Merge remote-tracking branch 'upstream/master'
2
.github/ISSUE_TEMPLATE/bug_report.md
vendored
|
@ -31,7 +31,7 @@ If applicable, add screenshots to help explain your problem.
|
||||||
- OS: [e.g. Windows 10/Raspberry Pi OS]
|
- OS: [e.g. Windows 10/Raspberry Pi OS]
|
||||||
- Python version: [e.g. python2.7]
|
- Python version: [e.g. python2.7]
|
||||||
- Calibre-Web version: [e.g. 0.6.8 or 087c4c59 (git rev-parse --short HEAD)]:
|
- Calibre-Web version: [e.g. 0.6.8 or 087c4c59 (git rev-parse --short HEAD)]:
|
||||||
- Docker container: [None/Technosoft2000/Linuxuser]:
|
- Docker container: [None/Technosoft2000/LinuxServer]:
|
||||||
- Special Hardware: [e.g. Rasperry Pi Zero]
|
- Special Hardware: [e.g. Rasperry Pi Zero]
|
||||||
- Browser: [e.g. Chrome 83.0.4103.97, Safari 13.3.7, Firefox 68.0.1 ESR]
|
- Browser: [e.g. Chrome 83.0.4103.97, Safari 13.3.7, Firefox 68.0.1 ESR]
|
||||||
|
|
||||||
|
|
1
.github/ISSUE_TEMPLATE/config.yml
vendored
Normal file
|
@ -0,0 +1 @@
|
||||||
|
blank_issues_enabled: false
|
3
.gitignore
vendored
|
@ -10,6 +10,7 @@ env/
|
||||||
venv/
|
venv/
|
||||||
eggs/
|
eggs/
|
||||||
dist/
|
dist/
|
||||||
|
executable/
|
||||||
build/
|
build/
|
||||||
vendor/
|
vendor/
|
||||||
.eggs/
|
.eggs/
|
||||||
|
@ -29,4 +30,4 @@ vendor/
|
||||||
settings.yaml
|
settings.yaml
|
||||||
gdrive_credentials
|
gdrive_credentials
|
||||||
client_secrets.json
|
client_secrets.json
|
||||||
|
gmail.json
|
||||||
|
|
10
README.md
|
@ -12,7 +12,7 @@ Calibre-Web is a web app providing a clean interface for browsing, reading and d
|
||||||
- full graphical setup
|
- full graphical setup
|
||||||
- User management with fine-grained per-user permissions
|
- User management with fine-grained per-user permissions
|
||||||
- Admin interface
|
- Admin interface
|
||||||
- User Interface in czech, dutch, english, finnish, french, german, greek, hungarian, italian, japanese, khmer, polish, russian, simplified chinese, spanish, swedish, turkish, ukrainian
|
- User Interface in brazilian, czech, dutch, english, finnish, french, german, greek, hungarian, italian, japanese, khmer, polish, russian, simplified chinese, spanish, swedish, turkish, ukrainian
|
||||||
- OPDS feed for eBook reader apps
|
- OPDS feed for eBook reader apps
|
||||||
- Filter and search by titles, authors, tags, series and language
|
- Filter and search by titles, authors, tags, series and language
|
||||||
- Create a custom book collection (shelves)
|
- Create a custom book collection (shelves)
|
||||||
|
@ -22,7 +22,7 @@ Calibre-Web is a web app providing a clean interface for browsing, reading and d
|
||||||
- Support for public user registration
|
- Support for public user registration
|
||||||
- Send eBooks to Kindle devices with the click of a button
|
- Send eBooks to Kindle devices with the click of a button
|
||||||
- Sync your Kobo devices through Calibre-Web with your Calibre library
|
- Sync your Kobo devices through Calibre-Web with your Calibre library
|
||||||
- Support for reading eBooks directly in the browser (.txt, .epub, .pdf, .cbr, .cbt, .cbz)
|
- Support for reading eBooks directly in the browser (.txt, .epub, .pdf, .cbr, .cbt, .cbz, .djvu)
|
||||||
- Upload new books in many formats, including audio formats (.mp3, .m4a, .m4b)
|
- Upload new books in many formats, including audio formats (.mp3, .m4a, .m4b)
|
||||||
- Support for Calibre Custom Columns
|
- Support for Calibre Custom Columns
|
||||||
- Ability to hide content based on categories and Custom Column content per user
|
- Ability to hide content based on categories and Custom Column content per user
|
||||||
|
@ -32,8 +32,8 @@ Calibre-Web is a web app providing a clean interface for browsing, reading and d
|
||||||
|
|
||||||
## Quick start
|
## Quick start
|
||||||
|
|
||||||
1. Install dependencies by running `pip3 install --target vendor -r requirements.txt` (python3.x) or `pip install --target vendor -r requirements.txt` (python2.7).
|
1. Install dependencies by running `pip3 install --target vendor -r requirements.txt` (python3.x). Alternativly set up a python virtual environment.
|
||||||
2. Execute the command: `python cps.py` (or `nohup python cps.py` - recommended if you want to exit the terminal window)
|
2. Execute the command: `python3 cps.py` (or `nohup python3 cps.py` - recommended if you want to exit the terminal window)
|
||||||
3. Point your browser to `http://localhost:8083` or `http://localhost:8083/opds` for the OPDS catalog
|
3. Point your browser to `http://localhost:8083` or `http://localhost:8083/opds` for the OPDS catalog
|
||||||
4. Set `Location of Calibre database` to the path of the folder where your Calibre library (metadata.db) lives, push "submit" button\
|
4. Set `Location of Calibre database` to the path of the folder where your Calibre library (metadata.db) lives, push "submit" button\
|
||||||
Optionally a Google Drive can be used to host the calibre library [-> Using Google Drive integration](https://github.com/janeczku/calibre-web/wiki/Configuration#using-google-drive-integration)
|
Optionally a Google Drive can be used to host the calibre library [-> Using Google Drive integration](https://github.com/janeczku/calibre-web/wiki/Configuration#using-google-drive-integration)
|
||||||
|
@ -48,7 +48,7 @@ Please note that running the above install command can fail on some versions of
|
||||||
|
|
||||||
## Requirements
|
## Requirements
|
||||||
|
|
||||||
python 3.x+, (Python 2.7+)
|
python 3.x+
|
||||||
|
|
||||||
Optionally, to enable on-the-fly conversion from one ebook format to another when using the send-to-kindle feature, or during editing of ebooks metadata:
|
Optionally, to enable on-the-fly conversion from one ebook format to another when using the send-to-kindle feature, or during editing of ebooks metadata:
|
||||||
|
|
||||||
|
|
12
cps.py
|
@ -31,7 +31,7 @@ else:
|
||||||
sys.path.append(os.path.join(os.path.dirname(os.path.abspath(__file__)), 'vendor'))
|
sys.path.append(os.path.join(os.path.dirname(os.path.abspath(__file__)), 'vendor'))
|
||||||
|
|
||||||
|
|
||||||
from cps import create_app, config
|
from cps import create_app
|
||||||
from cps import web_server
|
from cps import web_server
|
||||||
from cps.opds import opds
|
from cps.opds import opds
|
||||||
from cps.web import web
|
from cps.web import web
|
||||||
|
@ -41,6 +41,8 @@ from cps.shelf import shelf
|
||||||
from cps.admin import admi
|
from cps.admin import admi
|
||||||
from cps.gdrive import gdrive
|
from cps.gdrive import gdrive
|
||||||
from cps.editbooks import editbook
|
from cps.editbooks import editbook
|
||||||
|
from cps.remotelogin import remotelogin
|
||||||
|
from cps.error_handler import init_errorhandler
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from cps.kobo import kobo, get_kobo_activated
|
from cps.kobo import kobo, get_kobo_activated
|
||||||
|
@ -58,14 +60,18 @@ except ImportError:
|
||||||
|
|
||||||
def main():
|
def main():
|
||||||
app = create_app()
|
app = create_app()
|
||||||
|
|
||||||
|
init_errorhandler()
|
||||||
|
|
||||||
app.register_blueprint(web)
|
app.register_blueprint(web)
|
||||||
app.register_blueprint(opds)
|
app.register_blueprint(opds)
|
||||||
app.register_blueprint(jinjia)
|
app.register_blueprint(jinjia)
|
||||||
app.register_blueprint(about)
|
app.register_blueprint(about)
|
||||||
app.register_blueprint(shelf)
|
app.register_blueprint(shelf)
|
||||||
app.register_blueprint(admi)
|
app.register_blueprint(admi)
|
||||||
if config.config_use_google_drive:
|
app.register_blueprint(remotelogin)
|
||||||
app.register_blueprint(gdrive)
|
# if config.config_use_google_drive:
|
||||||
|
app.register_blueprint(gdrive)
|
||||||
app.register_blueprint(editbook)
|
app.register_blueprint(editbook)
|
||||||
if kobo_available:
|
if kobo_available:
|
||||||
app.register_blueprint(kobo)
|
app.register_blueprint(kobo)
|
||||||
|
|
|
@ -45,6 +45,7 @@ mimetypes.add_type('application/fb2+zip', '.fb2')
|
||||||
mimetypes.add_type('application/x-mobipocket-ebook', '.mobi')
|
mimetypes.add_type('application/x-mobipocket-ebook', '.mobi')
|
||||||
mimetypes.add_type('application/x-mobipocket-ebook', '.prc')
|
mimetypes.add_type('application/x-mobipocket-ebook', '.prc')
|
||||||
mimetypes.add_type('application/vnd.amazon.ebook', '.azw')
|
mimetypes.add_type('application/vnd.amazon.ebook', '.azw')
|
||||||
|
mimetypes.add_type('application/x-mobi8-ebook', '.azw3')
|
||||||
mimetypes.add_type('application/x-cbr', '.cbr')
|
mimetypes.add_type('application/x-cbr', '.cbr')
|
||||||
mimetypes.add_type('application/x-cbz', '.cbz')
|
mimetypes.add_type('application/x-cbz', '.cbz')
|
||||||
mimetypes.add_type('application/x-cbt', '.cbt')
|
mimetypes.add_type('application/x-cbt', '.cbt')
|
||||||
|
@ -94,9 +95,13 @@ def create_app():
|
||||||
app.root_path = app.root_path.decode('utf-8')
|
app.root_path = app.root_path.decode('utf-8')
|
||||||
app.instance_path = app.instance_path.decode('utf-8')
|
app.instance_path = app.instance_path.decode('utf-8')
|
||||||
|
|
||||||
cache_buster.init_cache_busting(app)
|
if os.environ.get('FLASK_DEBUG'):
|
||||||
|
cache_buster.init_cache_busting(app)
|
||||||
|
|
||||||
log.info('Starting Calibre Web...')
|
log.info('Starting Calibre Web...')
|
||||||
|
if sys.version_info < (3, 0):
|
||||||
|
log.info('Python2 is EOL since end of 2019, this version of Calibre-Web is no longer supporting Python2 please consider upgrading to Python3')
|
||||||
|
print('Python2 is EOL since end of 2019, this version of Calibre-Web is no longer supporting Python2 please consider upgrading to Python3')
|
||||||
Principal(app)
|
Principal(app)
|
||||||
lm.init_app(app)
|
lm.init_app(app)
|
||||||
app.secret_key = os.getenv('SECRET_KEY', config_sql.get_flask_session_key(ub.session))
|
app.secret_key = os.getenv('SECRET_KEY', config_sql.get_flask_session_key(ub.session))
|
||||||
|
@ -122,7 +127,7 @@ def get_locale():
|
||||||
user = getattr(g, 'user', None)
|
user = getattr(g, 'user', None)
|
||||||
# user = None
|
# user = None
|
||||||
if user is not None and hasattr(user, "locale"):
|
if user is not None and hasattr(user, "locale"):
|
||||||
if user.nickname != 'Guest': # if the account is the guest account bypass the config lang settings
|
if user.name != 'Guest': # if the account is the guest account bypass the config lang settings
|
||||||
return user.locale
|
return user.locale
|
||||||
|
|
||||||
preferred = list()
|
preferred = list()
|
||||||
|
|
11
cps/about.py
|
@ -31,12 +31,13 @@ import werkzeug, flask, flask_login, flask_principal, jinja2
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
|
|
||||||
from . import db, calibre_db, converter, uploader, server, isoLanguages, constants
|
from . import db, calibre_db, converter, uploader, server, isoLanguages, constants
|
||||||
from .web import render_title_template
|
from .render_template import render_title_template
|
||||||
try:
|
try:
|
||||||
from flask_login import __version__ as flask_loginVersion
|
from flask_login import __version__ as flask_loginVersion
|
||||||
except ImportError:
|
except ImportError:
|
||||||
from flask_login.__about__ import __version__ as flask_loginVersion
|
from flask_login.__about__ import __version__ as flask_loginVersion
|
||||||
try:
|
try:
|
||||||
|
# pylint: disable=unused-import
|
||||||
import unidecode
|
import unidecode
|
||||||
# _() necessary to make babel aware of string for translation
|
# _() necessary to make babel aware of string for translation
|
||||||
unidecode_version = _(u'installed')
|
unidecode_version = _(u'installed')
|
||||||
|
@ -48,6 +49,11 @@ try:
|
||||||
except ImportError:
|
except ImportError:
|
||||||
flask_danceVersion = None
|
flask_danceVersion = None
|
||||||
|
|
||||||
|
try:
|
||||||
|
from greenlet import __version__ as greenlet_Version
|
||||||
|
except ImportError:
|
||||||
|
greenlet_Version = None
|
||||||
|
|
||||||
from . import services
|
from . import services
|
||||||
|
|
||||||
about = flask.Blueprint('about', __name__)
|
about = flask.Blueprint('about', __name__)
|
||||||
|
@ -77,7 +83,8 @@ _VERSIONS = OrderedDict(
|
||||||
python_LDAP = services.ldapVersion if bool(services.ldapVersion) else None,
|
python_LDAP = services.ldapVersion if bool(services.ldapVersion) else None,
|
||||||
Goodreads = u'installed' if bool(services.goodreads_support) else None,
|
Goodreads = u'installed' if bool(services.goodreads_support) else None,
|
||||||
jsonschema = services.SyncToken.__version__ if bool(services.SyncToken) else None,
|
jsonschema = services.SyncToken.__version__ if bool(services.SyncToken) else None,
|
||||||
flask_dance = flask_danceVersion
|
flask_dance = flask_danceVersion,
|
||||||
|
greenlet = greenlet_Version
|
||||||
)
|
)
|
||||||
_VERSIONS.update(uploader.get_versions())
|
_VERSIONS.update(uploader.get_versions())
|
||||||
|
|
||||||
|
|
1157
cps/admin.py
|
@ -49,7 +49,7 @@ def init_cache_busting(app):
|
||||||
# compute version component
|
# compute version component
|
||||||
rooted_filename = os.path.join(dirpath, filename)
|
rooted_filename = os.path.join(dirpath, filename)
|
||||||
with open(rooted_filename, 'rb') as f:
|
with open(rooted_filename, 'rb') as f:
|
||||||
file_hash = hashlib.md5(f.read()).hexdigest()[:7]
|
file_hash = hashlib.md5(f.read()).hexdigest()[:7] # nosec
|
||||||
|
|
||||||
# save version to tables
|
# save version to tables
|
||||||
file_path = rooted_filename.replace(static_folder, "")
|
file_path = rooted_filename.replace(static_folder, "")
|
||||||
|
@ -64,6 +64,7 @@ def init_cache_busting(app):
|
||||||
return filename.split("?", 1)[0]
|
return filename.split("?", 1)[0]
|
||||||
|
|
||||||
@app.url_defaults
|
@app.url_defaults
|
||||||
|
# pylint: disable=unused-variable
|
||||||
def reverse_to_cache_busted_url(endpoint, values):
|
def reverse_to_cache_busted_url(endpoint, values):
|
||||||
"""
|
"""
|
||||||
Make `url_for` produce busted filenames when using the 'static' endpoint.
|
Make `url_for` produce busted filenames when using the 'static' endpoint.
|
||||||
|
|
29
cps/cli.py
|
@ -45,6 +45,7 @@ parser.add_argument('-v', '--version', action='version', help='Shows version num
|
||||||
version=version_info())
|
version=version_info())
|
||||||
parser.add_argument('-i', metavar='ip-address', help='Server IP-Address to listen')
|
parser.add_argument('-i', metavar='ip-address', help='Server IP-Address to listen')
|
||||||
parser.add_argument('-s', metavar='user:pass', help='Sets specific username to new password')
|
parser.add_argument('-s', metavar='user:pass', help='Sets specific username to new password')
|
||||||
|
parser.add_argument('-f', action='store_true', help='Enables filepicker in unconfigured mode')
|
||||||
args = parser.parse_args()
|
args = parser.parse_args()
|
||||||
|
|
||||||
if sys.version_info < (3, 0):
|
if sys.version_info < (3, 0):
|
||||||
|
@ -70,7 +71,7 @@ if args.c:
|
||||||
if os.path.isfile(args.c):
|
if os.path.isfile(args.c):
|
||||||
certfilepath = args.c
|
certfilepath = args.c
|
||||||
else:
|
else:
|
||||||
print("Certfilepath is invalid. Exiting...")
|
print("Certfile path is invalid. Exiting...")
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
|
|
||||||
if args.c == "":
|
if args.c == "":
|
||||||
|
@ -80,7 +81,7 @@ if args.k:
|
||||||
if os.path.isfile(args.k):
|
if os.path.isfile(args.k):
|
||||||
keyfilepath = args.k
|
keyfilepath = args.k
|
||||||
else:
|
else:
|
||||||
print("Keyfilepath is invalid. Exiting...")
|
print("Keyfile path is invalid. Exiting...")
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
|
|
||||||
if (args.k and not args.c) or (not args.k and args.c):
|
if (args.k and not args.c) or (not args.k and args.c):
|
||||||
|
@ -90,23 +91,29 @@ if (args.k and not args.c) or (not args.k and args.c):
|
||||||
if args.k == "":
|
if args.k == "":
|
||||||
keyfilepath = ""
|
keyfilepath = ""
|
||||||
|
|
||||||
# handle and check ipadress argument
|
# handle and check ip address argument
|
||||||
ipadress = args.i or None
|
ip_address = args.i or None
|
||||||
if ipadress:
|
if ip_address:
|
||||||
try:
|
try:
|
||||||
# try to parse the given ip address with socket
|
# try to parse the given ip address with socket
|
||||||
if hasattr(socket, 'inet_pton'):
|
if hasattr(socket, 'inet_pton'):
|
||||||
if ':' in ipadress:
|
if ':' in ip_address:
|
||||||
socket.inet_pton(socket.AF_INET6, ipadress)
|
socket.inet_pton(socket.AF_INET6, ip_address)
|
||||||
else:
|
else:
|
||||||
socket.inet_pton(socket.AF_INET, ipadress)
|
socket.inet_pton(socket.AF_INET, ip_address)
|
||||||
else:
|
else:
|
||||||
# on windows python < 3.4, inet_pton is not available
|
# on windows python < 3.4, inet_pton is not available
|
||||||
# inet_atom only handles IPv4 addresses
|
# inet_atom only handles IPv4 addresses
|
||||||
socket.inet_aton(ipadress)
|
socket.inet_aton(ip_address)
|
||||||
except socket.error as err:
|
except socket.error as err:
|
||||||
print(ipadress, ':', err)
|
print(ip_address, ':', err)
|
||||||
sys.exit(1)
|
sys.exit(1)
|
||||||
|
|
||||||
# handle and check user password argument
|
# handle and check user password argument
|
||||||
user_password = args.s or None
|
user_credentials = args.s or None
|
||||||
|
if user_credentials and ":" not in user_credentials:
|
||||||
|
print("No valid 'username:password' format")
|
||||||
|
sys.exit(3)
|
||||||
|
|
||||||
|
# Handles enabling of filepicker
|
||||||
|
filepicker = args.f or None
|
||||||
|
|
111
cps/comic.py
|
@ -18,21 +18,21 @@
|
||||||
|
|
||||||
from __future__ import division, print_function, unicode_literals
|
from __future__ import division, print_function, unicode_literals
|
||||||
import os
|
import os
|
||||||
import io
|
|
||||||
|
|
||||||
from . import logger, isoLanguages
|
from . import logger, isoLanguages
|
||||||
from .constants import BookMeta
|
from .constants import BookMeta
|
||||||
|
|
||||||
try:
|
|
||||||
from PIL import Image as PILImage
|
|
||||||
use_PIL = True
|
|
||||||
except ImportError as e:
|
|
||||||
use_PIL = False
|
|
||||||
|
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
|
|
||||||
|
try:
|
||||||
|
from wand.image import Image
|
||||||
|
use_IM = True
|
||||||
|
except (ImportError, RuntimeError) as e:
|
||||||
|
use_IM = False
|
||||||
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from comicapi.comicarchive import ComicArchive, MetaDataStyle
|
from comicapi.comicarchive import ComicArchive, MetaDataStyle
|
||||||
use_comic_meta = True
|
use_comic_meta = True
|
||||||
|
@ -52,25 +52,63 @@ except (ImportError, LookupError) as e:
|
||||||
use_rarfile = False
|
use_rarfile = False
|
||||||
use_comic_meta = False
|
use_comic_meta = False
|
||||||
|
|
||||||
|
NO_JPEG_EXTENSIONS = ['.png', '.webp', '.bmp']
|
||||||
|
COVER_EXTENSIONS = ['.png', '.webp', '.bmp', '.jpg', '.jpeg']
|
||||||
|
|
||||||
def _cover_processing(tmp_file_name, img, extension):
|
def _cover_processing(tmp_file_name, img, extension):
|
||||||
if use_PIL:
|
tmp_cover_name = os.path.join(os.path.dirname(tmp_file_name), 'cover.jpg')
|
||||||
|
if use_IM:
|
||||||
# convert to jpg because calibre only supports jpg
|
# convert to jpg because calibre only supports jpg
|
||||||
if extension in ('.png', '.webp'):
|
if extension in NO_JPEG_EXTENSIONS:
|
||||||
imgc = PILImage.open(io.BytesIO(img))
|
with Image(filename=tmp_file_name) as imgc:
|
||||||
im = imgc.convert('RGB')
|
imgc.format = 'jpeg'
|
||||||
tmp_bytesio = io.BytesIO()
|
imgc.transform_colorspace('rgb')
|
||||||
im.save(tmp_bytesio, format='JPEG')
|
imgc.save(tmp_cover_name)
|
||||||
img = tmp_bytesio.getvalue()
|
return tmp_cover_name
|
||||||
|
|
||||||
if not img:
|
if not img:
|
||||||
return None
|
return None
|
||||||
|
|
||||||
tmp_cover_name = os.path.join(os.path.dirname(tmp_file_name), 'cover.jpg')
|
|
||||||
with open(tmp_cover_name, 'wb') as f:
|
with open(tmp_cover_name, 'wb') as f:
|
||||||
f.write(img)
|
f.write(img)
|
||||||
return tmp_cover_name
|
return tmp_cover_name
|
||||||
|
|
||||||
|
|
||||||
|
def _extract_Cover_from_archive(original_file_extension, tmp_file_name, rarExecutable):
|
||||||
|
cover_data = None
|
||||||
|
if original_file_extension.upper() == '.CBZ':
|
||||||
|
cf = zipfile.ZipFile(tmp_file_name)
|
||||||
|
for name in cf.namelist():
|
||||||
|
ext = os.path.splitext(name)
|
||||||
|
if len(ext) > 1:
|
||||||
|
extension = ext[1].lower()
|
||||||
|
if extension in COVER_EXTENSIONS:
|
||||||
|
cover_data = cf.read(name)
|
||||||
|
break
|
||||||
|
elif original_file_extension.upper() == '.CBT':
|
||||||
|
cf = tarfile.TarFile(tmp_file_name)
|
||||||
|
for name in cf.getnames():
|
||||||
|
ext = os.path.splitext(name)
|
||||||
|
if len(ext) > 1:
|
||||||
|
extension = ext[1].lower()
|
||||||
|
if extension in COVER_EXTENSIONS:
|
||||||
|
cover_data = cf.extractfile(name).read()
|
||||||
|
break
|
||||||
|
elif original_file_extension.upper() == '.CBR' and use_rarfile:
|
||||||
|
try:
|
||||||
|
rarfile.UNRAR_TOOL = rarExecutable
|
||||||
|
cf = rarfile.RarFile(tmp_file_name)
|
||||||
|
for name in cf.getnames():
|
||||||
|
ext = os.path.splitext(name)
|
||||||
|
if len(ext) > 1:
|
||||||
|
extension = ext[1].lower()
|
||||||
|
if extension in COVER_EXTENSIONS:
|
||||||
|
cover_data = cf.read(name)
|
||||||
|
break
|
||||||
|
except Exception as ex:
|
||||||
|
log.debug('Rarfile failed with error: %s', ex)
|
||||||
|
return cover_data
|
||||||
|
|
||||||
|
|
||||||
def _extractCover(tmp_file_name, original_file_extension, rarExecutable):
|
def _extractCover(tmp_file_name, original_file_extension, rarExecutable):
|
||||||
cover_data = extension = None
|
cover_data = extension = None
|
||||||
|
@ -80,41 +118,11 @@ def _extractCover(tmp_file_name, original_file_extension, rarExecutable):
|
||||||
ext = os.path.splitext(name)
|
ext = os.path.splitext(name)
|
||||||
if len(ext) > 1:
|
if len(ext) > 1:
|
||||||
extension = ext[1].lower()
|
extension = ext[1].lower()
|
||||||
if extension in ('.jpg', '.jpeg', '.png', '.webp'):
|
if extension in COVER_EXTENSIONS:
|
||||||
cover_data = archive.getPage(index)
|
cover_data = archive.getPage(index)
|
||||||
break
|
break
|
||||||
else:
|
else:
|
||||||
if original_file_extension.upper() == '.CBZ':
|
cover_data = _extract_Cover_from_archive(original_file_extension, tmp_file_name, rarExecutable)
|
||||||
cf = zipfile.ZipFile(tmp_file_name)
|
|
||||||
for name in cf.namelist():
|
|
||||||
ext = os.path.splitext(name)
|
|
||||||
if len(ext) > 1:
|
|
||||||
extension = ext[1].lower()
|
|
||||||
if extension in ('.jpg', '.jpeg', '.png', '.webp'):
|
|
||||||
cover_data = cf.read(name)
|
|
||||||
break
|
|
||||||
elif original_file_extension.upper() == '.CBT':
|
|
||||||
cf = tarfile.TarFile(tmp_file_name)
|
|
||||||
for name in cf.getnames():
|
|
||||||
ext = os.path.splitext(name)
|
|
||||||
if len(ext) > 1:
|
|
||||||
extension = ext[1].lower()
|
|
||||||
if extension in ('.jpg', '.jpeg', '.png', '.webp'):
|
|
||||||
cover_data = cf.extractfile(name).read()
|
|
||||||
break
|
|
||||||
elif original_file_extension.upper() == '.CBR' and use_rarfile:
|
|
||||||
try:
|
|
||||||
rarfile.UNRAR_TOOL = rarExecutable
|
|
||||||
cf = rarfile.RarFile(tmp_file_name)
|
|
||||||
for name in cf.getnames():
|
|
||||||
ext = os.path.splitext(name)
|
|
||||||
if len(ext) > 1:
|
|
||||||
extension = ext[1].lower()
|
|
||||||
if extension in ('.jpg', '.jpeg', '.png', '.webp'):
|
|
||||||
cover_data = cf.read(name)
|
|
||||||
break
|
|
||||||
except Exception as e:
|
|
||||||
log.debug('Rarfile failed with error: %s', e)
|
|
||||||
return _cover_processing(tmp_file_name, cover_data, extension)
|
return _cover_processing(tmp_file_name, cover_data, extension)
|
||||||
|
|
||||||
|
|
||||||
|
@ -139,13 +147,15 @@ def get_comic_info(tmp_file_path, original_file_name, original_file_extension, r
|
||||||
file_path=tmp_file_path,
|
file_path=tmp_file_path,
|
||||||
extension=original_file_extension,
|
extension=original_file_extension,
|
||||||
title=loadedMetadata.title or original_file_name,
|
title=loadedMetadata.title or original_file_name,
|
||||||
author=" & ".join([credit["person"] for credit in loadedMetadata.credits if credit["role"] == "Writer"]) or u'Unknown',
|
author=" & ".join([credit["person"]
|
||||||
|
for credit in loadedMetadata.credits if credit["role"] == "Writer"]) or u'Unknown',
|
||||||
cover=_extractCover(tmp_file_path, original_file_extension, rarExecutable),
|
cover=_extractCover(tmp_file_path, original_file_extension, rarExecutable),
|
||||||
description=loadedMetadata.comments or "",
|
description=loadedMetadata.comments or "",
|
||||||
tags="",
|
tags="",
|
||||||
series=loadedMetadata.series or "",
|
series=loadedMetadata.series or "",
|
||||||
series_id=loadedMetadata.issue or "",
|
series_id=loadedMetadata.issue or "",
|
||||||
languages=loadedMetadata.language)
|
languages=loadedMetadata.language,
|
||||||
|
publisher="")
|
||||||
|
|
||||||
return BookMeta(
|
return BookMeta(
|
||||||
file_path=tmp_file_path,
|
file_path=tmp_file_path,
|
||||||
|
@ -157,4 +167,5 @@ def get_comic_info(tmp_file_path, original_file_name, original_file_extension, r
|
||||||
tags="",
|
tags="",
|
||||||
series="",
|
series="",
|
||||||
series_id="",
|
series_id="",
|
||||||
languages="")
|
languages="",
|
||||||
|
publisher="")
|
||||||
|
|
|
@ -20,11 +20,18 @@
|
||||||
from __future__ import division, print_function, unicode_literals
|
from __future__ import division, print_function, unicode_literals
|
||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
|
import json
|
||||||
|
|
||||||
from sqlalchemy import exc, Column, String, Integer, SmallInteger, Boolean, BLOB, JSON
|
from sqlalchemy import Column, String, Integer, SmallInteger, Boolean, BLOB, JSON
|
||||||
from sqlalchemy.ext.declarative import declarative_base
|
from sqlalchemy.exc import OperationalError
|
||||||
|
from sqlalchemy.sql.expression import text
|
||||||
|
try:
|
||||||
|
# Compatibility with sqlalchemy 2.0
|
||||||
|
from sqlalchemy.orm import declarative_base
|
||||||
|
except ImportError:
|
||||||
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
|
|
||||||
from . import constants, cli, logger, ub
|
from . import constants, cli, logger
|
||||||
|
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
@ -34,7 +41,7 @@ class _Flask_Settings(_Base):
|
||||||
__tablename__ = 'flask_settings'
|
__tablename__ = 'flask_settings'
|
||||||
|
|
||||||
id = Column(Integer, primary_key=True)
|
id = Column(Integer, primary_key=True)
|
||||||
flask_session_key = Column(BLOB, default="")
|
flask_session_key = Column(BLOB, default=b"")
|
||||||
|
|
||||||
def __init__(self, key):
|
def __init__(self, key):
|
||||||
self.flask_session_key = key
|
self.flask_session_key = key
|
||||||
|
@ -53,6 +60,8 @@ class _Settings(_Base):
|
||||||
mail_password = Column(String, default='mypassword')
|
mail_password = Column(String, default='mypassword')
|
||||||
mail_from = Column(String, default='automailer <mail@example.com>')
|
mail_from = Column(String, default='automailer <mail@example.com>')
|
||||||
mail_size = Column(Integer, default=25*1024*1024)
|
mail_size = Column(Integer, default=25*1024*1024)
|
||||||
|
mail_server_type = Column(SmallInteger, default=0)
|
||||||
|
mail_gmail_token = Column(JSON, default={})
|
||||||
|
|
||||||
config_calibre_dir = Column(String)
|
config_calibre_dir = Column(String)
|
||||||
config_port = Column(Integer, default=constants.DEFAULT_PORT)
|
config_port = Column(Integer, default=constants.DEFAULT_PORT)
|
||||||
|
@ -65,7 +74,7 @@ class _Settings(_Base):
|
||||||
config_random_books = Column(Integer, default=4)
|
config_random_books = Column(Integer, default=4)
|
||||||
config_authors_max = Column(Integer, default=0)
|
config_authors_max = Column(Integer, default=0)
|
||||||
config_read_column = Column(Integer, default=0)
|
config_read_column = Column(Integer, default=0)
|
||||||
config_title_regex = Column(String, default=u'^(A|The|An|Der|Die|Das|Den|Ein|Eine|Einen|Dem|Des|Einem|Eines)\s+')
|
config_title_regex = Column(String, default=r'^(A|The|An|Der|Die|Das|Den|Ein|Eine|Einen|Dem|Des|Einem|Eines)\s+')
|
||||||
config_mature_content_tags = Column(String, default='')
|
config_mature_content_tags = Column(String, default='')
|
||||||
config_theme = Column(Integer, default=0)
|
config_theme = Column(Integer, default=0)
|
||||||
|
|
||||||
|
@ -145,15 +154,16 @@ class _ConfigSQL(object):
|
||||||
self.load()
|
self.load()
|
||||||
|
|
||||||
change = False
|
change = False
|
||||||
if self.config_converterpath == None:
|
if self.config_converterpath == None: # pylint: disable=access-member-before-definition
|
||||||
change = True
|
change = True
|
||||||
self.config_converterpath = autodetect_calibre_binary()
|
self.config_converterpath = autodetect_calibre_binary()
|
||||||
|
|
||||||
if self.config_kepubifypath == None:
|
if self.config_kepubifypath == None: # pylint: disable=access-member-before-definition
|
||||||
|
|
||||||
change = True
|
change = True
|
||||||
self.config_kepubifypath = autodetect_kepubify_binary()
|
self.config_kepubifypath = autodetect_kepubify_binary()
|
||||||
|
|
||||||
if self.config_rarfile_location == None:
|
if self.config_rarfile_location == None: # pylint: disable=access-member-before-definition
|
||||||
change = True
|
change = True
|
||||||
self.config_rarfile_location = autodetect_unrar_binary()
|
self.config_rarfile_location = autodetect_unrar_binary()
|
||||||
if change:
|
if change:
|
||||||
|
@ -180,8 +190,9 @@ class _ConfigSQL(object):
|
||||||
return None
|
return None
|
||||||
return self.config_keyfile
|
return self.config_keyfile
|
||||||
|
|
||||||
def get_config_ipaddress(self):
|
@staticmethod
|
||||||
return cli.ipadress or ""
|
def get_config_ipaddress():
|
||||||
|
return cli.ip_address or ""
|
||||||
|
|
||||||
def _has_role(self, role_flag):
|
def _has_role(self, role_flag):
|
||||||
return constants.has_flag(self.config_default_role, role_flag)
|
return constants.has_flag(self.config_default_role, role_flag)
|
||||||
|
@ -239,18 +250,18 @@ class _ConfigSQL(object):
|
||||||
return {k:v for k, v in self.__dict__.items() if k.startswith('mail_')}
|
return {k:v for k, v in self.__dict__.items() if k.startswith('mail_')}
|
||||||
|
|
||||||
def get_mail_server_configured(self):
|
def get_mail_server_configured(self):
|
||||||
return not bool(self.mail_server == constants.DEFAULT_MAIL_SERVER)
|
return bool((self.mail_server != constants.DEFAULT_MAIL_SERVER and self.mail_server_type == 0)
|
||||||
|
or (self.mail_gmail_token != {} and self.mail_server_type == 1))
|
||||||
|
|
||||||
|
|
||||||
def set_from_dictionary(self, dictionary, field, convertor=None, default=None, encode=None):
|
def set_from_dictionary(self, dictionary, field, convertor=None, default=None, encode=None):
|
||||||
'''Possibly updates a field of this object.
|
"""Possibly updates a field of this object.
|
||||||
The new value, if present, is grabbed from the given dictionary, and optionally passed through a convertor.
|
The new value, if present, is grabbed from the given dictionary, and optionally passed through a convertor.
|
||||||
|
|
||||||
:returns: `True` if the field has changed value
|
:returns: `True` if the field has changed value
|
||||||
'''
|
"""
|
||||||
new_value = dictionary.get(field, default)
|
new_value = dictionary.get(field, default)
|
||||||
if new_value is None:
|
if new_value is None:
|
||||||
# log.debug("_ConfigSQL set_from_dictionary field '%s' not found", field)
|
|
||||||
return False
|
return False
|
||||||
|
|
||||||
if field not in self.__dict__:
|
if field not in self.__dict__:
|
||||||
|
@ -267,10 +278,17 @@ class _ConfigSQL(object):
|
||||||
if current_value == new_value:
|
if current_value == new_value:
|
||||||
return False
|
return False
|
||||||
|
|
||||||
# log.debug("_ConfigSQL set_from_dictionary '%s' = %r (was %r)", field, new_value, current_value)
|
|
||||||
setattr(self, field, new_value)
|
setattr(self, field, new_value)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
def toDict(self):
|
||||||
|
storage = {}
|
||||||
|
for k, v in self.__dict__.items():
|
||||||
|
if k[0] != '_' and not k.endswith("password") and not k.endswith("secret"):
|
||||||
|
storage[k] = v
|
||||||
|
return storage
|
||||||
|
|
||||||
|
|
||||||
def load(self):
|
def load(self):
|
||||||
'''Load all configuration values from the underlying storage.'''
|
'''Load all configuration values from the underlying storage.'''
|
||||||
s = self._read_from_storage() # type: _Settings
|
s = self._read_from_storage() # type: _Settings
|
||||||
|
@ -290,12 +308,20 @@ class _ConfigSQL(object):
|
||||||
have_metadata_db = os.path.isfile(db_file)
|
have_metadata_db = os.path.isfile(db_file)
|
||||||
self.db_configured = have_metadata_db
|
self.db_configured = have_metadata_db
|
||||||
constants.EXTENSIONS_UPLOAD = [x.lstrip().rstrip().lower() for x in self.config_upload_formats.split(',')]
|
constants.EXTENSIONS_UPLOAD = [x.lstrip().rstrip().lower() for x in self.config_upload_formats.split(',')]
|
||||||
logfile = logger.setup(self.config_logfile, self.config_log_level)
|
if os.environ.get('FLASK_DEBUG'):
|
||||||
|
logfile = logger.setup(logger.LOG_TO_STDOUT, logger.logging.DEBUG)
|
||||||
|
else:
|
||||||
|
# pylint: disable=access-member-before-definition
|
||||||
|
logfile = logger.setup(self.config_logfile, self.config_log_level)
|
||||||
if logfile != self.config_logfile:
|
if logfile != self.config_logfile:
|
||||||
log.warning("Log path %s not valid, falling back to default", self.config_logfile)
|
log.warning("Log path %s not valid, falling back to default", self.config_logfile)
|
||||||
self.config_logfile = logfile
|
self.config_logfile = logfile
|
||||||
self._session.merge(s)
|
self._session.merge(s)
|
||||||
self._session.commit()
|
try:
|
||||||
|
self._session.commit()
|
||||||
|
except OperationalError as e:
|
||||||
|
log.error('Database error: %s', e)
|
||||||
|
self._session.rollback()
|
||||||
|
|
||||||
def save(self):
|
def save(self):
|
||||||
'''Apply all configuration values to the underlying storage.'''
|
'''Apply all configuration values to the underlying storage.'''
|
||||||
|
@ -309,7 +335,11 @@ class _ConfigSQL(object):
|
||||||
|
|
||||||
log.debug("_ConfigSQL updating storage")
|
log.debug("_ConfigSQL updating storage")
|
||||||
self._session.merge(s)
|
self._session.merge(s)
|
||||||
self._session.commit()
|
try:
|
||||||
|
self._session.commit()
|
||||||
|
except OperationalError as e:
|
||||||
|
log.error('Database error: %s', e)
|
||||||
|
self._session.rollback()
|
||||||
self.load()
|
self.load()
|
||||||
|
|
||||||
def invalidate(self, error=None):
|
def invalidate(self, error=None):
|
||||||
|
@ -328,7 +358,7 @@ def _migrate_table(session, orm_class):
|
||||||
if column_name[0] != '_':
|
if column_name[0] != '_':
|
||||||
try:
|
try:
|
||||||
session.query(column).first()
|
session.query(column).first()
|
||||||
except exc.OperationalError as err:
|
except OperationalError as err:
|
||||||
log.debug("%s: %s", column_name, err.args[0])
|
log.debug("%s: %s", column_name, err.args[0])
|
||||||
if column.default is not None:
|
if column.default is not None:
|
||||||
if sys.version_info < (3, 0):
|
if sys.version_info < (3, 0):
|
||||||
|
@ -338,19 +368,29 @@ def _migrate_table(session, orm_class):
|
||||||
column_default = ""
|
column_default = ""
|
||||||
else:
|
else:
|
||||||
if isinstance(column.default.arg, bool):
|
if isinstance(column.default.arg, bool):
|
||||||
column_default = ("DEFAULT %r" % int(column.default.arg))
|
column_default = "DEFAULT {}".format(int(column.default.arg))
|
||||||
else:
|
else:
|
||||||
column_default = ("DEFAULT %r" % column.default.arg)
|
column_default = "DEFAULT `{}`".format(column.default.arg)
|
||||||
alter_table = "ALTER TABLE %s ADD COLUMN `%s` %s %s" % (orm_class.__tablename__,
|
if isinstance(column.type, JSON):
|
||||||
|
column_type = "JSON"
|
||||||
|
else:
|
||||||
|
column_type = column.type
|
||||||
|
alter_table = text("ALTER TABLE %s ADD COLUMN `%s` %s %s" % (orm_class.__tablename__,
|
||||||
column_name,
|
column_name,
|
||||||
column.type,
|
column_type,
|
||||||
column_default)
|
column_default))
|
||||||
log.debug(alter_table)
|
log.debug(alter_table)
|
||||||
session.execute(alter_table)
|
session.execute(alter_table)
|
||||||
changed = True
|
changed = True
|
||||||
|
except json.decoder.JSONDecodeError as e:
|
||||||
|
log.error("Database corrupt column: {}".format(column_name))
|
||||||
|
log.debug(e)
|
||||||
|
|
||||||
if changed:
|
if changed:
|
||||||
session.commit()
|
try:
|
||||||
|
session.commit()
|
||||||
|
except OperationalError:
|
||||||
|
session.rollback()
|
||||||
|
|
||||||
|
|
||||||
def autodetect_calibre_binary():
|
def autodetect_calibre_binary():
|
||||||
|
@ -403,12 +443,12 @@ def load_configuration(session):
|
||||||
session.commit()
|
session.commit()
|
||||||
conf = _ConfigSQL(session)
|
conf = _ConfigSQL(session)
|
||||||
# Migrate from global restrictions to user based restrictions
|
# Migrate from global restrictions to user based restrictions
|
||||||
if bool(conf.config_default_show & constants.MATURE_CONTENT) and conf.config_denied_tags == "":
|
#if bool(conf.config_default_show & constants.MATURE_CONTENT) and conf.config_denied_tags == "":
|
||||||
conf.config_denied_tags = conf.config_mature_content_tags
|
# conf.config_denied_tags = conf.config_mature_content_tags
|
||||||
conf.save()
|
# conf.save()
|
||||||
session.query(ub.User).filter(ub.User.mature_content != True). \
|
# session.query(ub.User).filter(ub.User.mature_content != True). \
|
||||||
update({"denied_tags": conf.config_mature_content_tags}, synchronize_session=False)
|
# update({"denied_tags": conf.config_mature_content_tags}, synchronize_session=False)
|
||||||
session.commit()
|
# session.commit()
|
||||||
return conf
|
return conf
|
||||||
|
|
||||||
def get_flask_session_key(session):
|
def get_flask_session_key(session):
|
||||||
|
|
|
@ -21,7 +21,11 @@ import sys
|
||||||
import os
|
import os
|
||||||
from collections import namedtuple
|
from collections import namedtuple
|
||||||
|
|
||||||
HOME_CONFIG = False
|
# if installed via pip this variable is set to true (empty file with name .HOMEDIR present)
|
||||||
|
HOME_CONFIG = os.path.isfile(os.path.join(os.path.dirname(os.path.abspath(__file__)), '.HOMEDIR'))
|
||||||
|
|
||||||
|
#In executables updater is not available, so variable is set to False there
|
||||||
|
UPDATER_AVAILABLE = True
|
||||||
|
|
||||||
# Base dir is parent of current file, necessary if called from different folder
|
# Base dir is parent of current file, necessary if called from different folder
|
||||||
if sys.version_info < (3, 0):
|
if sys.version_info < (3, 0):
|
||||||
|
@ -40,7 +44,7 @@ if HOME_CONFIG:
|
||||||
os.makedirs(home_dir)
|
os.makedirs(home_dir)
|
||||||
CONFIG_DIR = os.environ.get('CALIBRE_DBPATH', home_dir)
|
CONFIG_DIR = os.environ.get('CALIBRE_DBPATH', home_dir)
|
||||||
else:
|
else:
|
||||||
CONFIG_DIR = os.environ.get('CALIBRE_DBPATH', BASE_DIR)
|
CONFIG_DIR = os.environ.get('CALIBRE_DBPATH', BASE_DIR)
|
||||||
|
|
||||||
|
|
||||||
ROLE_USER = 0 << 0
|
ROLE_USER = 0 << 0
|
||||||
|
@ -84,6 +88,26 @@ SIDEBAR_ARCHIVED = 1 << 15
|
||||||
SIDEBAR_DOWNLOAD = 1 << 16
|
SIDEBAR_DOWNLOAD = 1 << 16
|
||||||
SIDEBAR_LIST = 1 << 17
|
SIDEBAR_LIST = 1 << 17
|
||||||
|
|
||||||
|
sidebar_settings = {
|
||||||
|
"detail_random": DETAIL_RANDOM,
|
||||||
|
"sidebar_language": SIDEBAR_LANGUAGE,
|
||||||
|
"sidebar_series": SIDEBAR_SERIES,
|
||||||
|
"sidebar_category": SIDEBAR_CATEGORY,
|
||||||
|
"sidebar_random": SIDEBAR_RANDOM,
|
||||||
|
"sidebar_author": SIDEBAR_AUTHOR,
|
||||||
|
"sidebar_best_rated": SIDEBAR_BEST_RATED,
|
||||||
|
"sidebar_read_and_unread": SIDEBAR_READ_AND_UNREAD,
|
||||||
|
"sidebar_recent": SIDEBAR_RECENT,
|
||||||
|
"sidebar_sorted": SIDEBAR_SORTED,
|
||||||
|
"sidebar_publisher": SIDEBAR_PUBLISHER,
|
||||||
|
"sidebar_rating": SIDEBAR_RATING,
|
||||||
|
"sidebar_format": SIDEBAR_FORMAT,
|
||||||
|
"sidebar_archived": SIDEBAR_ARCHIVED,
|
||||||
|
"sidebar_download": SIDEBAR_DOWNLOAD,
|
||||||
|
"sidebar_list": SIDEBAR_LIST,
|
||||||
|
}
|
||||||
|
|
||||||
|
|
||||||
ADMIN_USER_ROLES = sum(r for r in ALL_ROLES.values()) & ~ROLE_ANONYMOUS
|
ADMIN_USER_ROLES = sum(r for r in ALL_ROLES.values()) & ~ROLE_ANONYMOUS
|
||||||
ADMIN_USER_SIDEBAR = (SIDEBAR_LIST << 1) - 1
|
ADMIN_USER_SIDEBAR = (SIDEBAR_LIST << 1) - 1
|
||||||
|
|
||||||
|
@ -102,7 +126,7 @@ LDAP_AUTH_SIMPLE = 0
|
||||||
|
|
||||||
DEFAULT_MAIL_SERVER = "mail.example.org"
|
DEFAULT_MAIL_SERVER = "mail.example.org"
|
||||||
|
|
||||||
DEFAULT_PASSWORD = "admin123"
|
DEFAULT_PASSWORD = "admin123" # nosec
|
||||||
DEFAULT_PORT = 8083
|
DEFAULT_PORT = 8083
|
||||||
env_CALIBRE_PORT = os.environ.get("CALIBRE_PORT", DEFAULT_PORT)
|
env_CALIBRE_PORT = os.environ.get("CALIBRE_PORT", DEFAULT_PORT)
|
||||||
try:
|
try:
|
||||||
|
@ -128,9 +152,9 @@ def selected_roles(dictionary):
|
||||||
|
|
||||||
# :rtype: BookMeta
|
# :rtype: BookMeta
|
||||||
BookMeta = namedtuple('BookMeta', 'file_path, extension, title, author, cover, description, tags, series, '
|
BookMeta = namedtuple('BookMeta', 'file_path, extension, title, author, cover, description, tags, series, '
|
||||||
'series_id, languages')
|
'series_id, languages, publisher')
|
||||||
|
|
||||||
STABLE_VERSION = {'version': '0.6.10 Beta'}
|
STABLE_VERSION = {'version': '0.6.12 Beta'}
|
||||||
|
|
||||||
NIGHTLY_VERSION = {}
|
NIGHTLY_VERSION = {}
|
||||||
NIGHTLY_VERSION[0] = '$Format:%H$'
|
NIGHTLY_VERSION[0] = '$Format:%H$'
|
||||||
|
|
273
cps/db.py
|
@ -30,11 +30,17 @@ from sqlalchemy import Table, Column, ForeignKey, CheckConstraint
|
||||||
from sqlalchemy import String, Integer, Boolean, TIMESTAMP, Float
|
from sqlalchemy import String, Integer, Boolean, TIMESTAMP, Float
|
||||||
from sqlalchemy.orm import relationship, sessionmaker, scoped_session
|
from sqlalchemy.orm import relationship, sessionmaker, scoped_session
|
||||||
from sqlalchemy.orm.collections import InstrumentedList
|
from sqlalchemy.orm.collections import InstrumentedList
|
||||||
from sqlalchemy.ext.declarative import declarative_base, DeclarativeMeta
|
from sqlalchemy.ext.declarative import DeclarativeMeta
|
||||||
|
from sqlalchemy.exc import OperationalError
|
||||||
|
try:
|
||||||
|
# Compatibility with sqlalchemy 2.0
|
||||||
|
from sqlalchemy.orm import declarative_base
|
||||||
|
except ImportError:
|
||||||
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
from sqlalchemy.pool import StaticPool
|
from sqlalchemy.pool import StaticPool
|
||||||
from flask_login import current_user
|
|
||||||
from sqlalchemy.sql.expression import and_, true, false, text, func, or_
|
from sqlalchemy.sql.expression import and_, true, false, text, func, or_
|
||||||
from sqlalchemy.ext.associationproxy import association_proxy
|
from sqlalchemy.ext.associationproxy import association_proxy
|
||||||
|
from flask_login import current_user
|
||||||
from babel import Locale as LC
|
from babel import Locale as LC
|
||||||
from babel.core import UnknownLocaleError
|
from babel.core import UnknownLocaleError
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
|
@ -50,6 +56,8 @@ try:
|
||||||
except ImportError:
|
except ImportError:
|
||||||
use_unidecode = False
|
use_unidecode = False
|
||||||
|
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
cc_exceptions = ['datetime', 'comments', 'composite', 'series']
|
cc_exceptions = ['datetime', 'comments', 'composite', 'series']
|
||||||
cc_classes = {}
|
cc_classes = {}
|
||||||
|
|
||||||
|
@ -113,6 +121,8 @@ class Identifiers(Base):
|
||||||
return u"Douban"
|
return u"Douban"
|
||||||
elif format_type == "goodreads":
|
elif format_type == "goodreads":
|
||||||
return u"Goodreads"
|
return u"Goodreads"
|
||||||
|
elif format_type == "babelio":
|
||||||
|
return u"Babelio"
|
||||||
elif format_type == "google":
|
elif format_type == "google":
|
||||||
return u"Google Books"
|
return u"Google Books"
|
||||||
elif format_type == "kobo":
|
elif format_type == "kobo":
|
||||||
|
@ -140,6 +150,8 @@ class Identifiers(Base):
|
||||||
return u"https://dx.doi.org/{0}".format(self.val)
|
return u"https://dx.doi.org/{0}".format(self.val)
|
||||||
elif format_type == "goodreads":
|
elif format_type == "goodreads":
|
||||||
return u"https://www.goodreads.com/book/show/{0}".format(self.val)
|
return u"https://www.goodreads.com/book/show/{0}".format(self.val)
|
||||||
|
elif format_type == "babelio":
|
||||||
|
return u"https://www.babelio.com/livres/titre/{0}".format(self.val)
|
||||||
elif format_type == "douban":
|
elif format_type == "douban":
|
||||||
return u"https://book.douban.com/subject/{0}".format(self.val)
|
return u"https://book.douban.com/subject/{0}".format(self.val)
|
||||||
elif format_type == "google":
|
elif format_type == "google":
|
||||||
|
@ -154,10 +166,8 @@ class Identifiers(Base):
|
||||||
return u"https://portal.issn.org/resource/ISSN/{0}".format(self.val)
|
return u"https://portal.issn.org/resource/ISSN/{0}".format(self.val)
|
||||||
elif format_type == "isfdb":
|
elif format_type == "isfdb":
|
||||||
return u"http://www.isfdb.org/cgi-bin/pl.cgi?{0}".format(self.val)
|
return u"http://www.isfdb.org/cgi-bin/pl.cgi?{0}".format(self.val)
|
||||||
elif format_type == "url":
|
|
||||||
return u"{0}".format(self.val)
|
|
||||||
else:
|
else:
|
||||||
return u""
|
return u"{0}".format(self.val)
|
||||||
|
|
||||||
|
|
||||||
class Comments(Base):
|
class Comments(Base):
|
||||||
|
@ -326,7 +336,6 @@ class Books(Base):
|
||||||
has_cover = Column(Integer, default=0)
|
has_cover = Column(Integer, default=0)
|
||||||
uuid = Column(String)
|
uuid = Column(String)
|
||||||
isbn = Column(String(collation='NOCASE'), default="")
|
isbn = Column(String(collation='NOCASE'), default="")
|
||||||
# Iccn = Column(String(collation='NOCASE'), default="")
|
|
||||||
flags = Column(Integer, nullable=False, default=1)
|
flags = Column(Integer, nullable=False, default=1)
|
||||||
|
|
||||||
authors = relationship('Authors', secondary=books_authors_link, backref='books')
|
authors = relationship('Authors', secondary=books_authors_link, backref='books')
|
||||||
|
@ -384,14 +393,14 @@ class Custom_Columns(Base):
|
||||||
|
|
||||||
class AlchemyEncoder(json.JSONEncoder):
|
class AlchemyEncoder(json.JSONEncoder):
|
||||||
|
|
||||||
def default(self, obj):
|
def default(self, o):
|
||||||
if isinstance(obj.__class__, DeclarativeMeta):
|
if isinstance(o.__class__, DeclarativeMeta):
|
||||||
# an SQLAlchemy class
|
# an SQLAlchemy class
|
||||||
fields = {}
|
fields = {}
|
||||||
for field in [x for x in dir(obj) if not x.startswith('_') and x != 'metadata']:
|
for field in [x for x in dir(o) if not x.startswith('_') and x != 'metadata' and x!="password"]:
|
||||||
if field == 'books':
|
if field == 'books':
|
||||||
continue
|
continue
|
||||||
data = obj.__getattribute__(field)
|
data = o.__getattribute__(field)
|
||||||
try:
|
try:
|
||||||
if isinstance(data, str):
|
if isinstance(data, str):
|
||||||
data = data.replace("'", "\'")
|
data = data.replace("'", "\'")
|
||||||
|
@ -402,18 +411,21 @@ class AlchemyEncoder(json.JSONEncoder):
|
||||||
el.append(ele.get())
|
el.append(ele.get())
|
||||||
else:
|
else:
|
||||||
el.append(json.dumps(ele, cls=AlchemyEncoder))
|
el.append(json.dumps(ele, cls=AlchemyEncoder))
|
||||||
data = ",".join(el)
|
if field == 'authors':
|
||||||
|
data = " & ".join(el)
|
||||||
|
else:
|
||||||
|
data = ",".join(el)
|
||||||
if data == '[]':
|
if data == '[]':
|
||||||
data = ""
|
data = ""
|
||||||
else:
|
else:
|
||||||
json.dumps(data)
|
json.dumps(data)
|
||||||
fields[field] = data
|
fields[field] = data
|
||||||
except:
|
except Exception:
|
||||||
fields[field] = ""
|
fields[field] = ""
|
||||||
# a json-encodable dict
|
# a json-encodable dict
|
||||||
return fields
|
return fields
|
||||||
|
|
||||||
return json.JSONEncoder.default(self, obj)
|
return json.JSONEncoder.default(self, o)
|
||||||
|
|
||||||
|
|
||||||
class CalibreDB():
|
class CalibreDB():
|
||||||
|
@ -425,25 +437,96 @@ class CalibreDB():
|
||||||
# instances alive once they reach the end of their respective scopes
|
# instances alive once they reach the end of their respective scopes
|
||||||
instances = WeakSet()
|
instances = WeakSet()
|
||||||
|
|
||||||
def __init__(self):
|
def __init__(self, expire_on_commit=True):
|
||||||
""" Initialize a new CalibreDB session
|
""" Initialize a new CalibreDB session
|
||||||
"""
|
"""
|
||||||
self.session = None
|
self.session = None
|
||||||
if self._init:
|
if self._init:
|
||||||
self.initSession()
|
self.initSession(expire_on_commit)
|
||||||
|
|
||||||
self.instances.add(self)
|
self.instances.add(self)
|
||||||
|
|
||||||
|
def initSession(self, expire_on_commit=True):
|
||||||
def initSession(self):
|
|
||||||
self.session = self.session_factory()
|
self.session = self.session_factory()
|
||||||
|
self.session.expire_on_commit = expire_on_commit
|
||||||
self.update_title_sort(self.config)
|
self.update_title_sort(self.config)
|
||||||
|
|
||||||
|
@classmethod
|
||||||
|
def setup_db_cc_classes(self, cc):
|
||||||
|
cc_ids = []
|
||||||
|
books_custom_column_links = {}
|
||||||
|
for row in cc:
|
||||||
|
if row.datatype not in cc_exceptions:
|
||||||
|
if row.datatype == 'series':
|
||||||
|
dicttable = {'__tablename__': 'books_custom_column_' + str(row.id) + '_link',
|
||||||
|
'id': Column(Integer, primary_key=True),
|
||||||
|
'book': Column(Integer, ForeignKey('books.id'),
|
||||||
|
primary_key=True),
|
||||||
|
'map_value': Column('value', Integer,
|
||||||
|
ForeignKey('custom_column_' +
|
||||||
|
str(row.id) + '.id'),
|
||||||
|
primary_key=True),
|
||||||
|
'extra': Column(Float),
|
||||||
|
'asoc': relationship('custom_column_' + str(row.id), uselist=False),
|
||||||
|
'value': association_proxy('asoc', 'value')
|
||||||
|
}
|
||||||
|
books_custom_column_links[row.id] = type(str('books_custom_column_' + str(row.id) + '_link'),
|
||||||
|
(Base,), dicttable)
|
||||||
|
else:
|
||||||
|
books_custom_column_links[row.id] = Table('books_custom_column_' + str(row.id) + '_link',
|
||||||
|
Base.metadata,
|
||||||
|
Column('book', Integer, ForeignKey('books.id'),
|
||||||
|
primary_key=True),
|
||||||
|
Column('value', Integer,
|
||||||
|
ForeignKey('custom_column_' +
|
||||||
|
str(row.id) + '.id'),
|
||||||
|
primary_key=True)
|
||||||
|
)
|
||||||
|
cc_ids.append([row.id, row.datatype])
|
||||||
|
|
||||||
|
ccdict = {'__tablename__': 'custom_column_' + str(row.id),
|
||||||
|
'id': Column(Integer, primary_key=True)}
|
||||||
|
if row.datatype == 'float':
|
||||||
|
ccdict['value'] = Column(Float)
|
||||||
|
elif row.datatype == 'int':
|
||||||
|
ccdict['value'] = Column(Integer)
|
||||||
|
elif row.datatype == 'bool':
|
||||||
|
ccdict['value'] = Column(Boolean)
|
||||||
|
else:
|
||||||
|
ccdict['value'] = Column(String)
|
||||||
|
if row.datatype in ['float', 'int', 'bool']:
|
||||||
|
ccdict['book'] = Column(Integer, ForeignKey('books.id'))
|
||||||
|
cc_classes[row.id] = type(str('custom_column_' + str(row.id)), (Base,), ccdict)
|
||||||
|
|
||||||
|
for cc_id in cc_ids:
|
||||||
|
if (cc_id[1] == 'bool') or (cc_id[1] == 'int') or (cc_id[1] == 'float'):
|
||||||
|
setattr(Books,
|
||||||
|
'custom_column_' + str(cc_id[0]),
|
||||||
|
relationship(cc_classes[cc_id[0]],
|
||||||
|
primaryjoin=(
|
||||||
|
Books.id == cc_classes[cc_id[0]].book),
|
||||||
|
backref='books'))
|
||||||
|
elif (cc_id[1] == 'series'):
|
||||||
|
setattr(Books,
|
||||||
|
'custom_column_' + str(cc_id[0]),
|
||||||
|
relationship(books_custom_column_links[cc_id[0]],
|
||||||
|
backref='books'))
|
||||||
|
else:
|
||||||
|
setattr(Books,
|
||||||
|
'custom_column_' + str(cc_id[0]),
|
||||||
|
relationship(cc_classes[cc_id[0]],
|
||||||
|
secondary=books_custom_column_links[cc_id[0]],
|
||||||
|
backref='books'))
|
||||||
|
|
||||||
|
return cc_classes
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def setup_db(cls, config, app_db_path):
|
def setup_db(cls, config, app_db_path):
|
||||||
cls.config = config
|
cls.config = config
|
||||||
cls.dispose()
|
cls.dispose()
|
||||||
|
|
||||||
|
# toDo: if db changed -> delete shelfs, delete download books, delete read boks, kobo sync??
|
||||||
|
|
||||||
if not config.config_calibre_dir:
|
if not config.config_calibre_dir:
|
||||||
config.invalidate()
|
config.invalidate()
|
||||||
return False
|
return False
|
||||||
|
@ -459,84 +542,24 @@ class CalibreDB():
|
||||||
isolation_level="SERIALIZABLE",
|
isolation_level="SERIALIZABLE",
|
||||||
connect_args={'check_same_thread': False},
|
connect_args={'check_same_thread': False},
|
||||||
poolclass=StaticPool)
|
poolclass=StaticPool)
|
||||||
cls.engine.execute("attach database '{}' as calibre;".format(dbpath))
|
with cls.engine.begin() as connection:
|
||||||
cls.engine.execute("attach database '{}' as app_settings;".format(app_db_path))
|
connection.execute(text("attach database '{}' as calibre;".format(dbpath)))
|
||||||
|
connection.execute(text("attach database '{}' as app_settings;".format(app_db_path)))
|
||||||
|
|
||||||
conn = cls.engine.connect()
|
conn = cls.engine.connect()
|
||||||
# conn.text_factory = lambda b: b.decode(errors = 'ignore') possible fix for #1302
|
# conn.text_factory = lambda b: b.decode(errors = 'ignore') possible fix for #1302
|
||||||
except Exception as e:
|
except Exception as ex:
|
||||||
config.invalidate(e)
|
config.invalidate(ex)
|
||||||
return False
|
return False
|
||||||
|
|
||||||
config.db_configured = True
|
config.db_configured = True
|
||||||
|
|
||||||
if not cc_classes:
|
if not cc_classes:
|
||||||
cc = conn.execute("SELECT id, datatype FROM custom_columns")
|
try:
|
||||||
|
cc = conn.execute("SELECT id, datatype FROM custom_columns")
|
||||||
cc_ids = []
|
cls.setup_db_cc_classes(cc)
|
||||||
books_custom_column_links = {}
|
except OperationalError as e:
|
||||||
for row in cc:
|
log.debug_or_exception(e)
|
||||||
if row.datatype not in cc_exceptions:
|
|
||||||
if row.datatype == 'series':
|
|
||||||
dicttable = {'__tablename__': 'books_custom_column_' + str(row.id) + '_link',
|
|
||||||
'id': Column(Integer, primary_key=True),
|
|
||||||
'book': Column(Integer, ForeignKey('books.id'),
|
|
||||||
primary_key=True),
|
|
||||||
'map_value': Column('value', Integer,
|
|
||||||
ForeignKey('custom_column_' +
|
|
||||||
str(row.id) + '.id'),
|
|
||||||
primary_key=True),
|
|
||||||
'extra': Column(Float),
|
|
||||||
'asoc': relationship('custom_column_' + str(row.id), uselist=False),
|
|
||||||
'value': association_proxy('asoc', 'value')
|
|
||||||
}
|
|
||||||
books_custom_column_links[row.id] = type(str('books_custom_column_' + str(row.id) + '_link'),
|
|
||||||
(Base,), dicttable)
|
|
||||||
else:
|
|
||||||
books_custom_column_links[row.id] = Table('books_custom_column_' + str(row.id) + '_link',
|
|
||||||
Base.metadata,
|
|
||||||
Column('book', Integer, ForeignKey('books.id'),
|
|
||||||
primary_key=True),
|
|
||||||
Column('value', Integer,
|
|
||||||
ForeignKey('custom_column_' +
|
|
||||||
str(row.id) + '.id'),
|
|
||||||
primary_key=True)
|
|
||||||
)
|
|
||||||
cc_ids.append([row.id, row.datatype])
|
|
||||||
|
|
||||||
ccdict = {'__tablename__': 'custom_column_' + str(row.id),
|
|
||||||
'id': Column(Integer, primary_key=True)}
|
|
||||||
if row.datatype == 'float':
|
|
||||||
ccdict['value'] = Column(Float)
|
|
||||||
elif row.datatype == 'int':
|
|
||||||
ccdict['value'] = Column(Integer)
|
|
||||||
elif row.datatype == 'bool':
|
|
||||||
ccdict['value'] = Column(Boolean)
|
|
||||||
else:
|
|
||||||
ccdict['value'] = Column(String)
|
|
||||||
if row.datatype in ['float', 'int', 'bool']:
|
|
||||||
ccdict['book'] = Column(Integer, ForeignKey('books.id'))
|
|
||||||
cc_classes[row.id] = type(str('custom_column_' + str(row.id)), (Base,), ccdict)
|
|
||||||
|
|
||||||
for cc_id in cc_ids:
|
|
||||||
if (cc_id[1] == 'bool') or (cc_id[1] == 'int') or (cc_id[1] == 'float'):
|
|
||||||
setattr(Books,
|
|
||||||
'custom_column_' + str(cc_id[0]),
|
|
||||||
relationship(cc_classes[cc_id[0]],
|
|
||||||
primaryjoin=(
|
|
||||||
Books.id == cc_classes[cc_id[0]].book),
|
|
||||||
backref='books'))
|
|
||||||
elif (cc_id[1] == 'series'):
|
|
||||||
setattr(Books,
|
|
||||||
'custom_column_' + str(cc_id[0]),
|
|
||||||
relationship(books_custom_column_links[cc_id[0]],
|
|
||||||
backref='books'))
|
|
||||||
else:
|
|
||||||
setattr(Books,
|
|
||||||
'custom_column_' + str(cc_id[0]),
|
|
||||||
relationship(cc_classes[cc_id[0]],
|
|
||||||
secondary=books_custom_column_links[cc_id[0]],
|
|
||||||
backref='books'))
|
|
||||||
|
|
||||||
cls.session_factory = scoped_session(sessionmaker(autocommit=False,
|
cls.session_factory = scoped_session(sessionmaker(autocommit=False,
|
||||||
autoflush=True,
|
autoflush=True,
|
||||||
|
@ -557,8 +580,8 @@ class CalibreDB():
|
||||||
def get_book_by_uuid(self, book_uuid):
|
def get_book_by_uuid(self, book_uuid):
|
||||||
return self.session.query(Books).filter(Books.uuid == book_uuid).first()
|
return self.session.query(Books).filter(Books.uuid == book_uuid).first()
|
||||||
|
|
||||||
def get_book_format(self, book_id, format):
|
def get_book_format(self, book_id, file_format):
|
||||||
return self.session.query(Data).filter(Data.book == book_id).filter(Data.format == format).first()
|
return self.session.query(Data).filter(Data.book == book_id).filter(Data.format == file_format).first()
|
||||||
|
|
||||||
# Language and content filters for displaying in the UI
|
# Language and content filters for displaying in the UI
|
||||||
def common_filters(self, allow_show_archived=False):
|
def common_filters(self, allow_show_archived=False):
|
||||||
|
@ -597,6 +620,22 @@ class CalibreDB():
|
||||||
return and_(lang_filter, pos_content_tags_filter, ~neg_content_tags_filter,
|
return and_(lang_filter, pos_content_tags_filter, ~neg_content_tags_filter,
|
||||||
pos_content_cc_filter, ~neg_content_cc_filter, archived_filter)
|
pos_content_cc_filter, ~neg_content_cc_filter, archived_filter)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def get_checkbox_sorted(inputlist, state, offset, limit, order):
|
||||||
|
outcome = list()
|
||||||
|
elementlist = {ele.id: ele for ele in inputlist}
|
||||||
|
for entry in state:
|
||||||
|
try:
|
||||||
|
outcome.append(elementlist[entry])
|
||||||
|
except KeyError:
|
||||||
|
pass
|
||||||
|
del elementlist[entry]
|
||||||
|
for entry in elementlist:
|
||||||
|
outcome.append(elementlist[entry])
|
||||||
|
if order == "asc":
|
||||||
|
outcome.reverse()
|
||||||
|
return outcome[offset:offset + limit]
|
||||||
|
|
||||||
# Fill indexpage with all requested data from database
|
# Fill indexpage with all requested data from database
|
||||||
def fill_indexpage(self, page, pagesize, database, db_filter, order, *join):
|
def fill_indexpage(self, page, pagesize, database, db_filter, order, *join):
|
||||||
return self.fill_indexpage_with_archived_books(page, pagesize, database, db_filter, order, False, *join)
|
return self.fill_indexpage_with_archived_books(page, pagesize, database, db_filter, order, False, *join)
|
||||||
|
@ -608,19 +647,29 @@ class CalibreDB():
|
||||||
randm = self.session.query(Books) \
|
randm = self.session.query(Books) \
|
||||||
.filter(self.common_filters(allow_show_archived)) \
|
.filter(self.common_filters(allow_show_archived)) \
|
||||||
.order_by(func.random()) \
|
.order_by(func.random()) \
|
||||||
.limit(self.config.config_random_books)
|
.limit(self.config.config_random_books).all()
|
||||||
else:
|
else:
|
||||||
randm = false()
|
randm = false()
|
||||||
off = int(int(pagesize) * (page - 1))
|
off = int(int(pagesize) * (page - 1))
|
||||||
query = self.session.query(database) \
|
query = self.session.query(database)
|
||||||
.join(*join, isouter=True) \
|
if len(join) == 3:
|
||||||
.filter(db_filter) \
|
query = query.outerjoin(join[0], join[1]).outerjoin(join[2])
|
||||||
|
elif len(join) == 2:
|
||||||
|
query = query.outerjoin(join[0], join[1])
|
||||||
|
elif len(join) == 1:
|
||||||
|
query = query.outerjoin(join[0])
|
||||||
|
query = query.filter(db_filter)\
|
||||||
.filter(self.common_filters(allow_show_archived))
|
.filter(self.common_filters(allow_show_archived))
|
||||||
pagination = Pagination(page, pagesize,
|
entries = list()
|
||||||
len(query.all()))
|
pagination = list()
|
||||||
entries = query.order_by(*order).offset(off).limit(pagesize).all()
|
try:
|
||||||
for book in entries:
|
pagination = Pagination(page, pagesize,
|
||||||
book = self.order_authors(book)
|
len(query.all()))
|
||||||
|
entries = query.order_by(*order).offset(off).limit(pagesize).all()
|
||||||
|
except Exception as ex:
|
||||||
|
log.debug_or_exception(ex)
|
||||||
|
#for book in entries:
|
||||||
|
# book = self.order_authors(book)
|
||||||
return entries, randm, pagination
|
return entries, randm, pagination
|
||||||
|
|
||||||
# Orders all Authors in the list according to authors sort
|
# Orders all Authors in the list according to authors sort
|
||||||
|
@ -660,23 +709,33 @@ class CalibreDB():
|
||||||
return self.session.query(Books) \
|
return self.session.query(Books) \
|
||||||
.filter(and_(Books.authors.any(and_(*q)), func.lower(Books.title).ilike("%" + title + "%"))).first()
|
.filter(and_(Books.authors.any(and_(*q)), func.lower(Books.title).ilike("%" + title + "%"))).first()
|
||||||
|
|
||||||
# read search results from calibre-database and return it (function is used for feed and simple search
|
def search_query(self, term, *join):
|
||||||
def get_search_results(self, term, offset=None, order=None, limit=None):
|
|
||||||
order = order or [Books.sort]
|
|
||||||
pagination = None
|
|
||||||
term.strip().lower()
|
term.strip().lower()
|
||||||
self.session.connection().connection.connection.create_function("lower", 1, lcase)
|
self.session.connection().connection.connection.create_function("lower", 1, lcase)
|
||||||
q = list()
|
q = list()
|
||||||
authorterms = re.split("[, ]+", term)
|
authorterms = re.split("[, ]+", term)
|
||||||
for authorterm in authorterms:
|
for authorterm in authorterms:
|
||||||
q.append(Books.authors.any(func.lower(Authors.name).ilike("%" + authorterm + "%")))
|
q.append(Books.authors.any(func.lower(Authors.name).ilike("%" + authorterm + "%")))
|
||||||
result = self.session.query(Books).filter(self.common_filters(True)).filter(
|
query = self.session.query(Books)
|
||||||
|
if len(join) == 3:
|
||||||
|
query = query.outerjoin(join[0], join[1]).outerjoin(join[2])
|
||||||
|
elif len(join) == 2:
|
||||||
|
query = query.outerjoin(join[0], join[1])
|
||||||
|
elif len(join) == 1:
|
||||||
|
query = query.outerjoin(join[0])
|
||||||
|
return query.filter(self.common_filters(True)).filter(
|
||||||
or_(Books.tags.any(func.lower(Tags.name).ilike("%" + term + "%")),
|
or_(Books.tags.any(func.lower(Tags.name).ilike("%" + term + "%")),
|
||||||
Books.series.any(func.lower(Series.name).ilike("%" + term + "%")),
|
Books.series.any(func.lower(Series.name).ilike("%" + term + "%")),
|
||||||
Books.authors.any(and_(*q)),
|
Books.authors.any(and_(*q)),
|
||||||
Books.publishers.any(func.lower(Publishers.name).ilike("%" + term + "%")),
|
Books.publishers.any(func.lower(Publishers.name).ilike("%" + term + "%")),
|
||||||
func.lower(Books.title).ilike("%" + term + "%")
|
func.lower(Books.title).ilike("%" + term + "%")
|
||||||
)).order_by(*order).all()
|
))
|
||||||
|
|
||||||
|
# read search results from calibre-database and return it (function is used for feed and simple search
|
||||||
|
def get_search_results(self, term, offset=None, order=None, limit=None, *join):
|
||||||
|
order = order or [Books.sort]
|
||||||
|
pagination = None
|
||||||
|
result = self.search_query(term, *join).order_by(*order).all()
|
||||||
result_count = len(result)
|
result_count = len(result)
|
||||||
if offset != None and limit != None:
|
if offset != None and limit != None:
|
||||||
offset = int(offset)
|
offset = int(offset)
|
||||||
|
@ -731,7 +790,7 @@ class CalibreDB():
|
||||||
if old_session:
|
if old_session:
|
||||||
try:
|
try:
|
||||||
old_session.close()
|
old_session.close()
|
||||||
except:
|
except Exception:
|
||||||
pass
|
pass
|
||||||
if old_session.bind:
|
if old_session.bind:
|
||||||
try:
|
try:
|
||||||
|
@ -762,7 +821,7 @@ class CalibreDB():
|
||||||
def lcase(s):
|
def lcase(s):
|
||||||
try:
|
try:
|
||||||
return unidecode.unidecode(s.lower())
|
return unidecode.unidecode(s.lower())
|
||||||
except Exception as e:
|
except Exception as ex:
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
log.exception(e)
|
log.debug_or_exception(ex)
|
||||||
return s.lower()
|
return s.lower()
|
||||||
|
|
|
@ -21,7 +21,12 @@ import shutil
|
||||||
import glob
|
import glob
|
||||||
import zipfile
|
import zipfile
|
||||||
import json
|
import json
|
||||||
import io
|
from io import BytesIO
|
||||||
|
try:
|
||||||
|
from StringIO import StringIO
|
||||||
|
except ImportError:
|
||||||
|
from io import StringIO
|
||||||
|
|
||||||
import os
|
import os
|
||||||
|
|
||||||
from flask import send_file
|
from flask import send_file
|
||||||
|
@ -32,11 +37,12 @@ from .about import collect_stats
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
def assemble_logfiles(file_name):
|
def assemble_logfiles(file_name):
|
||||||
log_list = glob.glob(file_name + '*')
|
log_list = sorted(glob.glob(file_name + '*'), reverse=True)
|
||||||
wfd = io.StringIO()
|
wfd = StringIO()
|
||||||
for f in log_list:
|
for f in log_list:
|
||||||
with open(f, 'r') as fd:
|
with open(f, 'r') as fd:
|
||||||
shutil.copyfileobj(fd, wfd)
|
shutil.copyfileobj(fd, wfd)
|
||||||
|
wfd.seek(0)
|
||||||
return send_file(wfd,
|
return send_file(wfd,
|
||||||
as_attachment=True,
|
as_attachment=True,
|
||||||
attachment_filename=os.path.basename(file_name))
|
attachment_filename=os.path.basename(file_name))
|
||||||
|
@ -44,8 +50,12 @@ def assemble_logfiles(file_name):
|
||||||
def send_debug():
|
def send_debug():
|
||||||
file_list = glob.glob(logger.get_logfile(config.config_logfile) + '*')
|
file_list = glob.glob(logger.get_logfile(config.config_logfile) + '*')
|
||||||
file_list.extend(glob.glob(logger.get_accesslogfile(config.config_access_logfile) + '*'))
|
file_list.extend(glob.glob(logger.get_accesslogfile(config.config_access_logfile) + '*'))
|
||||||
memory_zip = io.BytesIO()
|
for element in [logger.LOG_TO_STDOUT, logger.LOG_TO_STDERR]:
|
||||||
|
if element in file_list:
|
||||||
|
file_list.remove(element)
|
||||||
|
memory_zip = BytesIO()
|
||||||
with zipfile.ZipFile(memory_zip, 'w', compression=zipfile.ZIP_DEFLATED) as zf:
|
with zipfile.ZipFile(memory_zip, 'w', compression=zipfile.ZIP_DEFLATED) as zf:
|
||||||
|
zf.writestr('settings.txt', json.dumps(config.toDict()))
|
||||||
zf.writestr('libs.txt', json.dumps(collect_stats()))
|
zf.writestr('libs.txt', json.dumps(collect_stats()))
|
||||||
for fp in file_list:
|
for fp in file_list:
|
||||||
zf.write(fp, os.path.basename(fp))
|
zf.write(fp, os.path.basename(fp))
|
||||||
|
|
899
cps/editbooks.py
57
cps/epub.py
|
@ -87,18 +87,29 @@ def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
||||||
lang = epub_metadata['language'].split('-', 1)[0].lower()
|
lang = epub_metadata['language'].split('-', 1)[0].lower()
|
||||||
epub_metadata['language'] = isoLanguages.get_lang3(lang)
|
epub_metadata['language'] = isoLanguages.get_lang3(lang)
|
||||||
|
|
||||||
series = tree.xpath("/pkg:package/pkg:metadata/pkg:meta[@name='calibre:series']/@content", namespaces=ns)
|
epub_metadata = parse_epbub_series(ns, tree, epub_metadata)
|
||||||
if len(series) > 0:
|
|
||||||
epub_metadata['series'] = series[0]
|
|
||||||
else:
|
|
||||||
epub_metadata['series'] = ''
|
|
||||||
|
|
||||||
series_id = tree.xpath("/pkg:package/pkg:metadata/pkg:meta[@name='calibre:series_index']/@content", namespaces=ns)
|
coverfile = parse_ebpub_cover(ns, tree, epubZip, coverpath, tmp_file_path)
|
||||||
if len(series_id) > 0:
|
|
||||||
epub_metadata['series_id'] = series_id[0]
|
|
||||||
else:
|
|
||||||
epub_metadata['series_id'] = '1'
|
|
||||||
|
|
||||||
|
if not epub_metadata['title']:
|
||||||
|
title = original_file_name
|
||||||
|
else:
|
||||||
|
title = epub_metadata['title']
|
||||||
|
|
||||||
|
return BookMeta(
|
||||||
|
file_path=tmp_file_path,
|
||||||
|
extension=original_file_extension,
|
||||||
|
title=title.encode('utf-8').decode('utf-8'),
|
||||||
|
author=epub_metadata['creator'].encode('utf-8').decode('utf-8'),
|
||||||
|
cover=coverfile,
|
||||||
|
description=epub_metadata['description'],
|
||||||
|
tags=epub_metadata['subject'].encode('utf-8').decode('utf-8'),
|
||||||
|
series=epub_metadata['series'].encode('utf-8').decode('utf-8'),
|
||||||
|
series_id=epub_metadata['series_id'].encode('utf-8').decode('utf-8'),
|
||||||
|
languages=epub_metadata['language'],
|
||||||
|
publisher="")
|
||||||
|
|
||||||
|
def parse_ebpub_cover(ns, tree, epubZip, coverpath, tmp_file_path):
|
||||||
coversection = tree.xpath("/pkg:package/pkg:manifest/pkg:item[@id='cover-image']/@href", namespaces=ns)
|
coversection = tree.xpath("/pkg:package/pkg:manifest/pkg:item[@id='cover-image']/@href", namespaces=ns)
|
||||||
coverfile = None
|
coverfile = None
|
||||||
if len(coversection) > 0:
|
if len(coversection) > 0:
|
||||||
|
@ -126,20 +137,18 @@ def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
||||||
coverfile = extractCover(epubZip, filename, "", tmp_file_path)
|
coverfile = extractCover(epubZip, filename, "", tmp_file_path)
|
||||||
else:
|
else:
|
||||||
coverfile = extractCover(epubZip, coversection[0], coverpath, tmp_file_path)
|
coverfile = extractCover(epubZip, coversection[0], coverpath, tmp_file_path)
|
||||||
|
return coverfile
|
||||||
|
|
||||||
if not epub_metadata['title']:
|
def parse_epbub_series(ns, tree, epub_metadata):
|
||||||
title = original_file_name
|
series = tree.xpath("/pkg:package/pkg:metadata/pkg:meta[@name='calibre:series']/@content", namespaces=ns)
|
||||||
|
if len(series) > 0:
|
||||||
|
epub_metadata['series'] = series[0]
|
||||||
else:
|
else:
|
||||||
title = epub_metadata['title']
|
epub_metadata['series'] = ''
|
||||||
|
|
||||||
return BookMeta(
|
series_id = tree.xpath("/pkg:package/pkg:metadata/pkg:meta[@name='calibre:series_index']/@content", namespaces=ns)
|
||||||
file_path=tmp_file_path,
|
if len(series_id) > 0:
|
||||||
extension=original_file_extension,
|
epub_metadata['series_id'] = series_id[0]
|
||||||
title=title.encode('utf-8').decode('utf-8'),
|
else:
|
||||||
author=epub_metadata['creator'].encode('utf-8').decode('utf-8'),
|
epub_metadata['series_id'] = '1'
|
||||||
cover=coverfile,
|
return epub_metadata
|
||||||
description=epub_metadata['description'],
|
|
||||||
tags=epub_metadata['subject'].encode('utf-8').decode('utf-8'),
|
|
||||||
series=epub_metadata['series'].encode('utf-8').decode('utf-8'),
|
|
||||||
series_id=epub_metadata['series_id'].encode('utf-8').decode('utf-8'),
|
|
||||||
languages=epub_metadata['language'])
|
|
||||||
|
|
67
cps/error_handler.py
Normal file
|
@ -0,0 +1,67 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2018-2020 OzzieIsaacs
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
import traceback
|
||||||
|
from flask import render_template
|
||||||
|
from werkzeug.exceptions import default_exceptions
|
||||||
|
try:
|
||||||
|
from werkzeug.exceptions import FailedDependency
|
||||||
|
except ImportError:
|
||||||
|
from werkzeug.exceptions import UnprocessableEntity as FailedDependency
|
||||||
|
|
||||||
|
from . import config, app, logger, services
|
||||||
|
|
||||||
|
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
|
# custom error page
|
||||||
|
def error_http(error):
|
||||||
|
return render_template('http_error.html',
|
||||||
|
error_code="Error {0}".format(error.code),
|
||||||
|
error_name=error.name,
|
||||||
|
issue=False,
|
||||||
|
instance=config.config_calibre_web_title
|
||||||
|
), error.code
|
||||||
|
|
||||||
|
|
||||||
|
def internal_error(error):
|
||||||
|
return render_template('http_error.html',
|
||||||
|
error_code="Internal Server Error",
|
||||||
|
error_name=str(error),
|
||||||
|
issue=True,
|
||||||
|
error_stack=traceback.format_exc().split("\n"),
|
||||||
|
instance=config.config_calibre_web_title
|
||||||
|
), 500
|
||||||
|
|
||||||
|
def init_errorhandler():
|
||||||
|
# http error handling
|
||||||
|
for ex in default_exceptions:
|
||||||
|
if ex < 500:
|
||||||
|
app.register_error_handler(ex, error_http)
|
||||||
|
elif ex == 500:
|
||||||
|
app.register_error_handler(ex, internal_error)
|
||||||
|
|
||||||
|
|
||||||
|
if services.ldap:
|
||||||
|
# Only way of catching the LDAPException upon logging in with LDAP server down
|
||||||
|
@app.errorhandler(services.ldap.LDAPException)
|
||||||
|
# pylint: disable=unused-variable
|
||||||
|
def handle_exception(e):
|
||||||
|
log.debug('LDAP server not accessible while trying to login to opds feed')
|
||||||
|
return error_http(FailedDependency())
|
||||||
|
|
27
cps/fb2.py
|
@ -30,51 +30,52 @@ def get_fb2_info(tmp_file_path, original_file_extension):
|
||||||
}
|
}
|
||||||
|
|
||||||
fb2_file = open(tmp_file_path)
|
fb2_file = open(tmp_file_path)
|
||||||
tree = etree.fromstring(fb2_file.read())
|
tree = etree.fromstring(fb2_file.read().encode())
|
||||||
|
|
||||||
authors = tree.xpath('/fb:FictionBook/fb:description/fb:title-info/fb:author', namespaces=ns)
|
authors = tree.xpath('/fb:FictionBook/fb:description/fb:title-info/fb:author', namespaces=ns)
|
||||||
|
|
||||||
def get_author(element):
|
def get_author(element):
|
||||||
last_name = element.xpath('fb:last-name/text()', namespaces=ns)
|
last_name = element.xpath('fb:last-name/text()', namespaces=ns)
|
||||||
if len(last_name):
|
if len(last_name):
|
||||||
last_name = last_name[0].encode('utf-8')
|
last_name = last_name[0]
|
||||||
else:
|
else:
|
||||||
last_name = u''
|
last_name = u''
|
||||||
middle_name = element.xpath('fb:middle-name/text()', namespaces=ns)
|
middle_name = element.xpath('fb:middle-name/text()', namespaces=ns)
|
||||||
if len(middle_name):
|
if len(middle_name):
|
||||||
middle_name = middle_name[0].encode('utf-8')
|
middle_name = middle_name[0]
|
||||||
else:
|
else:
|
||||||
middle_name = u''
|
middle_name = u''
|
||||||
first_name = element.xpath('fb:first-name/text()', namespaces=ns)
|
first_name = element.xpath('fb:first-name/text()', namespaces=ns)
|
||||||
if len(first_name):
|
if len(first_name):
|
||||||
first_name = first_name[0].encode('utf-8')
|
first_name = first_name[0]
|
||||||
else:
|
else:
|
||||||
first_name = u''
|
first_name = u''
|
||||||
return (first_name.decode('utf-8') + u' '
|
return (first_name + u' '
|
||||||
+ middle_name.decode('utf-8') + u' '
|
+ middle_name + u' '
|
||||||
+ last_name.decode('utf-8')).encode('utf-8')
|
+ last_name)
|
||||||
|
|
||||||
author = str(", ".join(map(get_author, authors)))
|
author = str(", ".join(map(get_author, authors)))
|
||||||
|
|
||||||
title = tree.xpath('/fb:FictionBook/fb:description/fb:title-info/fb:book-title/text()', namespaces=ns)
|
title = tree.xpath('/fb:FictionBook/fb:description/fb:title-info/fb:book-title/text()', namespaces=ns)
|
||||||
if len(title):
|
if len(title):
|
||||||
title = str(title[0].encode('utf-8'))
|
title = str(title[0])
|
||||||
else:
|
else:
|
||||||
title = u''
|
title = u''
|
||||||
description = tree.xpath('/fb:FictionBook/fb:description/fb:publish-info/fb:book-name/text()', namespaces=ns)
|
description = tree.xpath('/fb:FictionBook/fb:description/fb:publish-info/fb:book-name/text()', namespaces=ns)
|
||||||
if len(description):
|
if len(description):
|
||||||
description = str(description[0].encode('utf-8'))
|
description = str(description[0])
|
||||||
else:
|
else:
|
||||||
description = u''
|
description = u''
|
||||||
|
|
||||||
return BookMeta(
|
return BookMeta(
|
||||||
file_path=tmp_file_path,
|
file_path=tmp_file_path,
|
||||||
extension=original_file_extension,
|
extension=original_file_extension,
|
||||||
title=title.decode('utf-8'),
|
title=title,
|
||||||
author=author.decode('utf-8'),
|
author=author,
|
||||||
cover=None,
|
cover=None,
|
||||||
description=description.decode('utf-8'),
|
description=description,
|
||||||
tags="",
|
tags="",
|
||||||
series="",
|
series="",
|
||||||
series_id="",
|
series_id="",
|
||||||
languages="")
|
languages="",
|
||||||
|
publisher="")
|
||||||
|
|
|
@ -35,9 +35,9 @@ from flask_babel import gettext as _
|
||||||
from flask_login import login_required
|
from flask_login import login_required
|
||||||
|
|
||||||
from . import logger, gdriveutils, config, ub, calibre_db
|
from . import logger, gdriveutils, config, ub, calibre_db
|
||||||
from .web import admin_required
|
from .admin import admin_required
|
||||||
|
|
||||||
gdrive = Blueprint('gdrive', __name__)
|
gdrive = Blueprint('gdrive', __name__, url_prefix='/gdrive')
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
@ -47,10 +47,10 @@ except ImportError as err:
|
||||||
|
|
||||||
current_milli_time = lambda: int(round(time() * 1000))
|
current_milli_time = lambda: int(round(time() * 1000))
|
||||||
|
|
||||||
gdrive_watch_callback_token = 'target=calibreweb-watch_files'
|
gdrive_watch_callback_token = 'target=calibreweb-watch_files' #nosec
|
||||||
|
|
||||||
|
|
||||||
@gdrive.route("/gdrive/authenticate")
|
@gdrive.route("/authenticate")
|
||||||
@login_required
|
@login_required
|
||||||
@admin_required
|
@admin_required
|
||||||
def authenticate_google_drive():
|
def authenticate_google_drive():
|
||||||
|
@ -63,7 +63,7 @@ def authenticate_google_drive():
|
||||||
return redirect(authUrl)
|
return redirect(authUrl)
|
||||||
|
|
||||||
|
|
||||||
@gdrive.route("/gdrive/callback")
|
@gdrive.route("/callback")
|
||||||
def google_drive_callback():
|
def google_drive_callback():
|
||||||
auth_code = request.args.get('code')
|
auth_code = request.args.get('code')
|
||||||
if not auth_code:
|
if not auth_code:
|
||||||
|
@ -77,18 +77,14 @@ def google_drive_callback():
|
||||||
return redirect(url_for('admin.configuration'))
|
return redirect(url_for('admin.configuration'))
|
||||||
|
|
||||||
|
|
||||||
@gdrive.route("/gdrive/watch/subscribe")
|
@gdrive.route("/watch/subscribe")
|
||||||
@login_required
|
@login_required
|
||||||
@admin_required
|
@admin_required
|
||||||
def watch_gdrive():
|
def watch_gdrive():
|
||||||
if not config.config_google_drive_watch_changes_response:
|
if not config.config_google_drive_watch_changes_response:
|
||||||
with open(gdriveutils.CLIENT_SECRETS, 'r') as settings:
|
with open(gdriveutils.CLIENT_SECRETS, 'r') as settings:
|
||||||
filedata = json.load(settings)
|
filedata = json.load(settings)
|
||||||
if filedata['web']['redirect_uris'][0].endswith('/'):
|
address = filedata['web']['redirect_uris'][0].rstrip('/').replace('/gdrive/callback', '/gdrive/watch/callback')
|
||||||
filedata['web']['redirect_uris'][0] = filedata['web']['redirect_uris'][0][:-((len('/gdrive/callback')+1))]
|
|
||||||
else:
|
|
||||||
filedata['web']['redirect_uris'][0] = filedata['web']['redirect_uris'][0][:-(len('/gdrive/callback'))]
|
|
||||||
address = '%s/gdrive/watch/callback' % filedata['web']['redirect_uris'][0]
|
|
||||||
notification_id = str(uuid4())
|
notification_id = str(uuid4())
|
||||||
try:
|
try:
|
||||||
result = gdriveutils.watchChange(gdriveutils.Gdrive.Instance().drive, notification_id,
|
result = gdriveutils.watchChange(gdriveutils.Gdrive.Instance().drive, notification_id,
|
||||||
|
@ -98,14 +94,15 @@ def watch_gdrive():
|
||||||
except HttpError as e:
|
except HttpError as e:
|
||||||
reason=json.loads(e.content)['error']['errors'][0]
|
reason=json.loads(e.content)['error']['errors'][0]
|
||||||
if reason['reason'] == u'push.webhookUrlUnauthorized':
|
if reason['reason'] == u'push.webhookUrlUnauthorized':
|
||||||
flash(_(u'Callback domain is not verified, please follow steps to verify domain in google developer console'), category="error")
|
flash(_(u'Callback domain is not verified, '
|
||||||
|
u'please follow steps to verify domain in google developer console'), category="error")
|
||||||
else:
|
else:
|
||||||
flash(reason['message'], category="error")
|
flash(reason['message'], category="error")
|
||||||
|
|
||||||
return redirect(url_for('admin.configuration'))
|
return redirect(url_for('admin.configuration'))
|
||||||
|
|
||||||
|
|
||||||
@gdrive.route("/gdrive/watch/revoke")
|
@gdrive.route("/watch/revoke")
|
||||||
@login_required
|
@login_required
|
||||||
@admin_required
|
@admin_required
|
||||||
def revoke_watch_gdrive():
|
def revoke_watch_gdrive():
|
||||||
|
@ -121,14 +118,14 @@ def revoke_watch_gdrive():
|
||||||
return redirect(url_for('admin.configuration'))
|
return redirect(url_for('admin.configuration'))
|
||||||
|
|
||||||
|
|
||||||
@gdrive.route("/gdrive/watch/callback", methods=['GET', 'POST'])
|
@gdrive.route("/watch/callback", methods=['GET', 'POST'])
|
||||||
def on_received_watch_confirmation():
|
def on_received_watch_confirmation():
|
||||||
if not config.config_google_drive_watch_changes_response:
|
if not config.config_google_drive_watch_changes_response:
|
||||||
return ''
|
return ''
|
||||||
if request.headers.get('X-Goog-Channel-Token') != gdrive_watch_callback_token \
|
if request.headers.get('X-Goog-Channel-Token') != gdrive_watch_callback_token \
|
||||||
or request.headers.get('X-Goog-Resource-State') != 'change' \
|
or request.headers.get('X-Goog-Resource-State') != 'change' \
|
||||||
or not request.data:
|
or not request.data:
|
||||||
return '' # redirect(url_for('admin.configuration'))
|
return ''
|
||||||
|
|
||||||
log.debug('%r', request.headers)
|
log.debug('%r', request.headers)
|
||||||
log.debug('%r', request.data)
|
log.debug('%r', request.data)
|
||||||
|
@ -145,16 +142,19 @@ def on_received_watch_confirmation():
|
||||||
else:
|
else:
|
||||||
dbpath = os.path.join(config.config_calibre_dir, "metadata.db").encode()
|
dbpath = os.path.join(config.config_calibre_dir, "metadata.db").encode()
|
||||||
if not response['deleted'] and response['file']['title'] == 'metadata.db' \
|
if not response['deleted'] and response['file']['title'] == 'metadata.db' \
|
||||||
and response['file']['md5Checksum'] != hashlib.md5(dbpath):
|
and response['file']['md5Checksum'] != hashlib.md5(dbpath): # nosec
|
||||||
tmpDir = tempfile.gettempdir()
|
tmp_dir = os.path.join(tempfile.gettempdir(), 'calibre_web')
|
||||||
|
if not os.path.isdir(tmp_dir):
|
||||||
|
os.mkdir(tmp_dir)
|
||||||
|
|
||||||
log.info('Database file updated')
|
log.info('Database file updated')
|
||||||
copyfile(dbpath, os.path.join(tmpDir, "metadata.db_" + str(current_milli_time())))
|
copyfile(dbpath, os.path.join(tmp_dir, "metadata.db_" + str(current_milli_time())))
|
||||||
log.info('Backing up existing and downloading updated metadata.db')
|
log.info('Backing up existing and downloading updated metadata.db')
|
||||||
gdriveutils.downloadFile(None, "metadata.db", os.path.join(tmpDir, "tmp_metadata.db"))
|
gdriveutils.downloadFile(None, "metadata.db", os.path.join(tmp_dir, "tmp_metadata.db"))
|
||||||
log.info('Setting up new DB')
|
log.info('Setting up new DB')
|
||||||
# prevent error on windows, as os.rename does on exisiting files
|
# prevent error on windows, as os.rename does on existing files, also allow cross hdd move
|
||||||
move(os.path.join(tmpDir, "tmp_metadata.db"), dbpath)
|
move(os.path.join(tmp_dir, "tmp_metadata.db"), dbpath)
|
||||||
calibre_db.reconnect_db(config, ub.app_DB_path)
|
calibre_db.reconnect_db(config, ub.app_DB_path)
|
||||||
except Exception as e:
|
except Exception as ex:
|
||||||
log.exception(e)
|
log.debug_or_exception(ex)
|
||||||
return ''
|
return ''
|
||||||
|
|
|
@ -28,20 +28,33 @@ from sqlalchemy import create_engine
|
||||||
from sqlalchemy import Column, UniqueConstraint
|
from sqlalchemy import Column, UniqueConstraint
|
||||||
from sqlalchemy import String, Integer
|
from sqlalchemy import String, Integer
|
||||||
from sqlalchemy.orm import sessionmaker, scoped_session
|
from sqlalchemy.orm import sessionmaker, scoped_session
|
||||||
from sqlalchemy.ext.declarative import declarative_base
|
try:
|
||||||
|
# Compatibility with sqlalchemy 2.0
|
||||||
|
from sqlalchemy.orm import declarative_base
|
||||||
|
except ImportError:
|
||||||
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
from sqlalchemy.exc import OperationalError, InvalidRequestError
|
from sqlalchemy.exc import OperationalError, InvalidRequestError
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from pydrive.auth import GoogleAuth
|
|
||||||
from pydrive.drive import GoogleDrive
|
|
||||||
from pydrive.auth import RefreshError
|
|
||||||
from apiclient import errors
|
from apiclient import errors
|
||||||
from httplib2 import ServerNotFoundError
|
from httplib2 import ServerNotFoundError
|
||||||
gdrive_support = True
|
|
||||||
importError = None
|
importError = None
|
||||||
except ImportError as err:
|
gdrive_support = True
|
||||||
importError = err
|
except ImportError as e:
|
||||||
|
importError = e
|
||||||
gdrive_support = False
|
gdrive_support = False
|
||||||
|
try:
|
||||||
|
from pydrive2.auth import GoogleAuth
|
||||||
|
from pydrive2.drive import GoogleDrive
|
||||||
|
from pydrive2.auth import RefreshError
|
||||||
|
except ImportError as err:
|
||||||
|
try:
|
||||||
|
from pydrive.auth import GoogleAuth
|
||||||
|
from pydrive.drive import GoogleDrive
|
||||||
|
from pydrive.auth import RefreshError
|
||||||
|
except ImportError as err:
|
||||||
|
importError = err
|
||||||
|
gdrive_support = False
|
||||||
|
|
||||||
from . import logger, cli, config
|
from . import logger, cli, config
|
||||||
from .constants import CONFIG_DIR as _CONFIG_DIR
|
from .constants import CONFIG_DIR as _CONFIG_DIR
|
||||||
|
@ -91,7 +104,7 @@ class Singleton:
|
||||||
except AttributeError:
|
except AttributeError:
|
||||||
self._instance = self._decorated()
|
self._instance = self._decorated()
|
||||||
return self._instance
|
return self._instance
|
||||||
except ImportError as e:
|
except (ImportError, NameError) as e:
|
||||||
log.debug(e)
|
log.debug(e)
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
@ -189,8 +202,8 @@ def getDrive(drive=None, gauth=None):
|
||||||
gauth.Refresh()
|
gauth.Refresh()
|
||||||
except RefreshError as e:
|
except RefreshError as e:
|
||||||
log.error("Google Drive error: %s", e)
|
log.error("Google Drive error: %s", e)
|
||||||
except Exception as e:
|
except Exception as ex:
|
||||||
log.exception(e)
|
log.debug_or_exception(ex)
|
||||||
else:
|
else:
|
||||||
# Initialize the saved creds
|
# Initialize the saved creds
|
||||||
gauth.Authorize()
|
gauth.Authorize()
|
||||||
|
@ -208,7 +221,7 @@ def listRootFolders():
|
||||||
drive = getDrive(Gdrive.Instance().drive)
|
drive = getDrive(Gdrive.Instance().drive)
|
||||||
folder = "'root' in parents and mimeType = 'application/vnd.google-apps.folder' and trashed = false"
|
folder = "'root' in parents and mimeType = 'application/vnd.google-apps.folder' and trashed = false"
|
||||||
fileList = drive.ListFile({'q': folder}).GetList()
|
fileList = drive.ListFile({'q': folder}).GetList()
|
||||||
except ServerNotFoundError as e:
|
except (ServerNotFoundError, ssl.SSLError) as e:
|
||||||
log.info("GDrive Error %s" % e)
|
log.info("GDrive Error %s" % e)
|
||||||
fileList = []
|
fileList = []
|
||||||
return fileList
|
return fileList
|
||||||
|
@ -244,7 +257,12 @@ def getEbooksFolderId(drive=None):
|
||||||
log.error('Error gDrive, root ID not found')
|
log.error('Error gDrive, root ID not found')
|
||||||
gDriveId.path = '/'
|
gDriveId.path = '/'
|
||||||
session.merge(gDriveId)
|
session.merge(gDriveId)
|
||||||
session.commit()
|
try:
|
||||||
|
session.commit()
|
||||||
|
except OperationalError as ex:
|
||||||
|
log.error("gdrive.db DB is not Writeable")
|
||||||
|
log.debug('Database error: %s', ex)
|
||||||
|
session.rollback()
|
||||||
return gDriveId.gdrive_id
|
return gDriveId.gdrive_id
|
||||||
|
|
||||||
|
|
||||||
|
@ -259,37 +277,42 @@ def getFile(pathId, fileName, drive):
|
||||||
|
|
||||||
def getFolderId(path, drive):
|
def getFolderId(path, drive):
|
||||||
# drive = getDrive(drive)
|
# drive = getDrive(drive)
|
||||||
currentFolderId = getEbooksFolderId(drive)
|
try:
|
||||||
sqlCheckPath = path if path[-1] == '/' else path + '/'
|
currentFolderId = getEbooksFolderId(drive)
|
||||||
storedPathName = session.query(GdriveId).filter(GdriveId.path == sqlCheckPath).first()
|
sqlCheckPath = path if path[-1] == '/' else path + '/'
|
||||||
|
storedPathName = session.query(GdriveId).filter(GdriveId.path == sqlCheckPath).first()
|
||||||
|
|
||||||
if not storedPathName:
|
if not storedPathName:
|
||||||
dbChange = False
|
dbChange = False
|
||||||
s = path.split('/')
|
s = path.split('/')
|
||||||
for i, x in enumerate(s):
|
for i, x in enumerate(s):
|
||||||
if len(x) > 0:
|
if len(x) > 0:
|
||||||
currentPath = "/".join(s[:i+1])
|
currentPath = "/".join(s[:i+1])
|
||||||
if currentPath[-1] != '/':
|
if currentPath[-1] != '/':
|
||||||
currentPath = currentPath + '/'
|
currentPath = currentPath + '/'
|
||||||
storedPathName = session.query(GdriveId).filter(GdriveId.path == currentPath).first()
|
storedPathName = session.query(GdriveId).filter(GdriveId.path == currentPath).first()
|
||||||
if storedPathName:
|
if storedPathName:
|
||||||
currentFolderId = storedPathName.gdrive_id
|
currentFolderId = storedPathName.gdrive_id
|
||||||
else:
|
|
||||||
currentFolder = getFolderInFolder(currentFolderId, x, drive)
|
|
||||||
if currentFolder:
|
|
||||||
gDriveId = GdriveId()
|
|
||||||
gDriveId.gdrive_id = currentFolder['id']
|
|
||||||
gDriveId.path = currentPath
|
|
||||||
session.merge(gDriveId)
|
|
||||||
dbChange = True
|
|
||||||
currentFolderId = currentFolder['id']
|
|
||||||
else:
|
else:
|
||||||
currentFolderId = None
|
currentFolder = getFolderInFolder(currentFolderId, x, drive)
|
||||||
break
|
if currentFolder:
|
||||||
if dbChange:
|
gDriveId = GdriveId()
|
||||||
session.commit()
|
gDriveId.gdrive_id = currentFolder['id']
|
||||||
else:
|
gDriveId.path = currentPath
|
||||||
currentFolderId = storedPathName.gdrive_id
|
session.merge(gDriveId)
|
||||||
|
dbChange = True
|
||||||
|
currentFolderId = currentFolder['id']
|
||||||
|
else:
|
||||||
|
currentFolderId = None
|
||||||
|
break
|
||||||
|
if dbChange:
|
||||||
|
session.commit()
|
||||||
|
else:
|
||||||
|
currentFolderId = storedPathName.gdrive_id
|
||||||
|
except OperationalError as ex:
|
||||||
|
log.error("gdrive.db DB is not Writeable")
|
||||||
|
log.debug('Database error: %s', ex)
|
||||||
|
session.rollback()
|
||||||
return currentFolderId
|
return currentFolderId
|
||||||
|
|
||||||
|
|
||||||
|
@ -333,7 +356,7 @@ def moveGdriveFolderRemote(origin_file, target_folder):
|
||||||
addParents=gFileTargetDir['id'],
|
addParents=gFileTargetDir['id'],
|
||||||
removeParents=previous_parents,
|
removeParents=previous_parents,
|
||||||
fields='id, parents').execute()
|
fields='id, parents').execute()
|
||||||
# if previous_parents has no childs anymore, delete original fileparent
|
# if previous_parents has no children anymore, delete original fileparent
|
||||||
if len(children['items']) == 1:
|
if len(children['items']) == 1:
|
||||||
deleteDatabaseEntry(previous_parents)
|
deleteDatabaseEntry(previous_parents)
|
||||||
drive.auth.service.files().delete(fileId=previous_parents).execute()
|
drive.auth.service.files().delete(fileId=previous_parents).execute()
|
||||||
|
@ -385,7 +408,8 @@ def uploadFileToEbooksFolder(destFile, f):
|
||||||
if len(existingFiles) > 0:
|
if len(existingFiles) > 0:
|
||||||
driveFile = existingFiles[0]
|
driveFile = existingFiles[0]
|
||||||
else:
|
else:
|
||||||
driveFile = drive.CreateFile({'title': x, 'parents': [{"kind": "drive#fileLink", 'id': parent['id']}],})
|
driveFile = drive.CreateFile({'title': x,
|
||||||
|
'parents': [{"kind": "drive#fileLink", 'id': parent['id']}], })
|
||||||
driveFile.SetContentFile(f)
|
driveFile.SetContentFile(f)
|
||||||
driveFile.Upload()
|
driveFile.Upload()
|
||||||
else:
|
else:
|
||||||
|
@ -483,8 +507,8 @@ def getChangeById (drive, change_id):
|
||||||
except (errors.HttpError) as error:
|
except (errors.HttpError) as error:
|
||||||
log.error(error)
|
log.error(error)
|
||||||
return None
|
return None
|
||||||
except Exception as e:
|
except Exception as ex:
|
||||||
log.error(e)
|
log.error(ex)
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
@ -493,9 +517,10 @@ def deleteDatabaseOnChange():
|
||||||
try:
|
try:
|
||||||
session.query(GdriveId).delete()
|
session.query(GdriveId).delete()
|
||||||
session.commit()
|
session.commit()
|
||||||
except (OperationalError, InvalidRequestError):
|
except (OperationalError, InvalidRequestError) as ex:
|
||||||
session.rollback()
|
session.rollback()
|
||||||
log.info(u"GDrive DB is not Writeable")
|
log.debug('Database error: %s', ex)
|
||||||
|
log.error(u"GDrive DB is not Writeable")
|
||||||
|
|
||||||
|
|
||||||
def updateGdriveCalibreFromLocal():
|
def updateGdriveCalibreFromLocal():
|
||||||
|
@ -510,13 +535,23 @@ def updateDatabaseOnEdit(ID,newPath):
|
||||||
storedPathName = session.query(GdriveId).filter(GdriveId.gdrive_id == ID).first()
|
storedPathName = session.query(GdriveId).filter(GdriveId.gdrive_id == ID).first()
|
||||||
if storedPathName:
|
if storedPathName:
|
||||||
storedPathName.path = sqlCheckPath
|
storedPathName.path = sqlCheckPath
|
||||||
session.commit()
|
try:
|
||||||
|
session.commit()
|
||||||
|
except OperationalError as ex:
|
||||||
|
log.error("gdrive.db DB is not Writeable")
|
||||||
|
log.debug('Database error: %s', ex)
|
||||||
|
session.rollback()
|
||||||
|
|
||||||
|
|
||||||
# Deletes the hashes in database of deleted book
|
# Deletes the hashes in database of deleted book
|
||||||
def deleteDatabaseEntry(ID):
|
def deleteDatabaseEntry(ID):
|
||||||
session.query(GdriveId).filter(GdriveId.gdrive_id == ID).delete()
|
session.query(GdriveId).filter(GdriveId.gdrive_id == ID).delete()
|
||||||
session.commit()
|
try:
|
||||||
|
session.commit()
|
||||||
|
except OperationalError as ex:
|
||||||
|
log.error("gdrive.db DB is not Writeable")
|
||||||
|
log.debug('Database error: %s', ex)
|
||||||
|
session.rollback()
|
||||||
|
|
||||||
|
|
||||||
# Gets cover file from gdrive
|
# Gets cover file from gdrive
|
||||||
|
@ -533,7 +568,12 @@ def get_cover_via_gdrive(cover_path):
|
||||||
permissionAdded = PermissionAdded()
|
permissionAdded = PermissionAdded()
|
||||||
permissionAdded.gdrive_id = df['id']
|
permissionAdded.gdrive_id = df['id']
|
||||||
session.add(permissionAdded)
|
session.add(permissionAdded)
|
||||||
session.commit()
|
try:
|
||||||
|
session.commit()
|
||||||
|
except OperationalError as ex:
|
||||||
|
log.error("gdrive.db DB is not Writeable")
|
||||||
|
log.debug('Database error: %s', ex)
|
||||||
|
session.rollback()
|
||||||
return df.metadata.get('webContentLink')
|
return df.metadata.get('webContentLink')
|
||||||
else:
|
else:
|
||||||
return None
|
return None
|
||||||
|
@ -547,21 +587,24 @@ def partial(total_byte_len, part_size_limit):
|
||||||
return s
|
return s
|
||||||
|
|
||||||
# downloads files in chunks from gdrive
|
# downloads files in chunks from gdrive
|
||||||
def do_gdrive_download(df, headers):
|
def do_gdrive_download(df, headers, convert_encoding=False):
|
||||||
total_size = int(df.metadata.get('fileSize'))
|
total_size = int(df.metadata.get('fileSize'))
|
||||||
download_url = df.metadata.get('downloadUrl')
|
download_url = df.metadata.get('downloadUrl')
|
||||||
s = partial(total_size, 1024 * 1024) # I'm downloading BIG files, so 100M chunk size is fine for me
|
s = partial(total_size, 1024 * 1024) # I'm downloading BIG files, so 100M chunk size is fine for me
|
||||||
|
|
||||||
def stream():
|
def stream(convert_encoding):
|
||||||
for byte in s:
|
for byte in s:
|
||||||
headers = {"Range": 'bytes=%s-%s' % (byte[0], byte[1])}
|
headers = {"Range": 'bytes=%s-%s' % (byte[0], byte[1])}
|
||||||
resp, content = df.auth.Get_Http_Object().request(download_url, headers=headers)
|
resp, content = df.auth.Get_Http_Object().request(download_url, headers=headers)
|
||||||
if resp.status == 206:
|
if resp.status == 206:
|
||||||
|
if convert_encoding:
|
||||||
|
result = chardet.detect(content)
|
||||||
|
content = content.decode(result['encoding']).encode('utf-8')
|
||||||
yield content
|
yield content
|
||||||
else:
|
else:
|
||||||
log.warning('An error occurred: %s', resp)
|
log.warning('An error occurred: %s', resp)
|
||||||
return
|
return
|
||||||
return Response(stream_with_context(stream()), headers=headers)
|
return Response(stream_with_context(stream(convert_encoding)), headers=headers)
|
||||||
|
|
||||||
|
|
||||||
_SETTINGS_YAML_TEMPLATE = """
|
_SETTINGS_YAML_TEMPLATE = """
|
||||||
|
|
249
cps/helper.py
|
@ -24,10 +24,7 @@ import io
|
||||||
import mimetypes
|
import mimetypes
|
||||||
import re
|
import re
|
||||||
import shutil
|
import shutil
|
||||||
import glob
|
|
||||||
import time
|
import time
|
||||||
import zipfile
|
|
||||||
import json
|
|
||||||
import unicodedata
|
import unicodedata
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
from tempfile import gettempdir
|
from tempfile import gettempdir
|
||||||
|
@ -35,10 +32,10 @@ from tempfile import gettempdir
|
||||||
import requests
|
import requests
|
||||||
from babel.dates import format_datetime
|
from babel.dates import format_datetime
|
||||||
from babel.units import format_unit
|
from babel.units import format_unit
|
||||||
from flask import send_from_directory, make_response, redirect, abort, url_for, send_file
|
from flask import send_from_directory, make_response, redirect, abort, url_for
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
from flask_login import current_user
|
from flask_login import current_user
|
||||||
from sqlalchemy.sql.expression import true, false, and_, text
|
from sqlalchemy.sql.expression import true, false, and_, text, func
|
||||||
from werkzeug.datastructures import Headers
|
from werkzeug.datastructures import Headers
|
||||||
from werkzeug.security import generate_password_hash
|
from werkzeug.security import generate_password_hash
|
||||||
|
|
||||||
|
@ -53,13 +50,6 @@ try:
|
||||||
except ImportError:
|
except ImportError:
|
||||||
use_unidecode = False
|
use_unidecode = False
|
||||||
|
|
||||||
try:
|
|
||||||
from PIL import Image as PILImage
|
|
||||||
from PIL import UnidentifiedImageError
|
|
||||||
use_PIL = True
|
|
||||||
except ImportError:
|
|
||||||
use_PIL = False
|
|
||||||
|
|
||||||
from . import calibre_db
|
from . import calibre_db
|
||||||
from .tasks.convert import TaskConvert
|
from .tasks.convert import TaskConvert
|
||||||
from . import logger, config, get_locale, db, ub
|
from . import logger, config, get_locale, db, ub
|
||||||
|
@ -69,9 +59,17 @@ from .subproc_wrapper import process_wait
|
||||||
from .services.worker import WorkerThread, STAT_WAITING, STAT_FAIL, STAT_STARTED, STAT_FINISH_SUCCESS
|
from .services.worker import WorkerThread, STAT_WAITING, STAT_FAIL, STAT_STARTED, STAT_FINISH_SUCCESS
|
||||||
from .tasks.mail import TaskEmail
|
from .tasks.mail import TaskEmail
|
||||||
|
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
|
try:
|
||||||
|
from wand.image import Image
|
||||||
|
from wand.exceptions import MissingDelegateError
|
||||||
|
use_IM = True
|
||||||
|
except (ImportError, RuntimeError) as e:
|
||||||
|
log.debug('Cannot import Image, generating covers from non jpg files will not work: %s', e)
|
||||||
|
use_IM = False
|
||||||
|
MissingDelegateError = BaseException
|
||||||
|
|
||||||
|
|
||||||
# Convert existing book entry to new format
|
# Convert existing book entry to new format
|
||||||
def convert_book_format(book_id, calibrepath, old_book_format, new_book_format, user_id, kindle_mail=None):
|
def convert_book_format(book_id, calibrepath, old_book_format, new_book_format, user_id, kindle_mail=None):
|
||||||
|
@ -112,21 +110,21 @@ def convert_book_format(book_id, calibrepath, old_book_format, new_book_format,
|
||||||
def send_test_mail(kindle_mail, user_name):
|
def send_test_mail(kindle_mail, user_name):
|
||||||
WorkerThread.add(user_name, TaskEmail(_(u'Calibre-Web test e-mail'), None, None,
|
WorkerThread.add(user_name, TaskEmail(_(u'Calibre-Web test e-mail'), None, None,
|
||||||
config.get_mail_settings(), kindle_mail, _(u"Test e-mail"),
|
config.get_mail_settings(), kindle_mail, _(u"Test e-mail"),
|
||||||
_(u'This e-mail has been sent via Calibre-Web.')))
|
_(u'This e-mail has been sent via Calibre-Web.')))
|
||||||
return
|
return
|
||||||
|
|
||||||
|
|
||||||
# Send registration email or password reset email, depending on parameter resend (False means welcome email)
|
# Send registration email or password reset email, depending on parameter resend (False means welcome email)
|
||||||
def send_registration_mail(e_mail, user_name, default_password, resend=False):
|
def send_registration_mail(e_mail, user_name, default_password, resend=False):
|
||||||
text = "Hello %s!\r\n" % user_name
|
txt = "Hello %s!\r\n" % user_name
|
||||||
if not resend:
|
if not resend:
|
||||||
text += "Your new account at Calibre-Web has been created. Thanks for joining us!\r\n"
|
txt += "Your new account at Calibre-Web has been created. Thanks for joining us!\r\n"
|
||||||
text += "Please log in to your account using the following informations:\r\n"
|
txt += "Please log in to your account using the following informations:\r\n"
|
||||||
text += "User name: %s\r\n" % user_name
|
txt += "User name: %s\r\n" % user_name
|
||||||
text += "Password: %s\r\n" % default_password
|
txt += "Password: %s\r\n" % default_password
|
||||||
text += "Don't forget to change your password after first login.\r\n"
|
txt += "Don't forget to change your password after first login.\r\n"
|
||||||
text += "Sincerely\r\n\r\n"
|
txt += "Sincerely\r\n\r\n"
|
||||||
text += "Your Calibre-Web team"
|
txt += "Your Calibre-Web team"
|
||||||
WorkerThread.add(None, TaskEmail(
|
WorkerThread.add(None, TaskEmail(
|
||||||
subject=_(u'Get Started with Calibre-Web'),
|
subject=_(u'Get Started with Calibre-Web'),
|
||||||
filepath=None,
|
filepath=None,
|
||||||
|
@ -134,64 +132,52 @@ def send_registration_mail(e_mail, user_name, default_password, resend=False):
|
||||||
settings=config.get_mail_settings(),
|
settings=config.get_mail_settings(),
|
||||||
recipient=e_mail,
|
recipient=e_mail,
|
||||||
taskMessage=_(u"Registration e-mail for user: %(name)s", name=user_name),
|
taskMessage=_(u"Registration e-mail for user: %(name)s", name=user_name),
|
||||||
text=text
|
text=txt
|
||||||
))
|
))
|
||||||
|
|
||||||
return
|
return
|
||||||
|
|
||||||
|
|
||||||
|
def check_send_to_kindle_with_converter(formats):
|
||||||
|
bookformats = list()
|
||||||
|
if 'EPUB' in formats and 'MOBI' not in formats:
|
||||||
|
bookformats.append({'format': 'Mobi',
|
||||||
|
'convert': 1,
|
||||||
|
'text': _('Convert %(orig)s to %(format)s and send to Kindle',
|
||||||
|
orig='Epub',
|
||||||
|
format='Mobi')})
|
||||||
|
if 'AZW3' in formats and not 'MOBI' in formats:
|
||||||
|
bookformats.append({'format': 'Mobi',
|
||||||
|
'convert': 2,
|
||||||
|
'text': _('Convert %(orig)s to %(format)s and send to Kindle',
|
||||||
|
orig='Azw3',
|
||||||
|
format='Mobi')})
|
||||||
|
return bookformats
|
||||||
|
|
||||||
|
|
||||||
def check_send_to_kindle(entry):
|
def check_send_to_kindle(entry):
|
||||||
"""
|
"""
|
||||||
returns all available book formats for sending to Kindle
|
returns all available book formats for sending to Kindle
|
||||||
"""
|
"""
|
||||||
|
formats = list()
|
||||||
|
bookformats = list()
|
||||||
if len(entry.data):
|
if len(entry.data):
|
||||||
bookformats = list()
|
for ele in iter(entry.data):
|
||||||
if not config.config_converterpath:
|
if ele.uncompressed_size < config.mail_size:
|
||||||
# no converter - only for mobi and pdf formats
|
formats.append(ele.format)
|
||||||
for ele in iter(entry.data):
|
if 'MOBI' in formats:
|
||||||
if ele.uncompressed_size < config.mail_size:
|
bookformats.append({'format': 'Mobi',
|
||||||
if 'MOBI' in ele.format:
|
'convert': 0,
|
||||||
bookformats.append({'format': 'Mobi',
|
'text': _('Send %(format)s to Kindle', format='Mobi')})
|
||||||
'convert': 0,
|
if 'PDF' in formats:
|
||||||
'text': _('Send %(format)s to Kindle', format='Mobi')})
|
bookformats.append({'format': 'Pdf',
|
||||||
if 'PDF' in ele.format:
|
'convert': 0,
|
||||||
bookformats.append({'format': 'Pdf',
|
'text': _('Send %(format)s to Kindle', format='Pdf')})
|
||||||
'convert': 0,
|
if 'AZW' in formats:
|
||||||
'text': _('Send %(format)s to Kindle', format='Pdf')})
|
bookformats.append({'format': 'Azw',
|
||||||
if 'AZW' in ele.format:
|
'convert': 0,
|
||||||
bookformats.append({'format': 'Azw',
|
'text': _('Send %(format)s to Kindle', format='Azw')})
|
||||||
'convert': 0,
|
if config.config_converterpath:
|
||||||
'text': _('Send %(format)s to Kindle', format='Azw')})
|
bookformats.extend(check_send_to_kindle_with_converter(formats))
|
||||||
else:
|
|
||||||
formats = list()
|
|
||||||
for ele in iter(entry.data):
|
|
||||||
if ele.uncompressed_size < config.mail_size:
|
|
||||||
formats.append(ele.format)
|
|
||||||
if 'MOBI' in formats:
|
|
||||||
bookformats.append({'format': 'Mobi',
|
|
||||||
'convert': 0,
|
|
||||||
'text': _('Send %(format)s to Kindle', format='Mobi')})
|
|
||||||
if 'AZW' in formats:
|
|
||||||
bookformats.append({'format': 'Azw',
|
|
||||||
'convert': 0,
|
|
||||||
'text': _('Send %(format)s to Kindle', format='Azw')})
|
|
||||||
if 'PDF' in formats:
|
|
||||||
bookformats.append({'format': 'Pdf',
|
|
||||||
'convert': 0,
|
|
||||||
'text': _('Send %(format)s to Kindle', format='Pdf')})
|
|
||||||
if config.config_converterpath:
|
|
||||||
if 'EPUB' in formats and not 'MOBI' in formats:
|
|
||||||
bookformats.append({'format': 'Mobi',
|
|
||||||
'convert':1,
|
|
||||||
'text': _('Convert %(orig)s to %(format)s and send to Kindle',
|
|
||||||
orig='Epub',
|
|
||||||
format='Mobi')})
|
|
||||||
if 'AZW3' in formats and not 'MOBI' in formats:
|
|
||||||
bookformats.append({'format': 'Mobi',
|
|
||||||
'convert': 2,
|
|
||||||
'text': _('Convert %(orig)s to %(format)s and send to Kindle',
|
|
||||||
orig='Azw3',
|
|
||||||
format='Mobi')})
|
|
||||||
return bookformats
|
return bookformats
|
||||||
else:
|
else:
|
||||||
log.error(u'Cannot find book entry %d', entry.id)
|
log.error(u'Cannot find book entry %d', entry.id)
|
||||||
|
@ -201,7 +187,7 @@ def check_send_to_kindle(entry):
|
||||||
# Check if a reader is existing for any of the book formats, if not, return empty list, otherwise return
|
# Check if a reader is existing for any of the book formats, if not, return empty list, otherwise return
|
||||||
# list with supported formats
|
# list with supported formats
|
||||||
def check_read_formats(entry):
|
def check_read_formats(entry):
|
||||||
EXTENSIONS_READER = {'TXT', 'PDF', 'EPUB', 'CBZ', 'CBT', 'CBR'}
|
EXTENSIONS_READER = {'TXT', 'PDF', 'EPUB', 'CBZ', 'CBT', 'CBR', 'DJVU'}
|
||||||
bookformats = list()
|
bookformats = list()
|
||||||
if len(entry.data):
|
if len(entry.data):
|
||||||
for ele in iter(entry.data):
|
for ele in iter(entry.data):
|
||||||
|
@ -494,8 +480,8 @@ def reset_password(user_id):
|
||||||
password = generate_random_password()
|
password = generate_random_password()
|
||||||
existing_user.password = generate_password_hash(password)
|
existing_user.password = generate_password_hash(password)
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
send_registration_mail(existing_user.email, existing_user.nickname, password, True)
|
send_registration_mail(existing_user.email, existing_user.name, password, True)
|
||||||
return 1, existing_user.nickname
|
return 1, existing_user.name
|
||||||
except Exception:
|
except Exception:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
return 0, None
|
return 0, None
|
||||||
|
@ -512,11 +498,37 @@ def generate_random_password():
|
||||||
|
|
||||||
def uniq(inpt):
|
def uniq(inpt):
|
||||||
output = []
|
output = []
|
||||||
|
inpt = [ " ".join(inp.split()) for inp in inpt]
|
||||||
for x in inpt:
|
for x in inpt:
|
||||||
if x not in output:
|
if x not in output:
|
||||||
output.append(x)
|
output.append(x)
|
||||||
return output
|
return output
|
||||||
|
|
||||||
|
def check_email(email):
|
||||||
|
email = valid_email(email)
|
||||||
|
if ub.session.query(ub.User).filter(func.lower(ub.User.email) == email.lower()).first():
|
||||||
|
log.error(u"Found an existing account for this e-mail address")
|
||||||
|
raise Exception(_(u"Found an existing account for this e-mail address"))
|
||||||
|
return email
|
||||||
|
|
||||||
|
|
||||||
|
def check_username(username):
|
||||||
|
username = username.strip()
|
||||||
|
if ub.session.query(ub.User).filter(func.lower(ub.User.name) == username.lower()).scalar():
|
||||||
|
log.error(u"This username is already taken")
|
||||||
|
raise Exception (_(u"This username is already taken"))
|
||||||
|
return username
|
||||||
|
|
||||||
|
|
||||||
|
def valid_email(email):
|
||||||
|
email = email.strip()
|
||||||
|
# Regex according to https://developer.mozilla.org/en-US/docs/Web/HTML/Element/input/email#validation
|
||||||
|
if not re.search(r"^[\w.!#$%&'*+\\/=?^_`{|}~-]+@[\w](?:[\w-]{0,61}[\w])?(?:\.[\w](?:[\w-]{0,61}[\w])?)*$",
|
||||||
|
email):
|
||||||
|
log.error(u"Invalid e-mail address format")
|
||||||
|
raise Exception(_(u"Invalid e-mail address format"))
|
||||||
|
return email
|
||||||
|
|
||||||
# ################################# External interface #################################
|
# ################################# External interface #################################
|
||||||
|
|
||||||
|
|
||||||
|
@ -564,9 +576,8 @@ def get_book_cover_internal(book, use_generic_cover_on_failure):
|
||||||
else:
|
else:
|
||||||
log.error('%s/cover.jpg not found on Google Drive', book.path)
|
log.error('%s/cover.jpg not found on Google Drive', book.path)
|
||||||
return get_cover_on_failure(use_generic_cover_on_failure)
|
return get_cover_on_failure(use_generic_cover_on_failure)
|
||||||
except Exception as e:
|
except Exception as ex:
|
||||||
log.exception(e)
|
log.debug_or_exception(ex)
|
||||||
# traceback.print_exc()
|
|
||||||
return get_cover_on_failure(use_generic_cover_on_failure)
|
return get_cover_on_failure(use_generic_cover_on_failure)
|
||||||
else:
|
else:
|
||||||
cover_file_path = os.path.join(config.config_calibre_dir, book.path)
|
cover_file_path = os.path.join(config.config_calibre_dir, book.path)
|
||||||
|
@ -589,29 +600,35 @@ def save_cover_from_url(url, book_path):
|
||||||
requests.exceptions.Timeout) as ex:
|
requests.exceptions.Timeout) as ex:
|
||||||
log.info(u'Cover Download Error %s', ex)
|
log.info(u'Cover Download Error %s', ex)
|
||||||
return False, _("Error Downloading Cover")
|
return False, _("Error Downloading Cover")
|
||||||
except UnidentifiedImageError as ex:
|
except MissingDelegateError as ex:
|
||||||
log.info(u'File Format Error %s', ex)
|
log.info(u'File Format Error %s', ex)
|
||||||
return False, _("Cover Format Error")
|
return False, _("Cover Format Error")
|
||||||
|
|
||||||
|
|
||||||
def save_cover_from_filestorage(filepath, saved_filename, img):
|
def save_cover_from_filestorage(filepath, saved_filename, img):
|
||||||
if hasattr(img, '_content'):
|
# check if file path exists, otherwise create it, copy file to calibre path and delete temp file
|
||||||
f = open(os.path.join(filepath, saved_filename), "wb")
|
if not os.path.exists(filepath):
|
||||||
f.write(img._content)
|
|
||||||
f.close()
|
|
||||||
else:
|
|
||||||
# check if file path exists, otherwise create it, copy file to calibre path and delete temp file
|
|
||||||
if not os.path.exists(filepath):
|
|
||||||
try:
|
|
||||||
os.makedirs(filepath)
|
|
||||||
except OSError:
|
|
||||||
log.error(u"Failed to create path for cover")
|
|
||||||
return False, _(u"Failed to create path for cover")
|
|
||||||
try:
|
try:
|
||||||
img.save(os.path.join(filepath, saved_filename))
|
os.makedirs(filepath)
|
||||||
except (IOError, OSError):
|
except OSError:
|
||||||
log.error(u"Cover-file is not a valid image file, or could not be stored")
|
log.error(u"Failed to create path for cover")
|
||||||
return False, _(u"Cover-file is not a valid image file, or could not be stored")
|
return False, _(u"Failed to create path for cover")
|
||||||
|
try:
|
||||||
|
# upload of jgp file without wand
|
||||||
|
if isinstance(img, requests.Response):
|
||||||
|
with open(os.path.join(filepath, saved_filename), 'wb') as f:
|
||||||
|
f.write(img.content)
|
||||||
|
else:
|
||||||
|
if hasattr(img, "metadata"):
|
||||||
|
# upload of jpg/png... via url
|
||||||
|
img.save(filename=os.path.join(filepath, saved_filename))
|
||||||
|
img.close()
|
||||||
|
else:
|
||||||
|
# upload of jpg/png... from hdd
|
||||||
|
img.save(os.path.join(filepath, saved_filename))
|
||||||
|
except (IOError, OSError):
|
||||||
|
log.error(u"Cover-file is not a valid image file, or could not be stored")
|
||||||
|
return False, _(u"Cover-file is not a valid image file, or could not be stored")
|
||||||
return True, None
|
return True, None
|
||||||
|
|
||||||
|
|
||||||
|
@ -619,31 +636,33 @@ def save_cover_from_filestorage(filepath, saved_filename, img):
|
||||||
def save_cover(img, book_path):
|
def save_cover(img, book_path):
|
||||||
content_type = img.headers.get('content-type')
|
content_type = img.headers.get('content-type')
|
||||||
|
|
||||||
if use_PIL:
|
if use_IM:
|
||||||
if content_type not in ('image/jpeg', 'image/png', 'image/webp'):
|
if content_type not in ('image/jpeg', 'image/png', 'image/webp', 'image/bmp'):
|
||||||
log.error("Only jpg/jpeg/png/webp files are supported as coverfile")
|
log.error("Only jpg/jpeg/png/webp/bmp files are supported as coverfile")
|
||||||
return False, _("Only jpg/jpeg/png/webp files are supported as coverfile")
|
return False, _("Only jpg/jpeg/png/webp/bmp files are supported as coverfile")
|
||||||
# convert to jpg because calibre only supports jpg
|
# convert to jpg because calibre only supports jpg
|
||||||
if content_type in ('image/png', 'image/webp'):
|
if content_type != 'image/jpg':
|
||||||
if hasattr(img, 'stream'):
|
if hasattr(img, 'stream'):
|
||||||
imgc = PILImage.open(img.stream)
|
imgc = Image(blob=img.stream)
|
||||||
else:
|
else:
|
||||||
imgc = PILImage.open(io.BytesIO(img.content))
|
imgc = Image(blob=io.BytesIO(img.content))
|
||||||
im = imgc.convert('RGB')
|
imgc.format = 'jpeg'
|
||||||
tmp_bytesio = io.BytesIO()
|
imgc.transform_colorspace("rgb")
|
||||||
im.save(tmp_bytesio, format='JPEG')
|
img = imgc
|
||||||
img._content = tmp_bytesio.getvalue()
|
|
||||||
else:
|
else:
|
||||||
if content_type not in 'image/jpeg':
|
if content_type not in 'image/jpeg':
|
||||||
log.error("Only jpg/jpeg files are supported as coverfile")
|
log.error("Only jpg/jpeg files are supported as coverfile")
|
||||||
return False, _("Only jpg/jpeg files are supported as coverfile")
|
return False, _("Only jpg/jpeg files are supported as coverfile")
|
||||||
|
|
||||||
if config.config_use_google_drive:
|
if config.config_use_google_drive:
|
||||||
tmpDir = gettempdir()
|
tmp_dir = os.path.join(gettempdir(), 'calibre_web')
|
||||||
ret, message = save_cover_from_filestorage(tmpDir, "uploaded_cover.jpg", img)
|
|
||||||
|
if not os.path.isdir(tmp_dir):
|
||||||
|
os.mkdir(tmp_dir)
|
||||||
|
ret, message = save_cover_from_filestorage(tmp_dir, "uploaded_cover.jpg", img)
|
||||||
if ret is True:
|
if ret is True:
|
||||||
gd.uploadFileToEbooksFolder(os.path.join(book_path, 'cover.jpg'),
|
gd.uploadFileToEbooksFolder(os.path.join(book_path, 'cover.jpg').replace("\\","/"),
|
||||||
os.path.join(tmpDir, "uploaded_cover.jpg"))
|
os.path.join(tmp_dir, "uploaded_cover.jpg"))
|
||||||
log.info("Cover is saved on Google Drive")
|
log.info("Cover is saved on Google Drive")
|
||||||
return True, None
|
return True, None
|
||||||
else:
|
else:
|
||||||
|
@ -674,6 +693,7 @@ def do_download_file(book, book_format, client, data, headers):
|
||||||
# ToDo Check headers parameter
|
# ToDo Check headers parameter
|
||||||
for element in headers:
|
for element in headers:
|
||||||
response.headers[element[0]] = element[1]
|
response.headers[element[0]] = element[1]
|
||||||
|
log.info('Downloading file: {}'.format(os.path.join(filename, data.name + "." + book_format)))
|
||||||
return response
|
return response
|
||||||
|
|
||||||
##################################
|
##################################
|
||||||
|
@ -697,7 +717,7 @@ def check_unrar(unrarLocation):
|
||||||
log.debug("unrar version %s", version)
|
log.debug("unrar version %s", version)
|
||||||
break
|
break
|
||||||
except (OSError, UnicodeDecodeError) as err:
|
except (OSError, UnicodeDecodeError) as err:
|
||||||
log.exception(err)
|
log.debug_or_exception(err)
|
||||||
return _('Error excecuting UnRar')
|
return _('Error excecuting UnRar')
|
||||||
|
|
||||||
|
|
||||||
|
@ -713,7 +733,6 @@ def json_serial(obj):
|
||||||
'seconds': obj.seconds,
|
'seconds': obj.seconds,
|
||||||
'microseconds': obj.microseconds,
|
'microseconds': obj.microseconds,
|
||||||
}
|
}
|
||||||
# return obj.isoformat()
|
|
||||||
raise TypeError("Type %s not serializable" % type(obj))
|
raise TypeError("Type %s not serializable" % type(obj))
|
||||||
|
|
||||||
|
|
||||||
|
@ -737,8 +756,8 @@ def format_runtime(runtime):
|
||||||
# helper function to apply localize status information in tasklist entries
|
# helper function to apply localize status information in tasklist entries
|
||||||
def render_task_status(tasklist):
|
def render_task_status(tasklist):
|
||||||
renderedtasklist = list()
|
renderedtasklist = list()
|
||||||
for num, user, added, task in tasklist:
|
for __, user, __, task in tasklist:
|
||||||
if user == current_user.nickname or current_user.role_admin():
|
if user == current_user.name or current_user.role_admin():
|
||||||
ret = {}
|
ret = {}
|
||||||
if task.start_time:
|
if task.start_time:
|
||||||
ret['starttime'] = format_datetime(task.start_time, format='short', locale=get_locale())
|
ret['starttime'] = format_datetime(task.start_time, format='short', locale=get_locale())
|
||||||
|
@ -776,8 +795,8 @@ def tags_filters():
|
||||||
# checks if domain is in database (including wildcards)
|
# checks if domain is in database (including wildcards)
|
||||||
# example SELECT * FROM @TABLE WHERE 'abcdefg' LIKE Name;
|
# example SELECT * FROM @TABLE WHERE 'abcdefg' LIKE Name;
|
||||||
# from https://code.luasoftware.com/tutorials/flask/execute-raw-sql-in-flask-sqlalchemy/
|
# from https://code.luasoftware.com/tutorials/flask/execute-raw-sql-in-flask-sqlalchemy/
|
||||||
|
# in all calls the email address is checked for validity
|
||||||
def check_valid_domain(domain_text):
|
def check_valid_domain(domain_text):
|
||||||
# domain_text = domain_text.split('@', 1)[-1].lower()
|
|
||||||
sql = "SELECT * FROM registration WHERE (:domain LIKE domain and allow = 1);"
|
sql = "SELECT * FROM registration WHERE (:domain LIKE domain and allow = 1);"
|
||||||
result = ub.session.query(ub.Registration).from_statement(text(sql)).params(domain=domain_text).all()
|
result = ub.session.query(ub.Registration).from_statement(text(sql)).params(domain=domain_text).all()
|
||||||
if not len(result):
|
if not len(result):
|
||||||
|
@ -811,6 +830,7 @@ def get_download_link(book_id, book_format, client):
|
||||||
if book:
|
if book:
|
||||||
data1 = calibre_db.get_book_format(book.id, book_format.upper())
|
data1 = calibre_db.get_book_format(book.id, book_format.upper())
|
||||||
else:
|
else:
|
||||||
|
log.error("Book id {} not found for downloading".format(book_id))
|
||||||
abort(404)
|
abort(404)
|
||||||
if data1:
|
if data1:
|
||||||
# collect downloaded books only for registered user and not for anonymous user
|
# collect downloaded books only for registered user and not for anonymous user
|
||||||
|
@ -827,4 +847,3 @@ def get_download_link(book_id, book_format, client):
|
||||||
return do_download_file(book, book_format, client, data1, headers)
|
return do_download_file(book, book_format, client, data1, headers)
|
||||||
else:
|
else:
|
||||||
abort(404)
|
abort(404)
|
||||||
|
|
||||||
|
|
|
@ -57,27 +57,30 @@ def get_language_name(locale, lang_code):
|
||||||
|
|
||||||
def get_language_codes(locale, language_names, remainder=None):
|
def get_language_codes(locale, language_names, remainder=None):
|
||||||
language_names = set(x.strip().lower() for x in language_names if x)
|
language_names = set(x.strip().lower() for x in language_names if x)
|
||||||
languages = list()
|
lang = list()
|
||||||
for k, v in get_language_names(locale).items():
|
for k, v in get_language_names(locale).items():
|
||||||
v = v.lower()
|
v = v.lower()
|
||||||
if v in language_names:
|
if v in language_names:
|
||||||
languages.append(k)
|
lang.append(k)
|
||||||
language_names.remove(v)
|
language_names.remove(v)
|
||||||
if remainder is not None:
|
if remainder is not None and language_names:
|
||||||
remainder.extend(language_names)
|
remainder.extend(language_names)
|
||||||
return languages
|
return lang
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def get_valid_language_codes(locale, language_names, remainder=None):
|
def get_valid_language_codes(locale, language_names, remainder=None):
|
||||||
languages = list()
|
lang = list()
|
||||||
if "" in language_names:
|
if "" in language_names:
|
||||||
language_names.remove("")
|
language_names.remove("")
|
||||||
for k, v in get_language_names(locale).items():
|
for k, __ in get_language_names(locale).items():
|
||||||
if k in language_names:
|
if k in language_names:
|
||||||
languages.append(k)
|
lang.append(k)
|
||||||
language_names.remove(k)
|
language_names.remove(k)
|
||||||
if remainder is not None and len(language_names):
|
if remainder is not None and len(language_names):
|
||||||
remainder.extend(language_names)
|
remainder.extend(language_names)
|
||||||
return languages
|
return lang
|
||||||
|
|
||||||
|
|
||||||
def get_lang3(lang):
|
def get_lang3(lang):
|
||||||
try:
|
try:
|
||||||
|
|
|
@ -82,7 +82,7 @@ def formatdate_filter(val):
|
||||||
except AttributeError as e:
|
except AttributeError as e:
|
||||||
log.error('Babel error: %s, Current user locale: %s, Current User: %s', e,
|
log.error('Babel error: %s, Current user locale: %s, Current User: %s', e,
|
||||||
current_user.locale,
|
current_user.locale,
|
||||||
current_user.nickname
|
current_user.name
|
||||||
)
|
)
|
||||||
return val
|
return val
|
||||||
|
|
||||||
|
|
149
cps/kobo.py
|
@ -42,7 +42,7 @@ from flask import (
|
||||||
from flask_login import current_user
|
from flask_login import current_user
|
||||||
from werkzeug.datastructures import Headers
|
from werkzeug.datastructures import Headers
|
||||||
from sqlalchemy import func
|
from sqlalchemy import func
|
||||||
from sqlalchemy.sql.expression import and_, or_
|
from sqlalchemy.sql.expression import and_
|
||||||
from sqlalchemy.exc import StatementError
|
from sqlalchemy.exc import StatementError
|
||||||
import requests
|
import requests
|
||||||
|
|
||||||
|
@ -56,6 +56,8 @@ KOBO_FORMATS = {"KEPUB": ["KEPUB"], "EPUB": ["EPUB3", "EPUB"]}
|
||||||
KOBO_STOREAPI_URL = "https://storeapi.kobo.com"
|
KOBO_STOREAPI_URL = "https://storeapi.kobo.com"
|
||||||
KOBO_IMAGEHOST_URL = "https://kbimages1-a.akamaihd.net"
|
KOBO_IMAGEHOST_URL = "https://kbimages1-a.akamaihd.net"
|
||||||
|
|
||||||
|
SYNC_ITEM_LIMIT = 100
|
||||||
|
|
||||||
kobo = Blueprint("kobo", __name__, url_prefix="/kobo/<auth_token>")
|
kobo = Blueprint("kobo", __name__, url_prefix="/kobo/<auth_token>")
|
||||||
kobo_auth.disable_failed_auth_redirect_for_blueprint(kobo)
|
kobo_auth.disable_failed_auth_redirect_for_blueprint(kobo)
|
||||||
kobo_auth.register_url_value_preprocessor(kobo)
|
kobo_auth.register_url_value_preprocessor(kobo)
|
||||||
|
@ -142,68 +144,84 @@ def HandleSyncRequest():
|
||||||
new_books_last_modified = sync_token.books_last_modified
|
new_books_last_modified = sync_token.books_last_modified
|
||||||
new_books_last_created = sync_token.books_last_created
|
new_books_last_created = sync_token.books_last_created
|
||||||
new_reading_state_last_modified = sync_token.reading_state_last_modified
|
new_reading_state_last_modified = sync_token.reading_state_last_modified
|
||||||
|
new_archived_last_modified = datetime.datetime.min
|
||||||
sync_results = []
|
sync_results = []
|
||||||
|
|
||||||
# We reload the book database so that the user get's a fresh view of the library
|
# We reload the book database so that the user get's a fresh view of the library
|
||||||
# in case of external changes (e.g: adding a book through Calibre).
|
# in case of external changes (e.g: adding a book through Calibre).
|
||||||
calibre_db.reconnect_db(config, ub.app_DB_path)
|
calibre_db.reconnect_db(config, ub.app_DB_path)
|
||||||
|
|
||||||
archived_books = (
|
if sync_token.books_last_id > -1:
|
||||||
ub.session.query(ub.ArchivedBook)
|
changed_entries = (
|
||||||
.filter(ub.ArchivedBook.user_id == int(current_user.id))
|
calibre_db.session.query(db.Books, ub.ArchivedBook.last_modified, ub.ArchivedBook.is_archived)
|
||||||
.all()
|
.join(db.Data).outerjoin(ub.ArchivedBook, db.Books.id == ub.ArchivedBook.book_id)
|
||||||
)
|
.filter(db.Books.last_modified >= sync_token.books_last_modified)
|
||||||
|
.filter(db.Books.id>sync_token.books_last_id)
|
||||||
# We join-in books that have had their Archived bit recently modified in order to either:
|
.filter(db.Data.format.in_(KOBO_FORMATS))
|
||||||
# * Restore them to the user's device.
|
.order_by(db.Books.last_modified)
|
||||||
# * Delete them from the user's device.
|
.order_by(db.Books.id)
|
||||||
# (Ideally we would use a join for this logic, however cross-database joins don't look trivial in SqlAlchemy.)
|
.limit(SYNC_ITEM_LIMIT)
|
||||||
recently_restored_or_archived_books = []
|
)
|
||||||
archived_book_ids = {}
|
else:
|
||||||
new_archived_last_modified = datetime.datetime.min
|
changed_entries = (
|
||||||
for archived_book in archived_books:
|
calibre_db.session.query(db.Books, ub.ArchivedBook.last_modified, ub.ArchivedBook.is_archived)
|
||||||
if archived_book.last_modified > sync_token.archive_last_modified:
|
.join(db.Data).outerjoin(ub.ArchivedBook, db.Books.id == ub.ArchivedBook.book_id)
|
||||||
recently_restored_or_archived_books.append(archived_book.book_id)
|
.filter(db.Books.last_modified > sync_token.books_last_modified)
|
||||||
if archived_book.is_archived:
|
.filter(db.Data.format.in_(KOBO_FORMATS))
|
||||||
archived_book_ids[archived_book.book_id] = True
|
.order_by(db.Books.last_modified)
|
||||||
new_archived_last_modified = max(
|
.order_by(db.Books.id)
|
||||||
new_archived_last_modified, archived_book.last_modified)
|
.limit(SYNC_ITEM_LIMIT)
|
||||||
|
)
|
||||||
# sqlite gives unexpected results when performing the last_modified comparison without the datetime cast.
|
|
||||||
# It looks like it's treating the db.Books.last_modified field as a string and may fail
|
|
||||||
# the comparison because of the +00:00 suffix.
|
|
||||||
changed_entries = (
|
|
||||||
calibre_db.session.query(db.Books)
|
|
||||||
.join(db.Data)
|
|
||||||
.filter(or_(func.datetime(db.Books.last_modified) > sync_token.books_last_modified,
|
|
||||||
db.Books.id.in_(recently_restored_or_archived_books)))
|
|
||||||
.filter(db.Data.format.in_(KOBO_FORMATS))
|
|
||||||
.all()
|
|
||||||
)
|
|
||||||
|
|
||||||
reading_states_in_new_entitlements = []
|
reading_states_in_new_entitlements = []
|
||||||
for book in changed_entries:
|
for book in changed_entries:
|
||||||
kobo_reading_state = get_or_create_reading_state(book.id)
|
formats = [data.format for data in book.Books.data]
|
||||||
|
if not 'KEPUB' in formats and config.config_kepubifypath and 'EPUB' in formats:
|
||||||
|
helper.convert_book_format(book.Books.id, config.config_calibre_dir, 'EPUB', 'KEPUB', current_user.name)
|
||||||
|
|
||||||
|
kobo_reading_state = get_or_create_reading_state(book.Books.id)
|
||||||
entitlement = {
|
entitlement = {
|
||||||
"BookEntitlement": create_book_entitlement(book, archived=(book.id in archived_book_ids)),
|
"BookEntitlement": create_book_entitlement(book.Books, archived=(book.is_archived == True)),
|
||||||
"BookMetadata": get_metadata(book),
|
"BookMetadata": get_metadata(book.Books),
|
||||||
}
|
}
|
||||||
|
|
||||||
if kobo_reading_state.last_modified > sync_token.reading_state_last_modified:
|
if kobo_reading_state.last_modified > sync_token.reading_state_last_modified:
|
||||||
entitlement["ReadingState"] = get_kobo_reading_state_response(book, kobo_reading_state)
|
entitlement["ReadingState"] = get_kobo_reading_state_response(book.Books, kobo_reading_state)
|
||||||
new_reading_state_last_modified = max(new_reading_state_last_modified, kobo_reading_state.last_modified)
|
new_reading_state_last_modified = max(new_reading_state_last_modified, kobo_reading_state.last_modified)
|
||||||
reading_states_in_new_entitlements.append(book.id)
|
reading_states_in_new_entitlements.append(book.Books.id)
|
||||||
|
|
||||||
if book.timestamp > sync_token.books_last_created:
|
if book.Books.timestamp > sync_token.books_last_created:
|
||||||
sync_results.append({"NewEntitlement": entitlement})
|
sync_results.append({"NewEntitlement": entitlement})
|
||||||
else:
|
else:
|
||||||
sync_results.append({"ChangedEntitlement": entitlement})
|
sync_results.append({"ChangedEntitlement": entitlement})
|
||||||
|
|
||||||
new_books_last_modified = max(
|
new_books_last_modified = max(
|
||||||
book.last_modified, new_books_last_modified
|
book.Books.last_modified, new_books_last_modified
|
||||||
)
|
)
|
||||||
new_books_last_created = max(book.timestamp, new_books_last_created)
|
new_books_last_created = max(book.Books.timestamp, new_books_last_created)
|
||||||
|
|
||||||
|
max_change = (changed_entries
|
||||||
|
.from_self()
|
||||||
|
.filter(ub.ArchivedBook.is_archived)
|
||||||
|
.order_by(func.datetime(ub.ArchivedBook.last_modified).desc())
|
||||||
|
.first()
|
||||||
|
)
|
||||||
|
if max_change:
|
||||||
|
max_change = max_change.last_modified
|
||||||
|
else:
|
||||||
|
max_change = new_archived_last_modified
|
||||||
|
new_archived_last_modified = max(new_archived_last_modified, max_change)
|
||||||
|
|
||||||
|
# no. of books returned
|
||||||
|
book_count = changed_entries.count()
|
||||||
|
|
||||||
|
# last entry:
|
||||||
|
if book_count:
|
||||||
|
books_last_id = changed_entries.all()[-1].Books.id or -1
|
||||||
|
else:
|
||||||
|
books_last_id = -1
|
||||||
|
|
||||||
|
# generate reading state data
|
||||||
changed_reading_states = (
|
changed_reading_states = (
|
||||||
ub.session.query(ub.KoboReadingState)
|
ub.session.query(ub.KoboReadingState)
|
||||||
.filter(and_(func.datetime(ub.KoboReadingState.last_modified) > sync_token.reading_state_last_modified,
|
.filter(and_(func.datetime(ub.KoboReadingState.last_modified) > sync_token.reading_state_last_modified,
|
||||||
|
@ -225,11 +243,12 @@ def HandleSyncRequest():
|
||||||
sync_token.books_last_modified = new_books_last_modified
|
sync_token.books_last_modified = new_books_last_modified
|
||||||
sync_token.archive_last_modified = new_archived_last_modified
|
sync_token.archive_last_modified = new_archived_last_modified
|
||||||
sync_token.reading_state_last_modified = new_reading_state_last_modified
|
sync_token.reading_state_last_modified = new_reading_state_last_modified
|
||||||
|
sync_token.books_last_id = books_last_id
|
||||||
|
|
||||||
return generate_sync_response(sync_token, sync_results)
|
return generate_sync_response(sync_token, sync_results, book_count)
|
||||||
|
|
||||||
|
|
||||||
def generate_sync_response(sync_token, sync_results):
|
def generate_sync_response(sync_token, sync_results, set_cont=False):
|
||||||
extra_headers = {}
|
extra_headers = {}
|
||||||
if config.config_kobo_proxy:
|
if config.config_kobo_proxy:
|
||||||
# Merge in sync results from the official Kobo store.
|
# Merge in sync results from the official Kobo store.
|
||||||
|
@ -243,8 +262,10 @@ def generate_sync_response(sync_token, sync_results):
|
||||||
extra_headers["x-kobo-sync-mode"] = store_response.headers.get("x-kobo-sync-mode")
|
extra_headers["x-kobo-sync-mode"] = store_response.headers.get("x-kobo-sync-mode")
|
||||||
extra_headers["x-kobo-recent-reads"] = store_response.headers.get("x-kobo-recent-reads")
|
extra_headers["x-kobo-recent-reads"] = store_response.headers.get("x-kobo-recent-reads")
|
||||||
|
|
||||||
except Exception as e:
|
except Exception as ex:
|
||||||
log.error("Failed to receive or parse response from Kobo's sync endpoint: " + str(e))
|
log.error("Failed to receive or parse response from Kobo's sync endpoint: {}".format(ex))
|
||||||
|
if set_cont:
|
||||||
|
extra_headers["x-kobo-sync"] = "continue"
|
||||||
sync_token.to_headers(extra_headers)
|
sync_token.to_headers(extra_headers)
|
||||||
|
|
||||||
response = make_response(jsonify(sync_results), extra_headers)
|
response = make_response(jsonify(sync_results), extra_headers)
|
||||||
|
@ -284,7 +305,8 @@ def get_download_url_for_book(book, book_format):
|
||||||
book_format=book_format.lower()
|
book_format=book_format.lower()
|
||||||
)
|
)
|
||||||
return url_for(
|
return url_for(
|
||||||
"web.download_link",
|
"kobo.download_book",
|
||||||
|
auth_token=kobo_auth.get_auth_token(),
|
||||||
book_id=book.id,
|
book_id=book.id,
|
||||||
book_format=book_format.lower(),
|
book_format=book_format.lower(),
|
||||||
_external=True,
|
_external=True,
|
||||||
|
@ -443,8 +465,7 @@ def HandleTagCreate():
|
||||||
items_unknown_to_calibre = add_items_to_shelf(items, shelf)
|
items_unknown_to_calibre = add_items_to_shelf(items, shelf)
|
||||||
if items_unknown_to_calibre:
|
if items_unknown_to_calibre:
|
||||||
log.debug("Received request to add unknown books to a collection. Silently ignoring items.")
|
log.debug("Received request to add unknown books to a collection. Silently ignoring items.")
|
||||||
ub.session.commit()
|
ub.session_commit()
|
||||||
|
|
||||||
return make_response(jsonify(str(shelf.uuid)), 201)
|
return make_response(jsonify(str(shelf.uuid)), 201)
|
||||||
|
|
||||||
|
|
||||||
|
@ -476,7 +497,7 @@ def HandleTagUpdate(tag_id):
|
||||||
|
|
||||||
shelf.name = name
|
shelf.name = name
|
||||||
ub.session.merge(shelf)
|
ub.session.merge(shelf)
|
||||||
ub.session.commit()
|
ub.session_commit()
|
||||||
return make_response(' ', 200)
|
return make_response(' ', 200)
|
||||||
|
|
||||||
|
|
||||||
|
@ -528,8 +549,7 @@ def HandleTagAddItem(tag_id):
|
||||||
log.debug("Received request to add an unknown book to a collection. Silently ignoring item.")
|
log.debug("Received request to add an unknown book to a collection. Silently ignoring item.")
|
||||||
|
|
||||||
ub.session.merge(shelf)
|
ub.session.merge(shelf)
|
||||||
ub.session.commit()
|
ub.session_commit()
|
||||||
|
|
||||||
return make_response('', 201)
|
return make_response('', 201)
|
||||||
|
|
||||||
|
|
||||||
|
@ -569,7 +589,7 @@ def HandleTagRemoveItem(tag_id):
|
||||||
shelf.books.filter(ub.BookShelf.book_id == book.id).delete()
|
shelf.books.filter(ub.BookShelf.book_id == book.id).delete()
|
||||||
except KeyError:
|
except KeyError:
|
||||||
items_unknown_to_calibre.append(item)
|
items_unknown_to_calibre.append(item)
|
||||||
ub.session.commit()
|
ub.session_commit()
|
||||||
|
|
||||||
if items_unknown_to_calibre:
|
if items_unknown_to_calibre:
|
||||||
log.debug("Received request to remove an unknown book to a collecition. Silently ignoring item.")
|
log.debug("Received request to remove an unknown book to a collecition. Silently ignoring item.")
|
||||||
|
@ -615,7 +635,7 @@ def sync_shelves(sync_token, sync_results):
|
||||||
"ChangedTag": tag
|
"ChangedTag": tag
|
||||||
})
|
})
|
||||||
sync_token.tags_last_modified = new_tags_last_modified
|
sync_token.tags_last_modified = new_tags_last_modified
|
||||||
ub.session.commit()
|
ub.session_commit()
|
||||||
|
|
||||||
|
|
||||||
# Creates a Kobo "Tag" object from a ub.Shelf object
|
# Creates a Kobo "Tag" object from a ub.Shelf object
|
||||||
|
@ -696,7 +716,7 @@ def HandleStateRequest(book_uuid):
|
||||||
abort(400, description="Malformed request data is missing 'ReadingStates' key")
|
abort(400, description="Malformed request data is missing 'ReadingStates' key")
|
||||||
|
|
||||||
ub.session.merge(kobo_reading_state)
|
ub.session.merge(kobo_reading_state)
|
||||||
ub.session.commit()
|
ub.session_commit()
|
||||||
return jsonify({
|
return jsonify({
|
||||||
"RequestResult": "Success",
|
"RequestResult": "Success",
|
||||||
"UpdateResults": [update_results_response],
|
"UpdateResults": [update_results_response],
|
||||||
|
@ -734,7 +754,7 @@ def get_or_create_reading_state(book_id):
|
||||||
kobo_reading_state.statistics = ub.KoboStatistics()
|
kobo_reading_state.statistics = ub.KoboStatistics()
|
||||||
book_read.kobo_reading_state = kobo_reading_state
|
book_read.kobo_reading_state = kobo_reading_state
|
||||||
ub.session.add(book_read)
|
ub.session.add(book_read)
|
||||||
ub.session.commit()
|
ub.session_commit()
|
||||||
return book_read.kobo_reading_state
|
return book_read.kobo_reading_state
|
||||||
|
|
||||||
|
|
||||||
|
@ -837,8 +857,7 @@ def HandleBookDeletionRequest(book_uuid):
|
||||||
archived_book.last_modified = datetime.datetime.utcnow()
|
archived_book.last_modified = datetime.datetime.utcnow()
|
||||||
|
|
||||||
ub.session.merge(archived_book)
|
ub.session.merge(archived_book)
|
||||||
ub.session.commit()
|
ub.session_commit()
|
||||||
|
|
||||||
return ("", 204)
|
return ("", 204)
|
||||||
|
|
||||||
|
|
||||||
|
@ -874,17 +893,6 @@ def HandleProductsRequest(dummy=None):
|
||||||
return redirect_or_proxy_request()
|
return redirect_or_proxy_request()
|
||||||
|
|
||||||
|
|
||||||
'''@kobo.errorhandler(404)
|
|
||||||
def handle_404(err):
|
|
||||||
# This handler acts as a catch-all for endpoints that we don't have an interest in
|
|
||||||
# implementing (e.g: v1/analytics/gettests, v1/user/recommendations, etc)
|
|
||||||
if err:
|
|
||||||
print('404')
|
|
||||||
return jsonify(error=str(err)), 404
|
|
||||||
log.debug("Unknown Request received: %s, method: %s, data: %s", request.base_url, request.method, request.data)
|
|
||||||
return redirect_or_proxy_request()'''
|
|
||||||
|
|
||||||
|
|
||||||
def make_calibre_web_auth_response():
|
def make_calibre_web_auth_response():
|
||||||
# As described in kobo_auth.py, CalibreWeb doesn't make use practical use of this auth/device API call for
|
# As described in kobo_auth.py, CalibreWeb doesn't make use practical use of this auth/device API call for
|
||||||
# authentation (nor for authorization). We return a dummy response just to keep the device happy.
|
# authentation (nor for authorization). We return a dummy response just to keep the device happy.
|
||||||
|
@ -911,7 +919,7 @@ def HandleAuthRequest():
|
||||||
if config.config_kobo_proxy:
|
if config.config_kobo_proxy:
|
||||||
try:
|
try:
|
||||||
return redirect_or_proxy_request()
|
return redirect_or_proxy_request()
|
||||||
except:
|
except Exception:
|
||||||
log.error("Failed to receive or parse response from Kobo's auth endpoint. Falling back to un-proxied mode.")
|
log.error("Failed to receive or parse response from Kobo's auth endpoint. Falling back to un-proxied mode.")
|
||||||
return make_calibre_web_auth_response()
|
return make_calibre_web_auth_response()
|
||||||
|
|
||||||
|
@ -928,7 +936,7 @@ def HandleInitRequest():
|
||||||
store_response_json = store_response.json()
|
store_response_json = store_response.json()
|
||||||
if "Resources" in store_response_json:
|
if "Resources" in store_response_json:
|
||||||
kobo_resources = store_response_json["Resources"]
|
kobo_resources = store_response_json["Resources"]
|
||||||
except:
|
except Exception:
|
||||||
log.error("Failed to receive or parse response from Kobo's init endpoint. Falling back to un-proxied mode.")
|
log.error("Failed to receive or parse response from Kobo's init endpoint. Falling back to un-proxied mode.")
|
||||||
if not kobo_resources:
|
if not kobo_resources:
|
||||||
kobo_resources = NATIVE_KOBO_RESOURCES()
|
kobo_resources = NATIVE_KOBO_RESOURCES()
|
||||||
|
@ -989,7 +997,6 @@ def HandleInitRequest():
|
||||||
@requires_kobo_auth
|
@requires_kobo_auth
|
||||||
@download_required
|
@download_required
|
||||||
def download_book(book_id, book_format):
|
def download_book(book_id, book_format):
|
||||||
|
|
||||||
return get_download_link(book_id, book_format, "kobo")
|
return get_download_link(book_id, book_format, "kobo")
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -64,11 +64,11 @@ from datetime import datetime
|
||||||
from os import urandom
|
from os import urandom
|
||||||
|
|
||||||
from flask import g, Blueprint, url_for, abort, request
|
from flask import g, Blueprint, url_for, abort, request
|
||||||
from flask_login import login_user, login_required
|
from flask_login import login_user, current_user, login_required
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
|
|
||||||
from . import logger, ub, lm
|
from . import logger, config, calibre_db, db, helper, ub, lm
|
||||||
from .web import render_title_template
|
from .render_template import render_title_template
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from functools import wraps
|
from functools import wraps
|
||||||
|
@ -81,6 +81,7 @@ log = logger.create()
|
||||||
|
|
||||||
def register_url_value_preprocessor(kobo):
|
def register_url_value_preprocessor(kobo):
|
||||||
@kobo.url_value_preprocessor
|
@kobo.url_value_preprocessor
|
||||||
|
# pylint: disable=unused-variable
|
||||||
def pop_auth_token(__, values):
|
def pop_auth_token(__, values):
|
||||||
g.auth_token = values.pop("auth_token")
|
g.auth_token = values.pop("auth_token")
|
||||||
|
|
||||||
|
@ -147,7 +148,15 @@ def generate_auth_token(user_id):
|
||||||
auth_token.token_type = 1
|
auth_token.token_type = 1
|
||||||
|
|
||||||
ub.session.add(auth_token)
|
ub.session.add(auth_token)
|
||||||
ub.session.commit()
|
ub.session_commit()
|
||||||
|
|
||||||
|
books = calibre_db.session.query(db.Books).join(db.Data).all()
|
||||||
|
|
||||||
|
for book in books:
|
||||||
|
formats = [data.format for data in book.data]
|
||||||
|
if not 'KEPUB' in formats and config.config_kepubifypath and 'EPUB' in formats:
|
||||||
|
helper.convert_book_format(book.id, config.config_calibre_dir, 'EPUB', 'KEPUB', current_user.name)
|
||||||
|
|
||||||
return render_title_template(
|
return render_title_template(
|
||||||
"generate_kobo_auth_url.html",
|
"generate_kobo_auth_url.html",
|
||||||
title=_(u"Kobo Setup"),
|
title=_(u"Kobo Setup"),
|
||||||
|
@ -164,5 +173,5 @@ def delete_auth_token(user_id):
|
||||||
# Invalidate any prevously generated Kobo Auth token for this user.
|
# Invalidate any prevously generated Kobo Auth token for this user.
|
||||||
ub.session.query(ub.RemoteAuthToken).filter(ub.RemoteAuthToken.user_id == user_id)\
|
ub.session.query(ub.RemoteAuthToken).filter(ub.RemoteAuthToken.user_id == user_id)\
|
||||||
.filter(ub.RemoteAuthToken.token_type==1).delete()
|
.filter(ub.RemoteAuthToken.token_type==1).delete()
|
||||||
ub.session.commit()
|
|
||||||
return ""
|
return ub.session_commit()
|
||||||
|
|
|
@ -41,10 +41,37 @@ logging.addLevelName(logging.WARNING, "WARN")
|
||||||
logging.addLevelName(logging.CRITICAL, "CRIT")
|
logging.addLevelName(logging.CRITICAL, "CRIT")
|
||||||
|
|
||||||
|
|
||||||
|
class _Logger(logging.Logger):
|
||||||
|
|
||||||
|
def debug_or_exception(self, message, *args, **kwargs):
|
||||||
|
if sys.version_info > (3, 7):
|
||||||
|
if is_debug_enabled():
|
||||||
|
self.exception(message, stacklevel=2, *args, **kwargs)
|
||||||
|
else:
|
||||||
|
self.error(message, stacklevel=2, *args, **kwargs)
|
||||||
|
elif sys.version_info > (3, 0):
|
||||||
|
if is_debug_enabled():
|
||||||
|
self.exception(message, stack_info=True, *args, **kwargs)
|
||||||
|
else:
|
||||||
|
self.error(message, *args, **kwargs)
|
||||||
|
else:
|
||||||
|
if is_debug_enabled():
|
||||||
|
self.exception(message, *args, **kwargs)
|
||||||
|
else:
|
||||||
|
self.error(message, *args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
|
def debug_no_auth(self, message, *args, **kwargs):
|
||||||
|
message = message.strip("\r\n")
|
||||||
|
if message.startswith("send: AUTH"):
|
||||||
|
self.debug(message[:16], *args, **kwargs)
|
||||||
|
else:
|
||||||
|
self.debug(message, *args, **kwargs)
|
||||||
|
|
||||||
|
|
||||||
def get(name=None):
|
def get(name=None):
|
||||||
return logging.getLogger(name)
|
return logging.getLogger(name)
|
||||||
|
|
||||||
|
|
||||||
def create():
|
def create():
|
||||||
parent_frame = inspect.stack(0)[1]
|
parent_frame = inspect.stack(0)[1]
|
||||||
if hasattr(parent_frame, 'frame'):
|
if hasattr(parent_frame, 'frame'):
|
||||||
|
@ -54,7 +81,6 @@ def create():
|
||||||
parent_module = inspect.getmodule(parent_frame)
|
parent_module = inspect.getmodule(parent_frame)
|
||||||
return get(parent_module.__name__)
|
return get(parent_module.__name__)
|
||||||
|
|
||||||
|
|
||||||
def is_debug_enabled():
|
def is_debug_enabled():
|
||||||
return logging.root.level <= logging.DEBUG
|
return logging.root.level <= logging.DEBUG
|
||||||
|
|
||||||
|
@ -99,6 +125,7 @@ def setup(log_file, log_level=None):
|
||||||
May be called multiple times.
|
May be called multiple times.
|
||||||
'''
|
'''
|
||||||
log_level = log_level or DEFAULT_LOG_LEVEL
|
log_level = log_level or DEFAULT_LOG_LEVEL
|
||||||
|
logging.setLoggerClass(_Logger)
|
||||||
logging.getLogger(__package__).setLevel(log_level)
|
logging.getLogger(__package__).setLevel(log_level)
|
||||||
|
|
||||||
r = logging.root
|
r = logging.root
|
||||||
|
@ -126,11 +153,11 @@ def setup(log_file, log_level=None):
|
||||||
file_handler.baseFilename = log_file
|
file_handler.baseFilename = log_file
|
||||||
else:
|
else:
|
||||||
try:
|
try:
|
||||||
file_handler = RotatingFileHandler(log_file, maxBytes=50000, backupCount=2, encoding='utf-8')
|
file_handler = RotatingFileHandler(log_file, maxBytes=100000, backupCount=2, encoding='utf-8')
|
||||||
except IOError:
|
except IOError:
|
||||||
if log_file == DEFAULT_LOG_FILE:
|
if log_file == DEFAULT_LOG_FILE:
|
||||||
raise
|
raise
|
||||||
file_handler = RotatingFileHandler(DEFAULT_LOG_FILE, maxBytes=50000, backupCount=2, encoding='utf-8')
|
file_handler = RotatingFileHandler(DEFAULT_LOG_FILE, maxBytes=100000, backupCount=2, encoding='utf-8')
|
||||||
log_file = ""
|
log_file = ""
|
||||||
file_handler.setFormatter(FORMATTER)
|
file_handler.setFormatter(FORMATTER)
|
||||||
|
|
||||||
|
|
238
cps/oauth.py
|
@ -19,7 +19,6 @@
|
||||||
from __future__ import division, print_function, unicode_literals
|
from __future__ import division, print_function, unicode_literals
|
||||||
from flask import session
|
from flask import session
|
||||||
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from flask_dance.consumer.backend.sqla import SQLAlchemyBackend, first, _get_real_user
|
from flask_dance.consumer.backend.sqla import SQLAlchemyBackend, first, _get_real_user
|
||||||
from sqlalchemy.orm.exc import NoResultFound
|
from sqlalchemy.orm.exc import NoResultFound
|
||||||
|
@ -34,134 +33,131 @@ except ImportError:
|
||||||
except ImportError:
|
except ImportError:
|
||||||
pass
|
pass
|
||||||
|
|
||||||
try:
|
|
||||||
class OAuthBackend(SQLAlchemyBackend):
|
|
||||||
"""
|
|
||||||
Stores and retrieves OAuth tokens using a relational database through
|
|
||||||
the `SQLAlchemy`_ ORM.
|
|
||||||
|
|
||||||
.. _SQLAlchemy: https://www.sqlalchemy.org/
|
class OAuthBackend(SQLAlchemyBackend):
|
||||||
"""
|
"""
|
||||||
def __init__(self, model, session, provider_id,
|
Stores and retrieves OAuth tokens using a relational database through
|
||||||
user=None, user_id=None, user_required=None, anon_user=None,
|
the `SQLAlchemy`_ ORM.
|
||||||
cache=None):
|
|
||||||
self.provider_id = provider_id
|
|
||||||
super(OAuthBackend, self).__init__(model, session, user, user_id, user_required, anon_user, cache)
|
|
||||||
|
|
||||||
def get(self, blueprint, user=None, user_id=None):
|
.. _SQLAlchemy: https://www.sqlalchemy.org/
|
||||||
if self.provider_id + '_oauth_token' in session and session[self.provider_id + '_oauth_token'] != '':
|
"""
|
||||||
return session[self.provider_id + '_oauth_token']
|
def __init__(self, model, session, provider_id,
|
||||||
# check cache
|
user=None, user_id=None, user_required=None, anon_user=None,
|
||||||
cache_key = self.make_cache_key(blueprint=blueprint, user=user, user_id=user_id)
|
cache=None):
|
||||||
token = self.cache.get(cache_key)
|
self.provider_id = provider_id
|
||||||
if token:
|
super(OAuthBackend, self).__init__(model, session, user, user_id, user_required, anon_user, cache)
|
||||||
return token
|
|
||||||
|
|
||||||
# if not cached, make database queries
|
|
||||||
query = (
|
|
||||||
self.session.query(self.model)
|
|
||||||
.filter_by(provider=self.provider_id)
|
|
||||||
)
|
|
||||||
uid = first([user_id, self.user_id, blueprint.config.get("user_id")])
|
|
||||||
u = first(_get_real_user(ref, self.anon_user)
|
|
||||||
for ref in (user, self.user, blueprint.config.get("user")))
|
|
||||||
|
|
||||||
use_provider_user_id = False
|
|
||||||
if self.provider_id + '_oauth_user_id' in session and session[self.provider_id + '_oauth_user_id'] != '':
|
|
||||||
query = query.filter_by(provider_user_id=session[self.provider_id + '_oauth_user_id'])
|
|
||||||
use_provider_user_id = True
|
|
||||||
|
|
||||||
if self.user_required and not u and not uid and not use_provider_user_id:
|
|
||||||
# raise ValueError("Cannot get OAuth token without an associated user")
|
|
||||||
return None
|
|
||||||
# check for user ID
|
|
||||||
if hasattr(self.model, "user_id") and uid:
|
|
||||||
query = query.filter_by(user_id=uid)
|
|
||||||
# check for user (relationship property)
|
|
||||||
elif hasattr(self.model, "user") and u:
|
|
||||||
query = query.filter_by(user=u)
|
|
||||||
# if we have the property, but not value, filter by None
|
|
||||||
elif hasattr(self.model, "user_id"):
|
|
||||||
query = query.filter_by(user_id=None)
|
|
||||||
# run query
|
|
||||||
try:
|
|
||||||
token = query.one().token
|
|
||||||
except NoResultFound:
|
|
||||||
token = None
|
|
||||||
|
|
||||||
# cache the result
|
|
||||||
self.cache.set(cache_key, token)
|
|
||||||
|
|
||||||
|
def get(self, blueprint, user=None, user_id=None):
|
||||||
|
if self.provider_id + '_oauth_token' in session and session[self.provider_id + '_oauth_token'] != '':
|
||||||
|
return session[self.provider_id + '_oauth_token']
|
||||||
|
# check cache
|
||||||
|
cache_key = self.make_cache_key(blueprint=blueprint, user=user, user_id=user_id)
|
||||||
|
token = self.cache.get(cache_key)
|
||||||
|
if token:
|
||||||
return token
|
return token
|
||||||
|
|
||||||
def set(self, blueprint, token, user=None, user_id=None):
|
# if not cached, make database queries
|
||||||
uid = first([user_id, self.user_id, blueprint.config.get("user_id")])
|
query = (
|
||||||
u = first(_get_real_user(ref, self.anon_user)
|
self.session.query(self.model)
|
||||||
for ref in (user, self.user, blueprint.config.get("user")))
|
.filter_by(provider=self.provider_id)
|
||||||
|
)
|
||||||
|
uid = first([user_id, self.user_id, blueprint.config.get("user_id")])
|
||||||
|
u = first(_get_real_user(ref, self.anon_user)
|
||||||
|
for ref in (user, self.user, blueprint.config.get("user")))
|
||||||
|
|
||||||
if self.user_required and not u and not uid:
|
use_provider_user_id = False
|
||||||
raise ValueError("Cannot set OAuth token without an associated user")
|
if self.provider_id + '_oauth_user_id' in session and session[self.provider_id + '_oauth_user_id'] != '':
|
||||||
|
query = query.filter_by(provider_user_id=session[self.provider_id + '_oauth_user_id'])
|
||||||
|
use_provider_user_id = True
|
||||||
|
|
||||||
# if there was an existing model, delete it
|
if self.user_required and not u and not uid and not use_provider_user_id:
|
||||||
existing_query = (
|
# raise ValueError("Cannot get OAuth token without an associated user")
|
||||||
self.session.query(self.model)
|
return None
|
||||||
.filter_by(provider=self.provider_id)
|
# check for user ID
|
||||||
)
|
if hasattr(self.model, "user_id") and uid:
|
||||||
# check for user ID
|
query = query.filter_by(user_id=uid)
|
||||||
has_user_id = hasattr(self.model, "user_id")
|
# check for user (relationship property)
|
||||||
if has_user_id and uid:
|
elif hasattr(self.model, "user") and u:
|
||||||
existing_query = existing_query.filter_by(user_id=uid)
|
query = query.filter_by(user=u)
|
||||||
# check for user (relationship property)
|
# if we have the property, but not value, filter by None
|
||||||
has_user = hasattr(self.model, "user")
|
elif hasattr(self.model, "user_id"):
|
||||||
if has_user and u:
|
query = query.filter_by(user_id=None)
|
||||||
existing_query = existing_query.filter_by(user=u)
|
# run query
|
||||||
# queue up delete query -- won't be run until commit()
|
try:
|
||||||
existing_query.delete()
|
token = query.one().token
|
||||||
# create a new model for this token
|
except NoResultFound:
|
||||||
kwargs = {
|
token = None
|
||||||
"provider": self.provider_id,
|
|
||||||
"token": token,
|
|
||||||
}
|
|
||||||
if has_user_id and uid:
|
|
||||||
kwargs["user_id"] = uid
|
|
||||||
if has_user and u:
|
|
||||||
kwargs["user"] = u
|
|
||||||
self.session.add(self.model(**kwargs))
|
|
||||||
# commit to delete and add simultaneously
|
|
||||||
self.session.commit()
|
|
||||||
# invalidate cache
|
|
||||||
self.cache.delete(self.make_cache_key(
|
|
||||||
blueprint=blueprint, user=user, user_id=user_id
|
|
||||||
))
|
|
||||||
|
|
||||||
def delete(self, blueprint, user=None, user_id=None):
|
# cache the result
|
||||||
query = (
|
self.cache.set(cache_key, token)
|
||||||
self.session.query(self.model)
|
|
||||||
.filter_by(provider=self.provider_id)
|
|
||||||
)
|
|
||||||
uid = first([user_id, self.user_id, blueprint.config.get("user_id")])
|
|
||||||
u = first(_get_real_user(ref, self.anon_user)
|
|
||||||
for ref in (user, self.user, blueprint.config.get("user")))
|
|
||||||
|
|
||||||
if self.user_required and not u and not uid:
|
return token
|
||||||
raise ValueError("Cannot delete OAuth token without an associated user")
|
|
||||||
|
|
||||||
# check for user ID
|
def set(self, blueprint, token, user=None, user_id=None):
|
||||||
if hasattr(self.model, "user_id") and uid:
|
uid = first([user_id, self.user_id, blueprint.config.get("user_id")])
|
||||||
query = query.filter_by(user_id=uid)
|
u = first(_get_real_user(ref, self.anon_user)
|
||||||
# check for user (relationship property)
|
for ref in (user, self.user, blueprint.config.get("user")))
|
||||||
elif hasattr(self.model, "user") and u:
|
|
||||||
query = query.filter_by(user=u)
|
|
||||||
# if we have the property, but not value, filter by None
|
|
||||||
elif hasattr(self.model, "user_id"):
|
|
||||||
query = query.filter_by(user_id=None)
|
|
||||||
# run query
|
|
||||||
query.delete()
|
|
||||||
self.session.commit()
|
|
||||||
# invalidate cache
|
|
||||||
self.cache.delete(self.make_cache_key(
|
|
||||||
blueprint=blueprint, user=user, user_id=user_id,
|
|
||||||
))
|
|
||||||
|
|
||||||
except Exception:
|
if self.user_required and not u and not uid:
|
||||||
pass
|
raise ValueError("Cannot set OAuth token without an associated user")
|
||||||
|
|
||||||
|
# if there was an existing model, delete it
|
||||||
|
existing_query = (
|
||||||
|
self.session.query(self.model)
|
||||||
|
.filter_by(provider=self.provider_id)
|
||||||
|
)
|
||||||
|
# check for user ID
|
||||||
|
has_user_id = hasattr(self.model, "user_id")
|
||||||
|
if has_user_id and uid:
|
||||||
|
existing_query = existing_query.filter_by(user_id=uid)
|
||||||
|
# check for user (relationship property)
|
||||||
|
has_user = hasattr(self.model, "user")
|
||||||
|
if has_user and u:
|
||||||
|
existing_query = existing_query.filter_by(user=u)
|
||||||
|
# queue up delete query -- won't be run until commit()
|
||||||
|
existing_query.delete()
|
||||||
|
# create a new model for this token
|
||||||
|
kwargs = {
|
||||||
|
"provider": self.provider_id,
|
||||||
|
"token": token,
|
||||||
|
}
|
||||||
|
if has_user_id and uid:
|
||||||
|
kwargs["user_id"] = uid
|
||||||
|
if has_user and u:
|
||||||
|
kwargs["user"] = u
|
||||||
|
self.session.add(self.model(**kwargs))
|
||||||
|
# commit to delete and add simultaneously
|
||||||
|
self.session.commit()
|
||||||
|
# invalidate cache
|
||||||
|
self.cache.delete(self.make_cache_key(
|
||||||
|
blueprint=blueprint, user=user, user_id=user_id
|
||||||
|
))
|
||||||
|
|
||||||
|
def delete(self, blueprint, user=None, user_id=None):
|
||||||
|
query = (
|
||||||
|
self.session.query(self.model)
|
||||||
|
.filter_by(provider=self.provider_id)
|
||||||
|
)
|
||||||
|
uid = first([user_id, self.user_id, blueprint.config.get("user_id")])
|
||||||
|
u = first(_get_real_user(ref, self.anon_user)
|
||||||
|
for ref in (user, self.user, blueprint.config.get("user")))
|
||||||
|
|
||||||
|
if self.user_required and not u and not uid:
|
||||||
|
raise ValueError("Cannot delete OAuth token without an associated user")
|
||||||
|
|
||||||
|
# check for user ID
|
||||||
|
if hasattr(self.model, "user_id") and uid:
|
||||||
|
query = query.filter_by(user_id=uid)
|
||||||
|
# check for user (relationship property)
|
||||||
|
elif hasattr(self.model, "user") and u:
|
||||||
|
query = query.filter_by(user=u)
|
||||||
|
# if we have the property, but not value, filter by None
|
||||||
|
elif hasattr(self.model, "user_id"):
|
||||||
|
query = query.filter_by(user_id=None)
|
||||||
|
# run query
|
||||||
|
query.delete()
|
||||||
|
self.session.commit()
|
||||||
|
# invalidate cache
|
||||||
|
self.cache.delete(self.make_cache_key(
|
||||||
|
blueprint=blueprint, user=user, user_id=user_id,
|
||||||
|
))
|
||||||
|
|
346
cps/oauth_bb.py
|
@ -30,15 +30,20 @@ from flask_babel import gettext as _
|
||||||
from flask_dance.consumer import oauth_authorized, oauth_error
|
from flask_dance.consumer import oauth_authorized, oauth_error
|
||||||
from flask_dance.contrib.github import make_github_blueprint, github
|
from flask_dance.contrib.github import make_github_blueprint, github
|
||||||
from flask_dance.contrib.google import make_google_blueprint, google
|
from flask_dance.contrib.google import make_google_blueprint, google
|
||||||
from flask_login import login_user, current_user
|
from oauthlib.oauth2 import TokenExpiredError, InvalidGrantError
|
||||||
|
from flask_login import login_user, current_user, login_required
|
||||||
from sqlalchemy.orm.exc import NoResultFound
|
from sqlalchemy.orm.exc import NoResultFound
|
||||||
|
|
||||||
from . import constants, logger, config, app, ub
|
from . import constants, logger, config, app, ub
|
||||||
from .web import login_required
|
|
||||||
from .oauth import OAuthBackend, backend_resultcode
|
try:
|
||||||
|
from .oauth import OAuthBackend, backend_resultcode
|
||||||
|
except NameError:
|
||||||
|
pass
|
||||||
|
|
||||||
|
|
||||||
oauth_check = {}
|
oauth_check = {}
|
||||||
|
oauthblueprints = []
|
||||||
oauth = Blueprint('oauth', __name__)
|
oauth = Blueprint('oauth', __name__)
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
|
@ -84,11 +89,7 @@ def register_user_with_oauth(user=None):
|
||||||
except NoResultFound:
|
except NoResultFound:
|
||||||
# no found, return error
|
# no found, return error
|
||||||
return
|
return
|
||||||
try:
|
ub.session_commit("User {} with OAuth for provider {} registered".format(user.name, oauth_key))
|
||||||
ub.session.commit()
|
|
||||||
except Exception as e:
|
|
||||||
log.exception(e)
|
|
||||||
ub.session.rollback()
|
|
||||||
|
|
||||||
|
|
||||||
def logout_oauth_user():
|
def logout_oauth_user():
|
||||||
|
@ -97,19 +98,122 @@ def logout_oauth_user():
|
||||||
session.pop(str(oauth_key) + '_oauth_user_id')
|
session.pop(str(oauth_key) + '_oauth_user_id')
|
||||||
|
|
||||||
|
|
||||||
if ub.oauth_support:
|
def oauth_update_token(provider_id, token, provider_user_id):
|
||||||
oauthblueprints = []
|
session[provider_id + "_oauth_user_id"] = provider_user_id
|
||||||
|
session[provider_id + "_oauth_token"] = token
|
||||||
|
|
||||||
|
# Find this OAuth token in the database, or create it
|
||||||
|
query = ub.session.query(ub.OAuth).filter_by(
|
||||||
|
provider=provider_id,
|
||||||
|
provider_user_id=provider_user_id,
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
oauth_entry = query.one()
|
||||||
|
# update token
|
||||||
|
oauth_entry.token = token
|
||||||
|
except NoResultFound:
|
||||||
|
oauth_entry = ub.OAuth(
|
||||||
|
provider=provider_id,
|
||||||
|
provider_user_id=provider_user_id,
|
||||||
|
token=token,
|
||||||
|
)
|
||||||
|
ub.session.add(oauth_entry)
|
||||||
|
ub.session_commit()
|
||||||
|
|
||||||
|
# Disable Flask-Dance's default behavior for saving the OAuth token
|
||||||
|
# Value differrs depending on flask-dance version
|
||||||
|
return backend_resultcode
|
||||||
|
|
||||||
|
|
||||||
|
def bind_oauth_or_register(provider_id, provider_user_id, redirect_url, provider_name):
|
||||||
|
query = ub.session.query(ub.OAuth).filter_by(
|
||||||
|
provider=provider_id,
|
||||||
|
provider_user_id=provider_user_id,
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
oauth_entry = query.first()
|
||||||
|
# already bind with user, just login
|
||||||
|
if oauth_entry.user:
|
||||||
|
login_user(oauth_entry.user)
|
||||||
|
log.debug(u"You are now logged in as: '%s'", oauth_entry.user.name)
|
||||||
|
flash(_(u"you are now logged in as: '%(nickname)s'", nickname= oauth_entry.user.name),
|
||||||
|
category="success")
|
||||||
|
return redirect(url_for('web.index'))
|
||||||
|
else:
|
||||||
|
# bind to current user
|
||||||
|
if current_user and current_user.is_authenticated:
|
||||||
|
oauth_entry.user = current_user
|
||||||
|
try:
|
||||||
|
ub.session.add(oauth_entry)
|
||||||
|
ub.session.commit()
|
||||||
|
flash(_(u"Link to %(oauth)s Succeeded", oauth=provider_name), category="success")
|
||||||
|
log.info("Link to {} Succeeded".format(provider_name))
|
||||||
|
return redirect(url_for('web.profile'))
|
||||||
|
except Exception as ex:
|
||||||
|
log.debug_or_exception(ex)
|
||||||
|
ub.session.rollback()
|
||||||
|
else:
|
||||||
|
flash(_(u"Login failed, No User Linked With OAuth Account"), category="error")
|
||||||
|
log.info('Login failed, No User Linked With OAuth Account')
|
||||||
|
return redirect(url_for('web.login'))
|
||||||
|
# return redirect(url_for('web.login'))
|
||||||
|
# if config.config_public_reg:
|
||||||
|
# return redirect(url_for('web.register'))
|
||||||
|
# else:
|
||||||
|
# flash(_(u"Public registration is not enabled"), category="error")
|
||||||
|
# return redirect(url_for(redirect_url))
|
||||||
|
except (NoResultFound, AttributeError):
|
||||||
|
return redirect(url_for(redirect_url))
|
||||||
|
|
||||||
|
|
||||||
|
def get_oauth_status():
|
||||||
|
status = []
|
||||||
|
query = ub.session.query(ub.OAuth).filter_by(
|
||||||
|
user_id=current_user.id,
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
oauths = query.all()
|
||||||
|
for oauth_entry in oauths:
|
||||||
|
status.append(int(oauth_entry.provider))
|
||||||
|
return status
|
||||||
|
except NoResultFound:
|
||||||
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
def unlink_oauth(provider):
|
||||||
|
if request.host_url + 'me' != request.referrer:
|
||||||
|
pass
|
||||||
|
query = ub.session.query(ub.OAuth).filter_by(
|
||||||
|
provider=provider,
|
||||||
|
user_id=current_user.id,
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
oauth_entry = query.one()
|
||||||
|
if current_user and current_user.is_authenticated:
|
||||||
|
oauth_entry.user = current_user
|
||||||
|
try:
|
||||||
|
ub.session.delete(oauth_entry)
|
||||||
|
ub.session.commit()
|
||||||
|
logout_oauth_user()
|
||||||
|
flash(_(u"Unlink to %(oauth)s Succeeded", oauth=oauth_check[provider]), category="success")
|
||||||
|
log.info("Unlink to {} Succeeded".format(oauth_check[provider]))
|
||||||
|
except Exception as ex:
|
||||||
|
log.debug_or_exception(ex)
|
||||||
|
ub.session.rollback()
|
||||||
|
flash(_(u"Unlink to %(oauth)s Failed", oauth=oauth_check[provider]), category="error")
|
||||||
|
except NoResultFound:
|
||||||
|
log.warning("oauth %s for user %d not found", provider, current_user.id)
|
||||||
|
flash(_(u"Not Linked to %(oauth)s", oauth=provider), category="error")
|
||||||
|
return redirect(url_for('web.profile'))
|
||||||
|
|
||||||
|
def generate_oauth_blueprints():
|
||||||
if not ub.session.query(ub.OAuthProvider).count():
|
if not ub.session.query(ub.OAuthProvider).count():
|
||||||
oauthProvider = ub.OAuthProvider()
|
for provider in ("github", "google"):
|
||||||
oauthProvider.provider_name = "github"
|
oauthProvider = ub.OAuthProvider()
|
||||||
oauthProvider.active = False
|
oauthProvider.provider_name = provider
|
||||||
ub.session.add(oauthProvider)
|
oauthProvider.active = False
|
||||||
ub.session.commit()
|
ub.session.add(oauthProvider)
|
||||||
oauthProvider = ub.OAuthProvider()
|
ub.session_commit("{} Blueprint Created".format(provider))
|
||||||
oauthProvider.provider_name = "google"
|
|
||||||
oauthProvider.active = False
|
|
||||||
ub.session.add(oauthProvider)
|
|
||||||
ub.session.commit()
|
|
||||||
|
|
||||||
oauth_ids = ub.session.query(ub.OAuthProvider).all()
|
oauth_ids = ub.session.query(ub.OAuthProvider).all()
|
||||||
ele1 = dict(provider_name='github',
|
ele1 = dict(provider_name='github',
|
||||||
|
@ -146,17 +250,23 @@ if ub.oauth_support:
|
||||||
app.register_blueprint(blueprint, url_prefix="/login")
|
app.register_blueprint(blueprint, url_prefix="/login")
|
||||||
if element['active']:
|
if element['active']:
|
||||||
register_oauth_blueprint(element['id'], element['provider_name'])
|
register_oauth_blueprint(element['id'], element['provider_name'])
|
||||||
|
return oauthblueprints
|
||||||
|
|
||||||
|
|
||||||
|
if ub.oauth_support:
|
||||||
|
oauthblueprints = generate_oauth_blueprints()
|
||||||
|
|
||||||
@oauth_authorized.connect_via(oauthblueprints[0]['blueprint'])
|
@oauth_authorized.connect_via(oauthblueprints[0]['blueprint'])
|
||||||
def github_logged_in(blueprint, token):
|
def github_logged_in(blueprint, token):
|
||||||
if not token:
|
if not token:
|
||||||
flash(_(u"Failed to log in with GitHub."), category="error")
|
flash(_(u"Failed to log in with GitHub."), category="error")
|
||||||
|
log.error("Failed to log in with GitHub")
|
||||||
return False
|
return False
|
||||||
|
|
||||||
resp = blueprint.session.get("/user")
|
resp = blueprint.session.get("/user")
|
||||||
if not resp.ok:
|
if not resp.ok:
|
||||||
flash(_(u"Failed to fetch user info from GitHub."), category="error")
|
flash(_(u"Failed to fetch user info from GitHub."), category="error")
|
||||||
|
log.error("Failed to fetch user info from GitHub")
|
||||||
return False
|
return False
|
||||||
|
|
||||||
github_info = resp.json()
|
github_info = resp.json()
|
||||||
|
@ -168,11 +278,13 @@ if ub.oauth_support:
|
||||||
def google_logged_in(blueprint, token):
|
def google_logged_in(blueprint, token):
|
||||||
if not token:
|
if not token:
|
||||||
flash(_(u"Failed to log in with Google."), category="error")
|
flash(_(u"Failed to log in with Google."), category="error")
|
||||||
|
log.error("Failed to log in with Google")
|
||||||
return False
|
return False
|
||||||
|
|
||||||
resp = blueprint.session.get("/oauth2/v2/userinfo")
|
resp = blueprint.session.get("/oauth2/v2/userinfo")
|
||||||
if not resp.ok:
|
if not resp.ok:
|
||||||
flash(_(u"Failed to fetch user info from Google."), category="error")
|
flash(_(u"Failed to fetch user info from Google."), category="error")
|
||||||
|
log.error("Failed to fetch user info from Google")
|
||||||
return False
|
return False
|
||||||
|
|
||||||
google_info = resp.json()
|
google_info = resp.json()
|
||||||
|
@ -180,117 +292,6 @@ if ub.oauth_support:
|
||||||
return oauth_update_token(str(oauthblueprints[1]['id']), token, google_user_id)
|
return oauth_update_token(str(oauthblueprints[1]['id']), token, google_user_id)
|
||||||
|
|
||||||
|
|
||||||
def oauth_update_token(provider_id, token, provider_user_id):
|
|
||||||
session[provider_id + "_oauth_user_id"] = provider_user_id
|
|
||||||
session[provider_id + "_oauth_token"] = token
|
|
||||||
|
|
||||||
# Find this OAuth token in the database, or create it
|
|
||||||
query = ub.session.query(ub.OAuth).filter_by(
|
|
||||||
provider=provider_id,
|
|
||||||
provider_user_id=provider_user_id,
|
|
||||||
)
|
|
||||||
try:
|
|
||||||
oauth_entry = query.one()
|
|
||||||
# update token
|
|
||||||
oauth_entry.token = token
|
|
||||||
except NoResultFound:
|
|
||||||
oauth_entry = ub.OAuth(
|
|
||||||
provider=provider_id,
|
|
||||||
provider_user_id=provider_user_id,
|
|
||||||
token=token,
|
|
||||||
)
|
|
||||||
try:
|
|
||||||
ub.session.add(oauth_entry)
|
|
||||||
ub.session.commit()
|
|
||||||
except Exception as e:
|
|
||||||
log.exception(e)
|
|
||||||
ub.session.rollback()
|
|
||||||
|
|
||||||
# Disable Flask-Dance's default behavior for saving the OAuth token
|
|
||||||
# Value differrs depending on flask-dance version
|
|
||||||
return backend_resultcode
|
|
||||||
|
|
||||||
|
|
||||||
def bind_oauth_or_register(provider_id, provider_user_id, redirect_url, provider_name):
|
|
||||||
query = ub.session.query(ub.OAuth).filter_by(
|
|
||||||
provider=provider_id,
|
|
||||||
provider_user_id=provider_user_id,
|
|
||||||
)
|
|
||||||
try:
|
|
||||||
oauth_entry = query.first()
|
|
||||||
# already bind with user, just login
|
|
||||||
if oauth_entry.user:
|
|
||||||
login_user(oauth_entry.user)
|
|
||||||
log.debug(u"You are now logged in as: '%s'", oauth_entry.user.nickname)
|
|
||||||
flash(_(u"you are now logged in as: '%(nickname)s'", nickname= oauth_entry.user.nickname),
|
|
||||||
category="success")
|
|
||||||
return redirect(url_for('web.index'))
|
|
||||||
else:
|
|
||||||
# bind to current user
|
|
||||||
if current_user and current_user.is_authenticated:
|
|
||||||
oauth_entry.user = current_user
|
|
||||||
try:
|
|
||||||
ub.session.add(oauth_entry)
|
|
||||||
ub.session.commit()
|
|
||||||
flash(_(u"Link to %(oauth)s Succeeded", oauth=provider_name), category="success")
|
|
||||||
return redirect(url_for('web.profile'))
|
|
||||||
except Exception as e:
|
|
||||||
log.exception(e)
|
|
||||||
ub.session.rollback()
|
|
||||||
else:
|
|
||||||
flash(_(u"Login failed, No User Linked With OAuth Account"), category="error")
|
|
||||||
log.info('Login failed, No User Linked With OAuth Account')
|
|
||||||
return redirect(url_for('web.login'))
|
|
||||||
# return redirect(url_for('web.login'))
|
|
||||||
# if config.config_public_reg:
|
|
||||||
# return redirect(url_for('web.register'))
|
|
||||||
# else:
|
|
||||||
# flash(_(u"Public registration is not enabled"), category="error")
|
|
||||||
# return redirect(url_for(redirect_url))
|
|
||||||
except (NoResultFound, AttributeError):
|
|
||||||
return redirect(url_for(redirect_url))
|
|
||||||
|
|
||||||
|
|
||||||
def get_oauth_status():
|
|
||||||
status = []
|
|
||||||
query = ub.session.query(ub.OAuth).filter_by(
|
|
||||||
user_id=current_user.id,
|
|
||||||
)
|
|
||||||
try:
|
|
||||||
oauths = query.all()
|
|
||||||
for oauth_entry in oauths:
|
|
||||||
status.append(int(oauth_entry.provider))
|
|
||||||
return status
|
|
||||||
except NoResultFound:
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
def unlink_oauth(provider):
|
|
||||||
if request.host_url + 'me' != request.referrer:
|
|
||||||
pass
|
|
||||||
query = ub.session.query(ub.OAuth).filter_by(
|
|
||||||
provider=provider,
|
|
||||||
user_id=current_user.id,
|
|
||||||
)
|
|
||||||
try:
|
|
||||||
oauth_entry = query.one()
|
|
||||||
if current_user and current_user.is_authenticated:
|
|
||||||
oauth_entry.user = current_user
|
|
||||||
try:
|
|
||||||
ub.session.delete(oauth_entry)
|
|
||||||
ub.session.commit()
|
|
||||||
logout_oauth_user()
|
|
||||||
flash(_(u"Unlink to %(oauth)s Succeeded", oauth=oauth_check[provider]), category="success")
|
|
||||||
except Exception as e:
|
|
||||||
log.exception(e)
|
|
||||||
ub.session.rollback()
|
|
||||||
flash(_(u"Unlink to %(oauth)s Failed", oauth=oauth_check[provider]), category="error")
|
|
||||||
except NoResultFound:
|
|
||||||
log.warning("oauth %s for user %d not found", provider, current_user.id)
|
|
||||||
flash(_(u"Not Linked to %(oauth)s", oauth=provider), category="error")
|
|
||||||
return redirect(url_for('web.profile'))
|
|
||||||
|
|
||||||
|
|
||||||
# notify on OAuth provider error
|
# notify on OAuth provider error
|
||||||
@oauth_error.connect_via(oauthblueprints[0]['blueprint'])
|
@oauth_error.connect_via(oauthblueprints[0]['blueprint'])
|
||||||
def github_error(blueprint, error, error_description=None, error_uri=None):
|
def github_error(blueprint, error, error_description=None, error_uri=None):
|
||||||
|
@ -305,39 +306,6 @@ if ub.oauth_support:
|
||||||
) # ToDo: Translate
|
) # ToDo: Translate
|
||||||
flash(msg, category="error")
|
flash(msg, category="error")
|
||||||
|
|
||||||
|
|
||||||
@oauth.route('/link/github')
|
|
||||||
@oauth_required
|
|
||||||
def github_login():
|
|
||||||
if not github.authorized:
|
|
||||||
return redirect(url_for('github.login'))
|
|
||||||
account_info = github.get('/user')
|
|
||||||
if account_info.ok:
|
|
||||||
account_info_json = account_info.json()
|
|
||||||
return bind_oauth_or_register(oauthblueprints[0]['id'], account_info_json['id'], 'github.login', 'github')
|
|
||||||
flash(_(u"GitHub Oauth error, please retry later."), category="error")
|
|
||||||
return redirect(url_for('web.login'))
|
|
||||||
|
|
||||||
|
|
||||||
@oauth.route('/unlink/github', methods=["GET"])
|
|
||||||
@login_required
|
|
||||||
def github_login_unlink():
|
|
||||||
return unlink_oauth(oauthblueprints[0]['id'])
|
|
||||||
|
|
||||||
|
|
||||||
@oauth.route('/link/google')
|
|
||||||
@oauth_required
|
|
||||||
def google_login():
|
|
||||||
if not google.authorized:
|
|
||||||
return redirect(url_for("google.login"))
|
|
||||||
resp = google.get("/oauth2/v2/userinfo")
|
|
||||||
if resp.ok:
|
|
||||||
account_info_json = resp.json()
|
|
||||||
return bind_oauth_or_register(oauthblueprints[1]['id'], account_info_json['id'], 'google.login', 'google')
|
|
||||||
flash(_(u"Google Oauth error, please retry later."), category="error")
|
|
||||||
return redirect(url_for('web.login'))
|
|
||||||
|
|
||||||
|
|
||||||
@oauth_error.connect_via(oauthblueprints[1]['blueprint'])
|
@oauth_error.connect_via(oauthblueprints[1]['blueprint'])
|
||||||
def google_error(blueprint, error, error_description=None, error_uri=None):
|
def google_error(blueprint, error, error_description=None, error_uri=None):
|
||||||
msg = (
|
msg = (
|
||||||
|
@ -352,7 +320,49 @@ if ub.oauth_support:
|
||||||
flash(msg, category="error")
|
flash(msg, category="error")
|
||||||
|
|
||||||
|
|
||||||
@oauth.route('/unlink/google', methods=["GET"])
|
@oauth.route('/link/github')
|
||||||
@login_required
|
@oauth_required
|
||||||
def google_login_unlink():
|
def github_login():
|
||||||
return unlink_oauth(oauthblueprints[1]['id'])
|
if not github.authorized:
|
||||||
|
return redirect(url_for('github.login'))
|
||||||
|
try:
|
||||||
|
account_info = github.get('/user')
|
||||||
|
if account_info.ok:
|
||||||
|
account_info_json = account_info.json()
|
||||||
|
return bind_oauth_or_register(oauthblueprints[0]['id'], account_info_json['id'], 'github.login', 'github')
|
||||||
|
flash(_(u"GitHub Oauth error, please retry later."), category="error")
|
||||||
|
log.error("GitHub Oauth error, please retry later")
|
||||||
|
except (InvalidGrantError, TokenExpiredError) as e:
|
||||||
|
flash(_(u"GitHub Oauth error: {}").format(e), category="error")
|
||||||
|
log.error(e)
|
||||||
|
return redirect(url_for('web.login'))
|
||||||
|
|
||||||
|
|
||||||
|
@oauth.route('/unlink/github', methods=["GET"])
|
||||||
|
@login_required
|
||||||
|
def github_login_unlink():
|
||||||
|
return unlink_oauth(oauthblueprints[0]['id'])
|
||||||
|
|
||||||
|
|
||||||
|
@oauth.route('/link/google')
|
||||||
|
@oauth_required
|
||||||
|
def google_login():
|
||||||
|
if not google.authorized:
|
||||||
|
return redirect(url_for("google.login"))
|
||||||
|
try:
|
||||||
|
resp = google.get("/oauth2/v2/userinfo")
|
||||||
|
if resp.ok:
|
||||||
|
account_info_json = resp.json()
|
||||||
|
return bind_oauth_or_register(oauthblueprints[1]['id'], account_info_json['id'], 'google.login', 'google')
|
||||||
|
flash(_(u"Google Oauth error, please retry later."), category="error")
|
||||||
|
log.error("Google Oauth error, please retry later")
|
||||||
|
except (InvalidGrantError, TokenExpiredError) as e:
|
||||||
|
flash(_(u"Google Oauth error: {}").format(e), category="error")
|
||||||
|
log.error(e)
|
||||||
|
return redirect(url_for('web.login'))
|
||||||
|
|
||||||
|
|
||||||
|
@oauth.route('/unlink/google', methods=["GET"])
|
||||||
|
@login_required
|
||||||
|
def google_login_unlink():
|
||||||
|
return unlink_oauth(oauthblueprints[1]['id'])
|
||||||
|
|
163
cps/opds.py
|
@ -27,13 +27,14 @@ from functools import wraps
|
||||||
|
|
||||||
from flask import Blueprint, request, render_template, Response, g, make_response, abort
|
from flask import Blueprint, request, render_template, Response, g, make_response, abort
|
||||||
from flask_login import current_user
|
from flask_login import current_user
|
||||||
from sqlalchemy.sql.expression import func, text, or_, and_
|
from sqlalchemy.sql.expression import func, text, or_, and_, true
|
||||||
from werkzeug.security import check_password_hash
|
from werkzeug.security import check_password_hash
|
||||||
|
|
||||||
from . import constants, logger, config, db, calibre_db, ub, services, get_locale, isoLanguages
|
from . import constants, logger, config, db, calibre_db, ub, services, get_locale, isoLanguages
|
||||||
from .helper import get_download_link, get_book_cover
|
from .helper import get_download_link, get_book_cover
|
||||||
from .pagination import Pagination
|
from .pagination import Pagination
|
||||||
from .web import render_read_books, download_required, load_user_from_request
|
from .web import render_read_books
|
||||||
|
from .usermanagement import load_user_from_request
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
from babel import Locale as LC
|
from babel import Locale as LC
|
||||||
from babel.core import UnknownLocaleError
|
from babel.core import UnknownLocaleError
|
||||||
|
@ -93,7 +94,45 @@ def feed_cc_search(query):
|
||||||
@opds.route("/opds/search", methods=["GET"])
|
@opds.route("/opds/search", methods=["GET"])
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_normal_search():
|
def feed_normal_search():
|
||||||
return feed_search(request.args.get("query").strip())
|
return feed_search(request.args.get("query", "").strip())
|
||||||
|
|
||||||
|
|
||||||
|
@opds.route("/opds/books")
|
||||||
|
@requires_basic_auth_if_no_ano
|
||||||
|
def feed_booksindex():
|
||||||
|
shift = 0
|
||||||
|
off = int(request.args.get("offset") or 0)
|
||||||
|
entries = calibre_db.session.query(func.upper(func.substr(db.Books.sort, 1, 1)).label('id'))\
|
||||||
|
.filter(calibre_db.common_filters()).group_by(func.upper(func.substr(db.Books.sort, 1, 1))).all()
|
||||||
|
|
||||||
|
elements = []
|
||||||
|
if off == 0:
|
||||||
|
elements.append({'id': "00", 'name':_("All")})
|
||||||
|
shift = 1
|
||||||
|
for entry in entries[
|
||||||
|
off + shift - 1:
|
||||||
|
int(off + int(config.config_books_per_page) - shift)]:
|
||||||
|
elements.append({'id': entry.id, 'name': entry.id})
|
||||||
|
|
||||||
|
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
||||||
|
len(entries) + 1)
|
||||||
|
return render_xml_template('feed.xml',
|
||||||
|
letterelements=elements,
|
||||||
|
folder='opds.feed_letter_books',
|
||||||
|
pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
|
@opds.route("/opds/books/letter/<book_id>")
|
||||||
|
@requires_basic_auth_if_no_ano
|
||||||
|
def feed_letter_books(book_id):
|
||||||
|
off = request.args.get("offset") or 0
|
||||||
|
letter = true() if book_id == "00" else func.upper(db.Books.sort).startswith(book_id)
|
||||||
|
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
||||||
|
db.Books,
|
||||||
|
letter,
|
||||||
|
[db.Books.sort])
|
||||||
|
|
||||||
|
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/new")
|
@opds.route("/opds/new")
|
||||||
|
@ -149,14 +188,41 @@ def feed_hot():
|
||||||
@opds.route("/opds/author")
|
@opds.route("/opds/author")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_authorindex():
|
def feed_authorindex():
|
||||||
off = request.args.get("offset") or 0
|
shift = 0
|
||||||
entries = calibre_db.session.query(db.Authors).join(db.books_authors_link).join(db.Books)\
|
off = int(request.args.get("offset") or 0)
|
||||||
.filter(calibre_db.common_filters())\
|
entries = calibre_db.session.query(func.upper(func.substr(db.Authors.sort, 1, 1)).label('id'))\
|
||||||
.group_by(text('books_authors_link.author'))\
|
.join(db.books_authors_link).join(db.Books).filter(calibre_db.common_filters())\
|
||||||
.order_by(db.Authors.sort).limit(config.config_books_per_page)\
|
.group_by(func.upper(func.substr(db.Authors.sort, 1, 1))).all()
|
||||||
.offset(off)
|
|
||||||
|
elements = []
|
||||||
|
if off == 0:
|
||||||
|
elements.append({'id': "00", 'name':_("All")})
|
||||||
|
shift = 1
|
||||||
|
for entry in entries[
|
||||||
|
off + shift - 1:
|
||||||
|
int(off + int(config.config_books_per_page) - shift)]:
|
||||||
|
elements.append({'id': entry.id, 'name': entry.id})
|
||||||
|
|
||||||
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
||||||
len(calibre_db.session.query(db.Authors).all()))
|
len(entries) + 1)
|
||||||
|
return render_xml_template('feed.xml',
|
||||||
|
letterelements=elements,
|
||||||
|
folder='opds.feed_letter_author',
|
||||||
|
pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
|
@opds.route("/opds/author/letter/<book_id>")
|
||||||
|
@requires_basic_auth_if_no_ano
|
||||||
|
def feed_letter_author(book_id):
|
||||||
|
off = request.args.get("offset") or 0
|
||||||
|
letter = true() if book_id == "00" else func.upper(db.Authors.sort).startswith(book_id)
|
||||||
|
entries = calibre_db.session.query(db.Authors).join(db.books_authors_link).join(db.Books)\
|
||||||
|
.filter(calibre_db.common_filters()).filter(letter)\
|
||||||
|
.group_by(text('books_authors_link.author'))\
|
||||||
|
.order_by(db.Authors.sort)
|
||||||
|
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
||||||
|
entries.count())
|
||||||
|
entries = entries.limit(config.config_books_per_page).offset(off).all()
|
||||||
return render_xml_template('feed.xml', listelements=entries, folder='opds.feed_author', pagination=pagination)
|
return render_xml_template('feed.xml', listelements=entries, folder='opds.feed_author', pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
|
@ -200,17 +266,41 @@ def feed_publisher(book_id):
|
||||||
@opds.route("/opds/category")
|
@opds.route("/opds/category")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_categoryindex():
|
def feed_categoryindex():
|
||||||
|
shift = 0
|
||||||
|
off = int(request.args.get("offset") or 0)
|
||||||
|
entries = calibre_db.session.query(func.upper(func.substr(db.Tags.name, 1, 1)).label('id'))\
|
||||||
|
.join(db.books_tags_link).join(db.Books).filter(calibre_db.common_filters())\
|
||||||
|
.group_by(func.upper(func.substr(db.Tags.name, 1, 1))).all()
|
||||||
|
elements = []
|
||||||
|
if off == 0:
|
||||||
|
elements.append({'id': "00", 'name':_("All")})
|
||||||
|
shift = 1
|
||||||
|
for entry in entries[
|
||||||
|
off + shift - 1:
|
||||||
|
int(off + int(config.config_books_per_page) - shift)]:
|
||||||
|
elements.append({'id': entry.id, 'name': entry.id})
|
||||||
|
|
||||||
|
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
||||||
|
len(entries) + 1)
|
||||||
|
return render_xml_template('feed.xml',
|
||||||
|
letterelements=elements,
|
||||||
|
folder='opds.feed_letter_category',
|
||||||
|
pagination=pagination)
|
||||||
|
|
||||||
|
@opds.route("/opds/category/letter/<book_id>")
|
||||||
|
@requires_basic_auth_if_no_ano
|
||||||
|
def feed_letter_category(book_id):
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
|
letter = true() if book_id == "00" else func.upper(db.Tags.name).startswith(book_id)
|
||||||
entries = calibre_db.session.query(db.Tags)\
|
entries = calibre_db.session.query(db.Tags)\
|
||||||
.join(db.books_tags_link)\
|
.join(db.books_tags_link)\
|
||||||
.join(db.Books)\
|
.join(db.Books)\
|
||||||
.filter(calibre_db.common_filters())\
|
.filter(calibre_db.common_filters()).filter(letter)\
|
||||||
.group_by(text('books_tags_link.tag'))\
|
.group_by(text('books_tags_link.tag'))\
|
||||||
.order_by(db.Tags.name)\
|
.order_by(db.Tags.name)
|
||||||
.offset(off)\
|
|
||||||
.limit(config.config_books_per_page)
|
|
||||||
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
||||||
len(calibre_db.session.query(db.Tags).all()))
|
entries.count())
|
||||||
|
entries = entries.offset(off).limit(config.config_books_per_page).all()
|
||||||
return render_xml_template('feed.xml', listelements=entries, folder='opds.feed_category', pagination=pagination)
|
return render_xml_template('feed.xml', listelements=entries, folder='opds.feed_category', pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
|
@ -228,16 +318,40 @@ def feed_category(book_id):
|
||||||
@opds.route("/opds/series")
|
@opds.route("/opds/series")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_seriesindex():
|
def feed_seriesindex():
|
||||||
|
shift = 0
|
||||||
|
off = int(request.args.get("offset") or 0)
|
||||||
|
entries = calibre_db.session.query(func.upper(func.substr(db.Series.sort, 1, 1)).label('id'))\
|
||||||
|
.join(db.books_series_link).join(db.Books).filter(calibre_db.common_filters())\
|
||||||
|
.group_by(func.upper(func.substr(db.Series.sort, 1, 1))).all()
|
||||||
|
elements = []
|
||||||
|
if off == 0:
|
||||||
|
elements.append({'id': "00", 'name':_("All")})
|
||||||
|
shift = 1
|
||||||
|
for entry in entries[
|
||||||
|
off + shift - 1:
|
||||||
|
int(off + int(config.config_books_per_page) - shift)]:
|
||||||
|
elements.append({'id': entry.id, 'name': entry.id})
|
||||||
|
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
||||||
|
len(entries) + 1)
|
||||||
|
return render_xml_template('feed.xml',
|
||||||
|
letterelements=elements,
|
||||||
|
folder='opds.feed_letter_series',
|
||||||
|
pagination=pagination)
|
||||||
|
|
||||||
|
@opds.route("/opds/series/letter/<book_id>")
|
||||||
|
@requires_basic_auth_if_no_ano
|
||||||
|
def feed_letter_series(book_id):
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
|
letter = true() if book_id == "00" else func.upper(db.Series.sort).startswith(book_id)
|
||||||
entries = calibre_db.session.query(db.Series)\
|
entries = calibre_db.session.query(db.Series)\
|
||||||
.join(db.books_series_link)\
|
.join(db.books_series_link)\
|
||||||
.join(db.Books)\
|
.join(db.Books)\
|
||||||
.filter(calibre_db.common_filters())\
|
.filter(calibre_db.common_filters()).filter(letter)\
|
||||||
.group_by(text('books_series_link.series'))\
|
.group_by(text('books_series_link.series'))\
|
||||||
.order_by(db.Series.sort)\
|
.order_by(db.Series.sort)
|
||||||
.offset(off).all()
|
|
||||||
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
||||||
len(calibre_db.session.query(db.Series).all()))
|
entries.count())
|
||||||
|
entries = entries.offset(off).limit(config.config_books_per_page).all()
|
||||||
return render_xml_template('feed.xml', listelements=entries, folder='opds.feed_series', pagination=pagination)
|
return render_xml_template('feed.xml', listelements=entries, folder='opds.feed_series', pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
|
@ -268,7 +382,7 @@ def feed_ratingindex():
|
||||||
len(entries))
|
len(entries))
|
||||||
element = list()
|
element = list()
|
||||||
for entry in entries:
|
for entry in entries:
|
||||||
element.append(FeedObject(entry[0].id, "{} Stars".format(entry.name)))
|
element.append(FeedObject(entry[0].id, _("{} Stars").format(entry.name)))
|
||||||
return render_xml_template('feed.xml', listelements=element, folder='opds.feed_ratings', pagination=pagination)
|
return render_xml_template('feed.xml', listelements=element, folder='opds.feed_ratings', pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
|
@ -427,9 +541,14 @@ def check_auth(username, password):
|
||||||
username = username.encode('windows-1252')
|
username = username.encode('windows-1252')
|
||||||
except UnicodeEncodeError:
|
except UnicodeEncodeError:
|
||||||
username = username.encode('utf-8')
|
username = username.encode('utf-8')
|
||||||
user = ub.session.query(ub.User).filter(func.lower(ub.User.nickname) ==
|
user = ub.session.query(ub.User).filter(func.lower(ub.User.name) ==
|
||||||
username.decode('utf-8').lower()).first()
|
username.decode('utf-8').lower()).first()
|
||||||
return bool(user and check_password_hash(str(user.password), password))
|
if bool(user and check_password_hash(str(user.password), password)):
|
||||||
|
return True
|
||||||
|
else:
|
||||||
|
ip_Address = request.headers.get('X-Forwarded-For', request.remote_addr)
|
||||||
|
log.warning('OPDS Login failed for user "%s" IP-address: %s', username.decode('utf-8'), ip_Address)
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
def authenticate():
|
def authenticate():
|
||||||
|
|
138
cps/remotelogin.py
Normal file
|
@ -0,0 +1,138 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2018-2019 OzzieIsaacs, cervinko, jkrehm, bodybybuddha, ok11,
|
||||||
|
# andy29485, idalin, Kyosfonica, wuqi, Kennyl, lemmsh,
|
||||||
|
# falgh1, grunjol, csitko, ytils, xybydy, trasba, vrabe,
|
||||||
|
# ruben-herold, marblepebble, JackED42, SiphonSquirrel,
|
||||||
|
# apetresc, nanu-c, mutschler
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
import json
|
||||||
|
from datetime import datetime
|
||||||
|
|
||||||
|
from flask import Blueprint, request, make_response, abort, url_for, flash, redirect
|
||||||
|
from flask_login import login_required, current_user, login_user
|
||||||
|
from flask_babel import gettext as _
|
||||||
|
from sqlalchemy.sql.expression import true
|
||||||
|
|
||||||
|
from . import config, logger, ub
|
||||||
|
from .render_template import render_title_template
|
||||||
|
|
||||||
|
try:
|
||||||
|
from functools import wraps
|
||||||
|
except ImportError:
|
||||||
|
pass # We're not using Python 3
|
||||||
|
|
||||||
|
remotelogin = Blueprint('remotelogin', __name__)
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
|
|
||||||
|
def remote_login_required(f):
|
||||||
|
@wraps(f)
|
||||||
|
def inner(*args, **kwargs):
|
||||||
|
if config.config_remote_login:
|
||||||
|
return f(*args, **kwargs)
|
||||||
|
if request.headers.get('X-Requested-With') == 'XMLHttpRequest':
|
||||||
|
data = {'status': 'error', 'message': 'Forbidden'}
|
||||||
|
response = make_response(json.dumps(data, ensure_ascii=False))
|
||||||
|
response.headers["Content-Type"] = "application/json; charset=utf-8"
|
||||||
|
return response, 403
|
||||||
|
abort(403)
|
||||||
|
|
||||||
|
return inner
|
||||||
|
|
||||||
|
@remotelogin.route('/remote/login')
|
||||||
|
@remote_login_required
|
||||||
|
def remote_login():
|
||||||
|
auth_token = ub.RemoteAuthToken()
|
||||||
|
ub.session.add(auth_token)
|
||||||
|
ub.session_commit()
|
||||||
|
verify_url = url_for('remotelogin.verify_token', token=auth_token.auth_token, _external=true)
|
||||||
|
log.debug(u"Remot Login request with token: %s", auth_token.auth_token)
|
||||||
|
return render_title_template('remote_login.html', title=_(u"login"), token=auth_token.auth_token,
|
||||||
|
verify_url=verify_url, page="remotelogin")
|
||||||
|
|
||||||
|
|
||||||
|
@remotelogin.route('/verify/<token>')
|
||||||
|
@remote_login_required
|
||||||
|
@login_required
|
||||||
|
def verify_token(token):
|
||||||
|
auth_token = ub.session.query(ub.RemoteAuthToken).filter(ub.RemoteAuthToken.auth_token == token).first()
|
||||||
|
|
||||||
|
# Token not found
|
||||||
|
if auth_token is None:
|
||||||
|
flash(_(u"Token not found"), category="error")
|
||||||
|
log.error(u"Remote Login token not found")
|
||||||
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
|
# Token expired
|
||||||
|
elif datetime.now() > auth_token.expiration:
|
||||||
|
ub.session.delete(auth_token)
|
||||||
|
ub.session_commit()
|
||||||
|
|
||||||
|
flash(_(u"Token has expired"), category="error")
|
||||||
|
log.error(u"Remote Login token expired")
|
||||||
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
|
# Update token with user information
|
||||||
|
auth_token.user_id = current_user.id
|
||||||
|
auth_token.verified = True
|
||||||
|
ub.session_commit()
|
||||||
|
|
||||||
|
flash(_(u"Success! Please return to your device"), category="success")
|
||||||
|
log.debug(u"Remote Login token for userid %s verified", auth_token.user_id)
|
||||||
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
|
|
||||||
|
@remotelogin.route('/ajax/verify_token', methods=['POST'])
|
||||||
|
@remote_login_required
|
||||||
|
def token_verified():
|
||||||
|
token = request.form['token']
|
||||||
|
auth_token = ub.session.query(ub.RemoteAuthToken).filter(ub.RemoteAuthToken.auth_token == token).first()
|
||||||
|
|
||||||
|
data = {}
|
||||||
|
|
||||||
|
# Token not found
|
||||||
|
if auth_token is None:
|
||||||
|
data['status'] = 'error'
|
||||||
|
data['message'] = _(u"Token not found")
|
||||||
|
|
||||||
|
# Token expired
|
||||||
|
elif datetime.now() > auth_token.expiration:
|
||||||
|
ub.session.delete(auth_token)
|
||||||
|
ub.session_commit()
|
||||||
|
|
||||||
|
data['status'] = 'error'
|
||||||
|
data['message'] = _(u"Token has expired")
|
||||||
|
|
||||||
|
elif not auth_token.verified:
|
||||||
|
data['status'] = 'not_verified'
|
||||||
|
|
||||||
|
else:
|
||||||
|
user = ub.session.query(ub.User).filter(ub.User.id == auth_token.user_id).first()
|
||||||
|
login_user(user)
|
||||||
|
|
||||||
|
ub.session.delete(auth_token)
|
||||||
|
ub.session_commit("User {} logged in via remotelogin, token deleted".format(user.name))
|
||||||
|
|
||||||
|
data['status'] = 'success'
|
||||||
|
log.debug(u"Remote Login for userid %s succeded", user.id)
|
||||||
|
flash(_(u"you are now logged in as: '%(nickname)s'", nickname=user.name), category="success")
|
||||||
|
|
||||||
|
response = make_response(json.dumps(data, ensure_ascii=False))
|
||||||
|
response.headers["Content-Type"] = "application/json; charset=utf-8"
|
||||||
|
|
||||||
|
return response
|
122
cps/render_template.py
Normal file
|
@ -0,0 +1,122 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2018-2020 OzzieIsaacs
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
from flask import render_template
|
||||||
|
from flask_babel import gettext as _
|
||||||
|
from flask import g
|
||||||
|
from werkzeug.local import LocalProxy
|
||||||
|
from flask_login import current_user
|
||||||
|
|
||||||
|
from . import config, constants, ub, logger, db, calibre_db
|
||||||
|
from .ub import User
|
||||||
|
|
||||||
|
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
|
def get_sidebar_config(kwargs=None):
|
||||||
|
kwargs = kwargs or []
|
||||||
|
if 'content' in kwargs:
|
||||||
|
content = kwargs['content']
|
||||||
|
content = isinstance(content, (User, LocalProxy)) and not content.role_anonymous()
|
||||||
|
else:
|
||||||
|
content = 'conf' in kwargs
|
||||||
|
sidebar = list()
|
||||||
|
sidebar.append({"glyph": "glyphicon-book", "text": _('Books'), "link": 'web.index', "id": "new",
|
||||||
|
"visibility": constants.SIDEBAR_RECENT, 'public': True, "page": "root",
|
||||||
|
"show_text": _('Show recent books'), "config_show":False})
|
||||||
|
sidebar.append({"glyph": "glyphicon-fire", "text": _('Hot Books'), "link": 'web.books_list', "id": "hot",
|
||||||
|
"visibility": constants.SIDEBAR_HOT, 'public': True, "page": "hot",
|
||||||
|
"show_text": _('Show Hot Books'), "config_show": True})
|
||||||
|
if current_user.role_admin():
|
||||||
|
sidebar.append({"glyph": "glyphicon-download", "text": _('Downloaded Books'), "link": 'web.download_list',
|
||||||
|
"id": "download", "visibility": constants.SIDEBAR_DOWNLOAD, 'public': (not g.user.is_anonymous),
|
||||||
|
"page": "download", "show_text": _('Show Downloaded Books'),
|
||||||
|
"config_show": content})
|
||||||
|
else:
|
||||||
|
sidebar.append({"glyph": "glyphicon-download", "text": _('Downloaded Books'), "link": 'web.books_list',
|
||||||
|
"id": "download", "visibility": constants.SIDEBAR_DOWNLOAD, 'public': (not g.user.is_anonymous),
|
||||||
|
"page": "download", "show_text": _('Show Downloaded Books'),
|
||||||
|
"config_show": content})
|
||||||
|
sidebar.append(
|
||||||
|
{"glyph": "glyphicon-star", "text": _('Top Rated Books'), "link": 'web.books_list', "id": "rated",
|
||||||
|
"visibility": constants.SIDEBAR_BEST_RATED, 'public': True, "page": "rated",
|
||||||
|
"show_text": _('Show Top Rated Books'), "config_show": True})
|
||||||
|
sidebar.append({"glyph": "glyphicon-eye-open", "text": _('Read Books'), "link": 'web.books_list', "id": "read",
|
||||||
|
"visibility": constants.SIDEBAR_READ_AND_UNREAD, 'public': (not g.user.is_anonymous),
|
||||||
|
"page": "read", "show_text": _('Show read and unread'), "config_show": content})
|
||||||
|
sidebar.append(
|
||||||
|
{"glyph": "glyphicon-eye-close", "text": _('Unread Books'), "link": 'web.books_list', "id": "unread",
|
||||||
|
"visibility": constants.SIDEBAR_READ_AND_UNREAD, 'public': (not g.user.is_anonymous), "page": "unread",
|
||||||
|
"show_text": _('Show unread'), "config_show": False})
|
||||||
|
sidebar.append({"glyph": "glyphicon-random", "text": _('Discover'), "link": 'web.books_list', "id": "rand",
|
||||||
|
"visibility": constants.SIDEBAR_RANDOM, 'public': True, "page": "discover",
|
||||||
|
"show_text": _('Show random books'), "config_show": True})
|
||||||
|
sidebar.append({"glyph": "glyphicon-inbox", "text": _('Categories'), "link": 'web.category_list', "id": "cat",
|
||||||
|
"visibility": constants.SIDEBAR_CATEGORY, 'public': True, "page": "category",
|
||||||
|
"show_text": _('Show category selection'), "config_show": True})
|
||||||
|
sidebar.append({"glyph": "glyphicon-bookmark", "text": _('Series'), "link": 'web.series_list', "id": "serie",
|
||||||
|
"visibility": constants.SIDEBAR_SERIES, 'public': True, "page": "series",
|
||||||
|
"show_text": _('Show series selection'), "config_show": True})
|
||||||
|
sidebar.append({"glyph": "glyphicon-user", "text": _('Authors'), "link": 'web.author_list', "id": "author",
|
||||||
|
"visibility": constants.SIDEBAR_AUTHOR, 'public': True, "page": "author",
|
||||||
|
"show_text": _('Show author selection'), "config_show": True})
|
||||||
|
sidebar.append(
|
||||||
|
{"glyph": "glyphicon-text-size", "text": _('Publishers'), "link": 'web.publisher_list', "id": "publisher",
|
||||||
|
"visibility": constants.SIDEBAR_PUBLISHER, 'public': True, "page": "publisher",
|
||||||
|
"show_text": _('Show publisher selection'), "config_show":True})
|
||||||
|
sidebar.append({"glyph": "glyphicon-flag", "text": _('Languages'), "link": 'web.language_overview', "id": "lang",
|
||||||
|
"visibility": constants.SIDEBAR_LANGUAGE, 'public': (g.user.filter_language() == 'all'),
|
||||||
|
"page": "language",
|
||||||
|
"show_text": _('Show language selection'), "config_show": True})
|
||||||
|
sidebar.append({"glyph": "glyphicon-star-empty", "text": _('Ratings'), "link": 'web.ratings_list', "id": "rate",
|
||||||
|
"visibility": constants.SIDEBAR_RATING, 'public': True,
|
||||||
|
"page": "rating", "show_text": _('Show ratings selection'), "config_show": True})
|
||||||
|
sidebar.append({"glyph": "glyphicon-file", "text": _('File formats'), "link": 'web.formats_list', "id": "format",
|
||||||
|
"visibility": constants.SIDEBAR_FORMAT, 'public': True,
|
||||||
|
"page": "format", "show_text": _('Show file formats selection'), "config_show": True})
|
||||||
|
sidebar.append(
|
||||||
|
{"glyph": "glyphicon-trash", "text": _('Archived Books'), "link": 'web.books_list', "id": "archived",
|
||||||
|
"visibility": constants.SIDEBAR_ARCHIVED, 'public': (not g.user.is_anonymous), "page": "archived",
|
||||||
|
"show_text": _('Show archived books'), "config_show": content})
|
||||||
|
sidebar.append(
|
||||||
|
{"glyph": "glyphicon-th-list", "text": _('Books List'), "link": 'web.books_table', "id": "list",
|
||||||
|
"visibility": constants.SIDEBAR_LIST, 'public': (not g.user.is_anonymous), "page": "list",
|
||||||
|
"show_text": _('Show Books List'), "config_show": content})
|
||||||
|
|
||||||
|
return sidebar
|
||||||
|
|
||||||
|
def get_readbooks_ids():
|
||||||
|
if not config.config_read_column:
|
||||||
|
readBooks = ub.session.query(ub.ReadBook).filter(ub.ReadBook.user_id == int(current_user.id))\
|
||||||
|
.filter(ub.ReadBook.read_status == ub.ReadBook.STATUS_FINISHED).all()
|
||||||
|
return frozenset([x.book_id for x in readBooks])
|
||||||
|
else:
|
||||||
|
try:
|
||||||
|
readBooks = calibre_db.session.query(db.cc_classes[config.config_read_column])\
|
||||||
|
.filter(db.cc_classes[config.config_read_column].value == True).all()
|
||||||
|
return frozenset([x.book for x in readBooks])
|
||||||
|
except (KeyError, AttributeError):
|
||||||
|
log.error("Custom Column No.%d is not existing in calibre database", config.config_read_column)
|
||||||
|
return []
|
||||||
|
|
||||||
|
# Returns the template for rendering and includes the instance name
|
||||||
|
def render_title_template(*args, **kwargs):
|
||||||
|
sidebar = get_sidebar_config(kwargs)
|
||||||
|
return render_template(instance=config.config_calibre_web_title, sidebar=sidebar,
|
||||||
|
accept=constants.EXTENSIONS_UPLOAD, read_book_ids=get_readbooks_ids(),
|
||||||
|
*args, **kwargs)
|
|
@ -22,6 +22,7 @@ import os
|
||||||
import errno
|
import errno
|
||||||
import signal
|
import signal
|
||||||
import socket
|
import socket
|
||||||
|
import subprocess # nosec
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from gevent.pywsgi import WSGIServer
|
from gevent.pywsgi import WSGIServer
|
||||||
|
@ -136,6 +137,64 @@ class WebServer(object):
|
||||||
|
|
||||||
return sock, _readable_listen_address(*address)
|
return sock, _readable_listen_address(*address)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def _get_args_for_reloading():
|
||||||
|
"""Determine how the script was executed, and return the args needed
|
||||||
|
to execute it again in a new process.
|
||||||
|
Code from https://github.com/pyload/pyload. Author GammaC0de, voulter
|
||||||
|
"""
|
||||||
|
rv = [sys.executable]
|
||||||
|
py_script = sys.argv[0]
|
||||||
|
args = sys.argv[1:]
|
||||||
|
# Need to look at main module to determine how it was executed.
|
||||||
|
__main__ = sys.modules["__main__"]
|
||||||
|
|
||||||
|
# The value of __package__ indicates how Python was called. It may
|
||||||
|
# not exist if a setuptools script is installed as an egg. It may be
|
||||||
|
# set incorrectly for entry points created with pip on Windows.
|
||||||
|
if getattr(__main__, "__package__", None) is None or (
|
||||||
|
os.name == "nt"
|
||||||
|
and __main__.__package__ == ""
|
||||||
|
and not os.path.exists(py_script)
|
||||||
|
and os.path.exists("{}.exe".format(py_script))
|
||||||
|
):
|
||||||
|
# Executed a file, like "python app.py".
|
||||||
|
py_script = os.path.abspath(py_script)
|
||||||
|
|
||||||
|
if os.name == "nt":
|
||||||
|
# Windows entry points have ".exe" extension and should be
|
||||||
|
# called directly.
|
||||||
|
if not os.path.exists(py_script) and os.path.exists("{}.exe".format(py_script)):
|
||||||
|
py_script += ".exe"
|
||||||
|
|
||||||
|
if (
|
||||||
|
os.path.splitext(sys.executable)[1] == ".exe"
|
||||||
|
and os.path.splitext(py_script)[1] == ".exe"
|
||||||
|
):
|
||||||
|
rv.pop(0)
|
||||||
|
|
||||||
|
rv.append(py_script)
|
||||||
|
else:
|
||||||
|
# Executed a module, like "python -m module".
|
||||||
|
if sys.argv[0] == "-m":
|
||||||
|
args = sys.argv
|
||||||
|
else:
|
||||||
|
if os.path.isfile(py_script):
|
||||||
|
# Rewritten by Python from "-m script" to "/path/to/script.py".
|
||||||
|
py_module = __main__.__package__
|
||||||
|
name = os.path.splitext(os.path.basename(py_script))[0]
|
||||||
|
|
||||||
|
if name != "__main__":
|
||||||
|
py_module += ".{}".format(name)
|
||||||
|
else:
|
||||||
|
# Incorrectly rewritten by pydevd debugger from "-m script" to "script".
|
||||||
|
py_module = py_script
|
||||||
|
|
||||||
|
rv.extend(("-m", py_module.lstrip(".")))
|
||||||
|
|
||||||
|
rv.extend(args)
|
||||||
|
return rv
|
||||||
|
|
||||||
def _start_gevent(self):
|
def _start_gevent(self):
|
||||||
ssl_args = self.ssl_args or {}
|
ssl_args = self.ssl_args or {}
|
||||||
|
|
||||||
|
@ -192,18 +251,16 @@ class WebServer(object):
|
||||||
finally:
|
finally:
|
||||||
self.wsgiserver = None
|
self.wsgiserver = None
|
||||||
|
|
||||||
|
# prevent irritating log of pending tasks message from asyncio
|
||||||
|
logger.get('asyncio').setLevel(logger.logging.CRITICAL)
|
||||||
|
|
||||||
if not self.restart:
|
if not self.restart:
|
||||||
log.info("Performing shutdown of Calibre-Web")
|
log.info("Performing shutdown of Calibre-Web")
|
||||||
# prevent irritiating log of pending tasks message from asyncio
|
|
||||||
logger.get('asyncio').setLevel(logger.logging.CRITICAL)
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
log.info("Performing restart of Calibre-Web")
|
log.info("Performing restart of Calibre-Web")
|
||||||
arguments = list(sys.argv)
|
args = self._get_args_for_reloading()
|
||||||
arguments.insert(0, sys.executable)
|
subprocess.call(args, close_fds=True) # nosec
|
||||||
if os.name == 'nt':
|
|
||||||
arguments = ["\"%s\"" % a for a in arguments]
|
|
||||||
os.execv(sys.executable, arguments)
|
|
||||||
return True
|
return True
|
||||||
|
|
||||||
def _killServer(self, __, ___):
|
def _killServer(self, __, ___):
|
||||||
|
|
|
@ -22,6 +22,7 @@ from base64 import b64decode, b64encode
|
||||||
from jsonschema import validate, exceptions, __version__
|
from jsonschema import validate, exceptions, __version__
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
try:
|
try:
|
||||||
|
# pylint: disable=unused-import
|
||||||
from urllib import unquote
|
from urllib import unquote
|
||||||
except ImportError:
|
except ImportError:
|
||||||
from urllib.parse import unquote
|
from urllib.parse import unquote
|
||||||
|
@ -64,7 +65,7 @@ class SyncToken:
|
||||||
books_last_modified: Datetime representing the last modified book that the device knows about.
|
books_last_modified: Datetime representing the last modified book that the device knows about.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
SYNC_TOKEN_HEADER = "x-kobo-synctoken"
|
SYNC_TOKEN_HEADER = "x-kobo-synctoken" # nosec
|
||||||
VERSION = "1-1-0"
|
VERSION = "1-1-0"
|
||||||
LAST_MODIFIED_ADDED_VERSION = "1-1-0"
|
LAST_MODIFIED_ADDED_VERSION = "1-1-0"
|
||||||
MIN_VERSION = "1-0-0"
|
MIN_VERSION = "1-0-0"
|
||||||
|
@ -85,6 +86,7 @@ class SyncToken:
|
||||||
"archive_last_modified": {"type": "string"},
|
"archive_last_modified": {"type": "string"},
|
||||||
"reading_state_last_modified": {"type": "string"},
|
"reading_state_last_modified": {"type": "string"},
|
||||||
"tags_last_modified": {"type": "string"},
|
"tags_last_modified": {"type": "string"},
|
||||||
|
"books_last_id": {"type": "integer", "optional": True}
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -96,18 +98,20 @@ class SyncToken:
|
||||||
archive_last_modified=datetime.min,
|
archive_last_modified=datetime.min,
|
||||||
reading_state_last_modified=datetime.min,
|
reading_state_last_modified=datetime.min,
|
||||||
tags_last_modified=datetime.min,
|
tags_last_modified=datetime.min,
|
||||||
):
|
books_last_id=-1
|
||||||
|
): # nosec
|
||||||
self.raw_kobo_store_token = raw_kobo_store_token
|
self.raw_kobo_store_token = raw_kobo_store_token
|
||||||
self.books_last_created = books_last_created
|
self.books_last_created = books_last_created
|
||||||
self.books_last_modified = books_last_modified
|
self.books_last_modified = books_last_modified
|
||||||
self.archive_last_modified = archive_last_modified
|
self.archive_last_modified = archive_last_modified
|
||||||
self.reading_state_last_modified = reading_state_last_modified
|
self.reading_state_last_modified = reading_state_last_modified
|
||||||
self.tags_last_modified = tags_last_modified
|
self.tags_last_modified = tags_last_modified
|
||||||
|
self.books_last_id = books_last_id
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def from_headers(headers):
|
def from_headers(headers):
|
||||||
sync_token_header = headers.get(SyncToken.SYNC_TOKEN_HEADER, "")
|
sync_token_header = headers.get(SyncToken.SYNC_TOKEN_HEADER, "")
|
||||||
if sync_token_header == "":
|
if sync_token_header == "": # nosec
|
||||||
return SyncToken()
|
return SyncToken()
|
||||||
|
|
||||||
# On the first sync from a Kobo device, we may receive the SyncToken
|
# On the first sync from a Kobo device, we may receive the SyncToken
|
||||||
|
@ -137,9 +141,12 @@ class SyncToken:
|
||||||
archive_last_modified = get_datetime_from_json(data_json, "archive_last_modified")
|
archive_last_modified = get_datetime_from_json(data_json, "archive_last_modified")
|
||||||
reading_state_last_modified = get_datetime_from_json(data_json, "reading_state_last_modified")
|
reading_state_last_modified = get_datetime_from_json(data_json, "reading_state_last_modified")
|
||||||
tags_last_modified = get_datetime_from_json(data_json, "tags_last_modified")
|
tags_last_modified = get_datetime_from_json(data_json, "tags_last_modified")
|
||||||
|
books_last_id = data_json["books_last_id"]
|
||||||
except TypeError:
|
except TypeError:
|
||||||
log.error("SyncToken timestamps don't parse to a datetime.")
|
log.error("SyncToken timestamps don't parse to a datetime.")
|
||||||
return SyncToken(raw_kobo_store_token=raw_kobo_store_token)
|
return SyncToken(raw_kobo_store_token=raw_kobo_store_token)
|
||||||
|
except KeyError:
|
||||||
|
books_last_id = -1
|
||||||
|
|
||||||
return SyncToken(
|
return SyncToken(
|
||||||
raw_kobo_store_token=raw_kobo_store_token,
|
raw_kobo_store_token=raw_kobo_store_token,
|
||||||
|
@ -147,7 +154,8 @@ class SyncToken:
|
||||||
books_last_modified=books_last_modified,
|
books_last_modified=books_last_modified,
|
||||||
archive_last_modified=archive_last_modified,
|
archive_last_modified=archive_last_modified,
|
||||||
reading_state_last_modified=reading_state_last_modified,
|
reading_state_last_modified=reading_state_last_modified,
|
||||||
tags_last_modified=tags_last_modified
|
tags_last_modified=tags_last_modified,
|
||||||
|
books_last_id=books_last_id
|
||||||
)
|
)
|
||||||
|
|
||||||
def set_kobo_store_header(self, store_headers):
|
def set_kobo_store_header(self, store_headers):
|
||||||
|
@ -170,7 +178,8 @@ class SyncToken:
|
||||||
"books_last_created": to_epoch_timestamp(self.books_last_created),
|
"books_last_created": to_epoch_timestamp(self.books_last_created),
|
||||||
"archive_last_modified": to_epoch_timestamp(self.archive_last_modified),
|
"archive_last_modified": to_epoch_timestamp(self.archive_last_modified),
|
||||||
"reading_state_last_modified": to_epoch_timestamp(self.reading_state_last_modified),
|
"reading_state_last_modified": to_epoch_timestamp(self.reading_state_last_modified),
|
||||||
"tags_last_modified": to_epoch_timestamp(self.tags_last_modified)
|
"tags_last_modified": to_epoch_timestamp(self.tags_last_modified),
|
||||||
|
"books_last_id":self.books_last_id
|
||||||
},
|
},
|
||||||
}
|
}
|
||||||
return b64encode_json(token)
|
return b64encode_json(token)
|
||||||
|
|
|
@ -45,3 +45,9 @@ except ImportError as err:
|
||||||
log.debug("Cannot import SyncToken, syncing books with Kobo Devices will not work: %s", err)
|
log.debug("Cannot import SyncToken, syncing books with Kobo Devices will not work: %s", err)
|
||||||
kobo = None
|
kobo = None
|
||||||
SyncToken = None
|
SyncToken = None
|
||||||
|
|
||||||
|
try:
|
||||||
|
from . import gmail
|
||||||
|
except ImportError as err:
|
||||||
|
log.debug("Cannot import gmail, sending books via Gmail Oauth2 Verification will not work: %s", err)
|
||||||
|
gmail = None
|
||||||
|
|
83
cps/services/gmail.py
Normal file
|
@ -0,0 +1,83 @@
|
||||||
|
from __future__ import print_function
|
||||||
|
import os.path
|
||||||
|
from google_auth_oauthlib.flow import InstalledAppFlow
|
||||||
|
from google.auth.transport.requests import Request
|
||||||
|
from googleapiclient.discovery import build
|
||||||
|
from google.oauth2.credentials import Credentials
|
||||||
|
|
||||||
|
from datetime import datetime
|
||||||
|
import base64
|
||||||
|
from flask_babel import gettext as _
|
||||||
|
from ..constants import BASE_DIR
|
||||||
|
from .. import logger
|
||||||
|
|
||||||
|
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
|
SCOPES = ['openid', 'https://www.googleapis.com/auth/gmail.send', 'https://www.googleapis.com/auth/userinfo.email']
|
||||||
|
|
||||||
|
def setup_gmail(token):
|
||||||
|
# If there are no (valid) credentials available, let the user log in.
|
||||||
|
creds = None
|
||||||
|
if "token" in token:
|
||||||
|
creds = Credentials(
|
||||||
|
token=token['token'],
|
||||||
|
refresh_token=token['refresh_token'],
|
||||||
|
token_uri=token['token_uri'],
|
||||||
|
client_id=token['client_id'],
|
||||||
|
client_secret=token['client_secret'],
|
||||||
|
scopes=token['scopes'],
|
||||||
|
)
|
||||||
|
creds.expiry = datetime.fromisoformat(token['expiry'])
|
||||||
|
|
||||||
|
if not creds or not creds.valid:
|
||||||
|
# don't forget to dump one more time after the refresh
|
||||||
|
# also, some file-locking routines wouldn't be needless
|
||||||
|
if creds and creds.expired and creds.refresh_token:
|
||||||
|
creds.refresh(Request())
|
||||||
|
else:
|
||||||
|
cred_file = os.path.join(BASE_DIR, 'gmail.json')
|
||||||
|
if not os.path.exists(cred_file):
|
||||||
|
raise Exception(_("Found no valid gmail.json file with OAuth information"))
|
||||||
|
flow = InstalledAppFlow.from_client_secrets_file(
|
||||||
|
os.path.join(BASE_DIR, 'gmail.json'), SCOPES)
|
||||||
|
creds = flow.run_local_server(port=0)
|
||||||
|
user_info = get_user_info(creds)
|
||||||
|
return {
|
||||||
|
'token': creds.token,
|
||||||
|
'refresh_token': creds.refresh_token,
|
||||||
|
'token_uri': creds.token_uri,
|
||||||
|
'client_id': creds.client_id,
|
||||||
|
'client_secret': creds.client_secret,
|
||||||
|
'scopes': creds.scopes,
|
||||||
|
'expiry': creds.expiry.isoformat(),
|
||||||
|
'email': user_info
|
||||||
|
}
|
||||||
|
return {}
|
||||||
|
|
||||||
|
def get_user_info(credentials):
|
||||||
|
user_info_service = build(serviceName='oauth2', version='v2',credentials=credentials)
|
||||||
|
user_info = user_info_service.userinfo().get().execute()
|
||||||
|
return user_info.get('email', "")
|
||||||
|
|
||||||
|
def send_messsage(token, msg):
|
||||||
|
log.debug("Start sending email via Gmail")
|
||||||
|
creds = Credentials(
|
||||||
|
token=token['token'],
|
||||||
|
refresh_token=token['refresh_token'],
|
||||||
|
token_uri=token['token_uri'],
|
||||||
|
client_id=token['client_id'],
|
||||||
|
client_secret=token['client_secret'],
|
||||||
|
scopes=token['scopes'],
|
||||||
|
)
|
||||||
|
creds.expiry = datetime.fromisoformat(token['expiry'])
|
||||||
|
if creds and creds.expired and creds.refresh_token:
|
||||||
|
creds.refresh(Request())
|
||||||
|
service = build('gmail', 'v1', credentials=creds)
|
||||||
|
message_as_bytes = msg.as_bytes() # the message should converted from string to bytes.
|
||||||
|
message_as_base64 = base64.urlsafe_b64encode(message_as_bytes) # encode in base64 (printable letters coding)
|
||||||
|
raw = message_as_base64.decode() # convert to something JSON serializable
|
||||||
|
body = {'raw': raw}
|
||||||
|
|
||||||
|
(service.users().messages().send(userId='me', body=body).execute())
|
||||||
|
log.debug("Email send successfully via Gmail")
|
|
@ -45,7 +45,7 @@ class ImprovedQueue(queue.Queue):
|
||||||
with self.mutex:
|
with self.mutex:
|
||||||
return list(self.queue)
|
return list(self.queue)
|
||||||
|
|
||||||
#Class for all worker tasks in the background
|
# Class for all worker tasks in the background
|
||||||
class WorkerThread(threading.Thread):
|
class WorkerThread(threading.Thread):
|
||||||
_instance = None
|
_instance = None
|
||||||
|
|
||||||
|
@ -69,6 +69,7 @@ class WorkerThread(threading.Thread):
|
||||||
def add(cls, user, task):
|
def add(cls, user, task):
|
||||||
ins = cls.getInstance()
|
ins = cls.getInstance()
|
||||||
ins.num += 1
|
ins.num += 1
|
||||||
|
log.debug("Add Task for user: {}: {}".format(user, task))
|
||||||
ins.queue.put(QueuedTask(
|
ins.queue.put(QueuedTask(
|
||||||
num=ins.num,
|
num=ins.num,
|
||||||
user=user,
|
user=user,
|
||||||
|
@ -110,7 +111,7 @@ class WorkerThread(threading.Thread):
|
||||||
# We don't use a daemon here because we don't want the tasks to just be abruptly halted, leading to
|
# We don't use a daemon here because we don't want the tasks to just be abruptly halted, leading to
|
||||||
# possible file / database corruption
|
# possible file / database corruption
|
||||||
item = self.queue.get(timeout=1)
|
item = self.queue.get(timeout=1)
|
||||||
except queue.Empty as ex:
|
except queue.Empty:
|
||||||
time.sleep(1)
|
time.sleep(1)
|
||||||
continue
|
continue
|
||||||
|
|
||||||
|
@ -159,9 +160,9 @@ class CalibreTask:
|
||||||
# catch any unhandled exceptions in a task and automatically fail it
|
# catch any unhandled exceptions in a task and automatically fail it
|
||||||
try:
|
try:
|
||||||
self.run(*args)
|
self.run(*args)
|
||||||
except Exception as e:
|
except Exception as ex:
|
||||||
self._handleError(str(e))
|
self._handleError(str(ex))
|
||||||
log.exception(e)
|
log.debug_or_exception(ex)
|
||||||
|
|
||||||
self.end_time = datetime.now()
|
self.end_time = datetime.now()
|
||||||
|
|
||||||
|
@ -210,7 +211,6 @@ class CalibreTask:
|
||||||
self._progress = x
|
self._progress = x
|
||||||
|
|
||||||
def _handleError(self, error_message):
|
def _handleError(self, error_message):
|
||||||
log.exception(error_message)
|
|
||||||
self.stat = STAT_FAIL
|
self.stat = STAT_FAIL
|
||||||
self.progress = 1
|
self.progress = 1
|
||||||
self.error = error_message
|
self.error = error_message
|
||||||
|
|
300
cps/shelf.py
|
@ -22,15 +22,17 @@
|
||||||
|
|
||||||
from __future__ import division, print_function, unicode_literals
|
from __future__ import division, print_function, unicode_literals
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
import sys
|
||||||
|
|
||||||
from flask import Blueprint, request, flash, redirect, url_for
|
from flask import Blueprint, request, flash, redirect, url_for
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
from flask_login import login_required, current_user
|
from flask_login import login_required, current_user
|
||||||
from sqlalchemy.sql.expression import func
|
from sqlalchemy.sql.expression import func, true
|
||||||
from sqlalchemy.exc import OperationalError, InvalidRequestError
|
from sqlalchemy.exc import OperationalError, InvalidRequestError
|
||||||
|
|
||||||
from . import logger, ub, calibre_db
|
from . import logger, ub, calibre_db, db
|
||||||
from .web import login_required_if_no_ano, render_title_template
|
from .render_template import render_title_template
|
||||||
|
from .usermanagement import login_required_if_no_ano
|
||||||
|
|
||||||
|
|
||||||
shelf = Blueprint('shelf', __name__)
|
shelf = Blueprint('shelf', __name__)
|
||||||
|
@ -97,12 +99,14 @@ def add_to_shelf(shelf_id, book_id):
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
except (OperationalError, InvalidRequestError):
|
except (OperationalError, InvalidRequestError):
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
|
log.error("Settings DB is not Writeable")
|
||||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||||
if "HTTP_REFERER" in request.environ:
|
if "HTTP_REFERER" in request.environ:
|
||||||
return redirect(request.environ["HTTP_REFERER"])
|
return redirect(request.environ["HTTP_REFERER"])
|
||||||
else:
|
else:
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
if not xhr:
|
if not xhr:
|
||||||
|
log.debug("Book has been added to shelf: {}".format(shelf.name))
|
||||||
flash(_(u"Book has been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
flash(_(u"Book has been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
||||||
if "HTTP_REFERER" in request.environ:
|
if "HTTP_REFERER" in request.environ:
|
||||||
return redirect(request.environ["HTTP_REFERER"])
|
return redirect(request.environ["HTTP_REFERER"])
|
||||||
|
@ -121,6 +125,7 @@ def search_to_shelf(shelf_id):
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
if not check_shelf_edit_permissions(shelf):
|
if not check_shelf_edit_permissions(shelf):
|
||||||
|
log.warning("You are not allowed to add a book to the the shelf: {}".format(shelf.name))
|
||||||
flash(_(u"You are not allowed to add a book to the the shelf: %(name)s", name=shelf.name), category="error")
|
flash(_(u"You are not allowed to add a book to the the shelf: %(name)s", name=shelf.name), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
|
@ -138,18 +143,14 @@ def search_to_shelf(shelf_id):
|
||||||
books_for_shelf = ub.searched_ids[current_user.id]
|
books_for_shelf = ub.searched_ids[current_user.id]
|
||||||
|
|
||||||
if not books_for_shelf:
|
if not books_for_shelf:
|
||||||
log.error("Books are already part of %s", shelf)
|
log.error("Books are already part of {}".format(shelf.name))
|
||||||
flash(_(u"Books are already part of the shelf: %(name)s", name=shelf.name), category="error")
|
flash(_(u"Books are already part of the shelf: %(name)s", name=shelf.name), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
maxOrder = ub.session.query(func.max(ub.BookShelf.order)).filter(ub.BookShelf.shelf == shelf_id).first()
|
maxOrder = ub.session.query(func.max(ub.BookShelf.order)).filter(ub.BookShelf.shelf == shelf_id).first()[0] or 0
|
||||||
if maxOrder[0] is None:
|
|
||||||
maxOrder = 0
|
|
||||||
else:
|
|
||||||
maxOrder = maxOrder[0]
|
|
||||||
|
|
||||||
for book in books_for_shelf:
|
for book in books_for_shelf:
|
||||||
maxOrder = maxOrder + 1
|
maxOrder += 1
|
||||||
shelf.books.append(ub.BookShelf(shelf=shelf.id, book_id=book, order=maxOrder))
|
shelf.books.append(ub.BookShelf(shelf=shelf.id, book_id=book, order=maxOrder))
|
||||||
shelf.last_modified = datetime.utcnow()
|
shelf.last_modified = datetime.utcnow()
|
||||||
try:
|
try:
|
||||||
|
@ -158,8 +159,10 @@ def search_to_shelf(shelf_id):
|
||||||
flash(_(u"Books have been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
flash(_(u"Books have been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
||||||
except (OperationalError, InvalidRequestError):
|
except (OperationalError, InvalidRequestError):
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
log.error("Settings DB is not Writeable")
|
||||||
|
flash(_("Settings DB is not Writeable"), category="error")
|
||||||
else:
|
else:
|
||||||
|
log.error("Could not add books to shelf: {}".format(shelf.name))
|
||||||
flash(_(u"Could not add books to shelf: %(sname)s", sname=shelf.name), category="error")
|
flash(_(u"Could not add books to shelf: %(sname)s", sname=shelf.name), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
|
@ -170,7 +173,7 @@ def remove_from_shelf(shelf_id, book_id):
|
||||||
xhr = request.headers.get('X-Requested-With') == 'XMLHttpRequest'
|
xhr = request.headers.get('X-Requested-With') == 'XMLHttpRequest'
|
||||||
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
||||||
if shelf is None:
|
if shelf is None:
|
||||||
log.error("Invalid shelf specified: %s", shelf_id)
|
log.error("Invalid shelf specified: {}".format(shelf_id))
|
||||||
if not xhr:
|
if not xhr:
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
return "Invalid shelf specified", 400
|
return "Invalid shelf specified", 400
|
||||||
|
@ -199,7 +202,8 @@ def remove_from_shelf(shelf_id, book_id):
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
except (OperationalError, InvalidRequestError):
|
except (OperationalError, InvalidRequestError):
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
log.error("Settings DB is not Writeable")
|
||||||
|
flash(_("Settings DB is not Writeable"), category="error")
|
||||||
if "HTTP_REFERER" in request.environ:
|
if "HTTP_REFERER" in request.environ:
|
||||||
return redirect(request.environ["HTTP_REFERER"])
|
return redirect(request.environ["HTTP_REFERER"])
|
||||||
else:
|
else:
|
||||||
|
@ -213,6 +217,7 @@ def remove_from_shelf(shelf_id, book_id):
|
||||||
return "", 204
|
return "", 204
|
||||||
else:
|
else:
|
||||||
if not xhr:
|
if not xhr:
|
||||||
|
log.warning("You are not allowed to remove a book from shelf: {}".format(shelf.name))
|
||||||
flash(_(u"Sorry you are not allowed to remove a book from this shelf: %(sname)s", sname=shelf.name),
|
flash(_(u"Sorry you are not allowed to remove a book from this shelf: %(sname)s", sname=shelf.name),
|
||||||
category="error")
|
category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
@ -223,96 +228,79 @@ def remove_from_shelf(shelf_id, book_id):
|
||||||
@login_required
|
@login_required
|
||||||
def create_shelf():
|
def create_shelf():
|
||||||
shelf = ub.Shelf()
|
shelf = ub.Shelf()
|
||||||
if request.method == "POST":
|
return create_edit_shelf(shelf, title=_(u"Create a Shelf"), page="shelfcreate")
|
||||||
to_save = request.form.to_dict()
|
|
||||||
if "is_public" in to_save:
|
|
||||||
shelf.is_public = 1
|
|
||||||
shelf.name = to_save["title"]
|
|
||||||
shelf.user_id = int(current_user.id)
|
|
||||||
|
|
||||||
is_shelf_name_unique = False
|
|
||||||
if shelf.is_public == 1:
|
|
||||||
is_shelf_name_unique = ub.session.query(ub.Shelf) \
|
|
||||||
.filter((ub.Shelf.name == to_save["title"]) & (ub.Shelf.is_public == 1)) \
|
|
||||||
.first() is None
|
|
||||||
|
|
||||||
if not is_shelf_name_unique:
|
|
||||||
flash(_(u"A public shelf with the name '%(title)s' already exists.", title=to_save["title"]),
|
|
||||||
category="error")
|
|
||||||
else:
|
|
||||||
is_shelf_name_unique = ub.session.query(ub.Shelf) \
|
|
||||||
.filter((ub.Shelf.name == to_save["title"]) & (ub.Shelf.is_public == 0) &
|
|
||||||
(ub.Shelf.user_id == int(current_user.id)))\
|
|
||||||
.first() is None
|
|
||||||
|
|
||||||
if not is_shelf_name_unique:
|
|
||||||
flash(_(u"A private shelf with the name '%(title)s' already exists.", title=to_save["title"]),
|
|
||||||
category="error")
|
|
||||||
|
|
||||||
if is_shelf_name_unique:
|
|
||||||
try:
|
|
||||||
ub.session.add(shelf)
|
|
||||||
ub.session.commit()
|
|
||||||
flash(_(u"Shelf %(title)s created", title=to_save["title"]), category="success")
|
|
||||||
return redirect(url_for('shelf.show_shelf', shelf_id=shelf.id))
|
|
||||||
except (OperationalError, InvalidRequestError):
|
|
||||||
ub.session.rollback()
|
|
||||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
|
||||||
except Exception:
|
|
||||||
ub.session.rollback()
|
|
||||||
flash(_(u"There was an error"), category="error")
|
|
||||||
return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Create a Shelf"), page="shelfcreate")
|
|
||||||
else:
|
|
||||||
return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Create a Shelf"), page="shelfcreate")
|
|
||||||
|
|
||||||
|
|
||||||
@shelf.route("/shelf/edit/<int:shelf_id>", methods=["GET", "POST"])
|
@shelf.route("/shelf/edit/<int:shelf_id>", methods=["GET", "POST"])
|
||||||
@login_required
|
@login_required
|
||||||
def edit_shelf(shelf_id):
|
def edit_shelf(shelf_id):
|
||||||
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
||||||
|
return create_edit_shelf(shelf, title=_(u"Edit a shelf"), page="shelfedit", shelf_id=shelf_id)
|
||||||
|
|
||||||
|
|
||||||
|
# if shelf ID is set, we are editing a shelf
|
||||||
|
def create_edit_shelf(shelf, title, page, shelf_id=False):
|
||||||
if request.method == "POST":
|
if request.method == "POST":
|
||||||
to_save = request.form.to_dict()
|
to_save = request.form.to_dict()
|
||||||
|
if "is_public" in to_save:
|
||||||
is_shelf_name_unique = False
|
shelf.is_public = 1
|
||||||
if shelf.is_public == 1:
|
|
||||||
is_shelf_name_unique = ub.session.query(ub.Shelf) \
|
|
||||||
.filter((ub.Shelf.name == to_save["title"]) & (ub.Shelf.is_public == 1)) \
|
|
||||||
.filter(ub.Shelf.id != shelf_id) \
|
|
||||||
.first() is None
|
|
||||||
|
|
||||||
if not is_shelf_name_unique:
|
|
||||||
flash(_(u"A public shelf with the name '%(title)s' already exists.", title=to_save["title"]),
|
|
||||||
category="error")
|
|
||||||
else:
|
else:
|
||||||
is_shelf_name_unique = ub.session.query(ub.Shelf) \
|
shelf.is_public = 0
|
||||||
.filter((ub.Shelf.name == to_save["title"]) & (ub.Shelf.is_public == 0) &
|
if check_shelf_is_unique(shelf, to_save, shelf_id):
|
||||||
(ub.Shelf.user_id == int(current_user.id)))\
|
|
||||||
.filter(ub.Shelf.id != shelf_id)\
|
|
||||||
.first() is None
|
|
||||||
|
|
||||||
if not is_shelf_name_unique:
|
|
||||||
flash(_(u"A private shelf with the name '%(title)s' already exists.", title=to_save["title"]),
|
|
||||||
category="error")
|
|
||||||
|
|
||||||
if is_shelf_name_unique:
|
|
||||||
shelf.name = to_save["title"]
|
shelf.name = to_save["title"]
|
||||||
shelf.last_modified = datetime.utcnow()
|
# shelf.last_modified = datetime.utcnow()
|
||||||
if "is_public" in to_save:
|
if not shelf_id:
|
||||||
shelf.is_public = 1
|
shelf.user_id = int(current_user.id)
|
||||||
|
ub.session.add(shelf)
|
||||||
|
shelf_action = "created"
|
||||||
|
flash_text = _(u"Shelf %(title)s created", title=to_save["title"])
|
||||||
else:
|
else:
|
||||||
shelf.is_public = 0
|
shelf_action = "changed"
|
||||||
|
flash_text = _(u"Shelf %(title)s changed", title=to_save["title"])
|
||||||
try:
|
try:
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
flash(_(u"Shelf %(title)s changed", title=to_save["title"]), category="success")
|
log.info(u"Shelf {} {}".format(to_save["title"], shelf_action))
|
||||||
except (OperationalError, InvalidRequestError):
|
flash(flash_text, category="success")
|
||||||
|
return redirect(url_for('shelf.show_shelf', shelf_id=shelf.id))
|
||||||
|
except (OperationalError, InvalidRequestError) as ex:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
log.debug_or_exception(ex)
|
||||||
except Exception:
|
log.error("Settings DB is not Writeable")
|
||||||
|
flash(_("Settings DB is not Writeable"), category="error")
|
||||||
|
except Exception as ex:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
|
log.debug_or_exception(ex)
|
||||||
flash(_(u"There was an error"), category="error")
|
flash(_(u"There was an error"), category="error")
|
||||||
return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Edit a shelf"), page="shelfedit")
|
return render_title_template('shelf_edit.html', shelf=shelf, title=title, page=page)
|
||||||
|
|
||||||
|
|
||||||
|
def check_shelf_is_unique(shelf, to_save, shelf_id=False):
|
||||||
|
if shelf_id:
|
||||||
|
ident = ub.Shelf.id != shelf_id
|
||||||
else:
|
else:
|
||||||
return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Edit a shelf"), page="shelfedit")
|
ident = true()
|
||||||
|
if shelf.is_public == 1:
|
||||||
|
is_shelf_name_unique = ub.session.query(ub.Shelf) \
|
||||||
|
.filter((ub.Shelf.name == to_save["title"]) & (ub.Shelf.is_public == 1)) \
|
||||||
|
.filter(ident) \
|
||||||
|
.first() is None
|
||||||
|
|
||||||
|
if not is_shelf_name_unique:
|
||||||
|
log.error("A public shelf with the name '{}' already exists.".format(to_save["title"]))
|
||||||
|
flash(_(u"A public shelf with the name '%(title)s' already exists.", title=to_save["title"]),
|
||||||
|
category="error")
|
||||||
|
else:
|
||||||
|
is_shelf_name_unique = ub.session.query(ub.Shelf) \
|
||||||
|
.filter((ub.Shelf.name == to_save["title"]) & (ub.Shelf.is_public == 0) &
|
||||||
|
(ub.Shelf.user_id == int(current_user.id))) \
|
||||||
|
.filter(ident) \
|
||||||
|
.first() is None
|
||||||
|
|
||||||
|
if not is_shelf_name_unique:
|
||||||
|
log.error("A private shelf with the name '{}' already exists.".format(to_save["title"]))
|
||||||
|
flash(_(u"A private shelf with the name '%(title)s' already exists.", title=to_save["title"]),
|
||||||
|
category="error")
|
||||||
|
return is_shelf_name_unique
|
||||||
|
|
||||||
|
|
||||||
def delete_shelf_helper(cur_shelf):
|
def delete_shelf_helper(cur_shelf):
|
||||||
|
@ -322,9 +310,7 @@ def delete_shelf_helper(cur_shelf):
|
||||||
ub.session.delete(cur_shelf)
|
ub.session.delete(cur_shelf)
|
||||||
ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id).delete()
|
ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id).delete()
|
||||||
ub.session.add(ub.ShelfArchive(uuid=cur_shelf.uuid, user_id=cur_shelf.user_id))
|
ub.session.add(ub.ShelfArchive(uuid=cur_shelf.uuid, user_id=cur_shelf.user_id))
|
||||||
ub.session.commit()
|
ub.session_commit("successfully deleted Shelf {}".format(cur_shelf.name))
|
||||||
log.info("successfully deleted %s", cur_shelf)
|
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
@shelf.route("/shelf/delete/<int:shelf_id>")
|
@shelf.route("/shelf/delete/<int:shelf_id>")
|
||||||
|
@ -333,44 +319,25 @@ def delete_shelf(shelf_id):
|
||||||
cur_shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
cur_shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
||||||
try:
|
try:
|
||||||
delete_shelf_helper(cur_shelf)
|
delete_shelf_helper(cur_shelf)
|
||||||
except (OperationalError, InvalidRequestError):
|
except InvalidRequestError:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
log.error("Settings DB is not Writeable")
|
||||||
|
flash(_("Settings DB is not Writeable"), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
|
|
||||||
@shelf.route("/shelf/<int:shelf_id>", defaults={'shelf_type': 1})
|
@shelf.route("/simpleshelf/<int:shelf_id>")
|
||||||
@shelf.route("/shelf/<int:shelf_id>/<int:shelf_type>")
|
|
||||||
@login_required_if_no_ano
|
@login_required_if_no_ano
|
||||||
def show_shelf(shelf_type, shelf_id):
|
def show_simpleshelf(shelf_id):
|
||||||
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
return render_show_shelf(2, shelf_id, 1, None)
|
||||||
|
|
||||||
result = list()
|
|
||||||
# user is allowed to access shelf
|
|
||||||
if shelf and check_shelf_view_permissions(shelf):
|
|
||||||
page = "shelf.html" if shelf_type == 1 else 'shelfdown.html'
|
|
||||||
|
|
||||||
books_in_shelf = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id)\
|
@shelf.route("/shelf/<int:shelf_id>", defaults={"sort_param": "order", 'page': 1})
|
||||||
.order_by(ub.BookShelf.order.asc()).all()
|
@shelf.route("/shelf/<int:shelf_id>/<sort_param>", defaults={'page': 1})
|
||||||
for book in books_in_shelf:
|
@shelf.route("/shelf/<int:shelf_id>/<sort_param>/<int:page>")
|
||||||
cur_book = calibre_db.get_filtered_book(book.book_id)
|
@login_required_if_no_ano
|
||||||
if cur_book:
|
def show_shelf(shelf_id, sort_param, page):
|
||||||
result.append(cur_book)
|
return render_show_shelf(1, shelf_id, page, sort_param)
|
||||||
else:
|
|
||||||
cur_book = calibre_db.get_book(book.book_id)
|
|
||||||
if not cur_book:
|
|
||||||
log.info('Not existing book %s in %s deleted', book.book_id, shelf)
|
|
||||||
try:
|
|
||||||
ub.session.query(ub.BookShelf).filter(ub.BookShelf.book_id == book.book_id).delete()
|
|
||||||
ub.session.commit()
|
|
||||||
except (OperationalError, InvalidRequestError):
|
|
||||||
ub.session.rollback()
|
|
||||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
|
||||||
return render_title_template(page, entries=result, title=_(u"Shelf: '%(name)s'", name=shelf.name),
|
|
||||||
shelf=shelf, page="shelf")
|
|
||||||
else:
|
|
||||||
flash(_(u"Error opening shelf. Shelf does not exist or is not accessible"), category="error")
|
|
||||||
return redirect(url_for("web.index"))
|
|
||||||
|
|
||||||
|
|
||||||
@shelf.route("/shelf/order/<int:shelf_id>", methods=["GET", "POST"])
|
@shelf.route("/shelf/order/<int:shelf_id>", methods=["GET", "POST"])
|
||||||
|
@ -389,27 +356,86 @@ def order_shelf(shelf_id):
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
except (OperationalError, InvalidRequestError):
|
except (OperationalError, InvalidRequestError):
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
log.error("Settings DB is not Writeable")
|
||||||
|
flash(_("Settings DB is not Writeable"), category="error")
|
||||||
|
|
||||||
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
||||||
result = list()
|
result = list()
|
||||||
if shelf and check_shelf_view_permissions(shelf):
|
if shelf and check_shelf_view_permissions(shelf):
|
||||||
books_in_shelf2 = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id) \
|
result = calibre_db.session.query(db.Books)\
|
||||||
.order_by(ub.BookShelf.order.asc()).all()
|
.join(ub.BookShelf,ub.BookShelf.book_id == db.Books.id , isouter=True) \
|
||||||
for book in books_in_shelf2:
|
.add_columns(calibre_db.common_filters().label("visible")) \
|
||||||
cur_book = calibre_db.get_filtered_book(book.book_id)
|
.filter(ub.BookShelf.shelf == shelf_id).order_by(ub.BookShelf.order.asc()).all()
|
||||||
if cur_book:
|
|
||||||
result.append({'title': cur_book.title,
|
|
||||||
'id': cur_book.id,
|
|
||||||
'author': cur_book.authors,
|
|
||||||
'series': cur_book.series,
|
|
||||||
'series_index': cur_book.series_index})
|
|
||||||
else:
|
|
||||||
cur_book = calibre_db.get_book(book.book_id)
|
|
||||||
result.append({'title': _('Hidden Book'),
|
|
||||||
'id': cur_book.id,
|
|
||||||
'author': [],
|
|
||||||
'series': []})
|
|
||||||
return render_title_template('shelf_order.html', entries=result,
|
return render_title_template('shelf_order.html', entries=result,
|
||||||
title=_(u"Change order of Shelf: '%(name)s'", name=shelf.name),
|
title=_(u"Change order of Shelf: '%(name)s'", name=shelf.name),
|
||||||
shelf=shelf, page="shelforder")
|
shelf=shelf, page="shelforder")
|
||||||
|
|
||||||
|
|
||||||
|
def change_shelf_order(shelf_id, order):
|
||||||
|
result = calibre_db.session.query(db.Books).join(ub.BookShelf,ub.BookShelf.book_id == db.Books.id)\
|
||||||
|
.filter(ub.BookShelf.shelf == shelf_id).order_by(*order).all()
|
||||||
|
for index, entry in enumerate(result):
|
||||||
|
book = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id) \
|
||||||
|
.filter(ub.BookShelf.book_id == entry.id).first()
|
||||||
|
book.order = index
|
||||||
|
ub.session_commit("Shelf-id:{} - Order changed".format(shelf_id))
|
||||||
|
|
||||||
|
|
||||||
|
def render_show_shelf(shelf_type, shelf_id, page_no, sort_param):
|
||||||
|
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
||||||
|
|
||||||
|
# check user is allowed to access shelf
|
||||||
|
if shelf and check_shelf_view_permissions(shelf):
|
||||||
|
|
||||||
|
if shelf_type == 1:
|
||||||
|
# order = [ub.BookShelf.order.asc()]
|
||||||
|
if sort_param == 'pubnew':
|
||||||
|
change_shelf_order(shelf_id, [db.Books.pubdate.desc()])
|
||||||
|
if sort_param == 'pubold':
|
||||||
|
change_shelf_order(shelf_id, [db.Books.pubdate])
|
||||||
|
if sort_param == 'abc':
|
||||||
|
change_shelf_order(shelf_id, [db.Books.sort])
|
||||||
|
if sort_param == 'zyx':
|
||||||
|
change_shelf_order(shelf_id, [db.Books.sort.desc()])
|
||||||
|
if sort_param == 'new':
|
||||||
|
change_shelf_order(shelf_id, [db.Books.timestamp.desc()])
|
||||||
|
if sort_param == 'old':
|
||||||
|
change_shelf_order(shelf_id, [db.Books.timestamp])
|
||||||
|
if sort_param == 'authaz':
|
||||||
|
change_shelf_order(shelf_id, [db.Books.author_sort.asc()])
|
||||||
|
if sort_param == 'authza':
|
||||||
|
change_shelf_order(shelf_id, [db.Books.author_sort.desc()])
|
||||||
|
page = "shelf.html"
|
||||||
|
pagesize = 0
|
||||||
|
else:
|
||||||
|
pagesize = sys.maxsize
|
||||||
|
page = 'shelfdown.html'
|
||||||
|
|
||||||
|
result, __, pagination = calibre_db.fill_indexpage(page_no, pagesize,
|
||||||
|
db.Books,
|
||||||
|
ub.BookShelf.shelf == shelf_id,
|
||||||
|
[ub.BookShelf.order.asc()],
|
||||||
|
ub.BookShelf,ub.BookShelf.book_id == db.Books.id)
|
||||||
|
# delete chelf entries where book is not existent anymore, can happen if book is deleted outside calibre-web
|
||||||
|
wrong_entries = calibre_db.session.query(ub.BookShelf)\
|
||||||
|
.join(db.Books, ub.BookShelf.book_id == db.Books.id, isouter=True)\
|
||||||
|
.filter(db.Books.id == None).all()
|
||||||
|
for entry in wrong_entries:
|
||||||
|
log.info('Not existing book {} in {} deleted'.format(entry.book_id, shelf))
|
||||||
|
try:
|
||||||
|
ub.session.query(ub.BookShelf).filter(ub.BookShelf.book_id == entry.book_id).delete()
|
||||||
|
ub.session.commit()
|
||||||
|
except (OperationalError, InvalidRequestError):
|
||||||
|
ub.session.rollback()
|
||||||
|
log.error("Settings DB is not Writeable")
|
||||||
|
flash(_("Settings DB is not Writeable"), category="error")
|
||||||
|
|
||||||
|
return render_title_template(page,
|
||||||
|
entries=result,
|
||||||
|
pagination=pagination,
|
||||||
|
title=_(u"Shelf: '%(name)s'", name=shelf.name),
|
||||||
|
shelf=shelf,
|
||||||
|
page="shelf")
|
||||||
|
else:
|
||||||
|
flash(_(u"Error opening shelf. Shelf does not exist or is not accessible"), category="error")
|
||||||
|
return redirect(url_for("web.index"))
|
||||||
|
|
|
@ -1,17 +1,24 @@
|
||||||
body.serieslist.grid-view div.container-fluid>div>div.col-sm-10:before{
|
body.serieslist.grid-view div.container-fluid > div > div.col-sm-10::before {
|
||||||
display: none;
|
display: none;
|
||||||
}
|
}
|
||||||
|
|
||||||
.cover .badge{
|
.cover .badge {
|
||||||
position: absolute;
|
position: absolute;
|
||||||
top: 0;
|
top: 0;
|
||||||
left: 0;
|
left: 0;
|
||||||
background-color: #cc7b19;
|
color: #fff;
|
||||||
border-radius: 0;
|
background-color: #cc7b19;
|
||||||
padding: 0 8px;
|
border-radius: 0;
|
||||||
box-shadow: 0 0 4px rgba(0,0,0,.6);
|
padding: 0 8px;
|
||||||
line-height: 24px;
|
box-shadow: 0 0 4px rgba(0, 0, 0, 0.6);
|
||||||
|
line-height: 24px;
|
||||||
}
|
}
|
||||||
.cover{
|
|
||||||
box-shadow: 0 0 4px rgba(0,0,0,.6);
|
.cover {
|
||||||
|
box-shadow: 0 0 4px rgba(0, 0, 0, 0.6);
|
||||||
|
}
|
||||||
|
|
||||||
|
.cover .read {
|
||||||
|
padding: 0 0;
|
||||||
|
line-height: 15px;
|
||||||
}
|
}
|
||||||
|
|
BIN
cps/static/css/img/clear.png
Normal file
After Width: | Height: | Size: 509 B |
BIN
cps/static/css/img/loading.gif
Normal file
After Width: | Height: | Size: 1.8 KiB |
|
@ -33,7 +33,6 @@ body {
|
||||||
position: relative;
|
position: relative;
|
||||||
cursor: pointer;
|
cursor: pointer;
|
||||||
padding: 4px;
|
padding: 4px;
|
||||||
|
|
||||||
transition: all 0.2s ease;
|
transition: all 0.2s ease;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -45,7 +44,7 @@ body {
|
||||||
|
|
||||||
#sidebar a.active,
|
#sidebar a.active,
|
||||||
#sidebar a.active img + span {
|
#sidebar a.active img + span {
|
||||||
background-color: #45B29D;
|
background-color: #45b29d;
|
||||||
}
|
}
|
||||||
|
|
||||||
#sidebar li img {
|
#sidebar li img {
|
||||||
|
@ -85,21 +84,30 @@ body {
|
||||||
#progress .bar-load,
|
#progress .bar-load,
|
||||||
#progress .bar-read {
|
#progress .bar-read {
|
||||||
display: flex;
|
display: flex;
|
||||||
align-items: flex-end;
|
|
||||||
justify-content: flex-end;
|
|
||||||
position: absolute;
|
position: absolute;
|
||||||
top: 0;
|
top: 0;
|
||||||
left: 0;
|
|
||||||
bottom: 0;
|
bottom: 0;
|
||||||
transition: width 150ms ease-in-out;
|
transition: width 150ms ease-in-out;
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#progress .from-left {
|
||||||
|
left: 0;
|
||||||
|
align-items: flex-end;
|
||||||
|
justify-content: flex-end;
|
||||||
|
}
|
||||||
|
|
||||||
|
#progress .from-right {
|
||||||
|
right: 0;
|
||||||
|
align-items: flex-start;
|
||||||
|
justify-content: flex-start;
|
||||||
|
}
|
||||||
|
|
||||||
#progress .bar-load {
|
#progress .bar-load {
|
||||||
color: #000;
|
color: #000;
|
||||||
background-color: #ccc;
|
background-color: #ccc;
|
||||||
}
|
}
|
||||||
|
|
||||||
#progress .bar-read {
|
#progress .bar-read {
|
||||||
color: #fff;
|
color: #fff;
|
||||||
background-color: #45b29d;
|
background-color: #45b29d;
|
||||||
}
|
}
|
||||||
|
|
6
cps/static/css/libs/bootstrap-select.min.css
vendored
Normal file
4
cps/static/css/libs/bootstrap-table.min.css
vendored
6
cps/static/css/libs/images/findbarButton-next-dark.svg
Normal file
|
@ -0,0 +1,6 @@
|
||||||
|
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
||||||
|
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
||||||
|
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
||||||
|
fill="rgba(255,255,255,1)"><path d="M8 12a1 1 0 0 1-.707-.293l-5-5a1 1 0 0 1 1.414-1.414L8
|
||||||
|
9.586l4.293-4.293a1 1 0 0 1 1.414 1.414l-5 5A1 1 0 0 1 8 12z"></path></svg>
|
After Width: | Height: | Size: 461 B |
Before Width: | Height: | Size: 199 B |
Before Width: | Height: | Size: 304 B |
Before Width: | Height: | Size: 193 B |
4
cps/static/css/libs/images/findbarButton-next.svg
Normal file
|
@ -0,0 +1,4 @@
|
||||||
|
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
||||||
|
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
||||||
|
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"><path d="M8 12a1 1 0 0 1-.707-.293l-5-5a1 1 0 0 1 1.414-1.414L8 9.586l4.293-4.293a1 1 0 0 1 1.414 1.414l-5 5A1 1 0 0 1 8 12z"></path></svg>
|
After Width: | Height: | Size: 434 B |
Before Width: | Height: | Size: 296 B |
|
@ -0,0 +1,5 @@
|
||||||
|
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
||||||
|
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
||||||
|
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
||||||
|
fill="rgba(255,255,255,1)"><path d="M13 11a1 1 0 0 1-.707-.293L8 6.414l-4.293 4.293a1 1 0 0 1-1.414-1.414l5-5a1 1 0 0 1 1.414 0l5 5A1 1 0 0 1 13 11z"></path></svg>
|
After Width: | Height: | Size: 458 B |
Before Width: | Height: | Size: 193 B |
Before Width: | Height: | Size: 296 B |
Before Width: | Height: | Size: 199 B |
4
cps/static/css/libs/images/findbarButton-previous.svg
Normal file
|
@ -0,0 +1,4 @@
|
||||||
|
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
||||||
|
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
||||||
|
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"><path d="M13 11a1 1 0 0 1-.707-.293L8 6.414l-4.293 4.293a1 1 0 0 1-1.414-1.414l5-5a1 1 0 0 1 1.414 0l5 5A1 1 0 0 1 13 11z"></path></svg>
|
After Width: | Height: | Size: 431 B |
Before Width: | Height: | Size: 304 B |
24
cps/static/css/libs/images/loading-dark.svg
Normal file
|
@ -0,0 +1,24 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"
|
||||||
|
fill="rgba(255,255,255,1)" style="animation:spinLoadingIcon 1s steps(12,end)
|
||||||
|
infinite"><style>@keyframes
|
||||||
|
spinLoadingIcon{to{transform:rotate(360deg)}}</style><path
|
||||||
|
d="M7 3V1s0-1 1-1 1 1 1 1v2s0 1-1 1-1-1-1-1z"/><path d="M4.63
|
||||||
|
4.1l-1-1.73S3.13 1.5 4 1c.87-.5 1.37.37 1.37.37l1 1.73s.5.87-.37
|
||||||
|
1.37c-.87.57-1.37-.37-1.37-.37z" fill-opacity=".93"/><path
|
||||||
|
d="M3.1 6.37l-1.73-1S.5 4.87 1 4c.5-.87 1.37-.37 1.37-.37l1.73 1s.87.5.37
|
||||||
|
1.37c-.5.87-1.37.37-1.37.37z" fill-opacity=".86"/><path d="M3
|
||||||
|
9H1S0 9 0 8s1-1 1-1h2s1 0 1 1-1 1-1 1z" fill-opacity=".79"/><path d="M4.1 11.37l-1.73 1S1.5 12.87 1
|
||||||
|
12c-.5-.87.37-1.37.37-1.37l1.73-1s.87-.5 1.37.37c.5.87-.37 1.37-.37 1.37z"
|
||||||
|
fill-opacity=".72"/><path d="M3.63 13.56l1-1.73s.5-.87
|
||||||
|
1.37-.37c.87.5.37 1.37.37 1.37l-1 1.73s-.5.87-1.37.37c-.87-.5-.37-1.37-.37-1.37z"
|
||||||
|
fill-opacity=".65"/><path d="M7 15v-2s0-1 1-1 1 1 1 1v2s0 1-1
|
||||||
|
1-1-1-1-1z" fill-opacity=".58"/><path d="M10.63
|
||||||
|
14.56l-1-1.73s-.5-.87.37-1.37c.87-.5 1.37.37 1.37.37l1 1.73s.5.87-.37
|
||||||
|
1.37c-.87.5-1.37-.37-1.37-.37z" fill-opacity=".51"/><path
|
||||||
|
d="M13.56 12.37l-1.73-1s-.87-.5-.37-1.37c.5-.87 1.37-.37 1.37-.37l1.73 1s.87.5.37
|
||||||
|
1.37c-.5.87-1.37.37-1.37.37z" fill-opacity=".44"/><path d="M15
|
||||||
|
9h-2s-1 0-1-1 1-1 1-1h2s1 0 1 1-1 1-1 1z" fill-opacity=".37"/><path d="M14.56 5.37l-1.73
|
||||||
|
1s-.87.5-1.37-.37c-.5-.87.37-1.37.37-1.37l1.73-1s.87-.5 1.37.37c.5.87-.37 1.37-.37
|
||||||
|
1.37z" fill-opacity=".3"/><path d="M9.64 3.1l.98-1.66s.5-.874
|
||||||
|
1.37-.37c.87.5.37 1.37.37 1.37l-1 1.73s-.5.87-1.37.37c-.87-.5-.37-1.37-.37-1.37z"
|
||||||
|
fill-opacity=".23"/></svg>
|
After Width: | Height: | Size: 1.5 KiB |
Before Width: | Height: | Size: 7.2 KiB |
Before Width: | Height: | Size: 16 KiB |
1
cps/static/css/libs/images/loading.svg
Normal file
|
@ -0,0 +1 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16" style="animation:spinLoadingIcon 1s steps(12,end) infinite"><style>@keyframes spinLoadingIcon{to{transform:rotate(360deg)}}</style><path d="M7 3V1s0-1 1-1 1 1 1 1v2s0 1-1 1-1-1-1-1z"/><path d="M4.63 4.1l-1-1.73S3.13 1.5 4 1c.87-.5 1.37.37 1.37.37l1 1.73s.5.87-.37 1.37c-.87.57-1.37-.37-1.37-.37z" fill-opacity=".93"/><path d="M3.1 6.37l-1.73-1S.5 4.87 1 4c.5-.87 1.37-.37 1.37-.37l1.73 1s.87.5.37 1.37c-.5.87-1.37.37-1.37.37z" fill-opacity=".86"/><path d="M3 9H1S0 9 0 8s1-1 1-1h2s1 0 1 1-1 1-1 1z" fill-opacity=".79"/><path d="M4.1 11.37l-1.73 1S1.5 12.87 1 12c-.5-.87.37-1.37.37-1.37l1.73-1s.87-.5 1.37.37c.5.87-.37 1.37-.37 1.37z" fill-opacity=".72"/><path d="M3.63 13.56l1-1.73s.5-.87 1.37-.37c.87.5.37 1.37.37 1.37l-1 1.73s-.5.87-1.37.37c-.87-.5-.37-1.37-.37-1.37z" fill-opacity=".65"/><path d="M7 15v-2s0-1 1-1 1 1 1 1v2s0 1-1 1-1-1-1-1z" fill-opacity=".58"/><path d="M10.63 14.56l-1-1.73s-.5-.87.37-1.37c.87-.5 1.37.37 1.37.37l1 1.73s.5.87-.37 1.37c-.87.5-1.37-.37-1.37-.37z" fill-opacity=".51"/><path d="M13.56 12.37l-1.73-1s-.87-.5-.37-1.37c.5-.87 1.37-.37 1.37-.37l1.73 1s.87.5.37 1.37c-.5.87-1.37.37-1.37.37z" fill-opacity=".44"/><path d="M15 9h-2s-1 0-1-1 1-1 1-1h2s1 0 1 1-1 1-1 1z" fill-opacity=".37"/><path d="M14.56 5.37l-1.73 1s-.87.5-1.37-.37c-.5-.87.37-1.37.37-1.37l1.73-1s.87-.5 1.37.37c.5.87-.37 1.37-.37 1.37z" fill-opacity=".3"/><path d="M9.64 3.1l.98-1.66s.5-.874 1.37-.37c.87.5.37 1.37.37 1.37l-1 1.73s-.5.87-1.37.37c-.87-.5-.37-1.37-.37-1.37z" fill-opacity=".23"/></svg>
|
After Width: | Height: | Size: 1.5 KiB |
|
@ -0,0 +1,16 @@
|
||||||
|
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
||||||
|
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
||||||
|
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16
|
||||||
|
16"
|
||||||
|
fill="rgba(255,255,255,1)">
|
||||||
|
<path
|
||||||
|
d="M8 16a8 8 0 1 1 8-8 8.009 8.009 0 0 1-8 8zM8 2a6 6 0 1 0 6 6 6.006 6.006 0 0 0-6-6z">
|
||||||
|
</path>
|
||||||
|
<path
|
||||||
|
d="M8 7a1 1 0 0 0-1 1v3a1 1 0 0 0 2 0V8a1 1 0 0 0-1-1z">
|
||||||
|
</path>
|
||||||
|
<circle
|
||||||
|
cx="8" cy="5" r="1.188">
|
||||||
|
</circle>
|
||||||
|
</svg>
|
After Width: | Height: | Size: 557 B |
Before Width: | Height: | Size: 403 B |
|
@ -0,0 +1,15 @@
|
||||||
|
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
||||||
|
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
||||||
|
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16
|
||||||
|
16">
|
||||||
|
<path
|
||||||
|
d="M8 16a8 8 0 1 1 8-8 8.009 8.009 0 0 1-8 8zM8 2a6 6 0 1 0 6 6 6.006 6.006 0 0 0-6-6z">
|
||||||
|
</path>
|
||||||
|
<path
|
||||||
|
d="M8 7a1 1 0 0 0-1 1v3a1 1 0 0 0 2 0V8a1 1 0 0 0-1-1z">
|
||||||
|
</path>
|
||||||
|
<circle
|
||||||
|
cx="8" cy="5" r="1.188">
|
||||||
|
</circle>
|
||||||
|
</svg>
|
After Width: | Height: | Size: 530 B |
Before Width: | Height: | Size: 933 B |
|
@ -0,0 +1,2 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16"
|
||||||
|
fill="rgba(255,255,255,1)"><path d="M13 13c-.3 0-.5-.1-.7-.3L8 8.4l-4.3 4.3c-.9.9-2.3-.5-1.4-1.4l5-5c.4-.4 1-.4 1.4 0l5 5c.6.6.2 1.7-.7 1.7zm0-11H3C1.7 2 1.7 4 3 4h10c1.3 0 1.3-2 0-2z"/></svg>
|
After Width: | Height: | Size: 255 B |
Before Width: | Height: | Size: 179 B |
|
@ -0,0 +1 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16"><path d="M13 13c-.3 0-.5-.1-.7-.3L8 8.4l-4.3 4.3c-.9.9-2.3-.5-1.4-1.4l5-5c.4-.4 1-.4 1.4 0l5 5c.6.6.2 1.7-.7 1.7zm0-11H3C1.7 2 1.7 4 3 4h10c1.3 0 1.3-2 0-2z"/></svg>
|
After Width: | Height: | Size: 228 B |
Before Width: | Height: | Size: 266 B |
|
@ -0,0 +1,2 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"
|
||||||
|
fill="rgba(255,255,255,1)"><path d="M15 3.7V13c0 1.5-1.53 3-3 3H7.13c-.72 0-1.63-.5-2.13-1l-5-5s.84-1 .87-1c.13-.1.33-.2.53-.2.1 0 .3.1.4.2L4 10.6V2.7c0-.6.4-1 1-1s1 .4 1 1v4.6h1V1c0-.6.4-1 1-1s1 .4 1 1v6.3h1V1.7c0-.6.4-1 1-1s1 .4 1 1v5.7h1V3.7c0-.6.4-1 1-1s1 .4 1 1z"/></svg>
|
After Width: | Height: | Size: 339 B |
Before Width: | Height: | Size: 301 B |
|
@ -0,0 +1 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"><path d="M15 3.7V13c0 1.5-1.53 3-3 3H7.13c-.72 0-1.63-.5-2.13-1l-5-5s.84-1 .87-1c.13-.1.33-.2.53-.2.1 0 .3.1.4.2L4 10.6V2.7c0-.6.4-1 1-1s1 .4 1 1v4.6h1V1c0-.6.4-1 1-1s1 .4 1 1v6.3h1V1.7c0-.6.4-1 1-1s1 .4 1 1v5.7h1V3.7c0-.6.4-1 1-1s1 .4 1 1z"/></svg>
|
After Width: | Height: | Size: 312 B |
Before Width: | Height: | Size: 583 B |
|
@ -0,0 +1,2 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16"
|
||||||
|
fill="rgba(255,255,255,1)"><path d="M8 10c-.3 0-.5-.1-.7-.3l-5-5c-.9-.9.5-2.3 1.4-1.4L8 7.6l4.3-4.3c.9-.9 2.3.5 1.4 1.4l-5 5c-.2.2-.4.3-.7.3zm5 2H3c-1.3 0-1.3 2 0 2h10c1.3 0 1.3-2 0-2z"/></svg>
|
After Width: | Height: | Size: 256 B |
Before Width: | Height: | Size: 175 B |
|
@ -0,0 +1 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16"><path d="M8 10c-.3 0-.5-.1-.7-.3l-5-5c-.9-.9.5-2.3 1.4-1.4L8 7.6l4.3-4.3c.9-.9 2.3.5 1.4 1.4l-5 5c-.2.2-.4.3-.7.3zm5 2H3c-1.3 0-1.3 2 0 2h10c1.3 0 1.3-2 0-2z"/></svg>
|
After Width: | Height: | Size: 229 B |
Before Width: | Height: | Size: 276 B |
|
@ -0,0 +1,2 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"
|
||||||
|
fill="rgba(255,255,255,1)"><path d="M1 1a1 1 0 011 1v2.4A7 7 0 118 15a7 7 0 01-4.9-2 1 1 0 011.4-1.5 5 5 0 10-1-5.5H6a1 1 0 010 2H1a1 1 0 01-1-1V2a1 1 0 011-1z"/></svg>
|
After Width: | Height: | Size: 231 B |
Before Width: | Height: | Size: 360 B |
|
@ -0,0 +1 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"><path d="M1 1a1 1 0 011 1v2.4A7 7 0 118 15a7 7 0 01-4.9-2 1 1 0 011.4-1.5 5 5 0 10-1-5.5H6a1 1 0 010 2H1a1 1 0 01-1-1V2a1 1 0 011-1z"/></svg>
|
After Width: | Height: | Size: 204 B |
Before Width: | Height: | Size: 731 B |
|
@ -0,0 +1,5 @@
|
||||||
|
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
||||||
|
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
||||||
|
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
||||||
|
fill="rgba(255,255,255,1)"><path d="M15 1a1 1 0 0 0-1 1v2.418A6.995 6.995 0 1 0 8 15a6.954 6.954 0 0 0 4.95-2.05 1 1 0 0 0-1.414-1.414A5.019 5.019 0 1 1 12.549 6H10a1 1 0 0 0 0 2h5a1 1 0 0 0 1-1V2a1 1 0 0 0-1-1z"></path></svg>
|
After Width: | Height: | Size: 521 B |
Before Width: | Height: | Size: 359 B |
|
@ -0,0 +1,4 @@
|
||||||
|
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
||||||
|
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
||||||
|
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"><path d="M15 1a1 1 0 0 0-1 1v2.418A6.995 6.995 0 1 0 8 15a6.954 6.954 0 0 0 4.95-2.05 1 1 0 0 0-1.414-1.414A5.019 5.019 0 1 1 12.549 6H10a1 1 0 0 0 0 2h5a1 1 0 0 0 1-1V2a1 1 0 0 0-1-1z"></path></svg>
|
After Width: | Height: | Size: 494 B |
Before Width: | Height: | Size: 714 B |
|
@ -0,0 +1,2 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"
|
||||||
|
fill="rgba(255,255,255,1)"><path d="M0 4h1.5c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5H0zM9.5 4c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5h-3c-1 0-1.5-.5-1.5-1.5v-5C5 4.5 5.5 4 6.5 4zM16 4h-1.5c-1 0-1.5.5-1.5 1.5v5c0 1 .5 1.5 1.5 1.5H16z"/></svg>
|
After Width: | Height: | Size: 302 B |
Before Width: | Height: | Size: 218 B |
|
@ -0,0 +1 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"><path d="M0 4h1.5c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5H0zM9.5 4c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5h-3c-1 0-1.5-.5-1.5-1.5v-5C5 4.5 5.5 4 6.5 4zM16 4h-1.5c-1 0-1.5.5-1.5 1.5v5c0 1 .5 1.5 1.5 1.5H16z"/></svg>
|
After Width: | Height: | Size: 275 B |
Before Width: | Height: | Size: 332 B |
|
@ -0,0 +1,2 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"
|
||||||
|
fill="rgba(255,255,255,1)"><path d="M9.5 4c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5h-3c-1 0-1.5-.5-1.5-1.5v-5C5 4.5 5.5 4 6.5 4zM11 0v.5c0 1-.5 1.5-1.5 1.5h-3C5.5 2 5 1.5 5 .5V0h6zM11 16v-.5c0-1-.5-1.5-1.5-1.5h-3c-1 0-1.5.5-1.5 1.5v.5h6z"/></svg>
|
After Width: | Height: | Size: 307 B |
Before Width: | Height: | Size: 228 B |
|
@ -0,0 +1 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" height="16" width="16"><path d="M9.5 4c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5h-3c-1 0-1.5-.5-1.5-1.5v-5C5 4.5 5.5 4 6.5 4zM11 0v.5c0 1-.5 1.5-1.5 1.5h-3C5.5 2 5 1.5 5 .5V0h6zM11 16v-.5c0-1-.5-1.5-1.5-1.5h-3c-1 0-1.5.5-1.5 1.5v.5h6z"/></svg>
|
After Width: | Height: | Size: 280 B |
Before Width: | Height: | Size: 349 B |
|
@ -0,0 +1,2 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16"
|
||||||
|
fill="rgba(255,255,255,1)"><path d="M5.5 4c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5h-3c-1 0-1.5-.5-1.5-1.5v-5C1 4.5 1.5 4 2.5 4zM7 0v.5C7 1.5 6.5 2 5.5 2h-3C1.5 2 1 1.5 1 .5V0h6zM7 16v-.5c0-1-.5-1.5-1.5-1.5h-3c-1 0-1.5.5-1.5 1.5v.5h6zM13.5 4c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5h-3c-1 0-1.5-.5-1.5-1.5v-5c0-1 .5-1.5 1.5-1.5zM15 0v.5c0 1-.5 1.5-1.5 1.5h-3C9.5 2 9 1.5 9 .5V0h6zM15 16v-.507c0-1-.5-1.5-1.5-1.5h-3C9.5 14 9 14.5 9 15.5v.5h6z"/></svg>
|
After Width: | Height: | Size: 509 B |
Before Width: | Height: | Size: 297 B |
|
@ -0,0 +1 @@
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16"><path d="M5.5 4c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5h-3c-1 0-1.5-.5-1.5-1.5v-5C1 4.5 1.5 4 2.5 4zM7 0v.5C7 1.5 6.5 2 5.5 2h-3C1.5 2 1 1.5 1 .5V0h6zM7 16v-.5c0-1-.5-1.5-1.5-1.5h-3c-1 0-1.5.5-1.5 1.5v.5h6zM13.5 4c1 0 1.5.5 1.5 1.5v5c0 1-.5 1.5-1.5 1.5h-3c-1 0-1.5-.5-1.5-1.5v-5c0-1 .5-1.5 1.5-1.5zM15 0v.5c0 1-.5 1.5-1.5 1.5h-3C9.5 2 9 1.5 9 .5V0h6zM15 16v-.507c0-1-.5-1.5-1.5-1.5h-3C9.5 14 9 14.5 9 15.5v.5h6z"/></svg>
|
After Width: | Height: | Size: 482 B |
Before Width: | Height: | Size: 490 B |
|
@ -0,0 +1,5 @@
|
||||||
|
<!-- This Source Code Form is subject to the terms of the Mozilla Public
|
||||||
|
- License, v. 2.0. If a copy of the MPL was not distributed with this
|
||||||
|
- file, You can obtain one at http://mozilla.org/MPL/2.0/. -->
|
||||||
|
<svg xmlns="http://www.w3.org/2000/svg" width="16" height="16" viewBox="0 0 16 16"
|
||||||
|
fill="rgba(255,255,255,1)"><path d="M12.408 8.217l-8.083-6.7A.2.2 0 0 0 4 1.672V12.3a.2.2 0 0 0 .333.146l2.56-2.372 1.857 3.9A1.125 1.125 0 1 0 10.782 13L8.913 9.075l3.4-.51a.2.2 0 0 0 .095-.348z"></path></svg>
|
After Width: | Height: | Size: 505 B |
Before Width: | Height: | Size: 461 B |