Merge branch 'cover_thumbnail' into Develop
This commit is contained in:
commit
db03fb3edd
1
.gitignore
vendored
1
.gitignore
vendored
|
@ -23,6 +23,7 @@ vendor/
|
||||||
# calibre-web
|
# calibre-web
|
||||||
*.db
|
*.db
|
||||||
*.log
|
*.log
|
||||||
|
cps/cache
|
||||||
|
|
||||||
.idea/
|
.idea/
|
||||||
*.bak
|
*.bak
|
||||||
|
|
27
README.md
27
README.md
|
@ -1,6 +1,6 @@
|
||||||
# About
|
# About
|
||||||
|
|
||||||
Calibre-Web is a web app providing a clean interface for browsing, reading and downloading eBooks using an existing [Calibre](https://calibre-ebook.com) database.
|
Calibre-Web is a web app providing a clean interface for browsing, reading and downloading eBooks using a valid [Calibre](https://calibre-ebook.com) database.
|
||||||
|
|
||||||
[![GitHub License](https://img.shields.io/github/license/janeczku/calibre-web?style=flat-square)](https://github.com/janeczku/calibre-web/blob/master/LICENSE)
|
[![GitHub License](https://img.shields.io/github/license/janeczku/calibre-web?style=flat-square)](https://github.com/janeczku/calibre-web/blob/master/LICENSE)
|
||||||
[![GitHub commit activity](https://img.shields.io/github/commit-activity/w/janeczku/calibre-web?logo=github&style=flat-square&label=commits)]()
|
[![GitHub commit activity](https://img.shields.io/github/commit-activity/w/janeczku/calibre-web?logo=github&style=flat-square&label=commits)]()
|
||||||
|
@ -40,23 +40,20 @@ Calibre-Web is a web app providing a clean interface for browsing, reading and d
|
||||||
## Installation
|
## Installation
|
||||||
|
|
||||||
#### Installation via pip (recommended)
|
#### Installation via pip (recommended)
|
||||||
1. Install calibre web via pip with the command `pip install calibreweb` (Depending on your OS and or distro the command could also be `pip3`).
|
1. To avoid problems with already installed python dependencies, it's recommended to create a virtual environment for Calibre-Web
|
||||||
2. Optional features can also be installed via pip, please refer to [this page](https://github.com/janeczku/calibre-web/wiki/Dependencies-in-Calibre-Web-Linux-Windows) for details
|
2. Install Calibre-Web via pip with the command `pip install calibreweb` (Depending on your OS and or distro the command could also be `pip3`).
|
||||||
3. Calibre-Web can be started afterwards by typing `cps` or `python3 -m cps`
|
3. Optional features can also be installed via pip, please refer to [this page](https://github.com/janeczku/calibre-web/wiki/Dependencies-in-Calibre-Web-Linux-Windows) for details
|
||||||
|
4. Calibre-Web can be started afterwards by typing `cps`
|
||||||
|
|
||||||
#### Manual installation
|
In the Wiki there are also examples for: a [manual installation](https://github.com/janeczku/calibre-web/wiki/Manual-installation), [installation on Linux Mint](https://github.com/janeczku/calibre-web/wiki/How-To:Install-Calibre-Web-in-Linux-Mint-19-or-20), [installation on a Cloud Provider](https://github.com/janeczku/calibre-web/wiki/How-To:-Install-Calibre-Web-on-a-Cloud-Provider).
|
||||||
1. Install dependencies by running `pip3 install --target vendor -r requirements.txt` (python3.x). Alternativly set up a python virtual environment.
|
|
||||||
2. Execute the command: `python3 cps.py` (or `nohup python3 cps.py` - recommended if you want to exit the terminal window)
|
|
||||||
|
|
||||||
Issues with Ubuntu:
|
|
||||||
Please note that running the above install command can fail on some versions of Ubuntu, saying `"can't combine user with prefix"`. This is a [known bug](https://github.com/pypa/pip/issues/3826) and can be remedied by using the command `pip install --system --target vendor -r requirements.txt` instead.
|
|
||||||
|
|
||||||
## Quick start
|
## Quick start
|
||||||
|
|
||||||
Point your browser to `http://localhost:8083` or `http://localhost:8083/opds` for the OPDS catalog
|
Point your browser to `http://localhost:8083` or `http://localhost:8083/opds` for the OPDS catalog \
|
||||||
Set `Location of Calibre database` to the path of the folder where your Calibre library (metadata.db) lives, push "submit" button\
|
Login with default admin login \
|
||||||
Optionally a Google Drive can be used to host the calibre library [-> Using Google Drive integration](https://github.com/janeczku/calibre-web/wiki/Configuration#using-google-drive-integration)
|
Set `Location of Calibre database` to the path of the folder where your Calibre library (metadata.db) lives, push "submit" button \
|
||||||
Go to Login page
|
Optionally a Google Drive can be used to host the calibre library [-> Using Google Drive integration](https://github.com/janeczku/calibre-web/wiki/Configuration#using-google-drive-integration) \
|
||||||
|
Afterwards you can configure your Calibre-Web instance ([Basic Configuration](https://github.com/janeczku/calibre-web/wiki/Configuration#basic-configuration) and [UI Configuration](https://github.com/janeczku/calibre-web/wiki/Configuration#ui-configuration) on admin page)
|
||||||
|
|
||||||
#### Default admin login:
|
#### Default admin login:
|
||||||
*Username:* admin\
|
*Username:* admin\
|
||||||
|
@ -71,7 +68,7 @@ Optionally, to enable on-the-fly conversion from one ebook format to another whe
|
||||||
|
|
||||||
[Download and install](https://calibre-ebook.com/download) the Calibre desktop program for your platform and enter the folder including program name (normally /opt/calibre/ebook-convert, or C:\Program Files\calibre\ebook-convert.exe) in the field "calibre's converter tool" on the setup page.
|
[Download and install](https://calibre-ebook.com/download) the Calibre desktop program for your platform and enter the folder including program name (normally /opt/calibre/ebook-convert, or C:\Program Files\calibre\ebook-convert.exe) in the field "calibre's converter tool" on the setup page.
|
||||||
|
|
||||||
[Download](https://github.com/pgaskin/kepubify/releases/latest) Kepubify tool for your platform and place the binary starting with `kepubify` in Linux: `\opt\kepubify` Windows: `C:\Program Files\kepubify`.
|
[Download](https://github.com/pgaskin/kepubify/releases/latest) Kepubify tool for your platform and place the binary starting with `kepubify` in Linux: `/opt/kepubify` Windows: `C:\Program Files\kepubify`.
|
||||||
|
|
||||||
## Docker Images
|
## Docker Images
|
||||||
|
|
||||||
|
|
18
SECURITY.md
18
SECURITY.md
|
@ -24,16 +24,22 @@ To receive fixes for security vulnerabilities it is required to always upgrade t
|
||||||
| V 0.6.13 | JavaScript could get executed in the shelf title ||
|
| V 0.6.13 | JavaScript could get executed in the shelf title ||
|
||||||
| V 0.6.13 | Login with the old session cookie after logout. Thanks to @ibarrionuevo ||
|
| V 0.6.13 | Login with the old session cookie after logout. Thanks to @ibarrionuevo ||
|
||||||
| V 0.6.14 | CSRF was possible. Thanks to @mik317 and Hagai Wechsler (WhiteSource) |CVE-2021-25965|
|
| V 0.6.14 | CSRF was possible. Thanks to @mik317 and Hagai Wechsler (WhiteSource) |CVE-2021-25965|
|
||||||
| V 0.6.14 | Migrated some routes to POST-requests (CSRF protection). Thanks to @scara31 ||
|
| V 0.6.14 | Migrated some routes to POST-requests (CSRF protection). Thanks to @scara31 |CVE-2021-4164|
|
||||||
| V 0.6.15 | Fix for "javascript:" script links in identifier. Thanks to @scara31 ||
|
| V 0.6.15 | Fix for "javascript:" script links in identifier. Thanks to @scara31 |CVE-2021-4170|
|
||||||
| V 0.6.15 | Cross-Site Scripting vulnerability on uploaded cover file names. Thanks to @ibarrionuevo ||
|
| V 0.6.15 | Cross-Site Scripting vulnerability on uploaded cover file names. Thanks to @ibarrionuevo ||
|
||||||
| V 0.6.15 | Creating public shelfs is now denied if user is missing the edit public shelf right. Thanks to @ibarrionuevo ||
|
| V 0.6.15 | Creating public shelfs is now denied if user is missing the edit public shelf right. Thanks to @ibarrionuevo ||
|
||||||
| V 0.6.15 | Changed error message in case of trying to delete a shelf unauthorized. Thanks to @ibarrionuevo ||
|
| V 0.6.15 | Changed error message in case of trying to delete a shelf unauthorized. Thanks to @ibarrionuevo ||
|
||||||
| V 0.6.16 | JavaScript could get executed on authors page. Thanks to @alicaz ||
|
| V 0.6.16 | JavaScript could get executed on authors page. Thanks to @alicaz |CVE-2022-0352|
|
||||||
| V 0.6.16 | Localhost can no longer be used to upload covers. Thanks to @scara31 ||
|
| V 0.6.16 | Localhost can no longer be used to upload covers. Thanks to @scara31 |CVE-2022-0339|
|
||||||
| V 0.6.16 | Another case where public shelfs could be created without permission is prevented. Thanks to @nhiephon ||
|
| V 0.6.16 | Another case where public shelfs could be created without permission is prevented. Thanks to @nhiephon |CVE-2022-0273|
|
||||||
|
| V 0.6.16 | It's prevented to get the name of a private shelfs. Thanks to @nhiephon |CVE-2022-0405|
|
||||||
|
| V 0.6.17 | The SSRF Protection can no longer be bypassed via an HTTP redirect. Thanks to @416e6e61 |CVE-2022-0767|
|
||||||
|
| V 0.6.17 | The SSRF Protection can no longer be bypassed via 0.0.0.0 and it's ipv6 equivalent. Thanks to @r0hanSH |CVE-2022-0766|
|
||||||
|
| V 0.6.18 | Possible SQL Injection is prevented in user table Thanks to Iman Sharafaldin (Forward Security) ||
|
||||||
|
| V 0.6.18 | The SSRF protection no longer can be bypassed by IPV6/IPV4 embedding. Thanks to @416e6e61 |CVE-2022-0939|
|
||||||
|
| V 0.6.18 | The SSRF protection no longer can be bypassed to connect to other servers in the local network. Thanks to @michaellrowley |CVE-2022-0990|
|
||||||
|
|
||||||
|
|
||||||
## Staement regarding Log4j (CVE-2021-44228 and related)
|
## Statement regarding Log4j (CVE-2021-44228 and related)
|
||||||
|
|
||||||
Calibre-web is not affected by bugs related to Log4j. Calibre-Web is a python program, therefore not using Java, and not using the Java logging feature log4j.
|
Calibre-web is not affected by bugs related to Log4j. Calibre-Web is a python program, therefore not using Java, and not using the Java logging feature log4j.
|
||||||
|
|
15
cps.py
15
cps.py
|
@ -16,11 +16,6 @@
|
||||||
#
|
#
|
||||||
# You should have received a copy of the GNU General Public License
|
# You should have received a copy of the GNU General Public License
|
||||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
try:
|
|
||||||
from gevent import monkey
|
|
||||||
monkey.patch_all()
|
|
||||||
except ImportError:
|
|
||||||
pass
|
|
||||||
|
|
||||||
import sys
|
import sys
|
||||||
import os
|
import os
|
||||||
|
@ -40,10 +35,11 @@ from cps.about import about
|
||||||
from cps.shelf import shelf
|
from cps.shelf import shelf
|
||||||
from cps.admin import admi
|
from cps.admin import admi
|
||||||
from cps.gdrive import gdrive
|
from cps.gdrive import gdrive
|
||||||
from cps.editbooks import editbook
|
from cps.editbooks import EditBook
|
||||||
from cps.remotelogin import remotelogin
|
from cps.remotelogin import remotelogin
|
||||||
from cps.search_metadata import meta
|
from cps.search_metadata import meta
|
||||||
from cps.error_handler import init_errorhandler
|
from cps.error_handler import init_errorhandler
|
||||||
|
from cps.schedule import register_scheduled_tasks, register_startup_tasks
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from cps.kobo import kobo, get_kobo_activated
|
from cps.kobo import kobo, get_kobo_activated
|
||||||
|
@ -73,12 +69,17 @@ def main():
|
||||||
app.register_blueprint(remotelogin)
|
app.register_blueprint(remotelogin)
|
||||||
app.register_blueprint(meta)
|
app.register_blueprint(meta)
|
||||||
app.register_blueprint(gdrive)
|
app.register_blueprint(gdrive)
|
||||||
app.register_blueprint(editbook)
|
app.register_blueprint(EditBook)
|
||||||
if kobo_available:
|
if kobo_available:
|
||||||
app.register_blueprint(kobo)
|
app.register_blueprint(kobo)
|
||||||
app.register_blueprint(kobo_auth)
|
app.register_blueprint(kobo_auth)
|
||||||
if oauth_available:
|
if oauth_available:
|
||||||
app.register_blueprint(oauth)
|
app.register_blueprint(oauth)
|
||||||
|
|
||||||
|
# Register scheduled tasks
|
||||||
|
register_scheduled_tasks() # ToDo only reconnect if reconnect is enabled
|
||||||
|
register_startup_tasks()
|
||||||
|
|
||||||
success = web_server.start()
|
success = web_server.start()
|
||||||
sys.exit(0 if success else 1)
|
sys.exit(0 if success else 1)
|
||||||
|
|
||||||
|
|
|
@ -91,7 +91,7 @@ if wtf_present:
|
||||||
else:
|
else:
|
||||||
csrf = None
|
csrf = None
|
||||||
|
|
||||||
ub.init_db(cli.settingspath)
|
ub.init_db(cli.settings_path)
|
||||||
# pylint: disable=no-member
|
# pylint: disable=no-member
|
||||||
config = config_sql.load_configuration(ub.session)
|
config = config_sql.load_configuration(ub.session)
|
||||||
|
|
||||||
|
@ -106,7 +106,7 @@ log = logger.create()
|
||||||
from . import services
|
from . import services
|
||||||
|
|
||||||
db.CalibreDB.update_config(config)
|
db.CalibreDB.update_config(config)
|
||||||
db.CalibreDB.setup_db(config.config_calibre_dir, cli.settingspath)
|
db.CalibreDB.setup_db(config.config_calibre_dir, cli.settings_path)
|
||||||
|
|
||||||
|
|
||||||
calibre_db = db.CalibreDB()
|
calibre_db = db.CalibreDB()
|
||||||
|
@ -156,9 +156,10 @@ def create_app():
|
||||||
services.goodreads_support.connect(config.config_goodreads_api_key,
|
services.goodreads_support.connect(config.config_goodreads_api_key,
|
||||||
config.config_goodreads_api_secret,
|
config.config_goodreads_api_secret,
|
||||||
config.config_use_goodreads)
|
config.config_use_goodreads)
|
||||||
config.store_calibre_uuid(calibre_db, db.Library_Id)
|
config.store_calibre_uuid(calibre_db, db.LibraryId)
|
||||||
return app
|
return app
|
||||||
|
|
||||||
|
|
||||||
@babel.localeselector
|
@babel.localeselector
|
||||||
def get_locale():
|
def get_locale():
|
||||||
# if a user is logged in, use the locale from the user settings
|
# if a user is logged in, use the locale from the user settings
|
||||||
|
@ -178,12 +179,6 @@ def get_locale():
|
||||||
return negotiate_locale(preferred or ['en'], _BABEL_TRANSLATIONS)
|
return negotiate_locale(preferred or ['en'], _BABEL_TRANSLATIONS)
|
||||||
|
|
||||||
|
|
||||||
@babel.timezoneselector
|
|
||||||
def get_timezone():
|
|
||||||
user = getattr(g, 'user', None)
|
|
||||||
return user.timezone if user else None
|
|
||||||
|
|
||||||
|
|
||||||
from .updater import Updater
|
from .updater import Updater
|
||||||
updater_thread = Updater()
|
updater_thread = Updater()
|
||||||
|
|
||||||
|
|
|
@ -69,9 +69,9 @@ _VERSIONS.update(uploader.get_versions(False))
|
||||||
|
|
||||||
|
|
||||||
def collect_stats():
|
def collect_stats():
|
||||||
_VERSIONS['ebook converter'] = _(converter.get_calibre_version())
|
_VERSIONS['ebook converter'] = converter.get_calibre_version()
|
||||||
_VERSIONS['unrar'] = _(converter.get_unrar_version())
|
_VERSIONS['unrar'] = converter.get_unrar_version()
|
||||||
_VERSIONS['kepubify'] = _(converter.get_kepubify_version())
|
_VERSIONS['kepubify'] = converter.get_kepubify_version()
|
||||||
return _VERSIONS
|
return _VERSIONS
|
||||||
|
|
||||||
|
|
||||||
|
|
221
cps/admin.py
221
cps/admin.py
|
@ -24,12 +24,12 @@ import os
|
||||||
import re
|
import re
|
||||||
import base64
|
import base64
|
||||||
import json
|
import json
|
||||||
import time
|
|
||||||
import operator
|
import operator
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta, time
|
||||||
|
from functools import wraps
|
||||||
|
|
||||||
from babel import Locale as LC
|
from babel import Locale
|
||||||
from babel.dates import format_datetime
|
from babel.dates import format_datetime, format_time, format_timedelta
|
||||||
from flask import Blueprint, flash, redirect, url_for, abort, request, make_response, send_from_directory, g, Response
|
from flask import Blueprint, flash, redirect, url_for, abort, request, make_response, send_from_directory, g, Response
|
||||||
from flask_login import login_required, current_user, logout_user, confirm_login
|
from flask_login import login_required, current_user, logout_user, confirm_login
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
|
@ -40,14 +40,15 @@ from sqlalchemy.exc import IntegrityError, OperationalError, InvalidRequestError
|
||||||
from sqlalchemy.sql.expression import func, or_, text
|
from sqlalchemy.sql.expression import func, or_, text
|
||||||
|
|
||||||
from . import constants, logger, helper, services, cli
|
from . import constants, logger, helper, services, cli
|
||||||
from . import db, calibre_db, ub, web_server, get_locale, config, updater_thread, babel, gdriveutils, kobo_sync_status
|
from . import db, calibre_db, ub, web_server, get_locale, config, updater_thread, babel, gdriveutils, \
|
||||||
|
kobo_sync_status, schedule
|
||||||
from .helper import check_valid_domain, send_test_mail, reset_password, generate_password_hash, check_email, \
|
from .helper import check_valid_domain, send_test_mail, reset_password, generate_password_hash, check_email, \
|
||||||
valid_email, check_username
|
valid_email, check_username, update_thumbnail_cache
|
||||||
from .gdriveutils import is_gdrive_ready, gdrive_support
|
from .gdriveutils import is_gdrive_ready, gdrive_support
|
||||||
from .render_template import render_title_template, get_sidebar_config
|
from .render_template import render_title_template, get_sidebar_config
|
||||||
|
from .services.worker import WorkerThread
|
||||||
from . import debug_info, _BABEL_TRANSLATIONS
|
from . import debug_info, _BABEL_TRANSLATIONS
|
||||||
|
|
||||||
from functools import wraps
|
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
|
@ -56,7 +57,8 @@ feature_support = {
|
||||||
'goodreads': bool(services.goodreads_support),
|
'goodreads': bool(services.goodreads_support),
|
||||||
'kobo': bool(services.kobo),
|
'kobo': bool(services.kobo),
|
||||||
'updater': constants.UPDATER_AVAILABLE,
|
'updater': constants.UPDATER_AVAILABLE,
|
||||||
'gmail': bool(services.gmail)
|
'gmail': bool(services.gmail),
|
||||||
|
'scheduler': schedule.use_APScheduler
|
||||||
}
|
}
|
||||||
|
|
||||||
try:
|
try:
|
||||||
|
@ -159,7 +161,7 @@ def shutdown():
|
||||||
# needed for docker applications, as changes on metadata.db from host are not visible to application
|
# needed for docker applications, as changes on metadata.db from host are not visible to application
|
||||||
@admi.route("/reconnect", methods=['GET'])
|
@admi.route("/reconnect", methods=['GET'])
|
||||||
def reconnect():
|
def reconnect():
|
||||||
if cli.args.r:
|
if cli.reconnect_enable:
|
||||||
calibre_db.reconnect_db(config, ub.app_DB_path)
|
calibre_db.reconnect_db(config, ub.app_DB_path)
|
||||||
return json.dumps({})
|
return json.dumps({})
|
||||||
else:
|
else:
|
||||||
|
@ -167,10 +169,22 @@ def reconnect():
|
||||||
abort(404)
|
abort(404)
|
||||||
|
|
||||||
|
|
||||||
|
@admi.route("/ajax/updateThumbnails", methods=['POST'])
|
||||||
|
@admin_required
|
||||||
|
@login_required
|
||||||
|
def update_thumbnails():
|
||||||
|
content = config.get_scheduled_task_settings()
|
||||||
|
if content['schedule_generate_book_covers']:
|
||||||
|
log.info("Update of Cover cache requested")
|
||||||
|
update_thumbnail_cache()
|
||||||
|
return ""
|
||||||
|
|
||||||
|
|
||||||
@admi.route("/admin/view")
|
@admi.route("/admin/view")
|
||||||
@login_required
|
@login_required
|
||||||
@admin_required
|
@admin_required
|
||||||
def admin():
|
def admin():
|
||||||
|
locale = get_locale()
|
||||||
version = updater_thread.get_current_version_info()
|
version = updater_thread.get_current_version_info()
|
||||||
if version is False:
|
if version is False:
|
||||||
commit = _(u'Unknown')
|
commit = _(u'Unknown')
|
||||||
|
@ -185,15 +199,19 @@ def admin():
|
||||||
form_date -= timedelta(hours=int(commit[20:22]), minutes=int(commit[23:]))
|
form_date -= timedelta(hours=int(commit[20:22]), minutes=int(commit[23:]))
|
||||||
elif commit[19] == '-':
|
elif commit[19] == '-':
|
||||||
form_date += timedelta(hours=int(commit[20:22]), minutes=int(commit[23:]))
|
form_date += timedelta(hours=int(commit[20:22]), minutes=int(commit[23:]))
|
||||||
commit = format_datetime(form_date - tz, format='short', locale=get_locale())
|
commit = format_datetime(form_date - tz, format='short', locale=locale)
|
||||||
else:
|
else:
|
||||||
commit = version['version']
|
commit = version['version']
|
||||||
|
|
||||||
allUser = ub.session.query(ub.User).all()
|
all_user = ub.session.query(ub.User).all()
|
||||||
email_settings = config.get_mail_settings()
|
email_settings = config.get_mail_settings()
|
||||||
kobo_support = feature_support['kobo'] and config.config_kobo_sync
|
schedule_time = format_time(time(hour=config.schedule_start_time), format="short", locale=locale)
|
||||||
return render_title_template("admin.html", allUser=allUser, email=email_settings, config=config, commit=commit,
|
t = timedelta(hours=config.schedule_duration // 60, minutes=config.schedule_duration % 60)
|
||||||
feature_support=feature_support, kobo_support=kobo_support,
|
schedule_duration = format_timedelta(t, format="short", threshold=.99, locale=locale)
|
||||||
|
|
||||||
|
return render_title_template("admin.html", allUser=all_user, email=email_settings, config=config, commit=commit,
|
||||||
|
feature_support=feature_support, schedule_time=schedule_time,
|
||||||
|
schedule_duration=schedule_duration,
|
||||||
title=_(u"Admin page"), page="admin")
|
title=_(u"Admin page"), page="admin")
|
||||||
|
|
||||||
|
|
||||||
|
@ -242,12 +260,12 @@ def calibreweb_alive():
|
||||||
@login_required
|
@login_required
|
||||||
@admin_required
|
@admin_required
|
||||||
def view_configuration():
|
def view_configuration():
|
||||||
read_column = calibre_db.session.query(db.Custom_Columns)\
|
read_column = calibre_db.session.query(db.CustomColumns)\
|
||||||
.filter(and_(db.Custom_Columns.datatype == 'bool', db.Custom_Columns.mark_for_delete == 0)).all()
|
.filter(and_(db.CustomColumns.datatype == 'bool', db.CustomColumns.mark_for_delete == 0)).all()
|
||||||
restrict_columns = calibre_db.session.query(db.Custom_Columns)\
|
restrict_columns = calibre_db.session.query(db.CustomColumns)\
|
||||||
.filter(and_(db.Custom_Columns.datatype == 'text', db.Custom_Columns.mark_for_delete == 0)).all()
|
.filter(and_(db.CustomColumns.datatype == 'text', db.CustomColumns.mark_for_delete == 0)).all()
|
||||||
languages = calibre_db.speaking_language()
|
languages = calibre_db.speaking_language()
|
||||||
translations = [LC('en')] + babel.list_translations()
|
translations = [Locale('en')] + babel.list_translations()
|
||||||
return render_title_template("config_view_edit.html", conf=config, readColumns=read_column,
|
return render_title_template("config_view_edit.html", conf=config, readColumns=read_column,
|
||||||
restrictColumns=restrict_columns,
|
restrictColumns=restrict_columns,
|
||||||
languages=languages,
|
languages=languages,
|
||||||
|
@ -261,8 +279,8 @@ def view_configuration():
|
||||||
def edit_user_table():
|
def edit_user_table():
|
||||||
visibility = current_user.view_settings.get('useredit', {})
|
visibility = current_user.view_settings.get('useredit', {})
|
||||||
languages = calibre_db.speaking_language()
|
languages = calibre_db.speaking_language()
|
||||||
translations = babel.list_translations() + [LC('en')]
|
translations = babel.list_translations() + [Locale('en')]
|
||||||
allUser = ub.session.query(ub.User)
|
all_user = ub.session.query(ub.User)
|
||||||
tags = calibre_db.session.query(db.Tags)\
|
tags = calibre_db.session.query(db.Tags)\
|
||||||
.join(db.books_tags_link)\
|
.join(db.books_tags_link)\
|
||||||
.join(db.Books)\
|
.join(db.Books)\
|
||||||
|
@ -274,10 +292,10 @@ def edit_user_table():
|
||||||
else:
|
else:
|
||||||
custom_values = []
|
custom_values = []
|
||||||
if not config.config_anonbrowse:
|
if not config.config_anonbrowse:
|
||||||
allUser = allUser.filter(ub.User.role.op('&')(constants.ROLE_ANONYMOUS) != constants.ROLE_ANONYMOUS)
|
all_user = all_user.filter(ub.User.role.op('&')(constants.ROLE_ANONYMOUS) != constants.ROLE_ANONYMOUS)
|
||||||
kobo_support = feature_support['kobo'] and config.config_kobo_sync
|
kobo_support = feature_support['kobo'] and config.config_kobo_sync
|
||||||
return render_title_template("user_table.html",
|
return render_title_template("user_table.html",
|
||||||
users=allUser.all(),
|
users=all_user.all(),
|
||||||
tags=tags,
|
tags=tags,
|
||||||
custom_values=custom_values,
|
custom_values=custom_values,
|
||||||
translations=translations,
|
translations=translations,
|
||||||
|
@ -298,10 +316,13 @@ def list_users():
|
||||||
limit = int(request.args.get("limit") or 10)
|
limit = int(request.args.get("limit") or 10)
|
||||||
search = request.args.get("search")
|
search = request.args.get("search")
|
||||||
sort = request.args.get("sort", "id")
|
sort = request.args.get("sort", "id")
|
||||||
order = request.args.get("order", "").lower()
|
|
||||||
state = None
|
state = None
|
||||||
if sort == "state":
|
if sort == "state":
|
||||||
state = json.loads(request.args.get("state", "[]"))
|
state = json.loads(request.args.get("state", "[]"))
|
||||||
|
else:
|
||||||
|
if sort not in ub.User.__table__.columns.keys():
|
||||||
|
sort = "id"
|
||||||
|
order = request.args.get("order", "").lower()
|
||||||
|
|
||||||
if sort != "state" and order:
|
if sort != "state" and order:
|
||||||
order = text(sort + " " + order)
|
order = text(sort + " " + order)
|
||||||
|
@ -329,7 +350,7 @@ def list_users():
|
||||||
if user.default_language == "all":
|
if user.default_language == "all":
|
||||||
user.default = _("All")
|
user.default = _("All")
|
||||||
else:
|
else:
|
||||||
user.default = LC.parse(user.default_language).get_language_name(get_locale())
|
user.default = Locale.parse(user.default_language).get_language_name(get_locale())
|
||||||
|
|
||||||
table_entries = {'totalNotFiltered': total_count, 'total': filtered_count, "rows": users}
|
table_entries = {'totalNotFiltered': total_count, 'total': filtered_count, "rows": users}
|
||||||
js_list = json.dumps(table_entries, cls=db.AlchemyEncoder)
|
js_list = json.dumps(table_entries, cls=db.AlchemyEncoder)
|
||||||
|
@ -377,7 +398,7 @@ def delete_user():
|
||||||
@login_required
|
@login_required
|
||||||
@admin_required
|
@admin_required
|
||||||
def table_get_locale():
|
def table_get_locale():
|
||||||
locale = babel.list_translations() + [LC('en')]
|
locale = babel.list_translations() + [Locale('en')]
|
||||||
ret = list()
|
ret = list()
|
||||||
current_locale = get_locale()
|
current_locale = get_locale()
|
||||||
for loc in locale:
|
for loc in locale:
|
||||||
|
@ -441,7 +462,7 @@ def edit_list_user(param):
|
||||||
elif param.endswith('role'):
|
elif param.endswith('role'):
|
||||||
value = int(vals['field_index'])
|
value = int(vals['field_index'])
|
||||||
if user.name == "Guest" and value in \
|
if user.name == "Guest" and value in \
|
||||||
[constants.ROLE_ADMIN, constants.ROLE_PASSWD, constants.ROLE_EDIT_SHELFS]:
|
[constants.ROLE_ADMIN, constants.ROLE_PASSWD, constants.ROLE_EDIT_SHELFS]:
|
||||||
raise Exception(_("Guest can't have this role"))
|
raise Exception(_("Guest can't have this role"))
|
||||||
# check for valid value, last on checks for power of 2 value
|
# check for valid value, last on checks for power of 2 value
|
||||||
if value > 0 and value <= constants.ROLE_VIEWER and (value & value-1 == 0 or value == 1):
|
if value > 0 and value <= constants.ROLE_VIEWER and (value & value-1 == 0 or value == 1):
|
||||||
|
@ -496,7 +517,7 @@ def edit_list_user(param):
|
||||||
else:
|
else:
|
||||||
return _("Parameter not found"), 400
|
return _("Parameter not found"), 400
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.debug_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
return str(ex), 400
|
return str(ex), 400
|
||||||
ub.session_commit()
|
ub.session_commit()
|
||||||
return ""
|
return ""
|
||||||
|
@ -521,16 +542,16 @@ def update_table_settings():
|
||||||
|
|
||||||
def check_valid_read_column(column):
|
def check_valid_read_column(column):
|
||||||
if column != "0":
|
if column != "0":
|
||||||
if not calibre_db.session.query(db.Custom_Columns).filter(db.Custom_Columns.id == column) \
|
if not calibre_db.session.query(db.CustomColumns).filter(db.CustomColumns.id == column) \
|
||||||
.filter(and_(db.Custom_Columns.datatype == 'bool', db.Custom_Columns.mark_for_delete == 0)).all():
|
.filter(and_(db.CustomColumns.datatype == 'bool', db.CustomColumns.mark_for_delete == 0)).all():
|
||||||
return False
|
return False
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
|
||||||
def check_valid_restricted_column(column):
|
def check_valid_restricted_column(column):
|
||||||
if column != "0":
|
if column != "0":
|
||||||
if not calibre_db.session.query(db.Custom_Columns).filter(db.Custom_Columns.id == column) \
|
if not calibre_db.session.query(db.CustomColumns).filter(db.CustomColumns.id == column) \
|
||||||
.filter(and_(db.Custom_Columns.datatype == 'text', db.Custom_Columns.mark_for_delete == 0)).all():
|
.filter(and_(db.CustomColumns.datatype == 'text', db.CustomColumns.mark_for_delete == 0)).all():
|
||||||
return False
|
return False
|
||||||
return True
|
return True
|
||||||
|
|
||||||
|
@ -607,6 +628,8 @@ def load_dialogtexts(element_id):
|
||||||
texts["main"] = _('Are you sure you want to change shelf sync behavior for the selected user(s)?')
|
texts["main"] = _('Are you sure you want to change shelf sync behavior for the selected user(s)?')
|
||||||
elif element_id == "db_submit":
|
elif element_id == "db_submit":
|
||||||
texts["main"] = _('Are you sure you want to change Calibre library location?')
|
texts["main"] = _('Are you sure you want to change Calibre library location?')
|
||||||
|
elif element_id == "admin_refresh_cover_cache":
|
||||||
|
texts["main"] = _('Calibre-Web will search for updated Covers and update Cover Thumbnails, this may take a while?')
|
||||||
elif element_id == "btnfullsync":
|
elif element_id == "btnfullsync":
|
||||||
texts["main"] = _("Are you sure you want delete Calibre-Web's sync database "
|
texts["main"] = _("Are you sure you want delete Calibre-Web's sync database "
|
||||||
"to force a full sync with your Kobo Reader?")
|
"to force a full sync with your Kobo Reader?")
|
||||||
|
@ -860,10 +883,10 @@ def delete_restriction(res_type, user_id):
|
||||||
usr = current_user
|
usr = current_user
|
||||||
if element['id'].startswith('a'):
|
if element['id'].startswith('a'):
|
||||||
usr.allowed_tags = restriction_deletion(element, usr.list_allowed_tags)
|
usr.allowed_tags = restriction_deletion(element, usr.list_allowed_tags)
|
||||||
ub.session_commit("Deleted allowed tags of user {}: {}".format(usr.name, usr.list_allowed_tags))
|
ub.session_commit("Deleted allowed tags of user {}: {}".format(usr.name, element['Element']))
|
||||||
elif element['id'].startswith('d'):
|
elif element['id'].startswith('d'):
|
||||||
usr.denied_tags = restriction_deletion(element, usr.list_denied_tags)
|
usr.denied_tags = restriction_deletion(element, usr.list_denied_tags)
|
||||||
ub.session_commit("Deleted denied tags of user {}: {}".format(usr.name, usr.list_allowed_tags))
|
ub.session_commit("Deleted denied tag of user {}: {}".format(usr.name, element['Element']))
|
||||||
elif res_type == 3: # Columns per user
|
elif res_type == 3: # Columns per user
|
||||||
if isinstance(user_id, int):
|
if isinstance(user_id, int):
|
||||||
usr = ub.session.query(ub.User).filter(ub.User.id == int(user_id)).first()
|
usr = ub.session.query(ub.User).filter(ub.User.id == int(user_id)).first()
|
||||||
|
@ -872,12 +895,12 @@ def delete_restriction(res_type, user_id):
|
||||||
if element['id'].startswith('a'):
|
if element['id'].startswith('a'):
|
||||||
usr.allowed_column_value = restriction_deletion(element, usr.list_allowed_column_values)
|
usr.allowed_column_value = restriction_deletion(element, usr.list_allowed_column_values)
|
||||||
ub.session_commit("Deleted allowed columns of user {}: {}".format(usr.name,
|
ub.session_commit("Deleted allowed columns of user {}: {}".format(usr.name,
|
||||||
usr.list_allowed_column_values))
|
usr.list_allowed_column_values()))
|
||||||
|
|
||||||
elif element['id'].startswith('d'):
|
elif element['id'].startswith('d'):
|
||||||
usr.denied_column_value = restriction_deletion(element, usr.list_denied_column_values)
|
usr.denied_column_value = restriction_deletion(element, usr.list_denied_column_values)
|
||||||
ub.session_commit("Deleted denied columns of user {}: {}".format(usr.name,
|
ub.session_commit("Deleted denied columns of user {}: {}".format(usr.name,
|
||||||
usr.list_denied_column_values))
|
usr.list_denied_column_values()))
|
||||||
return ""
|
return ""
|
||||||
|
|
||||||
|
|
||||||
|
@ -1075,12 +1098,12 @@ def _configuration_oauth_helper(to_save):
|
||||||
reboot_required = False
|
reboot_required = False
|
||||||
for element in oauthblueprints:
|
for element in oauthblueprints:
|
||||||
if to_save["config_" + str(element['id']) + "_oauth_client_id"] != element['oauth_client_id'] \
|
if to_save["config_" + str(element['id']) + "_oauth_client_id"] != element['oauth_client_id'] \
|
||||||
or to_save["config_" + str(element['id']) + "_oauth_client_secret"] != element['oauth_client_secret']:
|
or to_save["config_" + str(element['id']) + "_oauth_client_secret"] != element['oauth_client_secret']:
|
||||||
reboot_required = True
|
reboot_required = True
|
||||||
element['oauth_client_id'] = to_save["config_" + str(element['id']) + "_oauth_client_id"]
|
element['oauth_client_id'] = to_save["config_" + str(element['id']) + "_oauth_client_id"]
|
||||||
element['oauth_client_secret'] = to_save["config_" + str(element['id']) + "_oauth_client_secret"]
|
element['oauth_client_secret'] = to_save["config_" + str(element['id']) + "_oauth_client_secret"]
|
||||||
if to_save["config_" + str(element['id']) + "_oauth_client_id"] \
|
if to_save["config_" + str(element['id']) + "_oauth_client_id"] \
|
||||||
and to_save["config_" + str(element['id']) + "_oauth_client_secret"]:
|
and to_save["config_" + str(element['id']) + "_oauth_client_secret"]:
|
||||||
active_oauths += 1
|
active_oauths += 1
|
||||||
element["active"] = 1
|
element["active"] = 1
|
||||||
else:
|
else:
|
||||||
|
@ -1133,7 +1156,7 @@ def _configuration_ldap_helper(to_save):
|
||||||
if not config.config_ldap_provider_url \
|
if not config.config_ldap_provider_url \
|
||||||
or not config.config_ldap_port \
|
or not config.config_ldap_port \
|
||||||
or not config.config_ldap_dn \
|
or not config.config_ldap_dn \
|
||||||
or not config.config_ldap_user_object:
|
or not config.config_ldap_user_object:
|
||||||
return reboot_required, _configuration_result(_('Please Enter a LDAP Provider, '
|
return reboot_required, _configuration_result(_('Please Enter a LDAP Provider, '
|
||||||
'Port, DN and User Object Identifier'))
|
'Port, DN and User Object Identifier'))
|
||||||
|
|
||||||
|
@ -1208,15 +1231,16 @@ def _db_configuration_update_helper():
|
||||||
'',
|
'',
|
||||||
to_save['config_calibre_dir'],
|
to_save['config_calibre_dir'],
|
||||||
flags=re.IGNORECASE)
|
flags=re.IGNORECASE)
|
||||||
|
db_valid = False
|
||||||
try:
|
try:
|
||||||
db_change, db_valid = _db_simulate_change()
|
db_change, db_valid = _db_simulate_change()
|
||||||
|
|
||||||
# gdrive_error drive setup
|
# gdrive_error drive setup
|
||||||
gdrive_error = _configuration_gdrive_helper(to_save)
|
gdrive_error = _configuration_gdrive_helper(to_save)
|
||||||
except (OperationalError, InvalidRequestError):
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("Settings DB is not Writeable")
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
_db_configuration_result(_("Settings DB is not Writeable"), gdrive_error)
|
_db_configuration_result(_(u"Database error: %(error)s.", error=e.orig), gdrive_error)
|
||||||
try:
|
try:
|
||||||
metadata_db = os.path.join(to_save['config_calibre_dir'], "metadata.db")
|
metadata_db = os.path.join(to_save['config_calibre_dir'], "metadata.db")
|
||||||
if config.config_use_google_drive and is_gdrive_ready() and not os.path.exists(metadata_db):
|
if config.config_use_google_drive and is_gdrive_ready() and not os.path.exists(metadata_db):
|
||||||
|
@ -1226,14 +1250,14 @@ def _db_configuration_update_helper():
|
||||||
return _db_configuration_result('{}'.format(ex), gdrive_error)
|
return _db_configuration_result('{}'.format(ex), gdrive_error)
|
||||||
|
|
||||||
if db_change or not db_valid or not config.db_configured \
|
if db_change or not db_valid or not config.db_configured \
|
||||||
or config.config_calibre_dir != to_save["config_calibre_dir"]:
|
or config.config_calibre_dir != to_save["config_calibre_dir"]:
|
||||||
if not calibre_db.setup_db(to_save['config_calibre_dir'], ub.app_DB_path):
|
if not calibre_db.setup_db(to_save['config_calibre_dir'], ub.app_DB_path):
|
||||||
return _db_configuration_result(_('DB Location is not Valid, Please Enter Correct Path'),
|
return _db_configuration_result(_('DB Location is not Valid, Please Enter Correct Path'),
|
||||||
gdrive_error)
|
gdrive_error)
|
||||||
config.store_calibre_uuid(calibre_db, db.Library_Id)
|
config.store_calibre_uuid(calibre_db, db.LibraryId)
|
||||||
# if db changed -> delete shelfs, delete download books, delete read books, kobo sync...
|
# if db changed -> delete shelfs, delete download books, delete read books, kobo sync...
|
||||||
if db_change:
|
if db_change:
|
||||||
log.info("Calibre Database changed, delete all Calibre-Web info related to old Database")
|
log.info("Calibre Database changed, all Calibre-Web info related to old Database gets deleted")
|
||||||
ub.session.query(ub.Downloads).delete()
|
ub.session.query(ub.Downloads).delete()
|
||||||
ub.session.query(ub.ArchivedBook).delete()
|
ub.session.query(ub.ArchivedBook).delete()
|
||||||
ub.session.query(ub.ReadBook).delete()
|
ub.session.query(ub.ReadBook).delete()
|
||||||
|
@ -1242,6 +1266,7 @@ def _db_configuration_update_helper():
|
||||||
ub.session.query(ub.KoboReadingState).delete()
|
ub.session.query(ub.KoboReadingState).delete()
|
||||||
ub.session.query(ub.KoboStatistics).delete()
|
ub.session.query(ub.KoboStatistics).delete()
|
||||||
ub.session.query(ub.KoboSyncedBooks).delete()
|
ub.session.query(ub.KoboSyncedBooks).delete()
|
||||||
|
helper.delete_thumbnail_cache()
|
||||||
ub.session_commit()
|
ub.session_commit()
|
||||||
_config_string(to_save, "config_calibre_dir")
|
_config_string(to_save, "config_calibre_dir")
|
||||||
calibre_db.update_config(config)
|
calibre_db.update_config(config)
|
||||||
|
@ -1269,7 +1294,7 @@ def _configuration_update_helper():
|
||||||
_config_checkbox_int(to_save, "config_unicode_filename")
|
_config_checkbox_int(to_save, "config_unicode_filename")
|
||||||
# Reboot on config_anonbrowse with enabled ldap, as decoraters are changed in this case
|
# Reboot on config_anonbrowse with enabled ldap, as decoraters are changed in this case
|
||||||
reboot_required |= (_config_checkbox_int(to_save, "config_anonbrowse")
|
reboot_required |= (_config_checkbox_int(to_save, "config_anonbrowse")
|
||||||
and config.config_login_type == constants.LOGIN_LDAP)
|
and config.config_login_type == constants.LOGIN_LDAP)
|
||||||
_config_checkbox_int(to_save, "config_public_reg")
|
_config_checkbox_int(to_save, "config_public_reg")
|
||||||
_config_checkbox_int(to_save, "config_register_email")
|
_config_checkbox_int(to_save, "config_register_email")
|
||||||
reboot_required |= _config_checkbox_int(to_save, "config_kobo_sync")
|
reboot_required |= _config_checkbox_int(to_save, "config_kobo_sync")
|
||||||
|
@ -1329,10 +1354,10 @@ def _configuration_update_helper():
|
||||||
unrar_status = helper.check_unrar(config.config_rarfile_location)
|
unrar_status = helper.check_unrar(config.config_rarfile_location)
|
||||||
if unrar_status:
|
if unrar_status:
|
||||||
return _configuration_result(unrar_status)
|
return _configuration_result(unrar_status)
|
||||||
except (OperationalError, InvalidRequestError):
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("Settings DB is not Writeable")
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
_configuration_result(_("Settings DB is not Writeable"))
|
_configuration_result(_(u"Database error: %(error)s.", error=e.orig))
|
||||||
|
|
||||||
config.save()
|
config.save()
|
||||||
if reboot_required:
|
if reboot_required:
|
||||||
|
@ -1427,10 +1452,10 @@ def _handle_new_user(to_save, content, languages, translations, kobo_support):
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("Found an existing account for {} or {}".format(content.name, content.email))
|
log.error("Found an existing account for {} or {}".format(content.name, content.email))
|
||||||
flash(_("Found an existing account for this e-mail address or name."), category="error")
|
flash(_("Found an existing account for this e-mail address or name."), category="error")
|
||||||
except OperationalError:
|
except OperationalError as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("Settings DB is not Writeable")
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_("Settings DB is not Writeable"), category="error")
|
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
||||||
|
|
||||||
|
|
||||||
def _delete_user(content):
|
def _delete_user(content):
|
||||||
|
@ -1489,7 +1514,7 @@ def _handle_edit_user(to_save, content, languages, translations, kobo_support):
|
||||||
content.role &= ~constants.ROLE_ANONYMOUS
|
content.role &= ~constants.ROLE_ANONYMOUS
|
||||||
|
|
||||||
val = [int(k[5:]) for k in to_save if k.startswith('show_')]
|
val = [int(k[5:]) for k in to_save if k.startswith('show_')]
|
||||||
sidebar = get_sidebar_config()
|
sidebar, __ = get_sidebar_config()
|
||||||
for element in sidebar:
|
for element in sidebar:
|
||||||
value = element['visibility']
|
value = element['visibility']
|
||||||
if value in val and not content.check_visibility(value):
|
if value in val and not content.check_visibility(value):
|
||||||
|
@ -1544,10 +1569,10 @@ def _handle_edit_user(to_save, content, languages, translations, kobo_support):
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("An unknown error occurred while changing user: {}".format(str(ex)))
|
log.error("An unknown error occurred while changing user: {}".format(str(ex)))
|
||||||
flash(_(u"An unknown error occurred. Please try again later."), category="error")
|
flash(_(u"An unknown error occurred. Please try again later."), category="error")
|
||||||
except OperationalError:
|
except OperationalError as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("Settings DB is not Writeable")
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_("Settings DB is not Writeable"), category="error")
|
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
||||||
return ""
|
return ""
|
||||||
|
|
||||||
|
|
||||||
|
@ -1557,7 +1582,7 @@ def _handle_edit_user(to_save, content, languages, translations, kobo_support):
|
||||||
def new_user():
|
def new_user():
|
||||||
content = ub.User()
|
content = ub.User()
|
||||||
languages = calibre_db.speaking_language()
|
languages = calibre_db.speaking_language()
|
||||||
translations = [LC('en')] + babel.list_translations()
|
translations = [Locale('en')] + babel.list_translations()
|
||||||
kobo_support = feature_support['kobo'] and config.config_kobo_sync
|
kobo_support = feature_support['kobo'] and config.config_kobo_sync
|
||||||
if request.method == "POST":
|
if request.method == "POST":
|
||||||
to_save = request.form.to_dict()
|
to_save = request.form.to_dict()
|
||||||
|
@ -1613,10 +1638,10 @@ def update_mailsettings():
|
||||||
_config_int(to_save, "mail_size", lambda y: int(y)*1024*1024)
|
_config_int(to_save, "mail_size", lambda y: int(y)*1024*1024)
|
||||||
try:
|
try:
|
||||||
config.save()
|
config.save()
|
||||||
except (OperationalError, InvalidRequestError):
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("Settings DB is not Writeable")
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_("Settings DB is not Writeable"), category="error")
|
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
||||||
return edit_mailsettings()
|
return edit_mailsettings()
|
||||||
|
|
||||||
if to_save.get("test"):
|
if to_save.get("test"):
|
||||||
|
@ -1635,6 +1660,66 @@ def update_mailsettings():
|
||||||
return edit_mailsettings()
|
return edit_mailsettings()
|
||||||
|
|
||||||
|
|
||||||
|
@admi.route("/admin/scheduledtasks")
|
||||||
|
@login_required
|
||||||
|
@admin_required
|
||||||
|
def edit_scheduledtasks():
|
||||||
|
content = config.get_scheduled_task_settings()
|
||||||
|
time_field = list()
|
||||||
|
duration_field = list()
|
||||||
|
|
||||||
|
locale = get_locale()
|
||||||
|
for n in range(24):
|
||||||
|
time_field.append((n , format_time(time(hour=n), format="short", locale=locale)))
|
||||||
|
for n in range(5, 65, 5):
|
||||||
|
t = timedelta(hours=n // 60, minutes=n % 60)
|
||||||
|
duration_field.append((n, format_timedelta(t, format="short", threshold=.99, locale=locale)))
|
||||||
|
|
||||||
|
return render_title_template("schedule_edit.html", config=content, starttime=time_field, duration=duration_field, title=_(u"Edit Scheduled Tasks Settings"))
|
||||||
|
|
||||||
|
|
||||||
|
@admi.route("/admin/scheduledtasks", methods=["POST"])
|
||||||
|
@login_required
|
||||||
|
@admin_required
|
||||||
|
def update_scheduledtasks():
|
||||||
|
error = False
|
||||||
|
to_save = request.form.to_dict()
|
||||||
|
if 0 <= int(to_save.get("schedule_start_time")) <= 23:
|
||||||
|
_config_int(to_save, "schedule_start_time")
|
||||||
|
else:
|
||||||
|
flash(_(u"Invalid start time for task specified"), category="error")
|
||||||
|
error = True
|
||||||
|
if 0 < int(to_save.get("schedule_duration")) <= 60:
|
||||||
|
_config_int(to_save, "schedule_duration")
|
||||||
|
else:
|
||||||
|
flash(_(u"Invalid duration for task specified"), category="error")
|
||||||
|
error = True
|
||||||
|
_config_checkbox(to_save, "schedule_generate_book_covers")
|
||||||
|
_config_checkbox(to_save, "schedule_generate_series_covers")
|
||||||
|
_config_checkbox(to_save, "schedule_reconnect")
|
||||||
|
|
||||||
|
if not error:
|
||||||
|
try:
|
||||||
|
config.save()
|
||||||
|
flash(_(u"Scheduled tasks settings updated"), category="success")
|
||||||
|
|
||||||
|
# Cancel any running tasks
|
||||||
|
schedule.end_scheduled_tasks()
|
||||||
|
|
||||||
|
# Re-register tasks with new settings
|
||||||
|
schedule.register_scheduled_tasks(config.schedule_reconnect)
|
||||||
|
except IntegrityError:
|
||||||
|
ub.session.rollback()
|
||||||
|
log.error("An unknown error occurred while saving scheduled tasks settings")
|
||||||
|
flash(_(u"An unknown error occurred. Please try again later."), category="error")
|
||||||
|
except OperationalError:
|
||||||
|
ub.session.rollback()
|
||||||
|
log.error("Settings DB is not Writeable")
|
||||||
|
flash(_("Settings DB is not Writeable"), category="error")
|
||||||
|
|
||||||
|
return edit_scheduledtasks()
|
||||||
|
|
||||||
|
|
||||||
@admi.route("/admin/user/<int:user_id>", methods=["GET", "POST"])
|
@admi.route("/admin/user/<int:user_id>", methods=["GET", "POST"])
|
||||||
@login_required
|
@login_required
|
||||||
@admin_required
|
@admin_required
|
||||||
|
@ -1644,7 +1729,7 @@ def edit_user(user_id):
|
||||||
flash(_(u"User not found"), category="error")
|
flash(_(u"User not found"), category="error")
|
||||||
return redirect(url_for('admin.admin'))
|
return redirect(url_for('admin.admin'))
|
||||||
languages = calibre_db.speaking_language(return_all_languages=True)
|
languages = calibre_db.speaking_language(return_all_languages=True)
|
||||||
translations = babel.list_translations() + [LC('en')]
|
translations = babel.list_translations() + [Locale('en')]
|
||||||
kobo_support = feature_support['kobo'] and config.config_kobo_sync
|
kobo_support = feature_support['kobo'] and config.config_kobo_sync
|
||||||
if request.method == "POST":
|
if request.method == "POST":
|
||||||
to_save = request.form.to_dict()
|
to_save = request.form.to_dict()
|
||||||
|
@ -1848,7 +1933,7 @@ def import_ldap_users():
|
||||||
try:
|
try:
|
||||||
new_users = services.ldap.get_group_members(config.config_ldap_group_name)
|
new_users = services.ldap.get_group_members(config.config_ldap_group_name)
|
||||||
except (services.ldap.LDAPException, TypeError, AttributeError, KeyError) as e:
|
except (services.ldap.LDAPException, TypeError, AttributeError, KeyError) as e:
|
||||||
log.debug_or_exception(e)
|
log.error_or_exception(e)
|
||||||
showtext['text'] = _(u'Error: %(ldaperror)s', ldaperror=e)
|
showtext['text'] = _(u'Error: %(ldaperror)s', ldaperror=e)
|
||||||
return json.dumps(showtext)
|
return json.dumps(showtext)
|
||||||
if not new_users:
|
if not new_users:
|
||||||
|
@ -1876,7 +1961,7 @@ def import_ldap_users():
|
||||||
try:
|
try:
|
||||||
user_data = services.ldap.get_object_details(user=user_identifier, query_filter=query_filter)
|
user_data = services.ldap.get_object_details(user=user_identifier, query_filter=query_filter)
|
||||||
except AttributeError as ex:
|
except AttributeError as ex:
|
||||||
log.debug_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
continue
|
continue
|
||||||
if user_data:
|
if user_data:
|
||||||
user_count, message = ldap_import_create_user(user, user_data)
|
user_count, message = ldap_import_create_user(user, user_data)
|
||||||
|
@ -1911,3 +1996,13 @@ def extract_dynamic_field_from_filter(user, filtr):
|
||||||
def extract_user_identifier(user, filtr):
|
def extract_user_identifier(user, filtr):
|
||||||
dynamic_field = extract_dynamic_field_from_filter(user, filtr)
|
dynamic_field = extract_dynamic_field_from_filter(user, filtr)
|
||||||
return extract_user_data_from_field(user, dynamic_field)
|
return extract_user_data_from_field(user, dynamic_field)
|
||||||
|
|
||||||
|
|
||||||
|
@admi.route("/ajax/canceltask", methods=['POST'])
|
||||||
|
@login_required
|
||||||
|
@admin_required
|
||||||
|
def cancel_task():
|
||||||
|
task_id = request.get_json().get('task_id', None)
|
||||||
|
worker = WorkerThread.get_instance()
|
||||||
|
worker.end_task(task_id)
|
||||||
|
return ""
|
||||||
|
|
|
@ -47,13 +47,16 @@ def init_cache_busting(app):
|
||||||
for filename in filenames:
|
for filename in filenames:
|
||||||
# compute version component
|
# compute version component
|
||||||
rooted_filename = os.path.join(dirpath, filename)
|
rooted_filename = os.path.join(dirpath, filename)
|
||||||
with open(rooted_filename, 'rb') as f:
|
try:
|
||||||
file_hash = hashlib.md5(f.read()).hexdigest()[:7] # nosec
|
with open(rooted_filename, 'rb') as f:
|
||||||
|
file_hash = hashlib.md5(f.read()).hexdigest()[:7] # nosec
|
||||||
|
# save version to tables
|
||||||
|
file_path = rooted_filename.replace(static_folder, "")
|
||||||
|
file_path = file_path.replace("\\", "/") # Convert Windows path to web path
|
||||||
|
hash_table[file_path] = file_hash
|
||||||
|
except PermissionError:
|
||||||
|
log.error("No permission to access {} file.".format(rooted_filename))
|
||||||
|
|
||||||
# save version to tables
|
|
||||||
file_path = rooted_filename.replace(static_folder, "")
|
|
||||||
file_path = file_path.replace("\\", "/") # Convert Windows path to web path
|
|
||||||
hash_table[file_path] = file_hash
|
|
||||||
log.debug('Finished computing cache-busting values')
|
log.debug('Finished computing cache-busting values')
|
||||||
|
|
||||||
def bust_filename(filename):
|
def bust_filename(filename):
|
||||||
|
|
19
cps/cli.py
19
cps/cli.py
|
@ -24,7 +24,7 @@ import socket
|
||||||
from .constants import CONFIG_DIR as _CONFIG_DIR
|
from .constants import CONFIG_DIR as _CONFIG_DIR
|
||||||
from .constants import STABLE_VERSION as _STABLE_VERSION
|
from .constants import STABLE_VERSION as _STABLE_VERSION
|
||||||
from .constants import NIGHTLY_VERSION as _NIGHTLY_VERSION
|
from .constants import NIGHTLY_VERSION as _NIGHTLY_VERSION
|
||||||
|
from .constants import DEFAULT_SETTINGS_FILE, DEFAULT_GDRIVE_FILE
|
||||||
|
|
||||||
def version_info():
|
def version_info():
|
||||||
if _NIGHTLY_VERSION[1].startswith('$Format'):
|
if _NIGHTLY_VERSION[1].startswith('$Format'):
|
||||||
|
@ -51,8 +51,15 @@ parser.add_argument('-d', action='store_true', help='Dry run of updater to check
|
||||||
parser.add_argument('-r', action='store_true', help='Enable public database reconnect route under /reconnect')
|
parser.add_argument('-r', action='store_true', help='Enable public database reconnect route under /reconnect')
|
||||||
args = parser.parse_args()
|
args = parser.parse_args()
|
||||||
|
|
||||||
settingspath = args.p or os.path.join(_CONFIG_DIR, "app.db")
|
settings_path = args.p or os.path.join(_CONFIG_DIR, DEFAULT_SETTINGS_FILE)
|
||||||
gdpath = args.g or os.path.join(_CONFIG_DIR, "gdrive.db")
|
gd_path = args.g or os.path.join(_CONFIG_DIR, DEFAULT_GDRIVE_FILE)
|
||||||
|
|
||||||
|
if os.path.isdir(settings_path):
|
||||||
|
settings_path = os.path.join(settings_path, DEFAULT_SETTINGS_FILE)
|
||||||
|
|
||||||
|
if os.path.isdir(gd_path):
|
||||||
|
gd_path = os.path.join(gd_path, DEFAULT_GDRIVE_FILE)
|
||||||
|
|
||||||
|
|
||||||
# handle and check parameter for ssl encryption
|
# handle and check parameter for ssl encryption
|
||||||
certfilepath = None
|
certfilepath = None
|
||||||
|
@ -84,10 +91,14 @@ if args.k == "":
|
||||||
|
|
||||||
# dry run updater
|
# dry run updater
|
||||||
dry_run = args.d or None
|
dry_run = args.d or None
|
||||||
|
# enable reconnect endpoint for docker database reconnect
|
||||||
|
reconnect_enable = args.r or os.environ.get("CALIBRE_RECONNECT", None)
|
||||||
# load covers from localhost
|
# load covers from localhost
|
||||||
allow_localhost = args.l or None
|
allow_localhost = args.l or os.environ.get("CALIBRE_LOCALHOST", None)
|
||||||
# handle and check ip address argument
|
# handle and check ip address argument
|
||||||
ip_address = args.i or None
|
ip_address = args.i or None
|
||||||
|
|
||||||
|
|
||||||
if ip_address:
|
if ip_address:
|
||||||
try:
|
try:
|
||||||
# try to parse the given ip address with socket
|
# try to parse the given ip address with socket
|
||||||
|
|
88
cps/comic.py
88
cps/comic.py
|
@ -1,7 +1,7 @@
|
||||||
# -*- coding: utf-8 -*-
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
# Copyright (C) 2018 OzzieIsaacs
|
# Copyright (C) 2018-2022 OzzieIsaacs
|
||||||
#
|
#
|
||||||
# This program is free software: you can redistribute it and/or modify
|
# This program is free software: you can redistribute it and/or modify
|
||||||
# it under the terms of the GNU General Public License as published by
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
@ -18,19 +18,16 @@
|
||||||
|
|
||||||
import os
|
import os
|
||||||
|
|
||||||
from . import logger, isoLanguages
|
from . import logger, isoLanguages, cover
|
||||||
from .constants import BookMeta
|
from .constants import BookMeta
|
||||||
|
|
||||||
|
|
||||||
log = logger.create()
|
|
||||||
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from wand.image import Image
|
from wand.image import Image
|
||||||
use_IM = True
|
use_IM = True
|
||||||
except (ImportError, RuntimeError) as e:
|
except (ImportError, RuntimeError) as e:
|
||||||
use_IM = False
|
use_IM = False
|
||||||
|
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from comicapi.comicarchive import ComicArchive, MetaDataStyle
|
from comicapi.comicarchive import ComicArchive, MetaDataStyle
|
||||||
|
@ -51,29 +48,8 @@ except (ImportError, LookupError) as e:
|
||||||
use_rarfile = False
|
use_rarfile = False
|
||||||
use_comic_meta = False
|
use_comic_meta = False
|
||||||
|
|
||||||
NO_JPEG_EXTENSIONS = ['.png', '.webp', '.bmp']
|
|
||||||
COVER_EXTENSIONS = ['.png', '.webp', '.bmp', '.jpg', '.jpeg']
|
|
||||||
|
|
||||||
def _cover_processing(tmp_file_name, img, extension):
|
def _extract_cover_from_archive(original_file_extension, tmp_file_name, rar_executable):
|
||||||
tmp_cover_name = os.path.join(os.path.dirname(tmp_file_name), 'cover.jpg')
|
|
||||||
if extension in NO_JPEG_EXTENSIONS:
|
|
||||||
if use_IM:
|
|
||||||
with Image(blob=img) as imgc:
|
|
||||||
imgc.format = 'jpeg'
|
|
||||||
imgc.transform_colorspace('rgb')
|
|
||||||
imgc.save(filename=tmp_cover_name)
|
|
||||||
return tmp_cover_name
|
|
||||||
else:
|
|
||||||
return None
|
|
||||||
if img:
|
|
||||||
with open(tmp_cover_name, 'wb') as f:
|
|
||||||
f.write(img)
|
|
||||||
return tmp_cover_name
|
|
||||||
else:
|
|
||||||
return None
|
|
||||||
|
|
||||||
|
|
||||||
def _extract_Cover_from_archive(original_file_extension, tmp_file_name, rarExecutable):
|
|
||||||
cover_data = extension = None
|
cover_data = extension = None
|
||||||
if original_file_extension.upper() == '.CBZ':
|
if original_file_extension.upper() == '.CBZ':
|
||||||
cf = zipfile.ZipFile(tmp_file_name)
|
cf = zipfile.ZipFile(tmp_file_name)
|
||||||
|
@ -81,7 +57,7 @@ def _extract_Cover_from_archive(original_file_extension, tmp_file_name, rarExecu
|
||||||
ext = os.path.splitext(name)
|
ext = os.path.splitext(name)
|
||||||
if len(ext) > 1:
|
if len(ext) > 1:
|
||||||
extension = ext[1].lower()
|
extension = ext[1].lower()
|
||||||
if extension in COVER_EXTENSIONS:
|
if extension in cover.COVER_EXTENSIONS:
|
||||||
cover_data = cf.read(name)
|
cover_data = cf.read(name)
|
||||||
break
|
break
|
||||||
elif original_file_extension.upper() == '.CBT':
|
elif original_file_extension.upper() == '.CBT':
|
||||||
|
@ -90,44 +66,44 @@ def _extract_Cover_from_archive(original_file_extension, tmp_file_name, rarExecu
|
||||||
ext = os.path.splitext(name)
|
ext = os.path.splitext(name)
|
||||||
if len(ext) > 1:
|
if len(ext) > 1:
|
||||||
extension = ext[1].lower()
|
extension = ext[1].lower()
|
||||||
if extension in COVER_EXTENSIONS:
|
if extension in cover.COVER_EXTENSIONS:
|
||||||
cover_data = cf.extractfile(name).read()
|
cover_data = cf.extractfile(name).read()
|
||||||
break
|
break
|
||||||
elif original_file_extension.upper() == '.CBR' and use_rarfile:
|
elif original_file_extension.upper() == '.CBR' and use_rarfile:
|
||||||
try:
|
try:
|
||||||
rarfile.UNRAR_TOOL = rarExecutable
|
rarfile.UNRAR_TOOL = rar_executable
|
||||||
cf = rarfile.RarFile(tmp_file_name)
|
cf = rarfile.RarFile(tmp_file_name)
|
||||||
for name in cf.getnames():
|
for name in cf.namelist():
|
||||||
ext = os.path.splitext(name)
|
ext = os.path.splitext(name)
|
||||||
if len(ext) > 1:
|
if len(ext) > 1:
|
||||||
extension = ext[1].lower()
|
extension = ext[1].lower()
|
||||||
if extension in COVER_EXTENSIONS:
|
if extension in cover.COVER_EXTENSIONS:
|
||||||
cover_data = cf.read(name)
|
cover_data = cf.read(name)
|
||||||
break
|
break
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.debug('Rarfile failed with error: %s', ex)
|
log.debug('Rarfile failed with error: {}'.format(ex))
|
||||||
return cover_data, extension
|
return cover_data, extension
|
||||||
|
|
||||||
|
|
||||||
def _extractCover(tmp_file_name, original_file_extension, rarExecutable):
|
def _extract_cover(tmp_file_name, original_file_extension, rar_executable):
|
||||||
cover_data = extension = None
|
cover_data = extension = None
|
||||||
if use_comic_meta:
|
if use_comic_meta:
|
||||||
archive = ComicArchive(tmp_file_name, rar_exe_path=rarExecutable)
|
archive = ComicArchive(tmp_file_name, rar_exe_path=rar_executable)
|
||||||
for index, name in enumerate(archive.getPageNameList()):
|
for index, name in enumerate(archive.getPageNameList()):
|
||||||
ext = os.path.splitext(name)
|
ext = os.path.splitext(name)
|
||||||
if len(ext) > 1:
|
if len(ext) > 1:
|
||||||
extension = ext[1].lower()
|
extension = ext[1].lower()
|
||||||
if extension in COVER_EXTENSIONS:
|
if extension in cover.COVER_EXTENSIONS:
|
||||||
cover_data = archive.getPage(index)
|
cover_data = archive.getPage(index)
|
||||||
break
|
break
|
||||||
else:
|
else:
|
||||||
cover_data, extension = _extract_Cover_from_archive(original_file_extension, tmp_file_name, rarExecutable)
|
cover_data, extension = _extract_cover_from_archive(original_file_extension, tmp_file_name, rar_executable)
|
||||||
return _cover_processing(tmp_file_name, cover_data, extension)
|
return cover.cover_processing(tmp_file_name, cover_data, extension)
|
||||||
|
|
||||||
|
|
||||||
def get_comic_info(tmp_file_path, original_file_name, original_file_extension, rarExecutable):
|
def get_comic_info(tmp_file_path, original_file_name, original_file_extension, rar_executable):
|
||||||
if use_comic_meta:
|
if use_comic_meta:
|
||||||
archive = ComicArchive(tmp_file_path, rar_exe_path=rarExecutable)
|
archive = ComicArchive(tmp_file_path, rar_exe_path=rar_executable)
|
||||||
if archive.seemsToBeAComicArchive():
|
if archive.seemsToBeAComicArchive():
|
||||||
if archive.hasMetadata(MetaDataStyle.CIX):
|
if archive.hasMetadata(MetaDataStyle.CIX):
|
||||||
style = MetaDataStyle.CIX
|
style = MetaDataStyle.CIX
|
||||||
|
@ -137,34 +113,38 @@ def get_comic_info(tmp_file_path, original_file_name, original_file_extension, r
|
||||||
style = None
|
style = None
|
||||||
|
|
||||||
# if style is not None:
|
# if style is not None:
|
||||||
loadedMetadata = archive.readMetadata(style)
|
loaded_metadata = archive.readMetadata(style)
|
||||||
|
|
||||||
lang = loadedMetadata.language or ""
|
lang = loaded_metadata.language or ""
|
||||||
loadedMetadata.language = isoLanguages.get_lang3(lang)
|
loaded_metadata.language = isoLanguages.get_lang3(lang)
|
||||||
|
|
||||||
return BookMeta(
|
return BookMeta(
|
||||||
file_path=tmp_file_path,
|
file_path=tmp_file_path,
|
||||||
extension=original_file_extension,
|
extension=original_file_extension,
|
||||||
title=loadedMetadata.title or original_file_name,
|
title=loaded_metadata.title or original_file_name,
|
||||||
author=" & ".join([credit["person"]
|
author=" & ".join([credit["person"]
|
||||||
for credit in loadedMetadata.credits if credit["role"] == "Writer"]) or u'Unknown',
|
for credit in loaded_metadata.credits if credit["role"] == "Writer"]) or 'Unknown',
|
||||||
cover=_extractCover(tmp_file_path, original_file_extension, rarExecutable),
|
cover=_extract_cover(tmp_file_path, original_file_extension, rar_executable),
|
||||||
description=loadedMetadata.comments or "",
|
description=loaded_metadata.comments or "",
|
||||||
tags="",
|
tags="",
|
||||||
series=loadedMetadata.series or "",
|
series=loaded_metadata.series or "",
|
||||||
series_id=loadedMetadata.issue or "",
|
series_id=loaded_metadata.issue or "",
|
||||||
languages=loadedMetadata.language,
|
languages=loaded_metadata.language,
|
||||||
publisher="")
|
publisher="",
|
||||||
|
pubdate="",
|
||||||
|
identifiers=[])
|
||||||
|
|
||||||
return BookMeta(
|
return BookMeta(
|
||||||
file_path=tmp_file_path,
|
file_path=tmp_file_path,
|
||||||
extension=original_file_extension,
|
extension=original_file_extension,
|
||||||
title=original_file_name,
|
title=original_file_name,
|
||||||
author=u'Unknown',
|
author=u'Unknown',
|
||||||
cover=_extractCover(tmp_file_path, original_file_extension, rarExecutable),
|
cover=_extract_cover(tmp_file_path, original_file_extension, rar_executable),
|
||||||
description="",
|
description="",
|
||||||
tags="",
|
tags="",
|
||||||
series="",
|
series="",
|
||||||
series_id="",
|
series_id="",
|
||||||
languages="",
|
languages="",
|
||||||
publisher="")
|
publisher="",
|
||||||
|
pubdate="",
|
||||||
|
identifiers=[])
|
||||||
|
|
|
@ -134,13 +134,19 @@ class _Settings(_Base):
|
||||||
config_calibre = Column(String)
|
config_calibre = Column(String)
|
||||||
config_rarfile_location = Column(String, default=None)
|
config_rarfile_location = Column(String, default=None)
|
||||||
config_upload_formats = Column(String, default=','.join(constants.EXTENSIONS_UPLOAD))
|
config_upload_formats = Column(String, default=','.join(constants.EXTENSIONS_UPLOAD))
|
||||||
config_unicode_filename =Column(Boolean, default=False)
|
config_unicode_filename = Column(Boolean, default=False)
|
||||||
|
|
||||||
config_updatechannel = Column(Integer, default=constants.UPDATE_STABLE)
|
config_updatechannel = Column(Integer, default=constants.UPDATE_STABLE)
|
||||||
|
|
||||||
config_reverse_proxy_login_header_name = Column(String)
|
config_reverse_proxy_login_header_name = Column(String)
|
||||||
config_allow_reverse_proxy_header_login = Column(Boolean, default=False)
|
config_allow_reverse_proxy_header_login = Column(Boolean, default=False)
|
||||||
|
|
||||||
|
schedule_start_time = Column(Integer, default=4)
|
||||||
|
schedule_duration = Column(Integer, default=10)
|
||||||
|
schedule_generate_book_covers = Column(Boolean, default=False)
|
||||||
|
schedule_generate_series_covers = Column(Boolean, default=False)
|
||||||
|
schedule_reconnect = Column(Boolean, default=False)
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return self.__class__.__name__
|
return self.__class__.__name__
|
||||||
|
|
||||||
|
@ -171,7 +177,6 @@ class _ConfigSQL(object):
|
||||||
if change:
|
if change:
|
||||||
self.save()
|
self.save()
|
||||||
|
|
||||||
|
|
||||||
def _read_from_storage(self):
|
def _read_from_storage(self):
|
||||||
if self._settings is None:
|
if self._settings is None:
|
||||||
log.debug("_ConfigSQL._read_from_storage")
|
log.debug("_ConfigSQL._read_from_storage")
|
||||||
|
@ -255,6 +260,8 @@ class _ConfigSQL(object):
|
||||||
return bool((self.mail_server != constants.DEFAULT_MAIL_SERVER and self.mail_server_type == 0)
|
return bool((self.mail_server != constants.DEFAULT_MAIL_SERVER and self.mail_server_type == 0)
|
||||||
or (self.mail_gmail_token != {} and self.mail_server_type == 1))
|
or (self.mail_gmail_token != {} and self.mail_server_type == 1))
|
||||||
|
|
||||||
|
def get_scheduled_task_settings(self):
|
||||||
|
return {k:v for k, v in self.__dict__.items() if k.startswith('schedule_')}
|
||||||
|
|
||||||
def set_from_dictionary(self, dictionary, field, convertor=None, default=None, encode=None):
|
def set_from_dictionary(self, dictionary, field, convertor=None, default=None, encode=None):
|
||||||
"""Possibly updates a field of this object.
|
"""Possibly updates a field of this object.
|
||||||
|
@ -290,7 +297,6 @@ class _ConfigSQL(object):
|
||||||
storage[k] = v
|
storage[k] = v
|
||||||
return storage
|
return storage
|
||||||
|
|
||||||
|
|
||||||
def load(self):
|
def load(self):
|
||||||
'''Load all configuration values from the underlying storage.'''
|
'''Load all configuration values from the underlying storage.'''
|
||||||
s = self._read_from_storage() # type: _Settings
|
s = self._read_from_storage() # type: _Settings
|
||||||
|
@ -411,6 +417,7 @@ def autodetect_calibre_binary():
|
||||||
return element
|
return element
|
||||||
return ""
|
return ""
|
||||||
|
|
||||||
|
|
||||||
def autodetect_unrar_binary():
|
def autodetect_unrar_binary():
|
||||||
if sys.platform == "win32":
|
if sys.platform == "win32":
|
||||||
calibre_path = ["C:\\program files\\WinRar\\unRAR.exe",
|
calibre_path = ["C:\\program files\\WinRar\\unRAR.exe",
|
||||||
|
@ -422,6 +429,7 @@ def autodetect_unrar_binary():
|
||||||
return element
|
return element
|
||||||
return ""
|
return ""
|
||||||
|
|
||||||
|
|
||||||
def autodetect_kepubify_binary():
|
def autodetect_kepubify_binary():
|
||||||
if sys.platform == "win32":
|
if sys.platform == "win32":
|
||||||
calibre_path = ["C:\\program files\\kepubify\\kepubify-windows-64Bit.exe",
|
calibre_path = ["C:\\program files\\kepubify\\kepubify-windows-64Bit.exe",
|
||||||
|
@ -433,6 +441,7 @@ def autodetect_kepubify_binary():
|
||||||
return element
|
return element
|
||||||
return ""
|
return ""
|
||||||
|
|
||||||
|
|
||||||
def _migrate_database(session):
|
def _migrate_database(session):
|
||||||
# make sure the table is created, if it does not exist
|
# make sure the table is created, if it does not exist
|
||||||
_Base.metadata.create_all(session.bind)
|
_Base.metadata.create_all(session.bind)
|
||||||
|
|
|
@ -23,6 +23,9 @@ from sqlalchemy import __version__ as sql_version
|
||||||
|
|
||||||
sqlalchemy_version2 = ([int(x) for x in sql_version.split('.')] >= [2, 0, 0])
|
sqlalchemy_version2 = ([int(x) for x in sql_version.split('.')] >= [2, 0, 0])
|
||||||
|
|
||||||
|
# APP_MODE - production, development, or test
|
||||||
|
APP_MODE = os.environ.get('APP_MODE', 'production')
|
||||||
|
|
||||||
# if installed via pip this variable is set to true (empty file with name .HOMEDIR present)
|
# if installed via pip this variable is set to true (empty file with name .HOMEDIR present)
|
||||||
HOME_CONFIG = os.path.isfile(os.path.join(os.path.dirname(os.path.abspath(__file__)), '.HOMEDIR'))
|
HOME_CONFIG = os.path.isfile(os.path.join(os.path.dirname(os.path.abspath(__file__)), '.HOMEDIR'))
|
||||||
|
|
||||||
|
@ -35,6 +38,10 @@ STATIC_DIR = os.path.join(BASE_DIR, 'cps', 'static')
|
||||||
TEMPLATES_DIR = os.path.join(BASE_DIR, 'cps', 'templates')
|
TEMPLATES_DIR = os.path.join(BASE_DIR, 'cps', 'templates')
|
||||||
TRANSLATIONS_DIR = os.path.join(BASE_DIR, 'cps', 'translations')
|
TRANSLATIONS_DIR = os.path.join(BASE_DIR, 'cps', 'translations')
|
||||||
|
|
||||||
|
# Cache dir - use CACHE_DIR environment variable, otherwise use the default directory: cps/cache
|
||||||
|
DEFAULT_CACHE_DIR = os.path.join(BASE_DIR, 'cps', 'cache')
|
||||||
|
CACHE_DIR = os.environ.get('CACHE_DIR', DEFAULT_CACHE_DIR)
|
||||||
|
|
||||||
if HOME_CONFIG:
|
if HOME_CONFIG:
|
||||||
home_dir = os.path.join(os.path.expanduser("~"), ".calibre-web")
|
home_dir = os.path.join(os.path.expanduser("~"), ".calibre-web")
|
||||||
if not os.path.exists(home_dir):
|
if not os.path.exists(home_dir):
|
||||||
|
@ -43,6 +50,8 @@ if HOME_CONFIG:
|
||||||
else:
|
else:
|
||||||
CONFIG_DIR = os.environ.get('CALIBRE_DBPATH', BASE_DIR)
|
CONFIG_DIR = os.environ.get('CALIBRE_DBPATH', BASE_DIR)
|
||||||
|
|
||||||
|
DEFAULT_SETTINGS_FILE = "app.db"
|
||||||
|
DEFAULT_GDRIVE_FILE = "gdrive.db"
|
||||||
|
|
||||||
ROLE_USER = 0 << 0
|
ROLE_USER = 0 << 0
|
||||||
ROLE_ADMIN = 1 << 0
|
ROLE_ADMIN = 1 << 0
|
||||||
|
@ -152,9 +161,9 @@ def selected_roles(dictionary):
|
||||||
|
|
||||||
# :rtype: BookMeta
|
# :rtype: BookMeta
|
||||||
BookMeta = namedtuple('BookMeta', 'file_path, extension, title, author, cover, description, tags, series, '
|
BookMeta = namedtuple('BookMeta', 'file_path, extension, title, author, cover, description, tags, series, '
|
||||||
'series_id, languages, publisher')
|
'series_id, languages, publisher, pubdate, identifiers')
|
||||||
|
|
||||||
STABLE_VERSION = {'version': '0.6.17 Beta'}
|
STABLE_VERSION = {'version': '0.6.19 Beta'}
|
||||||
|
|
||||||
NIGHTLY_VERSION = dict()
|
NIGHTLY_VERSION = dict()
|
||||||
NIGHTLY_VERSION[0] = '$Format:%H$'
|
NIGHTLY_VERSION[0] = '$Format:%H$'
|
||||||
|
@ -162,6 +171,19 @@ NIGHTLY_VERSION[1] = '$Format:%cI$'
|
||||||
# NIGHTLY_VERSION[0] = 'bb7d2c6273ae4560e83950d36d64533343623a57'
|
# NIGHTLY_VERSION[0] = 'bb7d2c6273ae4560e83950d36d64533343623a57'
|
||||||
# NIGHTLY_VERSION[1] = '2018-09-09T10:13:08+02:00'
|
# NIGHTLY_VERSION[1] = '2018-09-09T10:13:08+02:00'
|
||||||
|
|
||||||
|
# CACHE
|
||||||
|
CACHE_TYPE_THUMBNAILS = 'thumbnails'
|
||||||
|
|
||||||
|
# Thumbnail Types
|
||||||
|
THUMBNAIL_TYPE_COVER = 1
|
||||||
|
THUMBNAIL_TYPE_SERIES = 2
|
||||||
|
THUMBNAIL_TYPE_AUTHOR = 3
|
||||||
|
|
||||||
|
# Thumbnails Sizes
|
||||||
|
COVER_THUMBNAIL_ORIGINAL = 0
|
||||||
|
COVER_THUMBNAIL_SMALL = 1
|
||||||
|
COVER_THUMBNAIL_MEDIUM = 2
|
||||||
|
COVER_THUMBNAIL_LARGE = 3
|
||||||
|
|
||||||
# clean-up the module namespace
|
# clean-up the module namespace
|
||||||
del sys, os, namedtuple
|
del sys, os, namedtuple
|
||||||
|
|
|
@ -18,7 +18,8 @@
|
||||||
|
|
||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
from flask_babel import gettext as _
|
|
||||||
|
from flask_babel import lazy_gettext as N_
|
||||||
|
|
||||||
from . import config, logger
|
from . import config, logger
|
||||||
from .subproc_wrapper import process_wait
|
from .subproc_wrapper import process_wait
|
||||||
|
@ -26,10 +27,9 @@ from .subproc_wrapper import process_wait
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
# _() necessary to make babel aware of string for translation
|
# strings getting translated when used
|
||||||
_NOT_CONFIGURED = _('not configured')
|
_NOT_INSTALLED = N_('not installed')
|
||||||
_NOT_INSTALLED = _('not installed')
|
_EXECUTION_ERROR = N_('Execution permissions missing')
|
||||||
_EXECUTION_ERROR = _('Execution permissions missing')
|
|
||||||
|
|
||||||
|
|
||||||
def _get_command_version(path, pattern, argument=None):
|
def _get_command_version(path, pattern, argument=None):
|
||||||
|
@ -48,14 +48,16 @@ def _get_command_version(path, pattern, argument=None):
|
||||||
|
|
||||||
|
|
||||||
def get_calibre_version():
|
def get_calibre_version():
|
||||||
return _get_command_version(config.config_converterpath, r'ebook-convert.*\(calibre', '--version') \
|
return _get_command_version(config.config_converterpath, r'ebook-convert.*\(calibre', '--version')
|
||||||
or _NOT_CONFIGURED
|
|
||||||
|
|
||||||
|
|
||||||
def get_unrar_version():
|
def get_unrar_version():
|
||||||
return _get_command_version(config.config_rarfile_location, r'UNRAR.*\d') or _NOT_CONFIGURED
|
unrar_version = _get_command_version(config.config_rarfile_location, r'UNRAR.*\d')
|
||||||
|
if unrar_version == "not installed":
|
||||||
|
unrar_version = _get_command_version(config.config_rarfile_location, r'unrar.*\d','-V')
|
||||||
|
return unrar_version
|
||||||
|
|
||||||
def get_kepubify_version():
|
def get_kepubify_version():
|
||||||
return _get_command_version(config.config_kepubifypath, r'kepubify\s','--version') or _NOT_CONFIGURED
|
return _get_command_version(config.config_kepubifypath, r'kepubify\s','--version')
|
||||||
|
|
||||||
|
|
||||||
|
|
48
cps/cover.py
Normal file
48
cps/cover.py
Normal file
|
@ -0,0 +1,48 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2022 OzzieIsaacs
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
import os
|
||||||
|
|
||||||
|
try:
|
||||||
|
from wand.image import Image
|
||||||
|
use_IM = True
|
||||||
|
except (ImportError, RuntimeError) as e:
|
||||||
|
use_IM = False
|
||||||
|
|
||||||
|
|
||||||
|
NO_JPEG_EXTENSIONS = ['.png', '.webp', '.bmp']
|
||||||
|
COVER_EXTENSIONS = ['.png', '.webp', '.bmp', '.jpg', '.jpeg']
|
||||||
|
|
||||||
|
|
||||||
|
def cover_processing(tmp_file_name, img, extension):
|
||||||
|
tmp_cover_name = os.path.join(os.path.dirname(tmp_file_name), 'cover.jpg')
|
||||||
|
if extension in NO_JPEG_EXTENSIONS:
|
||||||
|
if use_IM:
|
||||||
|
with Image(blob=img) as imgc:
|
||||||
|
imgc.format = 'jpeg'
|
||||||
|
imgc.transform_colorspace('rgb')
|
||||||
|
imgc.save(filename=tmp_cover_name)
|
||||||
|
return tmp_cover_name
|
||||||
|
else:
|
||||||
|
return None
|
||||||
|
if img:
|
||||||
|
with open(tmp_cover_name, 'wb') as f:
|
||||||
|
f.write(img)
|
||||||
|
return tmp_cover_name
|
||||||
|
else:
|
||||||
|
return None
|
257
cps/db.py
257
cps/db.py
|
@ -17,14 +17,15 @@
|
||||||
# You should have received a copy of the GNU General Public License
|
# You should have received a copy of the GNU General Public License
|
||||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
import sys
|
|
||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
import ast
|
import ast
|
||||||
import json
|
import json
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
from urllib.parse import quote
|
from urllib.parse import quote
|
||||||
|
import unidecode
|
||||||
|
|
||||||
|
from sqlite3 import OperationalError as sqliteOperationalError
|
||||||
from sqlalchemy import create_engine
|
from sqlalchemy import create_engine
|
||||||
from sqlalchemy import Table, Column, ForeignKey, CheckConstraint
|
from sqlalchemy import Table, Column, ForeignKey, CheckConstraint
|
||||||
from sqlalchemy import String, Integer, Boolean, TIMESTAMP, Float
|
from sqlalchemy import String, Integer, Boolean, TIMESTAMP, Float
|
||||||
|
@ -49,11 +50,6 @@ from .pagination import Pagination
|
||||||
|
|
||||||
from weakref import WeakSet
|
from weakref import WeakSet
|
||||||
|
|
||||||
try:
|
|
||||||
import unidecode
|
|
||||||
use_unidecode = True
|
|
||||||
except ImportError:
|
|
||||||
use_unidecode = False
|
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
|
@ -93,7 +89,7 @@ books_publishers_link = Table('books_publishers_link', Base.metadata,
|
||||||
)
|
)
|
||||||
|
|
||||||
|
|
||||||
class Library_Id(Base):
|
class LibraryId(Base):
|
||||||
__tablename__ = 'library_id'
|
__tablename__ = 'library_id'
|
||||||
id = Column(Integer, primary_key=True)
|
id = Column(Integer, primary_key=True)
|
||||||
uuid = Column(String, nullable=False)
|
uuid = Column(String, nullable=False)
|
||||||
|
@ -112,7 +108,7 @@ class Identifiers(Base):
|
||||||
self.type = id_type
|
self.type = id_type
|
||||||
self.book = book
|
self.book = book
|
||||||
|
|
||||||
def formatType(self):
|
def format_type(self):
|
||||||
format_type = self.type.lower()
|
format_type = self.type.lower()
|
||||||
if format_type == 'amazon':
|
if format_type == 'amazon':
|
||||||
return u"Amazon"
|
return u"Amazon"
|
||||||
|
@ -184,8 +180,8 @@ class Comments(Base):
|
||||||
book = Column(Integer, ForeignKey('books.id'), nullable=False, unique=True)
|
book = Column(Integer, ForeignKey('books.id'), nullable=False, unique=True)
|
||||||
text = Column(String(collation='NOCASE'), nullable=False)
|
text = Column(String(collation='NOCASE'), nullable=False)
|
||||||
|
|
||||||
def __init__(self, text, book):
|
def __init__(self, comment, book):
|
||||||
self.text = text
|
self.text = comment
|
||||||
self.book = book
|
self.book = book
|
||||||
|
|
||||||
def get(self):
|
def get(self):
|
||||||
|
@ -367,7 +363,6 @@ class Books(Base):
|
||||||
self.path = path
|
self.path = path
|
||||||
self.has_cover = (has_cover != None)
|
self.has_cover = (has_cover != None)
|
||||||
|
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
return u"<Books('{0},{1}{2}{3}{4}{5}{6}{7}{8}')>".format(self.title, self.sort, self.author_sort,
|
return u"<Books('{0},{1}{2}{3}{4}{5}{6}{7}{8}')>".format(self.title, self.sort, self.author_sort,
|
||||||
self.timestamp, self.pubdate, self.series_index,
|
self.timestamp, self.pubdate, self.series_index,
|
||||||
|
@ -375,10 +370,10 @@ class Books(Base):
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def atom_timestamp(self):
|
def atom_timestamp(self):
|
||||||
return (self.timestamp.strftime('%Y-%m-%dT%H:%M:%S+00:00') or '')
|
return self.timestamp.strftime('%Y-%m-%dT%H:%M:%S+00:00') or ''
|
||||||
|
|
||||||
|
|
||||||
class Custom_Columns(Base):
|
class CustomColumns(Base):
|
||||||
__tablename__ = 'custom_columns'
|
__tablename__ = 'custom_columns'
|
||||||
|
|
||||||
id = Column(Integer, primary_key=True)
|
id = Column(Integer, primary_key=True)
|
||||||
|
@ -436,7 +431,7 @@ class AlchemyEncoder(json.JSONEncoder):
|
||||||
return json.JSONEncoder.default(self, o)
|
return json.JSONEncoder.default(self, o)
|
||||||
|
|
||||||
|
|
||||||
class CalibreDB():
|
class CalibreDB:
|
||||||
_init = False
|
_init = False
|
||||||
engine = None
|
engine = None
|
||||||
config = None
|
config = None
|
||||||
|
@ -450,17 +445,17 @@ class CalibreDB():
|
||||||
"""
|
"""
|
||||||
self.session = None
|
self.session = None
|
||||||
if self._init:
|
if self._init:
|
||||||
self.initSession(expire_on_commit)
|
self.init_session(expire_on_commit)
|
||||||
|
|
||||||
self.instances.add(self)
|
self.instances.add(self)
|
||||||
|
|
||||||
def initSession(self, expire_on_commit=True):
|
def init_session(self, expire_on_commit=True):
|
||||||
self.session = self.session_factory()
|
self.session = self.session_factory()
|
||||||
self.session.expire_on_commit = expire_on_commit
|
self.session.expire_on_commit = expire_on_commit
|
||||||
self.update_title_sort(self.config)
|
self.update_title_sort(self.config)
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def setup_db_cc_classes(self, cc):
|
def setup_db_cc_classes(cls, cc):
|
||||||
cc_ids = []
|
cc_ids = []
|
||||||
books_custom_column_links = {}
|
books_custom_column_links = {}
|
||||||
for row in cc:
|
for row in cc:
|
||||||
|
@ -539,16 +534,16 @@ class CalibreDB():
|
||||||
return False, False
|
return False, False
|
||||||
try:
|
try:
|
||||||
check_engine = create_engine('sqlite://',
|
check_engine = create_engine('sqlite://',
|
||||||
echo=False,
|
echo=False,
|
||||||
isolation_level="SERIALIZABLE",
|
isolation_level="SERIALIZABLE",
|
||||||
connect_args={'check_same_thread': False},
|
connect_args={'check_same_thread': False},
|
||||||
poolclass=StaticPool)
|
poolclass=StaticPool)
|
||||||
with check_engine.begin() as connection:
|
with check_engine.begin() as connection:
|
||||||
connection.execute(text("attach database '{}' as calibre;".format(dbpath)))
|
connection.execute(text("attach database '{}' as calibre;".format(dbpath)))
|
||||||
connection.execute(text("attach database '{}' as app_settings;".format(app_db_path)))
|
connection.execute(text("attach database '{}' as app_settings;".format(app_db_path)))
|
||||||
local_session = scoped_session(sessionmaker())
|
local_session = scoped_session(sessionmaker())
|
||||||
local_session.configure(bind=connection)
|
local_session.configure(bind=connection)
|
||||||
database_uuid = local_session().query(Library_Id).one_or_none()
|
database_uuid = local_session().query(LibraryId).one_or_none()
|
||||||
# local_session.dispose()
|
# local_session.dispose()
|
||||||
|
|
||||||
check_engine.connect()
|
check_engine.connect()
|
||||||
|
@ -597,13 +592,14 @@ class CalibreDB():
|
||||||
cc = conn.execute(text("SELECT id, datatype FROM custom_columns"))
|
cc = conn.execute(text("SELECT id, datatype FROM custom_columns"))
|
||||||
cls.setup_db_cc_classes(cc)
|
cls.setup_db_cc_classes(cc)
|
||||||
except OperationalError as e:
|
except OperationalError as e:
|
||||||
log.debug_or_exception(e)
|
log.error_or_exception(e)
|
||||||
|
return False
|
||||||
|
|
||||||
cls.session_factory = scoped_session(sessionmaker(autocommit=False,
|
cls.session_factory = scoped_session(sessionmaker(autocommit=False,
|
||||||
autoflush=True,
|
autoflush=True,
|
||||||
bind=cls.engine))
|
bind=cls.engine))
|
||||||
for inst in cls.instances:
|
for inst in cls.instances:
|
||||||
inst.initSession()
|
inst.init_session()
|
||||||
|
|
||||||
cls._init = True
|
cls._init = True
|
||||||
return True
|
return True
|
||||||
|
@ -626,8 +622,8 @@ class CalibreDB():
|
||||||
bd = (self.session.query(Books, read_column.value, ub.ArchivedBook.is_archived).select_from(Books)
|
bd = (self.session.query(Books, read_column.value, ub.ArchivedBook.is_archived).select_from(Books)
|
||||||
.join(read_column, read_column.book == book_id,
|
.join(read_column, read_column.book == book_id,
|
||||||
isouter=True))
|
isouter=True))
|
||||||
except (KeyError, AttributeError):
|
except (KeyError, AttributeError, IndexError):
|
||||||
log.error("Custom Column No.%d is not existing in calibre database", read_column)
|
log.error("Custom Column No.{} is not existing in calibre database".format(read_column))
|
||||||
# Skip linking read column and return None instead of read status
|
# Skip linking read column and return None instead of read status
|
||||||
bd = self.session.query(Books, None, ub.ArchivedBook.is_archived)
|
bd = self.session.query(Books, None, ub.ArchivedBook.is_archived)
|
||||||
return (bd.filter(Books.id == book_id)
|
return (bd.filter(Books.id == book_id)
|
||||||
|
@ -644,12 +640,10 @@ class CalibreDB():
|
||||||
# Language and content filters for displaying in the UI
|
# Language and content filters for displaying in the UI
|
||||||
def common_filters(self, allow_show_archived=False, return_all_languages=False):
|
def common_filters(self, allow_show_archived=False, return_all_languages=False):
|
||||||
if not allow_show_archived:
|
if not allow_show_archived:
|
||||||
archived_books = (
|
archived_books = (ub.session.query(ub.ArchivedBook)
|
||||||
ub.session.query(ub.ArchivedBook)
|
.filter(ub.ArchivedBook.user_id == int(current_user.id))
|
||||||
.filter(ub.ArchivedBook.user_id == int(current_user.id))
|
.filter(ub.ArchivedBook.is_archived == True)
|
||||||
.filter(ub.ArchivedBook.is_archived == True)
|
.all())
|
||||||
.all()
|
|
||||||
)
|
|
||||||
archived_book_ids = [archived_book.book_id for archived_book in archived_books]
|
archived_book_ids = [archived_book.book_id for archived_book in archived_books]
|
||||||
archived_filter = Books.id.notin_(archived_book_ids)
|
archived_filter = Books.id.notin_(archived_book_ids)
|
||||||
else:
|
else:
|
||||||
|
@ -668,16 +662,16 @@ class CalibreDB():
|
||||||
pos_cc_list = current_user.allowed_column_value.split(',')
|
pos_cc_list = current_user.allowed_column_value.split(',')
|
||||||
pos_content_cc_filter = true() if pos_cc_list == [''] else \
|
pos_content_cc_filter = true() if pos_cc_list == [''] else \
|
||||||
getattr(Books, 'custom_column_' + str(self.config.config_restricted_column)). \
|
getattr(Books, 'custom_column_' + str(self.config.config_restricted_column)). \
|
||||||
any(cc_classes[self.config.config_restricted_column].value.in_(pos_cc_list))
|
any(cc_classes[self.config.config_restricted_column].value.in_(pos_cc_list))
|
||||||
neg_cc_list = current_user.denied_column_value.split(',')
|
neg_cc_list = current_user.denied_column_value.split(',')
|
||||||
neg_content_cc_filter = false() if neg_cc_list == [''] else \
|
neg_content_cc_filter = false() if neg_cc_list == [''] else \
|
||||||
getattr(Books, 'custom_column_' + str(self.config.config_restricted_column)). \
|
getattr(Books, 'custom_column_' + str(self.config.config_restricted_column)). \
|
||||||
any(cc_classes[self.config.config_restricted_column].value.in_(neg_cc_list))
|
any(cc_classes[self.config.config_restricted_column].value.in_(neg_cc_list))
|
||||||
except (KeyError, AttributeError):
|
except (KeyError, AttributeError, IndexError):
|
||||||
pos_content_cc_filter = false()
|
pos_content_cc_filter = false()
|
||||||
neg_content_cc_filter = true()
|
neg_content_cc_filter = true()
|
||||||
log.error(u"Custom Column No.%d is not existing in calibre database",
|
log.error("Custom Column No.{} is not existing in calibre database".format(
|
||||||
self.config.config_restricted_column)
|
self.config.config_restricted_column))
|
||||||
flash(_("Custom Column No.%(column)d is not existing in calibre database",
|
flash(_("Custom Column No.%(column)d is not existing in calibre database",
|
||||||
column=self.config.config_restricted_column),
|
column=self.config.config_restricted_column),
|
||||||
category="error")
|
category="error")
|
||||||
|
@ -688,6 +682,25 @@ class CalibreDB():
|
||||||
return and_(lang_filter, pos_content_tags_filter, ~neg_content_tags_filter,
|
return and_(lang_filter, pos_content_tags_filter, ~neg_content_tags_filter,
|
||||||
pos_content_cc_filter, ~neg_content_cc_filter, archived_filter)
|
pos_content_cc_filter, ~neg_content_cc_filter, archived_filter)
|
||||||
|
|
||||||
|
def generate_linked_query(self, config_read_column, database):
|
||||||
|
if not config_read_column:
|
||||||
|
query = (self.session.query(database, ub.ArchivedBook.is_archived, ub.ReadBook.read_status)
|
||||||
|
.select_from(Books)
|
||||||
|
.outerjoin(ub.ReadBook,
|
||||||
|
and_(ub.ReadBook.user_id == int(current_user.id), ub.ReadBook.book_id == Books.id)))
|
||||||
|
else:
|
||||||
|
try:
|
||||||
|
read_column = cc_classes[config_read_column]
|
||||||
|
query = (self.session.query(database, ub.ArchivedBook.is_archived, read_column.value)
|
||||||
|
.select_from(Books)
|
||||||
|
.outerjoin(read_column, read_column.book == Books.id))
|
||||||
|
except (KeyError, AttributeError, IndexError):
|
||||||
|
log.error("Custom Column No.{} is not existing in calibre database".format(config_read_column))
|
||||||
|
# Skip linking read column and return None instead of read status
|
||||||
|
query = self.session.query(database, None, ub.ArchivedBook.is_archived)
|
||||||
|
return query.outerjoin(ub.ArchivedBook, and_(Books.id == ub.ArchivedBook.book_id,
|
||||||
|
int(current_user.id) == ub.ArchivedBook.user_id))
|
||||||
|
|
||||||
@staticmethod
|
@staticmethod
|
||||||
def get_checkbox_sorted(inputlist, state, offset, limit, order, combo=False):
|
def get_checkbox_sorted(inputlist, state, offset, limit, order, combo=False):
|
||||||
outcome = list()
|
outcome = list()
|
||||||
|
@ -717,30 +730,14 @@ class CalibreDB():
|
||||||
join_archive_read, config_read_column, *join):
|
join_archive_read, config_read_column, *join):
|
||||||
pagesize = pagesize or self.config.config_books_per_page
|
pagesize = pagesize or self.config.config_books_per_page
|
||||||
if current_user.show_detail_random():
|
if current_user.show_detail_random():
|
||||||
randm = self.session.query(Books) \
|
random_query = self.generate_linked_query(config_read_column, database)
|
||||||
.filter(self.common_filters(allow_show_archived)) \
|
randm = (random_query.filter(self.common_filters(allow_show_archived))
|
||||||
.order_by(func.random()) \
|
.order_by(func.random())
|
||||||
.limit(self.config.config_random_books).all()
|
.limit(self.config.config_random_books).all())
|
||||||
else:
|
else:
|
||||||
randm = false()
|
randm = false()
|
||||||
if join_archive_read:
|
if join_archive_read:
|
||||||
if not config_read_column:
|
query = self.generate_linked_query(config_read_column, database)
|
||||||
query = (self.session.query(database, ub.ReadBook.read_status, ub.ArchivedBook.is_archived)
|
|
||||||
.select_from(Books)
|
|
||||||
.outerjoin(ub.ReadBook,
|
|
||||||
and_(ub.ReadBook.user_id == int(current_user.id), ub.ReadBook.book_id == Books.id)))
|
|
||||||
else:
|
|
||||||
try:
|
|
||||||
read_column = cc_classes[config_read_column]
|
|
||||||
query = (self.session.query(database, read_column.value, ub.ArchivedBook.is_archived)
|
|
||||||
.select_from(Books)
|
|
||||||
.outerjoin(read_column, read_column.book == Books.id))
|
|
||||||
except (KeyError, AttributeError):
|
|
||||||
log.error("Custom Column No.%d is not existing in calibre database", read_column)
|
|
||||||
# Skip linking read column and return None instead of read status
|
|
||||||
query =self.session.query(database, None, ub.ArchivedBook.is_archived)
|
|
||||||
query = query.outerjoin(ub.ArchivedBook, and_(Books.id == ub.ArchivedBook.book_id,
|
|
||||||
int(current_user.id) == ub.ArchivedBook.user_id))
|
|
||||||
else:
|
else:
|
||||||
query = self.session.query(database)
|
query = self.session.query(database)
|
||||||
off = int(int(pagesize) * (page - 1))
|
off = int(int(pagesize) * (page - 1))
|
||||||
|
@ -769,13 +766,15 @@ class CalibreDB():
|
||||||
len(query.all()))
|
len(query.all()))
|
||||||
entries = query.order_by(*order).offset(off).limit(pagesize).all()
|
entries = query.order_by(*order).offset(off).limit(pagesize).all()
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.debug_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
# display authors in right order
|
# display authors in right order
|
||||||
entries = self.order_authors(entries, True, join_archive_read)
|
entries = self.order_authors(entries, True, join_archive_read)
|
||||||
return entries, randm, pagination
|
return entries, randm, pagination
|
||||||
|
|
||||||
# Orders all Authors in the list according to authors sort
|
# Orders all Authors in the list according to authors sort
|
||||||
def order_authors(self, entries, list_return=False, combined=False):
|
def order_authors(self, entries, list_return=False, combined=False):
|
||||||
|
# entries_copy = copy.deepcopy(entries)
|
||||||
|
# entries_copy =[]
|
||||||
for entry in entries:
|
for entry in entries:
|
||||||
if combined:
|
if combined:
|
||||||
sort_authors = entry.Books.author_sort.split('&')
|
sort_authors = entry.Books.author_sort.split('&')
|
||||||
|
@ -785,25 +784,30 @@ class CalibreDB():
|
||||||
sort_authors = entry.author_sort.split('&')
|
sort_authors = entry.author_sort.split('&')
|
||||||
ids = [a.id for a in entry.authors]
|
ids = [a.id for a in entry.authors]
|
||||||
authors_ordered = list()
|
authors_ordered = list()
|
||||||
error = False
|
# error = False
|
||||||
for auth in sort_authors:
|
for auth in sort_authors:
|
||||||
results = self.session.query(Authors).filter(Authors.sort == auth.lstrip().strip()).all()
|
results = self.session.query(Authors).filter(Authors.sort == auth.lstrip().strip()).all()
|
||||||
# ToDo: How to handle not found authorname
|
# ToDo: How to handle not found author name
|
||||||
if not len(results):
|
if not len(results):
|
||||||
error = True
|
log.error("Author {} not found to display name in right order".format(auth.strip()))
|
||||||
|
# error = True
|
||||||
break
|
break
|
||||||
for r in results:
|
for r in results:
|
||||||
if r.id in ids:
|
if r.id in ids:
|
||||||
authors_ordered.append(r)
|
authors_ordered.append(r)
|
||||||
if not error:
|
ids.remove(r.id)
|
||||||
|
for author_id in ids:
|
||||||
|
result = self.session.query(Authors).filter(Authors.id == author_id).first()
|
||||||
|
authors_ordered.append(result)
|
||||||
|
|
||||||
|
if list_return:
|
||||||
if combined:
|
if combined:
|
||||||
entry.Books.authors = authors_ordered
|
entry.Books.authors = authors_ordered
|
||||||
else:
|
else:
|
||||||
entry.authors = authors_ordered
|
entry.ordered_authors = authors_ordered
|
||||||
if list_return:
|
else:
|
||||||
return entries
|
return authors_ordered
|
||||||
else:
|
return entries
|
||||||
return authors_ordered
|
|
||||||
|
|
||||||
def get_typeahead(self, database, query, replace=('', ''), tag_filter=true()):
|
def get_typeahead(self, database, query, replace=('', ''), tag_filter=true()):
|
||||||
query = query or ''
|
query = query or ''
|
||||||
|
@ -817,36 +821,21 @@ class CalibreDB():
|
||||||
def check_exists_book(self, authr, title):
|
def check_exists_book(self, authr, title):
|
||||||
self.session.connection().connection.connection.create_function("lower", 1, lcase)
|
self.session.connection().connection.connection.create_function("lower", 1, lcase)
|
||||||
q = list()
|
q = list()
|
||||||
authorterms = re.split(r'\s*&\s*', authr)
|
author_terms = re.split(r'\s*&\s*', authr)
|
||||||
for authorterm in authorterms:
|
for author_term in author_terms:
|
||||||
q.append(Books.authors.any(func.lower(Authors.name).ilike("%" + authorterm + "%")))
|
q.append(Books.authors.any(func.lower(Authors.name).ilike("%" + author_term + "%")))
|
||||||
|
|
||||||
return self.session.query(Books) \
|
return self.session.query(Books) \
|
||||||
.filter(and_(Books.authors.any(and_(*q)), func.lower(Books.title).ilike("%" + title + "%"))).first()
|
.filter(and_(Books.authors.any(and_(*q)), func.lower(Books.title).ilike("%" + title + "%"))).first()
|
||||||
|
|
||||||
def search_query(self, term, config_read_column, *join):
|
def search_query(self, term, config, *join):
|
||||||
term.strip().lower()
|
term.strip().lower()
|
||||||
self.session.connection().connection.connection.create_function("lower", 1, lcase)
|
self.session.connection().connection.connection.create_function("lower", 1, lcase)
|
||||||
q = list()
|
q = list()
|
||||||
authorterms = re.split("[, ]+", term)
|
author_terms = re.split("[, ]+", term)
|
||||||
for authorterm in authorterms:
|
for author_term in author_terms:
|
||||||
q.append(Books.authors.any(func.lower(Authors.name).ilike("%" + authorterm + "%")))
|
q.append(Books.authors.any(func.lower(Authors.name).ilike("%" + author_term + "%")))
|
||||||
if not config_read_column:
|
query = self.generate_linked_query(config.config_read_column, Books)
|
||||||
query = (self.session.query(Books, ub.ArchivedBook.is_archived, ub.ReadBook).select_from(Books)
|
|
||||||
.outerjoin(ub.ReadBook, and_(Books.id == ub.ReadBook.book_id,
|
|
||||||
int(current_user.id) == ub.ReadBook.user_id)))
|
|
||||||
else:
|
|
||||||
try:
|
|
||||||
read_column = cc_classes[config_read_column]
|
|
||||||
query = (self.session.query(Books, ub.ArchivedBook.is_archived, read_column.value).select_from(Books)
|
|
||||||
.outerjoin(read_column, read_column.book == Books.id))
|
|
||||||
except (KeyError, AttributeError):
|
|
||||||
log.error("Custom Column No.%d is not existing in calibre database", config_read_column)
|
|
||||||
# Skip linking read column
|
|
||||||
query = self.session.query(Books, ub.ArchivedBook.is_archived, None)
|
|
||||||
query = query.outerjoin(ub.ArchivedBook, and_(Books.id == ub.ArchivedBook.book_id,
|
|
||||||
int(current_user.id) == ub.ArchivedBook.user_id))
|
|
||||||
|
|
||||||
if len(join) == 6:
|
if len(join) == 6:
|
||||||
query = query.outerjoin(join[0], join[1]).outerjoin(join[2]).outerjoin(join[3], join[4]).outerjoin(join[5])
|
query = query.outerjoin(join[0], join[1]).outerjoin(join[2]).outerjoin(join[3], join[4]).outerjoin(join[5])
|
||||||
if len(join) == 3:
|
if len(join) == 3:
|
||||||
|
@ -855,20 +844,42 @@ class CalibreDB():
|
||||||
query = query.outerjoin(join[0], join[1])
|
query = query.outerjoin(join[0], join[1])
|
||||||
elif len(join) == 1:
|
elif len(join) == 1:
|
||||||
query = query.outerjoin(join[0])
|
query = query.outerjoin(join[0])
|
||||||
return query.filter(self.common_filters(True)).filter(
|
|
||||||
or_(Books.tags.any(func.lower(Tags.name).ilike("%" + term + "%")),
|
cc = self.get_cc_columns(config, filter_config_custom_read=True)
|
||||||
Books.series.any(func.lower(Series.name).ilike("%" + term + "%")),
|
filter_expression = [Books.tags.any(func.lower(Tags.name).ilike("%" + term + "%")),
|
||||||
Books.authors.any(and_(*q)),
|
Books.series.any(func.lower(Series.name).ilike("%" + term + "%")),
|
||||||
Books.publishers.any(func.lower(Publishers.name).ilike("%" + term + "%")),
|
Books.authors.any(and_(*q)),
|
||||||
func.lower(Books.title).ilike("%" + term + "%")
|
Books.publishers.any(func.lower(Publishers.name).ilike("%" + term + "%")),
|
||||||
))
|
func.lower(Books.title).ilike("%" + term + "%")]
|
||||||
|
for c in cc:
|
||||||
|
if c.datatype not in ["datetime", "rating", "bool", "int", "float"]:
|
||||||
|
filter_expression.append(
|
||||||
|
getattr(Books,
|
||||||
|
'custom_column_' + str(c.id)).any(
|
||||||
|
func.lower(cc_classes[c.id].value).ilike("%" + term + "%")))
|
||||||
|
return query.filter(self.common_filters(True)).filter(or_(*filter_expression))
|
||||||
|
|
||||||
|
def get_cc_columns(self, config, filter_config_custom_read=False):
|
||||||
|
tmp_cc = self.session.query(CustomColumns).filter(CustomColumns.datatype.notin_(cc_exceptions)).all()
|
||||||
|
cc = []
|
||||||
|
r = None
|
||||||
|
if config.config_columns_to_ignore:
|
||||||
|
r = re.compile(config.config_columns_to_ignore)
|
||||||
|
|
||||||
|
for col in tmp_cc:
|
||||||
|
if filter_config_custom_read and config.config_read_column and config.config_read_column == col.id:
|
||||||
|
continue
|
||||||
|
if r and r.match(col.name):
|
||||||
|
continue
|
||||||
|
cc.append(col)
|
||||||
|
|
||||||
|
return cc
|
||||||
|
|
||||||
# read search results from calibre-database and return it (function is used for feed and simple search
|
# read search results from calibre-database and return it (function is used for feed and simple search
|
||||||
def get_search_results(self, term, offset=None, order=None, limit=None, allow_show_archived=False,
|
def get_search_results(self, term, config, offset=None, order=None, limit=None, *join):
|
||||||
config_read_column=False, *join):
|
|
||||||
order = order[0] if order else [Books.sort]
|
order = order[0] if order else [Books.sort]
|
||||||
pagination = None
|
pagination = None
|
||||||
result = self.search_query(term, config_read_column, *join).order_by(*order).all()
|
result = self.search_query(term, config, *join).order_by(*order).all()
|
||||||
result_count = len(result)
|
result_count = len(result)
|
||||||
if offset != None and limit != None:
|
if offset != None and limit != None:
|
||||||
offset = int(offset)
|
offset = int(offset)
|
||||||
|
@ -893,9 +904,20 @@ class CalibreDB():
|
||||||
.join(books_languages_link).join(Books)\
|
.join(books_languages_link).join(Books)\
|
||||||
.filter(self.common_filters(return_all_languages=return_all_languages)) \
|
.filter(self.common_filters(return_all_languages=return_all_languages)) \
|
||||||
.group_by(text('books_languages_link.lang_code')).all()
|
.group_by(text('books_languages_link.lang_code')).all()
|
||||||
|
tags = list()
|
||||||
for lang in languages:
|
for lang in languages:
|
||||||
lang[0].name = isoLanguages.get_language_name(get_locale(), lang[0].lang_code)
|
tag = Category(isoLanguages.get_language_name(get_locale(), lang[0].lang_code), lang[0].lang_code)
|
||||||
return sorted(languages, key=lambda x: x[0].name, reverse=reverse_order)
|
tags.append([tag, lang[1]])
|
||||||
|
# Append all books without language to list
|
||||||
|
if not return_all_languages:
|
||||||
|
no_lang_count = (self.session.query(Books)
|
||||||
|
.outerjoin(books_languages_link).outerjoin(Languages)
|
||||||
|
.filter(Languages.lang_code == None)
|
||||||
|
.filter(self.common_filters())
|
||||||
|
.count())
|
||||||
|
if no_lang_count:
|
||||||
|
tags.append([Category(_("None"), "none"), no_lang_count])
|
||||||
|
return sorted(tags, key=lambda x: x[0].name, reverse=reverse_order)
|
||||||
else:
|
else:
|
||||||
if not languages:
|
if not languages:
|
||||||
languages = self.session.query(Languages) \
|
languages = self.session.query(Languages) \
|
||||||
|
@ -907,7 +929,6 @@ class CalibreDB():
|
||||||
lang.name = isoLanguages.get_language_name(get_locale(), lang.lang_code)
|
lang.name = isoLanguages.get_language_name(get_locale(), lang.lang_code)
|
||||||
return sorted(languages, key=lambda x: x.name, reverse=reverse_order)
|
return sorted(languages, key=lambda x: x.name, reverse=reverse_order)
|
||||||
|
|
||||||
|
|
||||||
def update_title_sort(self, config, conn=None):
|
def update_title_sort(self, config, conn=None):
|
||||||
# user defined sort function for calibre databases (Series, etc.)
|
# user defined sort function for calibre databases (Series, etc.)
|
||||||
def _title_sort(title):
|
def _title_sort(title):
|
||||||
|
@ -920,7 +941,10 @@ class CalibreDB():
|
||||||
return title.strip()
|
return title.strip()
|
||||||
|
|
||||||
conn = conn or self.session.connection().connection.connection
|
conn = conn or self.session.connection().connection.connection
|
||||||
conn.create_function("title_sort", 1, _title_sort)
|
try:
|
||||||
|
conn.create_function("title_sort", 1, _title_sort)
|
||||||
|
except sqliteOperationalError:
|
||||||
|
pass
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def dispose(cls):
|
def dispose(cls):
|
||||||
|
@ -965,6 +989,25 @@ def lcase(s):
|
||||||
try:
|
try:
|
||||||
return unidecode.unidecode(s.lower())
|
return unidecode.unidecode(s.lower())
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log = logger.create()
|
_log = logger.create()
|
||||||
log.debug_or_exception(ex)
|
_log.error_or_exception(ex)
|
||||||
return s.lower()
|
return s.lower()
|
||||||
|
|
||||||
|
|
||||||
|
class Category:
|
||||||
|
name = None
|
||||||
|
id = None
|
||||||
|
count = None
|
||||||
|
rating = None
|
||||||
|
|
||||||
|
def __init__(self, name, cat_id, rating=None):
|
||||||
|
self.name = name
|
||||||
|
self.id = cat_id
|
||||||
|
self.rating = rating
|
||||||
|
self.count = 1
|
||||||
|
|
||||||
|
'''class Count:
|
||||||
|
count = None
|
||||||
|
|
||||||
|
def __init__(self, count):
|
||||||
|
self.count = count'''
|
||||||
|
|
|
@ -5,7 +5,7 @@ import json
|
||||||
|
|
||||||
from .constants import BASE_DIR
|
from .constants import BASE_DIR
|
||||||
try:
|
try:
|
||||||
from importlib_metadata import version
|
from importlib.metadata import version
|
||||||
importlib = True
|
importlib = True
|
||||||
ImportNotFound = BaseException
|
ImportNotFound = BaseException
|
||||||
except ImportError:
|
except ImportError:
|
||||||
|
|
751
cps/editbooks.py
751
cps/editbooks.py
File diff suppressed because it is too large
Load Diff
71
cps/epub.py
71
cps/epub.py
|
@ -20,23 +20,26 @@ import os
|
||||||
import zipfile
|
import zipfile
|
||||||
from lxml import etree
|
from lxml import etree
|
||||||
|
|
||||||
from . import isoLanguages
|
from . import isoLanguages, cover
|
||||||
from .helper import split_authors
|
from .helper import split_authors
|
||||||
from .constants import BookMeta
|
from .constants import BookMeta
|
||||||
|
|
||||||
|
|
||||||
def extract_cover(zip_file, cover_file, cover_path, tmp_file_name):
|
def _extract_cover(zip_file, cover_file, cover_path, tmp_file_name):
|
||||||
if cover_file is None:
|
if cover_file is None:
|
||||||
return None
|
return None
|
||||||
else:
|
else:
|
||||||
|
cf = extension = None
|
||||||
zip_cover_path = os.path.join(cover_path, cover_file).replace('\\', '/')
|
zip_cover_path = os.path.join(cover_path, cover_file).replace('\\', '/')
|
||||||
cf = zip_file.read(zip_cover_path)
|
|
||||||
prefix = os.path.splitext(tmp_file_name)[0]
|
prefix = os.path.splitext(tmp_file_name)[0]
|
||||||
tmp_cover_name = prefix + '.' + os.path.basename(zip_cover_path)
|
tmp_cover_name = prefix + '.' + os.path.basename(zip_cover_path)
|
||||||
image = open(tmp_cover_name, 'wb')
|
ext = os.path.splitext(tmp_cover_name)
|
||||||
image.write(cf)
|
if len(ext) > 1:
|
||||||
image.close()
|
extension = ext[1].lower()
|
||||||
return tmp_cover_name
|
if extension in cover.COVER_EXTENSIONS:
|
||||||
|
cf = zip_file.read(zip_cover_path)
|
||||||
|
return cover.cover_processing(tmp_file_name, cf, extension)
|
||||||
|
|
||||||
|
|
||||||
def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
||||||
|
@ -50,31 +53,39 @@ def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
||||||
|
|
||||||
txt = epub_zip.read('META-INF/container.xml')
|
txt = epub_zip.read('META-INF/container.xml')
|
||||||
tree = etree.fromstring(txt)
|
tree = etree.fromstring(txt)
|
||||||
cfname = tree.xpath('n:rootfiles/n:rootfile/@full-path', namespaces=ns)[0]
|
cf_name = tree.xpath('n:rootfiles/n:rootfile/@full-path', namespaces=ns)[0]
|
||||||
cf = epub_zip.read(cfname)
|
cf = epub_zip.read(cf_name)
|
||||||
tree = etree.fromstring(cf)
|
tree = etree.fromstring(cf)
|
||||||
|
|
||||||
coverpath = os.path.dirname(cfname)
|
cover_path = os.path.dirname(cf_name)
|
||||||
|
|
||||||
p = tree.xpath('/pkg:package/pkg:metadata', namespaces=ns)[0]
|
p = tree.xpath('/pkg:package/pkg:metadata', namespaces=ns)[0]
|
||||||
|
|
||||||
epub_metadata = {}
|
epub_metadata = {}
|
||||||
|
|
||||||
for s in ['title', 'description', 'creator', 'language', 'subject']:
|
for s in ['title', 'description', 'creator', 'language', 'subject', 'publisher', 'date']:
|
||||||
tmp = p.xpath('dc:%s/text()' % s, namespaces=ns)
|
tmp = p.xpath('dc:%s/text()' % s, namespaces=ns)
|
||||||
if len(tmp) > 0:
|
if len(tmp) > 0:
|
||||||
if s == 'creator':
|
if s == 'creator':
|
||||||
epub_metadata[s] = ' & '.join(split_authors(tmp))
|
epub_metadata[s] = ' & '.join(split_authors(tmp))
|
||||||
elif s == 'subject':
|
elif s == 'subject':
|
||||||
epub_metadata[s] = ', '.join(tmp)
|
epub_metadata[s] = ', '.join(tmp)
|
||||||
|
elif s == 'date':
|
||||||
|
epub_metadata[s] = tmp[0][:10]
|
||||||
else:
|
else:
|
||||||
epub_metadata[s] = tmp[0]
|
epub_metadata[s] = tmp[0]
|
||||||
else:
|
else:
|
||||||
epub_metadata[s] = u'Unknown'
|
epub_metadata[s] = 'Unknown'
|
||||||
|
|
||||||
if epub_metadata['subject'] == u'Unknown':
|
if epub_metadata['subject'] == 'Unknown':
|
||||||
epub_metadata['subject'] = ''
|
epub_metadata['subject'] = ''
|
||||||
|
|
||||||
|
if epub_metadata['publisher'] == u'Unknown':
|
||||||
|
epub_metadata['publisher'] = ''
|
||||||
|
|
||||||
|
if epub_metadata['date'] == u'Unknown':
|
||||||
|
epub_metadata['date'] = ''
|
||||||
|
|
||||||
if epub_metadata['description'] == u'Unknown':
|
if epub_metadata['description'] == u'Unknown':
|
||||||
description = tree.xpath("//*[local-name() = 'description']/text()")
|
description = tree.xpath("//*[local-name() = 'description']/text()")
|
||||||
if len(description) > 0:
|
if len(description) > 0:
|
||||||
|
@ -87,7 +98,15 @@ def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
||||||
|
|
||||||
epub_metadata = parse_epub_series(ns, tree, epub_metadata)
|
epub_metadata = parse_epub_series(ns, tree, epub_metadata)
|
||||||
|
|
||||||
cover_file = parse_epub_cover(ns, tree, epub_zip, coverpath, tmp_file_path)
|
cover_file = parse_epub_cover(ns, tree, epub_zip, cover_path, tmp_file_path)
|
||||||
|
|
||||||
|
identifiers = []
|
||||||
|
for node in p.xpath('dc:identifier', namespaces=ns):
|
||||||
|
identifier_name=node.attrib.values()[-1];
|
||||||
|
identifier_value=node.text;
|
||||||
|
if identifier_name in ('uuid','calibre'):
|
||||||
|
continue;
|
||||||
|
identifiers.append( [identifier_name, identifier_value] )
|
||||||
|
|
||||||
if not epub_metadata['title']:
|
if not epub_metadata['title']:
|
||||||
title = original_file_name
|
title = original_file_name
|
||||||
|
@ -105,15 +124,20 @@ def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
||||||
series=epub_metadata['series'].encode('utf-8').decode('utf-8'),
|
series=epub_metadata['series'].encode('utf-8').decode('utf-8'),
|
||||||
series_id=epub_metadata['series_id'].encode('utf-8').decode('utf-8'),
|
series_id=epub_metadata['series_id'].encode('utf-8').decode('utf-8'),
|
||||||
languages=epub_metadata['language'],
|
languages=epub_metadata['language'],
|
||||||
publisher="")
|
publisher=epub_metadata['publisher'].encode('utf-8').decode('utf-8'),
|
||||||
|
pubdate=epub_metadata['date'],
|
||||||
|
identifiers=identifiers)
|
||||||
|
|
||||||
|
|
||||||
def parse_epub_cover(ns, tree, epub_zip, cover_path, tmp_file_path):
|
def parse_epub_cover(ns, tree, epub_zip, cover_path, tmp_file_path):
|
||||||
cover_section = tree.xpath("/pkg:package/pkg:manifest/pkg:item[@id='cover-image']/@href", namespaces=ns)
|
cover_section = tree.xpath("/pkg:package/pkg:manifest/pkg:item[@id='cover-image']/@href", namespaces=ns)
|
||||||
cover_file = None
|
cover_file = None
|
||||||
if len(cover_section) > 0:
|
# if len(cover_section) > 0:
|
||||||
cover_file = extract_cover(epub_zip, cover_section[0], cover_path, tmp_file_path)
|
for cs in cover_section:
|
||||||
else:
|
cover_file = _extract_cover(epub_zip, cs, cover_path, tmp_file_path)
|
||||||
|
if cover_file:
|
||||||
|
break
|
||||||
|
if not cover_file:
|
||||||
meta_cover = tree.xpath("/pkg:package/pkg:metadata/pkg:meta[@name='cover']/@content", namespaces=ns)
|
meta_cover = tree.xpath("/pkg:package/pkg:metadata/pkg:meta[@name='cover']/@content", namespaces=ns)
|
||||||
if len(meta_cover) > 0:
|
if len(meta_cover) > 0:
|
||||||
cover_section = tree.xpath(
|
cover_section = tree.xpath(
|
||||||
|
@ -123,10 +147,10 @@ def parse_epub_cover(ns, tree, epub_zip, cover_path, tmp_file_path):
|
||||||
"/pkg:package/pkg:manifest/pkg:item[@properties='" + meta_cover[0] + "']/@href", namespaces=ns)
|
"/pkg:package/pkg:manifest/pkg:item[@properties='" + meta_cover[0] + "']/@href", namespaces=ns)
|
||||||
else:
|
else:
|
||||||
cover_section = tree.xpath("/pkg:package/pkg:guide/pkg:reference/@href", namespaces=ns)
|
cover_section = tree.xpath("/pkg:package/pkg:guide/pkg:reference/@href", namespaces=ns)
|
||||||
if len(cover_section) > 0:
|
for cs in cover_section:
|
||||||
filetype = cover_section[0].rsplit('.', 1)[-1]
|
filetype = cs.rsplit('.', 1)[-1]
|
||||||
if filetype == "xhtml" or filetype == "html": # if cover is (x)html format
|
if filetype == "xhtml" or filetype == "html": # if cover is (x)html format
|
||||||
markup = epub_zip.read(os.path.join(cover_path, cover_section[0]))
|
markup = epub_zip.read(os.path.join(cover_path, cs))
|
||||||
markup_tree = etree.fromstring(markup)
|
markup_tree = etree.fromstring(markup)
|
||||||
# no matter xhtml or html with no namespace
|
# no matter xhtml or html with no namespace
|
||||||
img_src = markup_tree.xpath("//*[local-name() = 'img']/@src")
|
img_src = markup_tree.xpath("//*[local-name() = 'img']/@src")
|
||||||
|
@ -137,9 +161,10 @@ def parse_epub_cover(ns, tree, epub_zip, cover_path, tmp_file_path):
|
||||||
# img_src maybe start with "../"" so fullpath join then relpath to cwd
|
# img_src maybe start with "../"" so fullpath join then relpath to cwd
|
||||||
filename = os.path.relpath(os.path.join(os.path.dirname(os.path.join(cover_path, cover_section[0])),
|
filename = os.path.relpath(os.path.join(os.path.dirname(os.path.join(cover_path, cover_section[0])),
|
||||||
img_src[0]))
|
img_src[0]))
|
||||||
cover_file = extract_cover(epub_zip, filename, "", tmp_file_path)
|
cover_file = _extract_cover(epub_zip, filename, "", tmp_file_path)
|
||||||
else:
|
else:
|
||||||
cover_file = extract_cover(epub_zip, cover_section[0], cover_path, tmp_file_path)
|
cover_file = _extract_cover(epub_zip, cs, cover_path, tmp_file_path)
|
||||||
|
if cover_file: break
|
||||||
return cover_file
|
return cover_file
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -42,8 +42,9 @@ def error_http(error):
|
||||||
|
|
||||||
def internal_error(error):
|
def internal_error(error):
|
||||||
return render_template('http_error.html',
|
return render_template('http_error.html',
|
||||||
error_code="Internal Server Error",
|
error_code="500 Internal Server Error",
|
||||||
error_name=str(error),
|
error_name='The server encountered an internal error and was unable to complete your '
|
||||||
|
'request. There is an error in the application.',
|
||||||
issue=True,
|
issue=True,
|
||||||
unconfigured=False,
|
unconfigured=False,
|
||||||
error_stack=traceback.format_exc().split("\n"),
|
error_stack=traceback.format_exc().split("\n"),
|
||||||
|
|
|
@ -77,4 +77,6 @@ def get_fb2_info(tmp_file_path, original_file_extension):
|
||||||
series="",
|
series="",
|
||||||
series_id="",
|
series_id="",
|
||||||
languages="",
|
languages="",
|
||||||
publisher="")
|
publisher="",
|
||||||
|
pubdate="",
|
||||||
|
identifiers=[])
|
||||||
|
|
95
cps/fs.py
Normal file
95
cps/fs.py
Normal file
|
@ -0,0 +1,95 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2020 mmonkey
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
from . import logger
|
||||||
|
from .constants import CACHE_DIR
|
||||||
|
from os import makedirs, remove
|
||||||
|
from os.path import isdir, isfile, join
|
||||||
|
from shutil import rmtree
|
||||||
|
|
||||||
|
|
||||||
|
class FileSystem:
|
||||||
|
_instance = None
|
||||||
|
_cache_dir = CACHE_DIR
|
||||||
|
|
||||||
|
def __new__(cls):
|
||||||
|
if cls._instance is None:
|
||||||
|
cls._instance = super(FileSystem, cls).__new__(cls)
|
||||||
|
cls.log = logger.create()
|
||||||
|
return cls._instance
|
||||||
|
|
||||||
|
def get_cache_dir(self, cache_type=None):
|
||||||
|
if not isdir(self._cache_dir):
|
||||||
|
try:
|
||||||
|
makedirs(self._cache_dir)
|
||||||
|
except OSError:
|
||||||
|
self.log.info(f'Failed to create path {self._cache_dir} (Permission denied).')
|
||||||
|
return False
|
||||||
|
|
||||||
|
path = join(self._cache_dir, cache_type)
|
||||||
|
if cache_type and not isdir(path):
|
||||||
|
try:
|
||||||
|
makedirs(path)
|
||||||
|
except OSError:
|
||||||
|
self.log.info(f'Failed to create path {path} (Permission denied).')
|
||||||
|
return False
|
||||||
|
|
||||||
|
return path if cache_type else self._cache_dir
|
||||||
|
|
||||||
|
def get_cache_file_dir(self, filename, cache_type=None):
|
||||||
|
path = join(self.get_cache_dir(cache_type), filename[:2])
|
||||||
|
if not isdir(path):
|
||||||
|
try:
|
||||||
|
makedirs(path)
|
||||||
|
except OSError:
|
||||||
|
self.log.info(f'Failed to create path {path} (Permission denied).')
|
||||||
|
return False
|
||||||
|
|
||||||
|
return path
|
||||||
|
|
||||||
|
def get_cache_file_path(self, filename, cache_type=None):
|
||||||
|
return join(self.get_cache_file_dir(filename, cache_type), filename) if filename else None
|
||||||
|
|
||||||
|
def get_cache_file_exists(self, filename, cache_type=None):
|
||||||
|
path = self.get_cache_file_path(filename, cache_type)
|
||||||
|
return isfile(path)
|
||||||
|
|
||||||
|
def delete_cache_dir(self, cache_type=None):
|
||||||
|
if not cache_type and isdir(self._cache_dir):
|
||||||
|
try:
|
||||||
|
rmtree(self._cache_dir)
|
||||||
|
except OSError:
|
||||||
|
self.log.info(f'Failed to delete path {self._cache_dir} (Permission denied).')
|
||||||
|
return False
|
||||||
|
|
||||||
|
path = join(self._cache_dir, cache_type)
|
||||||
|
if cache_type and isdir(path):
|
||||||
|
try:
|
||||||
|
rmtree(path)
|
||||||
|
except OSError:
|
||||||
|
self.log.info(f'Failed to delete path {path} (Permission denied).')
|
||||||
|
return False
|
||||||
|
|
||||||
|
def delete_cache_file(self, filename, cache_type=None):
|
||||||
|
path = self.get_cache_file_path(filename, cache_type)
|
||||||
|
if isfile(path):
|
||||||
|
try:
|
||||||
|
remove(path)
|
||||||
|
except OSError:
|
||||||
|
self.log.info(f'Failed to delete path {path} (Permission denied).')
|
||||||
|
return False
|
|
@ -152,7 +152,7 @@ try:
|
||||||
move(os.path.join(tmp_dir, "tmp_metadata.db"), dbpath)
|
move(os.path.join(tmp_dir, "tmp_metadata.db"), dbpath)
|
||||||
calibre_db.reconnect_db(config, ub.app_DB_path)
|
calibre_db.reconnect_db(config, ub.app_DB_path)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.debug_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
return ''
|
return ''
|
||||||
except AttributeError:
|
except AttributeError:
|
||||||
pass
|
pass
|
||||||
|
|
|
@ -32,13 +32,9 @@ try:
|
||||||
from sqlalchemy.orm import declarative_base
|
from sqlalchemy.orm import declarative_base
|
||||||
except ImportError:
|
except ImportError:
|
||||||
from sqlalchemy.ext.declarative import declarative_base
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
from sqlalchemy.exc import OperationalError, InvalidRequestError
|
from sqlalchemy.exc import OperationalError, InvalidRequestError, IntegrityError
|
||||||
from sqlalchemy.sql.expression import text
|
from sqlalchemy.sql.expression import text
|
||||||
|
|
||||||
#try:
|
|
||||||
# from six import __version__ as six_version
|
|
||||||
#except ImportError:
|
|
||||||
# six_version = "not installed"
|
|
||||||
try:
|
try:
|
||||||
from httplib2 import __version__ as httplib2_version
|
from httplib2 import __version__ as httplib2_version
|
||||||
except ImportError:
|
except ImportError:
|
||||||
|
@ -81,7 +77,7 @@ if gdrive_support:
|
||||||
if not logger.is_debug_enabled():
|
if not logger.is_debug_enabled():
|
||||||
logger.get('googleapiclient.discovery').setLevel(logger.logging.ERROR)
|
logger.get('googleapiclient.discovery').setLevel(logger.logging.ERROR)
|
||||||
else:
|
else:
|
||||||
log.debug("Cannot import pydrive,httplib2, using gdrive will not work: %s", importError)
|
log.debug("Cannot import pydrive, httplib2, using gdrive will not work: {}".format(importError))
|
||||||
|
|
||||||
|
|
||||||
class Singleton:
|
class Singleton:
|
||||||
|
@ -141,11 +137,12 @@ class Gdrive:
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
self.drive = getDrive(gauth=Gauth.Instance().auth)
|
self.drive = getDrive(gauth=Gauth.Instance().auth)
|
||||||
|
|
||||||
|
|
||||||
def is_gdrive_ready():
|
def is_gdrive_ready():
|
||||||
return os.path.exists(SETTINGS_YAML) and os.path.exists(CREDENTIALS)
|
return os.path.exists(SETTINGS_YAML) and os.path.exists(CREDENTIALS)
|
||||||
|
|
||||||
|
|
||||||
engine = create_engine('sqlite:///{0}'.format(cli.gdpath), echo=False)
|
engine = create_engine('sqlite:///{0}'.format(cli.gd_path), echo=False)
|
||||||
Base = declarative_base()
|
Base = declarative_base()
|
||||||
|
|
||||||
# Open session for database connection
|
# Open session for database connection
|
||||||
|
@ -193,11 +190,11 @@ def migrate():
|
||||||
session.execute('ALTER TABLE gdrive_ids2 RENAME to gdrive_ids')
|
session.execute('ALTER TABLE gdrive_ids2 RENAME to gdrive_ids')
|
||||||
break
|
break
|
||||||
|
|
||||||
if not os.path.exists(cli.gdpath):
|
if not os.path.exists(cli.gd_path):
|
||||||
try:
|
try:
|
||||||
Base.metadata.create_all(engine)
|
Base.metadata.create_all(engine)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.error("Error connect to database: {} - {}".format(cli.gdpath, ex))
|
log.error("Error connect to database: {} - {}".format(cli.gd_path, ex))
|
||||||
raise
|
raise
|
||||||
migrate()
|
migrate()
|
||||||
|
|
||||||
|
@ -213,9 +210,9 @@ def getDrive(drive=None, gauth=None):
|
||||||
try:
|
try:
|
||||||
gauth.Refresh()
|
gauth.Refresh()
|
||||||
except RefreshError as e:
|
except RefreshError as e:
|
||||||
log.error("Google Drive error: %s", e)
|
log.error("Google Drive error: {}".format(e))
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.debug_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
else:
|
else:
|
||||||
# Initialize the saved creds
|
# Initialize the saved creds
|
||||||
gauth.Authorize()
|
gauth.Authorize()
|
||||||
|
@ -225,7 +222,7 @@ def getDrive(drive=None, gauth=None):
|
||||||
try:
|
try:
|
||||||
drive.auth.Refresh()
|
drive.auth.Refresh()
|
||||||
except RefreshError as e:
|
except RefreshError as e:
|
||||||
log.error("Google Drive error: %s", e)
|
log.error("Google Drive error: {}".format(e))
|
||||||
return drive
|
return drive
|
||||||
|
|
||||||
def listRootFolders():
|
def listRootFolders():
|
||||||
|
@ -234,7 +231,7 @@ def listRootFolders():
|
||||||
folder = "'root' in parents and mimeType = 'application/vnd.google-apps.folder' and trashed = false"
|
folder = "'root' in parents and mimeType = 'application/vnd.google-apps.folder' and trashed = false"
|
||||||
fileList = drive.ListFile({'q': folder}).GetList()
|
fileList = drive.ListFile({'q': folder}).GetList()
|
||||||
except (ServerNotFoundError, ssl.SSLError, RefreshError) as e:
|
except (ServerNotFoundError, ssl.SSLError, RefreshError) as e:
|
||||||
log.info("GDrive Error %s" % e)
|
log.info("GDrive Error {}".format(e))
|
||||||
fileList = []
|
fileList = []
|
||||||
return fileList
|
return fileList
|
||||||
|
|
||||||
|
@ -272,8 +269,7 @@ def getEbooksFolderId(drive=None):
|
||||||
try:
|
try:
|
||||||
session.commit()
|
session.commit()
|
||||||
except OperationalError as ex:
|
except OperationalError as ex:
|
||||||
log.error("gdrive.db DB is not Writeable")
|
log.error_or_exception('Database error: {}'.format(ex))
|
||||||
log.debug('Database error: %s', ex)
|
|
||||||
session.rollback()
|
session.rollback()
|
||||||
return gDriveId.gdrive_id
|
return gDriveId.gdrive_id
|
||||||
|
|
||||||
|
@ -289,6 +285,7 @@ def getFile(pathId, fileName, drive):
|
||||||
|
|
||||||
def getFolderId(path, drive):
|
def getFolderId(path, drive):
|
||||||
# drive = getDrive(drive)
|
# drive = getDrive(drive)
|
||||||
|
currentFolderId = None
|
||||||
try:
|
try:
|
||||||
currentFolderId = getEbooksFolderId(drive)
|
currentFolderId = getEbooksFolderId(drive)
|
||||||
sqlCheckPath = path if path[-1] == '/' else path + '/'
|
sqlCheckPath = path if path[-1] == '/' else path + '/'
|
||||||
|
@ -321,9 +318,8 @@ def getFolderId(path, drive):
|
||||||
session.commit()
|
session.commit()
|
||||||
else:
|
else:
|
||||||
currentFolderId = storedPathName.gdrive_id
|
currentFolderId = storedPathName.gdrive_id
|
||||||
except OperationalError as ex:
|
except (OperationalError, IntegrityError) as ex:
|
||||||
log.error("gdrive.db DB is not Writeable")
|
log.error_or_exception('Database error: {}'.format(ex))
|
||||||
log.debug('Database error: %s', ex)
|
|
||||||
session.rollback()
|
session.rollback()
|
||||||
except ApiRequestError as ex:
|
except ApiRequestError as ex:
|
||||||
log.error('{} {}'.format(ex.error['message'], path))
|
log.error('{} {}'.format(ex.error['message'], path))
|
||||||
|
@ -547,8 +543,7 @@ def deleteDatabaseOnChange():
|
||||||
session.commit()
|
session.commit()
|
||||||
except (OperationalError, InvalidRequestError) as ex:
|
except (OperationalError, InvalidRequestError) as ex:
|
||||||
session.rollback()
|
session.rollback()
|
||||||
log.debug('Database error: %s', ex)
|
log.error_or_exception('Database error: {}'.format(ex))
|
||||||
log.error(u"GDrive DB is not Writeable")
|
|
||||||
|
|
||||||
|
|
||||||
def updateGdriveCalibreFromLocal():
|
def updateGdriveCalibreFromLocal():
|
||||||
|
@ -566,8 +561,7 @@ def updateDatabaseOnEdit(ID,newPath):
|
||||||
try:
|
try:
|
||||||
session.commit()
|
session.commit()
|
||||||
except OperationalError as ex:
|
except OperationalError as ex:
|
||||||
log.error("gdrive.db DB is not Writeable")
|
log.error_or_exception('Database error: {}'.format(ex))
|
||||||
log.debug('Database error: %s', ex)
|
|
||||||
session.rollback()
|
session.rollback()
|
||||||
|
|
||||||
|
|
||||||
|
@ -577,8 +571,7 @@ def deleteDatabaseEntry(ID):
|
||||||
try:
|
try:
|
||||||
session.commit()
|
session.commit()
|
||||||
except OperationalError as ex:
|
except OperationalError as ex:
|
||||||
log.error("gdrive.db DB is not Writeable")
|
log.error_or_exception('Database error: {}'.format(ex))
|
||||||
log.debug('Database error: %s', ex)
|
|
||||||
session.rollback()
|
session.rollback()
|
||||||
|
|
||||||
|
|
||||||
|
@ -599,8 +592,7 @@ def get_cover_via_gdrive(cover_path):
|
||||||
try:
|
try:
|
||||||
session.commit()
|
session.commit()
|
||||||
except OperationalError as ex:
|
except OperationalError as ex:
|
||||||
log.error("gdrive.db DB is not Writeable")
|
log.error_or_exception('Database error: {}'.format(ex))
|
||||||
log.debug('Database error: %s', ex)
|
|
||||||
session.rollback()
|
session.rollback()
|
||||||
return df.metadata.get('webContentLink')
|
return df.metadata.get('webContentLink')
|
||||||
else:
|
else:
|
||||||
|
@ -622,7 +614,7 @@ def do_gdrive_download(df, headers, convert_encoding=False):
|
||||||
|
|
||||||
def stream(convert_encoding):
|
def stream(convert_encoding):
|
||||||
for byte in s:
|
for byte in s:
|
||||||
headers = {"Range": 'bytes=%s-%s' % (byte[0], byte[1])}
|
headers = {"Range": 'bytes={}-{}'.format(byte[0], byte[1])}
|
||||||
resp, content = df.auth.Get_Http_Object().request(download_url, headers=headers)
|
resp, content = df.auth.Get_Http_Object().request(download_url, headers=headers)
|
||||||
if resp.status == 206:
|
if resp.status == 206:
|
||||||
if convert_encoding:
|
if convert_encoding:
|
||||||
|
@ -630,7 +622,7 @@ def do_gdrive_download(df, headers, convert_encoding=False):
|
||||||
content = content.decode(result['encoding']).encode('utf-8')
|
content = content.decode(result['encoding']).encode('utf-8')
|
||||||
yield content
|
yield content
|
||||||
else:
|
else:
|
||||||
log.warning('An error occurred: %s', resp)
|
log.warning('An error occurred: {}'.format(resp))
|
||||||
return
|
return
|
||||||
return Response(stream_with_context(stream(convert_encoding)), headers=headers)
|
return Response(stream_with_context(stream(convert_encoding)), headers=headers)
|
||||||
|
|
||||||
|
|
29
cps/gevent_wsgi.py
Normal file
29
cps/gevent_wsgi.py
Normal file
|
@ -0,0 +1,29 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2022 OzzieIsaacs
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
|
||||||
|
from gevent.pywsgi import WSGIHandler
|
||||||
|
|
||||||
|
class MyWSGIHandler(WSGIHandler):
|
||||||
|
def get_environ(self):
|
||||||
|
env = super().get_environ()
|
||||||
|
path, __ = self.path.split('?', 1) if '?' in self.path else (self.path, '')
|
||||||
|
env['RAW_URI'] = path
|
||||||
|
return env
|
||||||
|
|
||||||
|
|
445
cps/helper.py
445
cps/helper.py
|
@ -19,42 +19,49 @@
|
||||||
|
|
||||||
import os
|
import os
|
||||||
import io
|
import io
|
||||||
|
import sys
|
||||||
import mimetypes
|
import mimetypes
|
||||||
import re
|
import re
|
||||||
import shutil
|
import shutil
|
||||||
import socket
|
import socket
|
||||||
import unicodedata
|
|
||||||
from datetime import datetime, timedelta
|
from datetime import datetime, timedelta
|
||||||
from tempfile import gettempdir
|
from tempfile import gettempdir
|
||||||
from urllib.parse import urlparse
|
|
||||||
import requests
|
import requests
|
||||||
|
import unidecode
|
||||||
|
|
||||||
from babel.dates import format_datetime
|
from babel.dates import format_datetime
|
||||||
from babel.units import format_unit
|
from babel.units import format_unit
|
||||||
from flask import send_from_directory, make_response, redirect, abort, url_for
|
from flask import send_from_directory, make_response, redirect, abort, url_for
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
|
from flask_babel import lazy_gettext as N_
|
||||||
from flask_login import current_user
|
from flask_login import current_user
|
||||||
from sqlalchemy.sql.expression import true, false, and_, text, func
|
from sqlalchemy.sql.expression import true, false, and_, or_, text, func
|
||||||
from sqlalchemy.exc import InvalidRequestError, OperationalError
|
from sqlalchemy.exc import InvalidRequestError, OperationalError
|
||||||
from werkzeug.datastructures import Headers
|
from werkzeug.datastructures import Headers
|
||||||
from werkzeug.security import generate_password_hash
|
from werkzeug.security import generate_password_hash
|
||||||
from markupsafe import escape
|
from markupsafe import escape
|
||||||
from urllib.parse import quote
|
from urllib.parse import quote
|
||||||
|
|
||||||
|
|
||||||
try:
|
try:
|
||||||
import unidecode
|
import advocate
|
||||||
use_unidecode = True
|
from advocate.exceptions import UnacceptableAddressException
|
||||||
|
use_advocate = True
|
||||||
except ImportError:
|
except ImportError:
|
||||||
use_unidecode = False
|
use_advocate = False
|
||||||
|
advocate = requests
|
||||||
|
UnacceptableAddressException = MissingSchema = BaseException
|
||||||
|
|
||||||
from . import calibre_db, cli
|
from . import calibre_db, cli
|
||||||
from .tasks.convert import TaskConvert
|
from .tasks.convert import TaskConvert
|
||||||
from . import logger, config, get_locale, db, ub, kobo_sync_status
|
from . import logger, config, get_locale, db, ub, fs
|
||||||
from . import gdriveutils as gd
|
from . import gdriveutils as gd
|
||||||
from .constants import STATIC_DIR as _STATIC_DIR
|
from .constants import STATIC_DIR as _STATIC_DIR, CACHE_TYPE_THUMBNAILS, THUMBNAIL_TYPE_COVER, THUMBNAIL_TYPE_SERIES
|
||||||
from .subproc_wrapper import process_wait
|
from .subproc_wrapper import process_wait
|
||||||
from .services.worker import WorkerThread, STAT_WAITING, STAT_FAIL, STAT_STARTED, STAT_FINISH_SUCCESS
|
from .services.worker import WorkerThread, STAT_WAITING, STAT_FAIL, STAT_STARTED, STAT_FINISH_SUCCESS, STAT_ENDED, \
|
||||||
|
STAT_CANCELLED
|
||||||
from .tasks.mail import TaskEmail
|
from .tasks.mail import TaskEmail
|
||||||
|
from .tasks.thumbnail import TaskClearCoverThumbnailCache, TaskGenerateCoverThumbnails
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
|
@ -105,9 +112,10 @@ def convert_book_format(book_id, calibrepath, old_book_format, new_book_format,
|
||||||
return None
|
return None
|
||||||
|
|
||||||
|
|
||||||
|
# Texts are not lazy translated as they are supposed to get send out as is
|
||||||
def send_test_mail(kindle_mail, user_name):
|
def send_test_mail(kindle_mail, user_name):
|
||||||
WorkerThread.add(user_name, TaskEmail(_(u'Calibre-Web test e-mail'), None, None,
|
WorkerThread.add(user_name, TaskEmail(_(u'Calibre-Web test e-mail'), None, None,
|
||||||
config.get_mail_settings(), kindle_mail, _(u"Test e-mail"),
|
config.get_mail_settings(), kindle_mail, N_(u"Test e-mail"),
|
||||||
_(u'This e-mail has been sent via Calibre-Web.')))
|
_(u'This e-mail has been sent via Calibre-Web.')))
|
||||||
return
|
return
|
||||||
|
|
||||||
|
@ -129,7 +137,7 @@ def send_registration_mail(e_mail, user_name, default_password, resend=False):
|
||||||
attachment=None,
|
attachment=None,
|
||||||
settings=config.get_mail_settings(),
|
settings=config.get_mail_settings(),
|
||||||
recipient=e_mail,
|
recipient=e_mail,
|
||||||
taskMessage=_(u"Registration e-mail for user: %(name)s", name=user_name),
|
task_message=N_(u"Registration e-mail for user: %(name)s", name=user_name),
|
||||||
text=txt
|
text=txt
|
||||||
))
|
))
|
||||||
return
|
return
|
||||||
|
@ -143,7 +151,7 @@ def check_send_to_kindle_with_converter(formats):
|
||||||
'text': _('Convert %(orig)s to %(format)s and send to Kindle',
|
'text': _('Convert %(orig)s to %(format)s and send to Kindle',
|
||||||
orig='Epub',
|
orig='Epub',
|
||||||
format='Mobi')})
|
format='Mobi')})
|
||||||
if 'AZW3' in formats and not 'MOBI' in formats:
|
if 'AZW3' in formats and 'MOBI' not in formats:
|
||||||
bookformats.append({'format': 'Mobi',
|
bookformats.append({'format': 'Mobi',
|
||||||
'convert': 2,
|
'convert': 2,
|
||||||
'text': _('Convert %(orig)s to %(format)s and send to Kindle',
|
'text': _('Convert %(orig)s to %(format)s and send to Kindle',
|
||||||
|
@ -185,11 +193,11 @@ def check_send_to_kindle(entry):
|
||||||
# Check if a reader is existing for any of the book formats, if not, return empty list, otherwise return
|
# Check if a reader is existing for any of the book formats, if not, return empty list, otherwise return
|
||||||
# list with supported formats
|
# list with supported formats
|
||||||
def check_read_formats(entry):
|
def check_read_formats(entry):
|
||||||
EXTENSIONS_READER = {'TXT', 'PDF', 'EPUB', 'CBZ', 'CBT', 'CBR', 'DJVU'}
|
extensions_reader = {'TXT', 'PDF', 'EPUB', 'CBZ', 'CBT', 'CBR', 'DJVU'}
|
||||||
bookformats = list()
|
bookformats = list()
|
||||||
if len(entry.data):
|
if len(entry.data):
|
||||||
for ele in iter(entry.data):
|
for ele in iter(entry.data):
|
||||||
if ele.format.upper() in EXTENSIONS_READER:
|
if ele.format.upper() in extensions_reader:
|
||||||
bookformats.append(ele.format.lower())
|
bookformats.append(ele.format.lower())
|
||||||
return bookformats
|
return bookformats
|
||||||
|
|
||||||
|
@ -213,37 +221,45 @@ def send_mail(book_id, book_format, convert, kindle_mail, calibrepath, user_id):
|
||||||
if entry.format.upper() == book_format.upper():
|
if entry.format.upper() == book_format.upper():
|
||||||
converted_file_name = entry.name + '.' + book_format.lower()
|
converted_file_name = entry.name + '.' + book_format.lower()
|
||||||
link = '<a href="{}">{}</a>'.format(url_for('web.show_book', book_id=book_id), escape(book.title))
|
link = '<a href="{}">{}</a>'.format(url_for('web.show_book', book_id=book_id), escape(book.title))
|
||||||
EmailText = _(u"%(book)s send to Kindle", book=link)
|
email_text = N_(u"%(book)s send to Kindle", book=link)
|
||||||
WorkerThread.add(user_id, TaskEmail(_(u"Send to Kindle"), book.path, converted_file_name,
|
WorkerThread.add(user_id, TaskEmail(_(u"Send to Kindle"), book.path, converted_file_name,
|
||||||
config.get_mail_settings(), kindle_mail,
|
config.get_mail_settings(), kindle_mail,
|
||||||
EmailText, _(u'This e-mail has been sent via Calibre-Web.')))
|
email_text, _(u'This e-mail has been sent via Calibre-Web.')))
|
||||||
return
|
return
|
||||||
return _(u"The requested file could not be read. Maybe wrong permissions?")
|
return _(u"The requested file could not be read. Maybe wrong permissions?")
|
||||||
|
|
||||||
|
|
||||||
|
def shorten_component(s, by_what):
|
||||||
|
l = len(s)
|
||||||
|
if l < by_what:
|
||||||
|
return s
|
||||||
|
l = (l - by_what)//2
|
||||||
|
if l <= 0:
|
||||||
|
return s
|
||||||
|
return s[:l] + s[-l:]
|
||||||
|
|
||||||
|
|
||||||
def get_valid_filename(value, replace_whitespace=True, chars=128):
|
def get_valid_filename(value, replace_whitespace=True, chars=128):
|
||||||
"""
|
"""
|
||||||
Returns the given string converted to a string that can be used for a clean
|
Returns the given string converted to a string that can be used for a clean
|
||||||
filename. Limits num characters to 128 max.
|
filename. Limits num characters to 128 max.
|
||||||
"""
|
"""
|
||||||
|
|
||||||
|
|
||||||
if value[-1:] == u'.':
|
if value[-1:] == u'.':
|
||||||
value = value[:-1]+u'_'
|
value = value[:-1]+u'_'
|
||||||
value = value.replace("/", "_").replace(":", "_").strip('\0')
|
value = value.replace("/", "_").replace(":", "_").strip('\0')
|
||||||
if use_unidecode:
|
if config.config_unicode_filename:
|
||||||
if config.config_unicode_filename:
|
value = (unidecode.unidecode(value))
|
||||||
value = (unidecode.unidecode(value))
|
|
||||||
else:
|
|
||||||
value = value.replace(u'§', u'SS')
|
|
||||||
value = value.replace(u'ß', u'ss')
|
|
||||||
value = unicodedata.normalize('NFKD', value)
|
|
||||||
re_slugify = re.compile(r'[\W\s-]', re.UNICODE)
|
|
||||||
value = re_slugify.sub('', value)
|
|
||||||
if replace_whitespace:
|
if replace_whitespace:
|
||||||
# *+:\"/<>? are replaced by _
|
# *+:\"/<>? are replaced by _
|
||||||
value = re.sub(r'[*+:\\\"/<>?]+', u'_', value, flags=re.U)
|
value = re.sub(r'[*+:\\\"/<>?]+', u'_', value, flags=re.U)
|
||||||
# pipe has to be replaced with comma
|
# pipe has to be replaced with comma
|
||||||
value = re.sub(r'[|]+', u',', value, flags=re.U)
|
value = re.sub(r'[|]+', u',', value, flags=re.U)
|
||||||
value = value[:chars].strip()
|
|
||||||
|
filename_encoding_for_length = 'utf-16' if sys.platform == "win32" or sys.platform == "darwin" else 'utf-8'
|
||||||
|
value = value.encode(filename_encoding_for_length)[:chars].decode('utf-8', errors='ignore').strip()
|
||||||
|
|
||||||
if not value:
|
if not value:
|
||||||
raise ValueError("Filename cannot be empty")
|
raise ValueError("Filename cannot be empty")
|
||||||
return value
|
return value
|
||||||
|
@ -266,6 +282,7 @@ def split_authors(values):
|
||||||
|
|
||||||
|
|
||||||
def get_sorted_author(value):
|
def get_sorted_author(value):
|
||||||
|
value2 = None
|
||||||
try:
|
try:
|
||||||
if ',' not in value:
|
if ',' not in value:
|
||||||
regexes = [r"^(JR|SR)\.?$", r"^I{1,3}\.?$", r"^IV\.?$"]
|
regexes = [r"^(JR|SR)\.?$", r"^I{1,3}\.?$", r"^IV\.?$"]
|
||||||
|
@ -290,6 +307,7 @@ def get_sorted_author(value):
|
||||||
value2 = value
|
value2 = value
|
||||||
return value2
|
return value2
|
||||||
|
|
||||||
|
|
||||||
def edit_book_read_status(book_id, read_status=None):
|
def edit_book_read_status(book_id, read_status=None):
|
||||||
if not config.config_read_column:
|
if not config.config_read_column:
|
||||||
book = ub.session.query(ub.ReadBook).filter(and_(ub.ReadBook.user_id == int(current_user.id),
|
book = ub.session.query(ub.ReadBook).filter(and_(ub.ReadBook.user_id == int(current_user.id),
|
||||||
|
@ -303,9 +321,9 @@ def edit_book_read_status(book_id, read_status=None):
|
||||||
else:
|
else:
|
||||||
book.read_status = ub.ReadBook.STATUS_FINISHED if read_status else ub.ReadBook.STATUS_UNREAD
|
book.read_status = ub.ReadBook.STATUS_FINISHED if read_status else ub.ReadBook.STATUS_UNREAD
|
||||||
else:
|
else:
|
||||||
readBook = ub.ReadBook(user_id=current_user.id, book_id = book_id)
|
read_book = ub.ReadBook(user_id=current_user.id, book_id=book_id)
|
||||||
readBook.read_status = ub.ReadBook.STATUS_FINISHED
|
read_book.read_status = ub.ReadBook.STATUS_FINISHED
|
||||||
book = readBook
|
book = read_book
|
||||||
if not book.kobo_reading_state:
|
if not book.kobo_reading_state:
|
||||||
kobo_reading_state = ub.KoboReadingState(user_id=current_user.id, book_id=book_id)
|
kobo_reading_state = ub.KoboReadingState(user_id=current_user.id, book_id=book_id)
|
||||||
kobo_reading_state.current_bookmark = ub.KoboBookmark()
|
kobo_reading_state.current_bookmark = ub.KoboBookmark()
|
||||||
|
@ -329,15 +347,17 @@ def edit_book_read_status(book_id, read_status=None):
|
||||||
new_cc = cc_class(value=read_status or 1, book=book_id)
|
new_cc = cc_class(value=read_status or 1, book=book_id)
|
||||||
calibre_db.session.add(new_cc)
|
calibre_db.session.add(new_cc)
|
||||||
calibre_db.session.commit()
|
calibre_db.session.commit()
|
||||||
except (KeyError, AttributeError):
|
except (KeyError, AttributeError, IndexError):
|
||||||
log.error(u"Custom Column No.%d is not existing in calibre database", config.config_read_column)
|
log.error(
|
||||||
|
"Custom Column No.{} is not existing in calibre database".format(config.config_read_column))
|
||||||
return "Custom Column No.{} is not existing in calibre database".format(config.config_read_column)
|
return "Custom Column No.{} is not existing in calibre database".format(config.config_read_column)
|
||||||
except (OperationalError, InvalidRequestError) as e:
|
except (OperationalError, InvalidRequestError) as ex:
|
||||||
calibre_db.session.rollback()
|
calibre_db.session.rollback()
|
||||||
log.error(u"Read status could not set: {}".format(e))
|
log.error(u"Read status could not set: {}".format(ex))
|
||||||
return "Read status could not set: {}".format(e), 400
|
return _("Read status could not set: {}".format(ex.orig))
|
||||||
return ""
|
return ""
|
||||||
|
|
||||||
|
|
||||||
# Deletes a book fro the local filestorage, returns True if deleting is successfull, otherwise false
|
# Deletes a book fro the local filestorage, returns True if deleting is successfull, otherwise false
|
||||||
def delete_book_file(book, calibrepath, book_format=None):
|
def delete_book_file(book, calibrepath, book_format=None):
|
||||||
# check that path is 2 elements deep, check that target path has no subfolders
|
# check that path is 2 elements deep, check that target path has no subfolders
|
||||||
|
@ -361,15 +381,15 @@ def delete_book_file(book, calibrepath, book_format=None):
|
||||||
id=book.id,
|
id=book.id,
|
||||||
path=book.path)
|
path=book.path)
|
||||||
shutil.rmtree(path)
|
shutil.rmtree(path)
|
||||||
except (IOError, OSError) as e:
|
except (IOError, OSError) as ex:
|
||||||
log.error("Deleting book %s failed: %s", book.id, e)
|
log.error("Deleting book %s failed: %s", book.id, ex)
|
||||||
return False, _("Deleting book %(id)s failed: %(message)s", id=book.id, message=e)
|
return False, _("Deleting book %(id)s failed: %(message)s", id=book.id, message=ex)
|
||||||
authorpath = os.path.join(calibrepath, os.path.split(book.path)[0])
|
authorpath = os.path.join(calibrepath, os.path.split(book.path)[0])
|
||||||
if not os.listdir(authorpath):
|
if not os.listdir(authorpath):
|
||||||
try:
|
try:
|
||||||
shutil.rmtree(authorpath)
|
shutil.rmtree(authorpath)
|
||||||
except (IOError, OSError) as e:
|
except (IOError, OSError) as ex:
|
||||||
log.error("Deleting authorpath for book %s failed: %s", book.id, e)
|
log.error("Deleting authorpath for book %s failed: %s", book.id, ex)
|
||||||
return True, None
|
return True, None
|
||||||
|
|
||||||
log.error("Deleting book %s from database only, book path in database not valid: %s",
|
log.error("Deleting book %s from database only, book path in database not valid: %s",
|
||||||
|
@ -395,21 +415,21 @@ def clean_author_database(renamed_author, calibre_path="", local_book=None, gdri
|
||||||
all_titledir = book.path.split('/')[1]
|
all_titledir = book.path.split('/')[1]
|
||||||
all_new_path = os.path.join(calibre_path, all_new_authordir, all_titledir)
|
all_new_path = os.path.join(calibre_path, all_new_authordir, all_titledir)
|
||||||
all_new_name = get_valid_filename(book.title, chars=42) + ' - ' \
|
all_new_name = get_valid_filename(book.title, chars=42) + ' - ' \
|
||||||
+ get_valid_filename(new_author.name, chars=42)
|
+ get_valid_filename(new_author.name, chars=42)
|
||||||
# change location in database to new author/title path
|
# change location in database to new author/title path
|
||||||
book.path = os.path.join(all_new_authordir, all_titledir).replace('\\', '/')
|
book.path = os.path.join(all_new_authordir, all_titledir).replace('\\', '/')
|
||||||
for file_format in book.data:
|
for file_format in book.data:
|
||||||
if not gdrive:
|
if not gdrive:
|
||||||
shutil.move(os.path.normcase(os.path.join(all_new_path,
|
shutil.move(os.path.normcase(os.path.join(all_new_path,
|
||||||
file_format.name + '.' + file_format.format.lower())),
|
file_format.name + '.' + file_format.format.lower())),
|
||||||
os.path.normcase(os.path.join(all_new_path,
|
os.path.normcase(os.path.join(all_new_path,
|
||||||
all_new_name + '.' + file_format.format.lower())))
|
all_new_name + '.' + file_format.format.lower())))
|
||||||
else:
|
else:
|
||||||
gFile = gd.getFileFromEbooksFolder(all_new_path,
|
g_file = gd.getFileFromEbooksFolder(all_new_path,
|
||||||
file_format.name + '.' + file_format.format.lower())
|
file_format.name + '.' + file_format.format.lower())
|
||||||
if gFile:
|
if g_file:
|
||||||
gd.moveGdriveFileRemote(gFile, all_new_name + u'.' + file_format.format.lower())
|
gd.moveGdriveFileRemote(g_file, all_new_name + u'.' + file_format.format.lower())
|
||||||
gd.updateDatabaseOnEdit(gFile['id'], all_new_name + u'.' + file_format.format.lower())
|
gd.updateDatabaseOnEdit(g_file['id'], all_new_name + u'.' + file_format.format.lower())
|
||||||
else:
|
else:
|
||||||
log.error("File {} not found on gdrive"
|
log.error("File {} not found on gdrive"
|
||||||
.format(all_new_path, file_format.name + '.' + file_format.format.lower()))
|
.format(all_new_path, file_format.name + '.' + file_format.format.lower()))
|
||||||
|
@ -426,16 +446,16 @@ def rename_all_authors(first_author, renamed_author, calibre_path="", localbook=
|
||||||
old_author_dir = get_valid_filename(r, chars=96)
|
old_author_dir = get_valid_filename(r, chars=96)
|
||||||
new_author_rename_dir = get_valid_filename(new_author.name, chars=96)
|
new_author_rename_dir = get_valid_filename(new_author.name, chars=96)
|
||||||
if gdrive:
|
if gdrive:
|
||||||
gFile = gd.getFileFromEbooksFolder(None, old_author_dir)
|
g_file = gd.getFileFromEbooksFolder(None, old_author_dir)
|
||||||
if gFile:
|
if g_file:
|
||||||
gd.moveGdriveFolderRemote(gFile, new_author_rename_dir)
|
gd.moveGdriveFolderRemote(g_file, new_author_rename_dir)
|
||||||
else:
|
else:
|
||||||
if os.path.isdir(os.path.join(calibre_path, old_author_dir)):
|
if os.path.isdir(os.path.join(calibre_path, old_author_dir)):
|
||||||
try:
|
try:
|
||||||
old_author_path = os.path.join(calibre_path, old_author_dir)
|
old_author_path = os.path.join(calibre_path, old_author_dir)
|
||||||
new_author_path = os.path.join(calibre_path, new_author_rename_dir)
|
new_author_path = os.path.join(calibre_path, new_author_rename_dir)
|
||||||
shutil.move(os.path.normcase(old_author_path), os.path.normcase(new_author_path))
|
shutil.move(os.path.normcase(old_author_path), os.path.normcase(new_author_path))
|
||||||
except (OSError) as ex:
|
except OSError as ex:
|
||||||
log.error("Rename author from: %s to %s: %s", old_author_path, new_author_path, ex)
|
log.error("Rename author from: %s to %s: %s", old_author_path, new_author_path, ex)
|
||||||
log.debug(ex, exc_info=True)
|
log.debug(ex, exc_info=True)
|
||||||
return _("Rename author from: '%(src)s' to '%(dest)s' failed with error: %(error)s",
|
return _("Rename author from: '%(src)s' to '%(dest)s' failed with error: %(error)s",
|
||||||
|
@ -444,34 +464,35 @@ def rename_all_authors(first_author, renamed_author, calibre_path="", localbook=
|
||||||
new_authordir = get_valid_filename(localbook.authors[0].name, chars=96)
|
new_authordir = get_valid_filename(localbook.authors[0].name, chars=96)
|
||||||
return new_authordir
|
return new_authordir
|
||||||
|
|
||||||
|
|
||||||
# Moves files in file storage during author/title rename, or from temp dir to file storage
|
# Moves files in file storage during author/title rename, or from temp dir to file storage
|
||||||
def update_dir_structure_file(book_id, calibre_path, first_author, original_filepath, db_filename, renamed_author):
|
def update_dir_structure_file(book_id, calibre_path, first_author, original_filepath, db_filename, renamed_author):
|
||||||
# get book database entry from id, if original path overwrite source with original_filepath
|
# get book database entry from id, if original path overwrite source with original_filepath
|
||||||
localbook = calibre_db.get_book(book_id)
|
local_book = calibre_db.get_book(book_id)
|
||||||
if original_filepath:
|
if original_filepath:
|
||||||
path = original_filepath
|
path = original_filepath
|
||||||
else:
|
else:
|
||||||
path = os.path.join(calibre_path, localbook.path)
|
path = os.path.join(calibre_path, local_book.path)
|
||||||
|
|
||||||
# Create (current) authordir and titledir from database
|
# Create (current) author_dir and title_dir from database
|
||||||
authordir = localbook.path.split('/')[0]
|
author_dir = local_book.path.split('/')[0]
|
||||||
titledir = localbook.path.split('/')[1]
|
title_dir = local_book.path.split('/')[1]
|
||||||
|
|
||||||
# Create new_authordir from parameter or from database
|
# Create new_author_dir from parameter or from database
|
||||||
# Create new titledir from database and add id
|
# Create new title_dir from database and add id
|
||||||
new_authordir = rename_all_authors(first_author, renamed_author, calibre_path, localbook)
|
new_author_dir = rename_all_authors(first_author, renamed_author, calibre_path, local_book)
|
||||||
if first_author:
|
if first_author:
|
||||||
if first_author.lower() in [r.lower() for r in renamed_author]:
|
if first_author.lower() in [r.lower() for r in renamed_author]:
|
||||||
if os.path.isdir(os.path.join(calibre_path, new_authordir)):
|
if os.path.isdir(os.path.join(calibre_path, new_author_dir)):
|
||||||
path = os.path.join(calibre_path, new_authordir, titledir)
|
path = os.path.join(calibre_path, new_author_dir, title_dir)
|
||||||
|
|
||||||
new_titledir = get_valid_filename(localbook.title, chars=96) + " (" + str(book_id) + ")"
|
new_title_dir = get_valid_filename(local_book.title, chars=96) + " (" + str(book_id) + ")"
|
||||||
|
|
||||||
if titledir != new_titledir or authordir != new_authordir or original_filepath:
|
if title_dir != new_title_dir or author_dir != new_author_dir or original_filepath:
|
||||||
error = move_files_on_change(calibre_path,
|
error = move_files_on_change(calibre_path,
|
||||||
new_authordir,
|
new_author_dir,
|
||||||
new_titledir,
|
new_title_dir,
|
||||||
localbook,
|
local_book,
|
||||||
db_filename,
|
db_filename,
|
||||||
original_filepath,
|
original_filepath,
|
||||||
path)
|
path)
|
||||||
|
@ -479,26 +500,23 @@ def update_dir_structure_file(book_id, calibre_path, first_author, original_file
|
||||||
return error
|
return error
|
||||||
|
|
||||||
# Rename all files from old names to new names
|
# Rename all files from old names to new names
|
||||||
return rename_files_on_change(first_author, renamed_author, localbook, original_filepath, path, calibre_path)
|
return rename_files_on_change(first_author, renamed_author, local_book, original_filepath, path, calibre_path)
|
||||||
|
|
||||||
|
|
||||||
def upload_new_file_gdrive(book_id, first_author, renamed_author, title, title_dir, original_filepath, filename_ext):
|
def upload_new_file_gdrive(book_id, first_author, renamed_author, title, title_dir, original_filepath, filename_ext):
|
||||||
error = False
|
|
||||||
book = calibre_db.get_book(book_id)
|
book = calibre_db.get_book(book_id)
|
||||||
file_name = get_valid_filename(title, chars=42) + ' - ' + \
|
file_name = get_valid_filename(title, chars=42) + ' - ' + \
|
||||||
get_valid_filename(first_author, chars=42) + \
|
get_valid_filename(first_author, chars=42) + filename_ext
|
||||||
filename_ext
|
|
||||||
rename_all_authors(first_author, renamed_author, gdrive=True)
|
rename_all_authors(first_author, renamed_author, gdrive=True)
|
||||||
gdrive_path = os.path.join(get_valid_filename(first_author, chars=96),
|
gdrive_path = os.path.join(get_valid_filename(first_author, chars=96),
|
||||||
title_dir + " (" + str(book_id) + ")")
|
title_dir + " (" + str(book_id) + ")")
|
||||||
book.path = gdrive_path.replace("\\", "/")
|
book.path = gdrive_path.replace("\\", "/")
|
||||||
gd.uploadFileToEbooksFolder(os.path.join(gdrive_path, file_name).replace("\\", "/"), original_filepath)
|
gd.uploadFileToEbooksFolder(os.path.join(gdrive_path, file_name).replace("\\", "/"), original_filepath)
|
||||||
error |= rename_files_on_change(first_author, renamed_author, localbook=book, gdrive=True)
|
return rename_files_on_change(first_author, renamed_author, local_book=book, gdrive=True)
|
||||||
return error
|
|
||||||
|
|
||||||
|
|
||||||
def update_dir_structure_gdrive(book_id, first_author, renamed_author):
|
def update_dir_structure_gdrive(book_id, first_author, renamed_author):
|
||||||
error = False
|
|
||||||
book = calibre_db.get_book(book_id)
|
book = calibre_db.get_book(book_id)
|
||||||
|
|
||||||
authordir = book.path.split('/')[0]
|
authordir = book.path.split('/')[0]
|
||||||
|
@ -507,27 +525,26 @@ def update_dir_structure_gdrive(book_id, first_author, renamed_author):
|
||||||
new_titledir = get_valid_filename(book.title, chars=96) + u" (" + str(book_id) + u")"
|
new_titledir = get_valid_filename(book.title, chars=96) + u" (" + str(book_id) + u")"
|
||||||
|
|
||||||
if titledir != new_titledir:
|
if titledir != new_titledir:
|
||||||
gFile = gd.getFileFromEbooksFolder(os.path.dirname(book.path), titledir)
|
g_file = gd.getFileFromEbooksFolder(os.path.dirname(book.path), titledir)
|
||||||
if gFile:
|
if g_file:
|
||||||
gd.moveGdriveFileRemote(gFile, new_titledir)
|
gd.moveGdriveFileRemote(g_file, new_titledir)
|
||||||
book.path = book.path.split('/')[0] + u'/' + new_titledir
|
book.path = book.path.split('/')[0] + u'/' + new_titledir
|
||||||
gd.updateDatabaseOnEdit(gFile['id'], book.path) # only child folder affected
|
gd.updateDatabaseOnEdit(g_file['id'], book.path) # only child folder affected
|
||||||
else:
|
else:
|
||||||
error = _(u'File %(file)s not found on Google Drive', file=book.path) # file not found
|
return _(u'File %(file)s not found on Google Drive', file=book.path) # file not found
|
||||||
|
|
||||||
if authordir != new_authordir and authordir not in renamed_author:
|
if authordir != new_authordir and authordir not in renamed_author:
|
||||||
gFile = gd.getFileFromEbooksFolder(os.path.dirname(book.path), new_titledir)
|
g_file = gd.getFileFromEbooksFolder(os.path.dirname(book.path), new_titledir)
|
||||||
if gFile:
|
if g_file:
|
||||||
gd.moveGdriveFolderRemote(gFile, new_authordir)
|
gd.moveGdriveFolderRemote(g_file, new_authordir)
|
||||||
book.path = new_authordir + u'/' + book.path.split('/')[1]
|
book.path = new_authordir + u'/' + book.path.split('/')[1]
|
||||||
gd.updateDatabaseOnEdit(gFile['id'], book.path)
|
gd.updateDatabaseOnEdit(g_file['id'], book.path)
|
||||||
else:
|
else:
|
||||||
error = _(u'File %(file)s not found on Google Drive', file=authordir) # file not found
|
return _(u'File %(file)s not found on Google Drive', file=authordir) # file not found
|
||||||
|
|
||||||
# change location in database to new author/title path
|
# change location in database to new author/title path
|
||||||
book.path = os.path.join(new_authordir, new_titledir).replace('\\', '/')
|
book.path = os.path.join(new_authordir, new_titledir).replace('\\', '/')
|
||||||
error |= rename_files_on_change(first_author, renamed_author, book, gdrive=True)
|
return rename_files_on_change(first_author, renamed_author, book, gdrive=True)
|
||||||
return error
|
|
||||||
|
|
||||||
|
|
||||||
def move_files_on_change(calibre_path, new_authordir, new_titledir, localbook, db_filename, original_filepath, path):
|
def move_files_on_change(calibre_path, new_authordir, new_titledir, localbook, db_filename, original_filepath, path):
|
||||||
|
@ -545,18 +562,17 @@ def move_files_on_change(calibre_path, new_authordir, new_titledir, localbook, d
|
||||||
# move original path to new path
|
# move original path to new path
|
||||||
log.debug("Moving title: %s to %s", path, new_path)
|
log.debug("Moving title: %s to %s", path, new_path)
|
||||||
shutil.move(os.path.normcase(path), os.path.normcase(new_path))
|
shutil.move(os.path.normcase(path), os.path.normcase(new_path))
|
||||||
else: # path is valid copy only files to new location (merge)
|
else: # path is valid copy only files to new location (merge)
|
||||||
log.info("Moving title: %s into existing: %s", path, new_path)
|
log.info("Moving title: %s into existing: %s", path, new_path)
|
||||||
# Take all files and subfolder from old path (strange command)
|
# Take all files and subfolder from old path (strange command)
|
||||||
for dir_name, __, file_list in os.walk(path):
|
for dir_name, __, file_list in os.walk(path):
|
||||||
for file in file_list:
|
for file in file_list:
|
||||||
shutil.move(os.path.normcase(os.path.join(dir_name, file)),
|
shutil.move(os.path.normcase(os.path.join(dir_name, file)),
|
||||||
os.path.normcase(os.path.join(new_path + dir_name[len(path):], file)))
|
os.path.normcase(os.path.join(new_path + dir_name[len(path):], file)))
|
||||||
# change location in database to new author/title path
|
# change location in database to new author/title path
|
||||||
localbook.path = os.path.join(new_authordir, new_titledir).replace('\\','/')
|
localbook.path = os.path.join(new_authordir, new_titledir).replace('\\', '/')
|
||||||
except OSError as ex:
|
except OSError as ex:
|
||||||
log.error("Rename title from: %s to %s: %s", path, new_path, ex)
|
log.error_or_exception("Rename title from {} to {} failed with error: {}".format(path, new_path, ex))
|
||||||
log.debug(ex, exc_info=True)
|
|
||||||
return _("Rename title from: '%(src)s' to '%(dest)s' failed with error: %(error)s",
|
return _("Rename title from: '%(src)s' to '%(dest)s' failed with error: %(error)s",
|
||||||
src=path, dest=new_path, error=str(ex))
|
src=path, dest=new_path, error=str(ex))
|
||||||
return False
|
return False
|
||||||
|
@ -564,8 +580,8 @@ def move_files_on_change(calibre_path, new_authordir, new_titledir, localbook, d
|
||||||
|
|
||||||
def rename_files_on_change(first_author,
|
def rename_files_on_change(first_author,
|
||||||
renamed_author,
|
renamed_author,
|
||||||
localbook,
|
local_book,
|
||||||
orignal_filepath="",
|
original_filepath="",
|
||||||
path="",
|
path="",
|
||||||
calibre_path="",
|
calibre_path="",
|
||||||
gdrive=False):
|
gdrive=False):
|
||||||
|
@ -573,13 +589,12 @@ def rename_files_on_change(first_author,
|
||||||
try:
|
try:
|
||||||
clean_author_database(renamed_author, calibre_path, gdrive=gdrive)
|
clean_author_database(renamed_author, calibre_path, gdrive=gdrive)
|
||||||
if first_author and first_author not in renamed_author:
|
if first_author and first_author not in renamed_author:
|
||||||
clean_author_database([first_author], calibre_path, localbook, gdrive)
|
clean_author_database([first_author], calibre_path, local_book, gdrive)
|
||||||
if not gdrive and not renamed_author and not orignal_filepath and len(os.listdir(os.path.dirname(path))) == 0:
|
if not gdrive and not renamed_author and not original_filepath and len(os.listdir(os.path.dirname(path))) == 0:
|
||||||
shutil.rmtree(os.path.dirname(path))
|
shutil.rmtree(os.path.dirname(path))
|
||||||
except (OSError, FileNotFoundError) as ex:
|
except (OSError, FileNotFoundError) as ex:
|
||||||
log.error("Error in rename file in path %s", ex)
|
log.error_or_exception("Error in rename file in path {}".format(ex))
|
||||||
log.debug(ex, exc_info=True)
|
return _("Error in rename file in path: {}".format(str(ex)))
|
||||||
return _("Error in rename file in path: %(error)s", error=str(ex))
|
|
||||||
return False
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
@ -590,12 +605,12 @@ def delete_book_gdrive(book, book_format):
|
||||||
for entry in book.data:
|
for entry in book.data:
|
||||||
if entry.format.upper() == book_format:
|
if entry.format.upper() == book_format:
|
||||||
name = entry.name + '.' + book_format
|
name = entry.name + '.' + book_format
|
||||||
gFile = gd.getFileFromEbooksFolder(book.path, name)
|
g_file = gd.getFileFromEbooksFolder(book.path, name)
|
||||||
else:
|
else:
|
||||||
gFile = gd.getFileFromEbooksFolder(os.path.dirname(book.path), book.path.split('/')[1])
|
g_file = gd.getFileFromEbooksFolder(os.path.dirname(book.path), book.path.split('/')[1])
|
||||||
if gFile:
|
if g_file:
|
||||||
gd.deleteDatabaseEntry(gFile['id'])
|
gd.deleteDatabaseEntry(g_file['id'])
|
||||||
gFile.Trash()
|
g_file.Trash()
|
||||||
else:
|
else:
|
||||||
error = _(u'Book path %(path)s not found on Google Drive', path=book.path) # file not found
|
error = _(u'Book path %(path)s not found on Google Drive', path=book.path) # file not found
|
||||||
|
|
||||||
|
@ -627,12 +642,13 @@ def generate_random_password():
|
||||||
|
|
||||||
def uniq(inpt):
|
def uniq(inpt):
|
||||||
output = []
|
output = []
|
||||||
inpt = [ " ".join(inp.split()) for inp in inpt]
|
inpt = [" ".join(inp.split()) for inp in inpt]
|
||||||
for x in inpt:
|
for x in inpt:
|
||||||
if x not in output:
|
if x not in output:
|
||||||
output.append(x)
|
output.append(x)
|
||||||
return output
|
return output
|
||||||
|
|
||||||
|
|
||||||
def check_email(email):
|
def check_email(email):
|
||||||
email = valid_email(email)
|
email = valid_email(email)
|
||||||
if ub.session.query(ub.User).filter(func.lower(ub.User.email) == email.lower()).first():
|
if ub.session.query(ub.User).filter(func.lower(ub.User.email) == email.lower()).first():
|
||||||
|
@ -645,7 +661,7 @@ def check_username(username):
|
||||||
username = username.strip()
|
username = username.strip()
|
||||||
if ub.session.query(ub.User).filter(func.lower(ub.User.name) == username.lower()).scalar():
|
if ub.session.query(ub.User).filter(func.lower(ub.User.name) == username.lower()).scalar():
|
||||||
log.error(u"This username is already taken")
|
log.error(u"This username is already taken")
|
||||||
raise Exception (_(u"This username is already taken"))
|
raise Exception(_(u"This username is already taken"))
|
||||||
return username
|
return username
|
||||||
|
|
||||||
|
|
||||||
|
@ -679,6 +695,7 @@ def update_dir_structure(book_id,
|
||||||
|
|
||||||
|
|
||||||
def delete_book(book, calibrepath, book_format):
|
def delete_book(book, calibrepath, book_format):
|
||||||
|
clear_cover_thumbnail_cache(book.id)
|
||||||
if config.config_use_google_drive:
|
if config.config_use_google_drive:
|
||||||
return delete_book_gdrive(book, book_format)
|
return delete_book_gdrive(book, book_format)
|
||||||
else:
|
else:
|
||||||
|
@ -687,24 +704,38 @@ def delete_book(book, calibrepath, book_format):
|
||||||
|
|
||||||
def get_cover_on_failure(use_generic_cover):
|
def get_cover_on_failure(use_generic_cover):
|
||||||
if use_generic_cover:
|
if use_generic_cover:
|
||||||
return send_from_directory(_STATIC_DIR, "generic_cover.jpg")
|
try:
|
||||||
else:
|
return send_from_directory(_STATIC_DIR, "generic_cover.jpg")
|
||||||
return None
|
except PermissionError:
|
||||||
|
log.error("No permission to access generic_cover.jpg file.")
|
||||||
|
abort(403)
|
||||||
|
abort(404)
|
||||||
|
|
||||||
|
|
||||||
def get_book_cover(book_id):
|
def get_book_cover(book_id, resolution=None):
|
||||||
book = calibre_db.get_filtered_book(book_id, allow_show_archived=True)
|
book = calibre_db.get_filtered_book(book_id, allow_show_archived=True)
|
||||||
return get_book_cover_internal(book, use_generic_cover_on_failure=True)
|
return get_book_cover_internal(book, use_generic_cover_on_failure=True, resolution=resolution)
|
||||||
|
|
||||||
|
|
||||||
def get_book_cover_with_uuid(book_uuid,
|
# Called only by kobo sync -> cover not found should be answered with 404 and not with default cover
|
||||||
use_generic_cover_on_failure=True):
|
def get_book_cover_with_uuid(book_uuid, resolution=None):
|
||||||
book = calibre_db.get_book_by_uuid(book_uuid)
|
book = calibre_db.get_book_by_uuid(book_uuid)
|
||||||
return get_book_cover_internal(book, use_generic_cover_on_failure)
|
return get_book_cover_internal(book, use_generic_cover_on_failure=False, resolution=resolution)
|
||||||
|
|
||||||
|
|
||||||
def get_book_cover_internal(book, use_generic_cover_on_failure):
|
def get_book_cover_internal(book, use_generic_cover_on_failure, resolution=None):
|
||||||
if book and book.has_cover:
|
if book and book.has_cover:
|
||||||
|
|
||||||
|
# Send the book cover thumbnail if it exists in cache
|
||||||
|
if resolution:
|
||||||
|
thumbnail = get_book_cover_thumbnail(book, resolution)
|
||||||
|
if thumbnail:
|
||||||
|
cache = fs.FileSystem()
|
||||||
|
if cache.get_cache_file_exists(thumbnail.filename, CACHE_TYPE_THUMBNAILS):
|
||||||
|
return send_from_directory(cache.get_cache_file_dir(thumbnail.filename, CACHE_TYPE_THUMBNAILS),
|
||||||
|
thumbnail.filename)
|
||||||
|
|
||||||
|
# Send the book cover from Google Drive if configured
|
||||||
if config.config_use_google_drive:
|
if config.config_use_google_drive:
|
||||||
try:
|
try:
|
||||||
if not gd.is_gdrive_ready():
|
if not gd.is_gdrive_ready():
|
||||||
|
@ -713,11 +744,13 @@ def get_book_cover_internal(book, use_generic_cover_on_failure):
|
||||||
if path:
|
if path:
|
||||||
return redirect(path)
|
return redirect(path)
|
||||||
else:
|
else:
|
||||||
log.error('%s/cover.jpg not found on Google Drive', book.path)
|
log.error('{}/cover.jpg not found on Google Drive'.format(book.path))
|
||||||
return get_cover_on_failure(use_generic_cover_on_failure)
|
return get_cover_on_failure(use_generic_cover_on_failure)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.debug_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
return get_cover_on_failure(use_generic_cover_on_failure)
|
return get_cover_on_failure(use_generic_cover_on_failure)
|
||||||
|
|
||||||
|
# Send the book cover from the Calibre directory
|
||||||
else:
|
else:
|
||||||
cover_file_path = os.path.join(config.config_calibre_dir, book.path)
|
cover_file_path = os.path.join(config.config_calibre_dir, book.path)
|
||||||
if os.path.isfile(os.path.join(cover_file_path, "cover.jpg")):
|
if os.path.isfile(os.path.join(cover_file_path, "cover.jpg")):
|
||||||
|
@ -728,27 +761,82 @@ def get_book_cover_internal(book, use_generic_cover_on_failure):
|
||||||
return get_cover_on_failure(use_generic_cover_on_failure)
|
return get_cover_on_failure(use_generic_cover_on_failure)
|
||||||
|
|
||||||
|
|
||||||
|
def get_book_cover_thumbnail(book, resolution):
|
||||||
|
if book and book.has_cover:
|
||||||
|
return ub.session \
|
||||||
|
.query(ub.Thumbnail) \
|
||||||
|
.filter(ub.Thumbnail.type == THUMBNAIL_TYPE_COVER) \
|
||||||
|
.filter(ub.Thumbnail.entity_id == book.id) \
|
||||||
|
.filter(ub.Thumbnail.resolution == resolution) \
|
||||||
|
.filter(or_(ub.Thumbnail.expiration.is_(None), ub.Thumbnail.expiration > datetime.utcnow())) \
|
||||||
|
.first()
|
||||||
|
|
||||||
|
|
||||||
|
def get_series_thumbnail_on_failure(series_id, resolution):
|
||||||
|
book = calibre_db.session \
|
||||||
|
.query(db.Books) \
|
||||||
|
.join(db.books_series_link) \
|
||||||
|
.join(db.Series) \
|
||||||
|
.filter(db.Series.id == series_id) \
|
||||||
|
.filter(db.Books.has_cover == 1) \
|
||||||
|
.first()
|
||||||
|
|
||||||
|
return get_book_cover_internal(book, use_generic_cover_on_failure=True, resolution=resolution)
|
||||||
|
|
||||||
|
|
||||||
|
def get_series_cover_thumbnail(series_id, resolution=None):
|
||||||
|
return get_series_cover_internal(series_id, resolution)
|
||||||
|
|
||||||
|
|
||||||
|
def get_series_cover_internal(series_id, resolution=None):
|
||||||
|
# Send the series thumbnail if it exists in cache
|
||||||
|
if resolution:
|
||||||
|
thumbnail = get_series_thumbnail(series_id, resolution)
|
||||||
|
if thumbnail:
|
||||||
|
cache = fs.FileSystem()
|
||||||
|
if cache.get_cache_file_exists(thumbnail.filename, CACHE_TYPE_THUMBNAILS):
|
||||||
|
return send_from_directory(cache.get_cache_file_dir(thumbnail.filename, CACHE_TYPE_THUMBNAILS),
|
||||||
|
thumbnail.filename)
|
||||||
|
|
||||||
|
return get_series_thumbnail_on_failure(series_id, resolution)
|
||||||
|
|
||||||
|
|
||||||
|
def get_series_thumbnail(series_id, resolution):
|
||||||
|
return ub.session \
|
||||||
|
.query(ub.Thumbnail) \
|
||||||
|
.filter(ub.Thumbnail.type == THUMBNAIL_TYPE_SERIES) \
|
||||||
|
.filter(ub.Thumbnail.entity_id == series_id) \
|
||||||
|
.filter(ub.Thumbnail.resolution == resolution) \
|
||||||
|
.filter(or_(ub.Thumbnail.expiration.is_(None), ub.Thumbnail.expiration > datetime.utcnow())) \
|
||||||
|
.first()
|
||||||
|
|
||||||
|
|
||||||
# saves book cover from url
|
# saves book cover from url
|
||||||
def save_cover_from_url(url, book_path):
|
def save_cover_from_url(url, book_path):
|
||||||
try:
|
try:
|
||||||
if not cli.allow_localhost:
|
if cli.allow_localhost:
|
||||||
# 127.0.x.x, localhost, [::1], [::ffff:7f00:1]
|
img = requests.get(url, timeout=(10, 200), allow_redirects=False) # ToDo: Error Handling
|
||||||
ip = socket.getaddrinfo(urlparse(url).hostname, 0)[0][4][0]
|
elif use_advocate:
|
||||||
if ip.startswith("127.") or ip.startswith('::ffff:7f') or ip == "::1":
|
img = advocate.get(url, timeout=(10, 200), allow_redirects=False) # ToDo: Error Handling
|
||||||
log.error("Localhost was accessed for cover upload")
|
else:
|
||||||
return False, _("You are not allowed to access localhost for cover uploads")
|
log.error("python modul advocate is not installed but is needed")
|
||||||
img = requests.get(url, timeout=(10, 200)) # ToDo: Error Handling
|
return False, _("Python modul 'advocate' is not installed but is needed for cover downloads")
|
||||||
img.raise_for_status()
|
img.raise_for_status()
|
||||||
return save_cover(img, book_path)
|
return save_cover(img, book_path)
|
||||||
except (socket.gaierror,
|
except (socket.gaierror,
|
||||||
requests.exceptions.HTTPError,
|
requests.exceptions.HTTPError,
|
||||||
|
requests.exceptions.InvalidURL,
|
||||||
requests.exceptions.ConnectionError,
|
requests.exceptions.ConnectionError,
|
||||||
requests.exceptions.Timeout) as ex:
|
requests.exceptions.Timeout) as ex:
|
||||||
log.info(u'Cover Download Error %s', ex)
|
# "Invalid host" can be the result of a redirect response
|
||||||
|
log.error(u'Cover Download Error %s', ex)
|
||||||
return False, _("Error Downloading Cover")
|
return False, _("Error Downloading Cover")
|
||||||
except MissingDelegateError as ex:
|
except MissingDelegateError as ex:
|
||||||
log.info(u'File Format Error %s', ex)
|
log.info(u'File Format Error %s', ex)
|
||||||
return False, _("Cover Format Error")
|
return False, _("Cover Format Error")
|
||||||
|
except UnacceptableAddressException as e:
|
||||||
|
log.error("Localhost or local network was accessed for cover upload")
|
||||||
|
return False, _("You are not allowed to access localhost or the local network for cover uploads")
|
||||||
|
|
||||||
|
|
||||||
def save_cover_from_filestorage(filepath, saved_filename, img):
|
def save_cover_from_filestorage(filepath, saved_filename, img):
|
||||||
|
@ -783,24 +871,23 @@ def save_cover(img, book_path):
|
||||||
content_type = img.headers.get('content-type')
|
content_type = img.headers.get('content-type')
|
||||||
|
|
||||||
if use_IM:
|
if use_IM:
|
||||||
if content_type not in ('image/jpeg', 'image/png', 'image/webp', 'image/bmp'):
|
if content_type not in ('image/jpeg', 'image/jpg', 'image/png', 'image/webp', 'image/bmp'):
|
||||||
log.error("Only jpg/jpeg/png/webp/bmp files are supported as coverfile")
|
log.error("Only jpg/jpeg/png/webp/bmp files are supported as coverfile")
|
||||||
return False, _("Only jpg/jpeg/png/webp/bmp files are supported as coverfile")
|
return False, _("Only jpg/jpeg/png/webp/bmp files are supported as coverfile")
|
||||||
# convert to jpg because calibre only supports jpg
|
# convert to jpg because calibre only supports jpg
|
||||||
if content_type != 'image/jpg':
|
try:
|
||||||
try:
|
if hasattr(img, 'stream'):
|
||||||
if hasattr(img, 'stream'):
|
imgc = Image(blob=img.stream)
|
||||||
imgc = Image(blob=img.stream)
|
else:
|
||||||
else:
|
imgc = Image(blob=io.BytesIO(img.content))
|
||||||
imgc = Image(blob=io.BytesIO(img.content))
|
imgc.format = 'jpeg'
|
||||||
imgc.format = 'jpeg'
|
imgc.transform_colorspace("rgb")
|
||||||
imgc.transform_colorspace("rgb")
|
img = imgc
|
||||||
img = imgc
|
except (BlobError, MissingDelegateError):
|
||||||
except (BlobError, MissingDelegateError):
|
log.error("Invalid cover file content")
|
||||||
log.error("Invalid cover file content")
|
return False, _("Invalid cover file content")
|
||||||
return False, _("Invalid cover file content")
|
|
||||||
else:
|
else:
|
||||||
if content_type not in 'image/jpeg':
|
if content_type not in ['image/jpeg', 'image/jpg']:
|
||||||
log.error("Only jpg/jpeg files are supported as coverfile")
|
log.error("Only jpg/jpeg files are supported as coverfile")
|
||||||
return False, _("Only jpg/jpeg files are supported as coverfile")
|
return False, _("Only jpg/jpeg files are supported as coverfile")
|
||||||
|
|
||||||
|
@ -811,7 +898,7 @@ def save_cover(img, book_path):
|
||||||
os.mkdir(tmp_dir)
|
os.mkdir(tmp_dir)
|
||||||
ret, message = save_cover_from_filestorage(tmp_dir, "uploaded_cover.jpg", img)
|
ret, message = save_cover_from_filestorage(tmp_dir, "uploaded_cover.jpg", img)
|
||||||
if ret is True:
|
if ret is True:
|
||||||
gd.uploadFileToEbooksFolder(os.path.join(book_path, 'cover.jpg').replace("\\","/"),
|
gd.uploadFileToEbooksFolder(os.path.join(book_path, 'cover.jpg').replace("\\", "/"),
|
||||||
os.path.join(tmp_dir, "uploaded_cover.jpg"))
|
os.path.join(tmp_dir, "uploaded_cover.jpg"))
|
||||||
log.info("Cover is saved on Google Drive")
|
log.info("Cover is saved on Google Drive")
|
||||||
return True, None
|
return True, None
|
||||||
|
@ -823,9 +910,9 @@ def save_cover(img, book_path):
|
||||||
|
|
||||||
def do_download_file(book, book_format, client, data, headers):
|
def do_download_file(book, book_format, client, data, headers):
|
||||||
if config.config_use_google_drive:
|
if config.config_use_google_drive:
|
||||||
#startTime = time.time()
|
# startTime = time.time()
|
||||||
df = gd.getFileFromEbooksFolder(book.path, data.name + "." + book_format)
|
df = gd.getFileFromEbooksFolder(book.path, data.name + "." + book_format)
|
||||||
#log.debug('%s', time.time() - startTime)
|
# log.debug('%s', time.time() - startTime)
|
||||||
if df:
|
if df:
|
||||||
return gd.do_gdrive_download(df, headers)
|
return gd.do_gdrive_download(df, headers)
|
||||||
else:
|
else:
|
||||||
|
@ -849,22 +936,22 @@ def do_download_file(book, book_format, client, data, headers):
|
||||||
##################################
|
##################################
|
||||||
|
|
||||||
|
|
||||||
def check_unrar(unrarLocation):
|
def check_unrar(unrar_location):
|
||||||
if not unrarLocation:
|
if not unrar_location:
|
||||||
return
|
return
|
||||||
|
|
||||||
if not os.path.exists(unrarLocation):
|
if not os.path.exists(unrar_location):
|
||||||
return _('Unrar binary file not found')
|
return _('Unrar binary file not found')
|
||||||
|
|
||||||
try:
|
try:
|
||||||
unrarLocation = [unrarLocation]
|
unrar_location = [unrar_location]
|
||||||
value = process_wait(unrarLocation, pattern='UNRAR (.*) freeware')
|
value = process_wait(unrar_location, pattern='UNRAR (.*) freeware')
|
||||||
if value:
|
if value:
|
||||||
version = value.group(1)
|
version = value.group(1)
|
||||||
log.debug("unrar version %s", version)
|
log.debug("unrar version %s", version)
|
||||||
|
|
||||||
except (OSError, UnicodeDecodeError) as err:
|
except (OSError, UnicodeDecodeError) as err:
|
||||||
log.debug_or_exception(err)
|
log.error_or_exception(err)
|
||||||
return _('Error excecuting UnRar')
|
return _('Error excecuting UnRar')
|
||||||
|
|
||||||
|
|
||||||
|
@ -885,25 +972,25 @@ def json_serial(obj):
|
||||||
|
|
||||||
# helper function for displaying the runtime of tasks
|
# helper function for displaying the runtime of tasks
|
||||||
def format_runtime(runtime):
|
def format_runtime(runtime):
|
||||||
retVal = ""
|
ret_val = ""
|
||||||
if runtime.days:
|
if runtime.days:
|
||||||
retVal = format_unit(runtime.days, 'duration-day', length="long", locale=get_locale()) + ', '
|
ret_val = format_unit(runtime.days, 'duration-day', length="long", locale=get_locale()) + ', '
|
||||||
mins, seconds = divmod(runtime.seconds, 60)
|
mins, seconds = divmod(runtime.seconds, 60)
|
||||||
hours, minutes = divmod(mins, 60)
|
hours, minutes = divmod(mins, 60)
|
||||||
# ToDo: locale.number_symbols._data['timeSeparator'] -> localize time separator ?
|
# ToDo: locale.number_symbols._data['timeSeparator'] -> localize time separator ?
|
||||||
if hours:
|
if hours:
|
||||||
retVal += '{:d}:{:02d}:{:02d}s'.format(hours, minutes, seconds)
|
ret_val += '{:d}:{:02d}:{:02d}s'.format(hours, minutes, seconds)
|
||||||
elif minutes:
|
elif minutes:
|
||||||
retVal += '{:2d}:{:02d}s'.format(minutes, seconds)
|
ret_val += '{:2d}:{:02d}s'.format(minutes, seconds)
|
||||||
else:
|
else:
|
||||||
retVal += '{:2d}s'.format(seconds)
|
ret_val += '{:2d}s'.format(seconds)
|
||||||
return retVal
|
return ret_val
|
||||||
|
|
||||||
|
|
||||||
# helper function to apply localize status information in tasklist entries
|
# helper function to apply localize status information in tasklist entries
|
||||||
def render_task_status(tasklist):
|
def render_task_status(tasklist):
|
||||||
renderedtasklist = list()
|
renderedtasklist = list()
|
||||||
for __, user, __, task in tasklist:
|
for __, user, __, task, __ in tasklist:
|
||||||
if user == current_user.name or current_user.role_admin():
|
if user == current_user.name or current_user.role_admin():
|
||||||
ret = {}
|
ret = {}
|
||||||
if task.start_time:
|
if task.start_time:
|
||||||
|
@ -920,12 +1007,22 @@ def render_task_status(tasklist):
|
||||||
ret['status'] = _(u'Started')
|
ret['status'] = _(u'Started')
|
||||||
elif task.stat == STAT_FINISH_SUCCESS:
|
elif task.stat == STAT_FINISH_SUCCESS:
|
||||||
ret['status'] = _(u'Finished')
|
ret['status'] = _(u'Finished')
|
||||||
|
elif task.stat == STAT_ENDED:
|
||||||
|
ret['status'] = _(u'Ended')
|
||||||
|
elif task.stat == STAT_CANCELLED:
|
||||||
|
ret['status'] = _(u'Cancelled')
|
||||||
else:
|
else:
|
||||||
ret['status'] = _(u'Unknown Status')
|
ret['status'] = _(u'Unknown Status')
|
||||||
|
|
||||||
ret['taskMessage'] = "{}: {}".format(_(task.name), task.message)
|
ret['taskMessage'] = "{}: {}".format(task.name, task.message) if task.message else task.name
|
||||||
ret['progress'] = "{} %".format(int(task.progress * 100))
|
ret['progress'] = "{} %".format(int(task.progress * 100))
|
||||||
ret['user'] = escape(user) # prevent xss
|
ret['user'] = escape(user) # prevent xss
|
||||||
|
|
||||||
|
# Hidden fields
|
||||||
|
ret['task_id'] = task.id
|
||||||
|
ret['stat'] = task.stat
|
||||||
|
ret['is_cancellable'] = task.is_cancellable
|
||||||
|
|
||||||
renderedtasklist.append(ret)
|
renderedtasklist.append(ret)
|
||||||
|
|
||||||
return renderedtasklist
|
return renderedtasklist
|
||||||
|
@ -953,27 +1050,10 @@ def check_valid_domain(domain_text):
|
||||||
return not len(result)
|
return not len(result)
|
||||||
|
|
||||||
|
|
||||||
def get_cc_columns(filter_config_custom_read=False):
|
|
||||||
tmpcc = calibre_db.session.query(db.Custom_Columns)\
|
|
||||||
.filter(db.Custom_Columns.datatype.notin_(db.cc_exceptions)).all()
|
|
||||||
cc = []
|
|
||||||
r = None
|
|
||||||
if config.config_columns_to_ignore:
|
|
||||||
r = re.compile(config.config_columns_to_ignore)
|
|
||||||
|
|
||||||
for col in tmpcc:
|
|
||||||
if filter_config_custom_read and config.config_read_column and config.config_read_column == col.id:
|
|
||||||
continue
|
|
||||||
if r and r.match(col.name):
|
|
||||||
continue
|
|
||||||
cc.append(col)
|
|
||||||
|
|
||||||
return cc
|
|
||||||
|
|
||||||
|
|
||||||
def get_download_link(book_id, book_format, client):
|
def get_download_link(book_id, book_format, client):
|
||||||
book_format = book_format.split(".")[0]
|
book_format = book_format.split(".")[0]
|
||||||
book = calibre_db.get_filtered_book(book_id, allow_show_archived=True)
|
book = calibre_db.get_filtered_book(book_id, allow_show_archived=True)
|
||||||
|
data1= ""
|
||||||
if book:
|
if book:
|
||||||
data1 = calibre_db.get_book_format(book.id, book_format.upper())
|
data1 = calibre_db.get_book_format(book.id, book_format.upper())
|
||||||
else:
|
else:
|
||||||
|
@ -994,3 +1074,24 @@ def get_download_link(book_id, book_format, client):
|
||||||
return do_download_file(book, book_format, client, data1, headers)
|
return do_download_file(book, book_format, client, data1, headers)
|
||||||
else:
|
else:
|
||||||
abort(404)
|
abort(404)
|
||||||
|
|
||||||
|
|
||||||
|
def clear_cover_thumbnail_cache(book_id):
|
||||||
|
WorkerThread.add(None, TaskClearCoverThumbnailCache(book_id), hidden=True)
|
||||||
|
|
||||||
|
|
||||||
|
def replace_cover_thumbnail_cache(book_id):
|
||||||
|
WorkerThread.add(None, TaskClearCoverThumbnailCache(book_id), hidden=True)
|
||||||
|
WorkerThread.add(None, TaskGenerateCoverThumbnails(book_id), hidden=True)
|
||||||
|
|
||||||
|
|
||||||
|
def delete_thumbnail_cache():
|
||||||
|
WorkerThread.add(None, TaskClearCoverThumbnailCache(-1))
|
||||||
|
|
||||||
|
|
||||||
|
def add_book_to_thumbnail_cache(book_id):
|
||||||
|
WorkerThread.add(None, TaskGenerateCoverThumbnails(book_id), hidden=True)
|
||||||
|
|
||||||
|
|
||||||
|
def update_thumbnail_cache():
|
||||||
|
WorkerThread.add(None, TaskGenerateCoverThumbnails())
|
||||||
|
|
|
@ -31,8 +31,7 @@ from flask import Blueprint, request, url_for
|
||||||
from flask_babel import get_locale
|
from flask_babel import get_locale
|
||||||
from flask_login import current_user
|
from flask_login import current_user
|
||||||
from markupsafe import escape
|
from markupsafe import escape
|
||||||
from . import logger
|
from . import constants, logger
|
||||||
|
|
||||||
|
|
||||||
jinjia = Blueprint('jinjia', __name__)
|
jinjia = Blueprint('jinjia', __name__)
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
@ -128,12 +127,55 @@ def formatseriesindex_filter(series_index):
|
||||||
return series_index
|
return series_index
|
||||||
return 0
|
return 0
|
||||||
|
|
||||||
|
|
||||||
@jinjia.app_template_filter('escapedlink')
|
@jinjia.app_template_filter('escapedlink')
|
||||||
def escapedlink_filter(url, text):
|
def escapedlink_filter(url, text):
|
||||||
return "<a href='{}'>{}</a>".format(url, escape(text))
|
return "<a href='{}'>{}</a>".format(url, escape(text))
|
||||||
|
|
||||||
|
|
||||||
@jinjia.app_template_filter('uuidfilter')
|
@jinjia.app_template_filter('uuidfilter')
|
||||||
def uuidfilter(var):
|
def uuidfilter(var):
|
||||||
return uuid4()
|
return uuid4()
|
||||||
|
|
||||||
|
|
||||||
|
@jinjia.app_template_filter('cache_timestamp')
|
||||||
|
def cache_timestamp(rolling_period='month'):
|
||||||
|
if rolling_period == 'day':
|
||||||
|
return str(int(datetime.datetime.today().replace(hour=1, minute=1).timestamp()))
|
||||||
|
elif rolling_period == 'year':
|
||||||
|
return str(int(datetime.datetime.today().replace(day=1).timestamp()))
|
||||||
|
else:
|
||||||
|
return str(int(datetime.datetime.today().replace(month=1, day=1).timestamp()))
|
||||||
|
|
||||||
|
|
||||||
|
@jinjia.app_template_filter('last_modified')
|
||||||
|
def book_last_modified(book):
|
||||||
|
return str(int(book.last_modified.timestamp()))
|
||||||
|
|
||||||
|
|
||||||
|
@jinjia.app_template_filter('get_cover_srcset')
|
||||||
|
def get_cover_srcset(book):
|
||||||
|
srcset = list()
|
||||||
|
resolutions = {
|
||||||
|
constants.COVER_THUMBNAIL_SMALL: 'sm',
|
||||||
|
constants.COVER_THUMBNAIL_MEDIUM: 'md',
|
||||||
|
constants.COVER_THUMBNAIL_LARGE: 'lg'
|
||||||
|
}
|
||||||
|
for resolution, shortname in resolutions.items():
|
||||||
|
url = url_for('web.get_cover', book_id=book.id, resolution=shortname, c=book_last_modified(book))
|
||||||
|
srcset.append(f'{url} {resolution}x')
|
||||||
|
return ', '.join(srcset)
|
||||||
|
|
||||||
|
|
||||||
|
@jinjia.app_template_filter('get_series_srcset')
|
||||||
|
def get_cover_srcset(series):
|
||||||
|
srcset = list()
|
||||||
|
resolutions = {
|
||||||
|
constants.COVER_THUMBNAIL_SMALL: 'sm',
|
||||||
|
constants.COVER_THUMBNAIL_MEDIUM: 'md',
|
||||||
|
constants.COVER_THUMBNAIL_LARGE: 'lg'
|
||||||
|
}
|
||||||
|
for resolution, shortname in resolutions.items():
|
||||||
|
url = url_for('web.get_series_cover', series_id=series.id, resolution=shortname, c=cache_timestamp())
|
||||||
|
srcset.append(f'{url} {resolution}x')
|
||||||
|
return ', '.join(srcset)
|
||||||
|
|
90
cps/kobo.py
90
cps/kobo.py
|
@ -45,7 +45,7 @@ import requests
|
||||||
|
|
||||||
|
|
||||||
from . import config, logger, kobo_auth, db, calibre_db, helper, shelf as shelf_lib, ub, csrf, kobo_sync_status
|
from . import config, logger, kobo_auth, db, calibre_db, helper, shelf as shelf_lib, ub, csrf, kobo_sync_status
|
||||||
from .constants import sqlalchemy_version2
|
from .constants import sqlalchemy_version2, COVER_THUMBNAIL_SMALL
|
||||||
from .helper import get_download_link
|
from .helper import get_download_link
|
||||||
from .services import SyncToken as SyncToken
|
from .services import SyncToken as SyncToken
|
||||||
from .web import download_required
|
from .web import download_required
|
||||||
|
@ -148,8 +148,8 @@ def HandleSyncRequest():
|
||||||
sync_token.books_last_created = datetime.datetime.min
|
sync_token.books_last_created = datetime.datetime.min
|
||||||
sync_token.reading_state_last_modified = datetime.datetime.min
|
sync_token.reading_state_last_modified = datetime.datetime.min
|
||||||
|
|
||||||
new_books_last_modified = sync_token.books_last_modified # needed for sync selected shelfs only
|
new_books_last_modified = sync_token.books_last_modified # needed for sync selected shelfs only
|
||||||
new_books_last_created = sync_token.books_last_created # needed to distinguish between new and changed entitlement
|
new_books_last_created = sync_token.books_last_created # needed to distinguish between new and changed entitlement
|
||||||
new_reading_state_last_modified = sync_token.reading_state_last_modified
|
new_reading_state_last_modified = sync_token.reading_state_last_modified
|
||||||
|
|
||||||
new_archived_last_modified = datetime.datetime.min
|
new_archived_last_modified = datetime.datetime.min
|
||||||
|
@ -176,18 +176,17 @@ def HandleSyncRequest():
|
||||||
.join(db.Data).outerjoin(ub.ArchivedBook, and_(db.Books.id == ub.ArchivedBook.book_id,
|
.join(db.Data).outerjoin(ub.ArchivedBook, and_(db.Books.id == ub.ArchivedBook.book_id,
|
||||||
ub.ArchivedBook.user_id == current_user.id))
|
ub.ArchivedBook.user_id == current_user.id))
|
||||||
.filter(db.Books.id.notin_(calibre_db.session.query(ub.KoboSyncedBooks.book_id)
|
.filter(db.Books.id.notin_(calibre_db.session.query(ub.KoboSyncedBooks.book_id)
|
||||||
.filter(ub.KoboSyncedBooks.user_id == current_user.id)))
|
.filter(ub.KoboSyncedBooks.user_id == current_user.id)))
|
||||||
.filter(ub.BookShelf.date_added > sync_token.books_last_modified)
|
.filter(ub.BookShelf.date_added > sync_token.books_last_modified)
|
||||||
.filter(db.Data.format.in_(KOBO_FORMATS))
|
.filter(db.Data.format.in_(KOBO_FORMATS))
|
||||||
.filter(calibre_db.common_filters(allow_show_archived=True))
|
.filter(calibre_db.common_filters(allow_show_archived=True))
|
||||||
.order_by(db.Books.id)
|
.order_by(db.Books.id)
|
||||||
.order_by(ub.ArchivedBook.last_modified)
|
.order_by(ub.ArchivedBook.last_modified)
|
||||||
.join(ub.BookShelf, db.Books.id == ub.BookShelf.book_id)
|
.join(ub.BookShelf, db.Books.id == ub.BookShelf.book_id)
|
||||||
.join(ub.Shelf)
|
.join(ub.Shelf)
|
||||||
.filter(ub.Shelf.user_id == current_user.id)
|
.filter(ub.Shelf.user_id == current_user.id)
|
||||||
.filter(ub.Shelf.kobo_sync)
|
.filter(ub.Shelf.kobo_sync)
|
||||||
.distinct()
|
.distinct())
|
||||||
)
|
|
||||||
else:
|
else:
|
||||||
if sqlalchemy_version2:
|
if sqlalchemy_version2:
|
||||||
changed_entries = select(db.Books, ub.ArchivedBook.last_modified, ub.ArchivedBook.is_archived)
|
changed_entries = select(db.Books, ub.ArchivedBook.last_modified, ub.ArchivedBook.is_archived)
|
||||||
|
@ -196,16 +195,14 @@ def HandleSyncRequest():
|
||||||
ub.ArchivedBook.last_modified,
|
ub.ArchivedBook.last_modified,
|
||||||
ub.ArchivedBook.is_archived)
|
ub.ArchivedBook.is_archived)
|
||||||
changed_entries = (changed_entries
|
changed_entries = (changed_entries
|
||||||
.join(db.Data).outerjoin(ub.ArchivedBook, and_(db.Books.id == ub.ArchivedBook.book_id,
|
.join(db.Data).outerjoin(ub.ArchivedBook, and_(db.Books.id == ub.ArchivedBook.book_id,
|
||||||
ub.ArchivedBook.user_id == current_user.id))
|
ub.ArchivedBook.user_id == current_user.id))
|
||||||
.filter(db.Books.id.notin_(calibre_db.session.query(ub.KoboSyncedBooks.book_id)
|
.filter(db.Books.id.notin_(calibre_db.session.query(ub.KoboSyncedBooks.book_id)
|
||||||
.filter(ub.KoboSyncedBooks.user_id == current_user.id)))
|
.filter(ub.KoboSyncedBooks.user_id == current_user.id)))
|
||||||
.filter(calibre_db.common_filters(allow_show_archived=True))
|
.filter(calibre_db.common_filters(allow_show_archived=True))
|
||||||
.filter(db.Data.format.in_(KOBO_FORMATS))
|
.filter(db.Data.format.in_(KOBO_FORMATS))
|
||||||
.order_by(db.Books.last_modified)
|
.order_by(db.Books.last_modified)
|
||||||
.order_by(db.Books.id)
|
.order_by(db.Books.id))
|
||||||
)
|
|
||||||
|
|
||||||
|
|
||||||
reading_states_in_new_entitlements = []
|
reading_states_in_new_entitlements = []
|
||||||
if sqlalchemy_version2:
|
if sqlalchemy_version2:
|
||||||
|
@ -215,7 +212,7 @@ def HandleSyncRequest():
|
||||||
log.debug("Books to Sync: {}".format(len(books.all())))
|
log.debug("Books to Sync: {}".format(len(books.all())))
|
||||||
for book in books:
|
for book in books:
|
||||||
formats = [data.format for data in book.Books.data]
|
formats = [data.format for data in book.Books.data]
|
||||||
if not 'KEPUB' in formats and config.config_kepubifypath and 'EPUB' in formats:
|
if 'KEPUB' not in formats and config.config_kepubifypath and 'EPUB' in formats:
|
||||||
helper.convert_book_format(book.Books.id, config.config_calibre_dir, 'EPUB', 'KEPUB', current_user.name)
|
helper.convert_book_format(book.Books.id, config.config_calibre_dir, 'EPUB', 'KEPUB', current_user.name)
|
||||||
|
|
||||||
kobo_reading_state = get_or_create_reading_state(book.Books.id)
|
kobo_reading_state = get_or_create_reading_state(book.Books.id)
|
||||||
|
@ -262,7 +259,7 @@ def HandleSyncRequest():
|
||||||
.columns(db.Books).first()
|
.columns(db.Books).first()
|
||||||
else:
|
else:
|
||||||
max_change = changed_entries.from_self().filter(ub.ArchivedBook.is_archived)\
|
max_change = changed_entries.from_self().filter(ub.ArchivedBook.is_archived)\
|
||||||
.filter(ub.ArchivedBook.user_id==current_user.id) \
|
.filter(ub.ArchivedBook.user_id == current_user.id) \
|
||||||
.order_by(func.datetime(ub.ArchivedBook.last_modified).desc()).first()
|
.order_by(func.datetime(ub.ArchivedBook.last_modified).desc()).first()
|
||||||
|
|
||||||
max_change = max_change.last_modified if max_change else new_archived_last_modified
|
max_change = max_change.last_modified if max_change else new_archived_last_modified
|
||||||
|
@ -425,9 +422,9 @@ def get_author(book):
|
||||||
author_list = []
|
author_list = []
|
||||||
autor_roles = []
|
autor_roles = []
|
||||||
for author in book.authors:
|
for author in book.authors:
|
||||||
autor_roles.append({"Name":author.name}) #.encode('unicode-escape').decode('latin-1')
|
autor_roles.append({"Name": author.name})
|
||||||
author_list.append(author.name)
|
author_list.append(author.name)
|
||||||
return {"ContributorRoles": autor_roles, "Contributors":author_list}
|
return {"ContributorRoles": autor_roles, "Contributors": author_list}
|
||||||
|
|
||||||
|
|
||||||
def get_publisher(book):
|
def get_publisher(book):
|
||||||
|
@ -441,6 +438,7 @@ def get_series(book):
|
||||||
return None
|
return None
|
||||||
return book.series[0].name
|
return book.series[0].name
|
||||||
|
|
||||||
|
|
||||||
def get_seriesindex(book):
|
def get_seriesindex(book):
|
||||||
return book.series_index or 1
|
return book.series_index or 1
|
||||||
|
|
||||||
|
@ -485,7 +483,7 @@ def get_metadata(book):
|
||||||
"Language": "en",
|
"Language": "en",
|
||||||
"PhoneticPronunciations": {},
|
"PhoneticPronunciations": {},
|
||||||
"PublicationDate": convert_to_kobo_timestamp_string(book.pubdate),
|
"PublicationDate": convert_to_kobo_timestamp_string(book.pubdate),
|
||||||
"Publisher": {"Imprint": "", "Name": get_publisher(book),},
|
"Publisher": {"Imprint": "", "Name": get_publisher(book), },
|
||||||
"RevisionId": book_uuid,
|
"RevisionId": book_uuid,
|
||||||
"Title": book.title,
|
"Title": book.title,
|
||||||
"WorkId": book_uuid,
|
"WorkId": book_uuid,
|
||||||
|
@ -504,6 +502,7 @@ def get_metadata(book):
|
||||||
|
|
||||||
return metadata
|
return metadata
|
||||||
|
|
||||||
|
|
||||||
@csrf.exempt
|
@csrf.exempt
|
||||||
@kobo.route("/v1/library/tags", methods=["POST", "DELETE"])
|
@kobo.route("/v1/library/tags", methods=["POST", "DELETE"])
|
||||||
@requires_kobo_auth
|
@requires_kobo_auth
|
||||||
|
@ -718,7 +717,6 @@ def sync_shelves(sync_token, sync_results, only_kobo_shelves=False):
|
||||||
*extra_filters
|
*extra_filters
|
||||||
).distinct().order_by(func.datetime(ub.Shelf.last_modified).asc())
|
).distinct().order_by(func.datetime(ub.Shelf.last_modified).asc())
|
||||||
|
|
||||||
|
|
||||||
for shelf in shelflist:
|
for shelf in shelflist:
|
||||||
if not shelf_lib.check_shelf_view_permissions(shelf):
|
if not shelf_lib.check_shelf_view_permissions(shelf):
|
||||||
continue
|
continue
|
||||||
|
@ -764,6 +762,7 @@ def create_kobo_tag(shelf):
|
||||||
)
|
)
|
||||||
return {"Tag": tag}
|
return {"Tag": tag}
|
||||||
|
|
||||||
|
|
||||||
@csrf.exempt
|
@csrf.exempt
|
||||||
@kobo.route("/v1/library/<book_uuid>/state", methods=["GET", "PUT"])
|
@kobo.route("/v1/library/<book_uuid>/state", methods=["GET", "PUT"])
|
||||||
@requires_kobo_auth
|
@requires_kobo_auth
|
||||||
|
@ -808,7 +807,7 @@ def HandleStateRequest(book_uuid):
|
||||||
book_read = kobo_reading_state.book_read_link
|
book_read = kobo_reading_state.book_read_link
|
||||||
new_book_read_status = get_ub_read_status(request_status_info["Status"])
|
new_book_read_status = get_ub_read_status(request_status_info["Status"])
|
||||||
if new_book_read_status == ub.ReadBook.STATUS_IN_PROGRESS \
|
if new_book_read_status == ub.ReadBook.STATUS_IN_PROGRESS \
|
||||||
and new_book_read_status != book_read.read_status:
|
and new_book_read_status != book_read.read_status:
|
||||||
book_read.times_started_reading += 1
|
book_read.times_started_reading += 1
|
||||||
book_read.last_time_started_reading = datetime.datetime.utcnow()
|
book_read.last_time_started_reading = datetime.datetime.utcnow()
|
||||||
book_read.read_status = new_book_read_status
|
book_read.read_status = new_book_read_status
|
||||||
|
@ -848,7 +847,7 @@ def get_ub_read_status(kobo_read_status):
|
||||||
|
|
||||||
def get_or_create_reading_state(book_id):
|
def get_or_create_reading_state(book_id):
|
||||||
book_read = ub.session.query(ub.ReadBook).filter(ub.ReadBook.book_id == book_id,
|
book_read = ub.session.query(ub.ReadBook).filter(ub.ReadBook.book_id == book_id,
|
||||||
ub.ReadBook.user_id == int(current_user.id)).one_or_none()
|
ub.ReadBook.user_id == int(current_user.id)).one_or_none()
|
||||||
if not book_read:
|
if not book_read:
|
||||||
book_read = ub.ReadBook(user_id=current_user.id, book_id=book_id)
|
book_read = ub.ReadBook(user_id=current_user.id, book_id=book_id)
|
||||||
if not book_read.kobo_reading_state:
|
if not book_read.kobo_reading_state:
|
||||||
|
@ -912,13 +911,12 @@ def get_current_bookmark_response(current_bookmark):
|
||||||
}
|
}
|
||||||
return resp
|
return resp
|
||||||
|
|
||||||
|
|
||||||
@kobo.route("/<book_uuid>/<width>/<height>/<isGreyscale>/image.jpg", defaults={'Quality': ""})
|
@kobo.route("/<book_uuid>/<width>/<height>/<isGreyscale>/image.jpg", defaults={'Quality': ""})
|
||||||
@kobo.route("/<book_uuid>/<width>/<height>/<Quality>/<isGreyscale>/image.jpg")
|
@kobo.route("/<book_uuid>/<width>/<height>/<Quality>/<isGreyscale>/image.jpg")
|
||||||
@requires_kobo_auth
|
@requires_kobo_auth
|
||||||
def HandleCoverImageRequest(book_uuid, width, height,Quality, isGreyscale):
|
def HandleCoverImageRequest(book_uuid, width, height, Quality, isGreyscale):
|
||||||
book_cover = helper.get_book_cover_with_uuid(
|
book_cover = helper.get_book_cover_with_uuid(book_uuid, resolution=COVER_THUMBNAIL_SMALL)
|
||||||
book_uuid, use_generic_cover_on_failure=False
|
|
||||||
)
|
|
||||||
if not book_cover:
|
if not book_cover:
|
||||||
if config.config_kobo_proxy:
|
if config.config_kobo_proxy:
|
||||||
log.debug("Cover for unknown book: %s proxied to kobo" % book_uuid)
|
log.debug("Cover for unknown book: %s proxied to kobo" % book_uuid)
|
||||||
|
@ -991,8 +989,8 @@ def handle_getests():
|
||||||
if config.config_kobo_proxy:
|
if config.config_kobo_proxy:
|
||||||
return redirect_or_proxy_request()
|
return redirect_or_proxy_request()
|
||||||
else:
|
else:
|
||||||
testkey = request.headers.get("X-Kobo-userkey","")
|
testkey = request.headers.get("X-Kobo-userkey", "")
|
||||||
return make_response(jsonify({"Result": "Success", "TestKey":testkey, "Tests": {}}))
|
return make_response(jsonify({"Result": "Success", "TestKey": testkey, "Tests": {}}))
|
||||||
|
|
||||||
|
|
||||||
@csrf.exempt
|
@csrf.exempt
|
||||||
|
@ -1022,7 +1020,7 @@ def make_calibre_web_auth_response():
|
||||||
content = request.get_json()
|
content = request.get_json()
|
||||||
AccessToken = base64.b64encode(os.urandom(24)).decode('utf-8')
|
AccessToken = base64.b64encode(os.urandom(24)).decode('utf-8')
|
||||||
RefreshToken = base64.b64encode(os.urandom(24)).decode('utf-8')
|
RefreshToken = base64.b64encode(os.urandom(24)).decode('utf-8')
|
||||||
return make_response(
|
return make_response(
|
||||||
jsonify(
|
jsonify(
|
||||||
{
|
{
|
||||||
"AccessToken": AccessToken,
|
"AccessToken": AccessToken,
|
||||||
|
@ -1160,14 +1158,16 @@ def NATIVE_KOBO_RESOURCES():
|
||||||
"eula_page": "https://www.kobo.com/termsofuse?style=onestore",
|
"eula_page": "https://www.kobo.com/termsofuse?style=onestore",
|
||||||
"exchange_auth": "https://storeapi.kobo.com/v1/auth/exchange",
|
"exchange_auth": "https://storeapi.kobo.com/v1/auth/exchange",
|
||||||
"external_book": "https://storeapi.kobo.com/v1/products/books/external/{Ids}",
|
"external_book": "https://storeapi.kobo.com/v1/products/books/external/{Ids}",
|
||||||
"facebook_sso_page": "https://authorize.kobo.com/signin/provider/Facebook/login?returnUrl=http://store.kobobooks.com/",
|
"facebook_sso_page":
|
||||||
|
"https://authorize.kobo.com/signin/provider/Facebook/login?returnUrl=http://store.kobobooks.com/",
|
||||||
"featured_list": "https://storeapi.kobo.com/v1/products/featured/{FeaturedListId}",
|
"featured_list": "https://storeapi.kobo.com/v1/products/featured/{FeaturedListId}",
|
||||||
"featured_lists": "https://storeapi.kobo.com/v1/products/featured",
|
"featured_lists": "https://storeapi.kobo.com/v1/products/featured",
|
||||||
"free_books_page": {
|
"free_books_page": {
|
||||||
"EN": "https://www.kobo.com/{region}/{language}/p/free-ebooks",
|
"EN": "https://www.kobo.com/{region}/{language}/p/free-ebooks",
|
||||||
"FR": "https://www.kobo.com/{region}/{language}/p/livres-gratuits",
|
"FR": "https://www.kobo.com/{region}/{language}/p/livres-gratuits",
|
||||||
"IT": "https://www.kobo.com/{region}/{language}/p/libri-gratuiti",
|
"IT": "https://www.kobo.com/{region}/{language}/p/libri-gratuiti",
|
||||||
"NL": "https://www.kobo.com/{region}/{language}/List/bekijk-het-overzicht-van-gratis-ebooks/QpkkVWnUw8sxmgjSlCbJRg",
|
"NL": "https://www.kobo.com/{region}/{language}/"
|
||||||
|
"List/bekijk-het-overzicht-van-gratis-ebooks/QpkkVWnUw8sxmgjSlCbJRg",
|
||||||
"PT": "https://www.kobo.com/{region}/{language}/p/livros-gratis",
|
"PT": "https://www.kobo.com/{region}/{language}/p/livros-gratis",
|
||||||
},
|
},
|
||||||
"fte_feedback": "https://storeapi.kobo.com/v1/products/ftefeedback",
|
"fte_feedback": "https://storeapi.kobo.com/v1/products/ftefeedback",
|
||||||
|
@ -1192,7 +1192,8 @@ def NATIVE_KOBO_RESOURCES():
|
||||||
"library_stack": "https://storeapi.kobo.com/v1/user/library/stacks/{LibraryItemId}",
|
"library_stack": "https://storeapi.kobo.com/v1/user/library/stacks/{LibraryItemId}",
|
||||||
"library_sync": "https://storeapi.kobo.com/v1/library/sync",
|
"library_sync": "https://storeapi.kobo.com/v1/library/sync",
|
||||||
"love_dashboard_page": "https://store.kobobooks.com/{culture}/kobosuperpoints",
|
"love_dashboard_page": "https://store.kobobooks.com/{culture}/kobosuperpoints",
|
||||||
"love_points_redemption_page": "https://store.kobobooks.com/{culture}/KoboSuperPointsRedemption?productId={ProductId}",
|
"love_points_redemption_page":
|
||||||
|
"https://store.kobobooks.com/{culture}/KoboSuperPointsRedemption?productId={ProductId}",
|
||||||
"magazine_landing_page": "https://store.kobobooks.com/emagazines",
|
"magazine_landing_page": "https://store.kobobooks.com/emagazines",
|
||||||
"notifications_registration_issue": "https://storeapi.kobo.com/v1/notifications/registration",
|
"notifications_registration_issue": "https://storeapi.kobo.com/v1/notifications/registration",
|
||||||
"oauth_host": "https://oauth.kobo.com",
|
"oauth_host": "https://oauth.kobo.com",
|
||||||
|
@ -1208,7 +1209,8 @@ def NATIVE_KOBO_RESOURCES():
|
||||||
"product_recommendations": "https://storeapi.kobo.com/v1/products/{ProductId}/recommendations",
|
"product_recommendations": "https://storeapi.kobo.com/v1/products/{ProductId}/recommendations",
|
||||||
"product_reviews": "https://storeapi.kobo.com/v1/products/{ProductIds}/reviews",
|
"product_reviews": "https://storeapi.kobo.com/v1/products/{ProductIds}/reviews",
|
||||||
"products": "https://storeapi.kobo.com/v1/products",
|
"products": "https://storeapi.kobo.com/v1/products",
|
||||||
"provider_external_sign_in_page": "https://authorize.kobo.com/ExternalSignIn/{providerName}?returnUrl=http://store.kobobooks.com/",
|
"provider_external_sign_in_page":
|
||||||
|
"https://authorize.kobo.com/ExternalSignIn/{providerName}?returnUrl=http://store.kobobooks.com/",
|
||||||
"purchase_buy": "https://www.kobo.com/checkout/createpurchase/",
|
"purchase_buy": "https://www.kobo.com/checkout/createpurchase/",
|
||||||
"purchase_buy_templated": "https://www.kobo.com/{culture}/checkout/createpurchase/{ProductId}",
|
"purchase_buy_templated": "https://www.kobo.com/{culture}/checkout/createpurchase/{ProductId}",
|
||||||
"quickbuy_checkout": "https://storeapi.kobo.com/v1/store/quickbuy/{PurchaseId}/checkout",
|
"quickbuy_checkout": "https://storeapi.kobo.com/v1/store/quickbuy/{PurchaseId}/checkout",
|
||||||
|
|
|
@ -42,7 +42,7 @@ logging.addLevelName(logging.CRITICAL, "CRIT")
|
||||||
|
|
||||||
class _Logger(logging.Logger):
|
class _Logger(logging.Logger):
|
||||||
|
|
||||||
def debug_or_exception(self, message, stacklevel=2, *args, **kwargs):
|
def error_or_exception(self, message, stacklevel=2, *args, **kwargs):
|
||||||
if sys.version_info > (3, 7):
|
if sys.version_info > (3, 7):
|
||||||
if is_debug_enabled():
|
if is_debug_enabled():
|
||||||
self.exception(message, stacklevel=stacklevel, *args, **kwargs)
|
self.exception(message, stacklevel=stacklevel, *args, **kwargs)
|
||||||
|
|
|
@ -19,14 +19,22 @@
|
||||||
import concurrent.futures
|
import concurrent.futures
|
||||||
import requests
|
import requests
|
||||||
from bs4 import BeautifulSoup as BS # requirement
|
from bs4 import BeautifulSoup as BS # requirement
|
||||||
|
from typing import List, Optional
|
||||||
|
|
||||||
try:
|
try:
|
||||||
import cchardet #optional for better speed
|
import cchardet #optional for better speed
|
||||||
except ImportError:
|
except ImportError:
|
||||||
pass
|
pass
|
||||||
|
from cps import logger
|
||||||
from cps.services.Metadata import MetaRecord, MetaSourceInfo, Metadata
|
from cps.services.Metadata import MetaRecord, MetaSourceInfo, Metadata
|
||||||
|
import cps.logger as logger
|
||||||
|
|
||||||
#from time import time
|
#from time import time
|
||||||
from operator import itemgetter
|
from operator import itemgetter
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
|
|
||||||
class Amazon(Metadata):
|
class Amazon(Metadata):
|
||||||
__name__ = "Amazon"
|
__name__ = "Amazon"
|
||||||
|
@ -46,12 +54,16 @@ class Amazon(Metadata):
|
||||||
|
|
||||||
def search(
|
def search(
|
||||||
self, query: str, generic_cover: str = "", locale: str = "en"
|
self, query: str, generic_cover: str = "", locale: str = "en"
|
||||||
):
|
) -> Optional[List[MetaRecord]]:
|
||||||
#timer=time()
|
#timer=time()
|
||||||
def inner(link,index)->[dict,int]:
|
def inner(link, index) -> [dict, int]:
|
||||||
with self.session as session:
|
with self.session as session:
|
||||||
r = session.get(f"https://www.amazon.com/{link}")
|
try:
|
||||||
r.raise_for_status()
|
r = session.get(f"https://www.amazon.com/{link}")
|
||||||
|
r.raise_for_status()
|
||||||
|
except Exception as ex:
|
||||||
|
log.warning(ex)
|
||||||
|
return
|
||||||
long_soup = BS(r.text, "lxml") #~4sec :/
|
long_soup = BS(r.text, "lxml") #~4sec :/
|
||||||
soup2 = long_soup.find("div", attrs={"cel_widget_id": "dpx-books-ppd_csm_instrumentation_wrapper"})
|
soup2 = long_soup.find("div", attrs={"cel_widget_id": "dpx-books-ppd_csm_instrumentation_wrapper"})
|
||||||
if soup2 is None:
|
if soup2 is None:
|
||||||
|
@ -65,7 +77,7 @@ class Amazon(Metadata):
|
||||||
description="Amazon Books",
|
description="Amazon Books",
|
||||||
link="https://amazon.com/"
|
link="https://amazon.com/"
|
||||||
),
|
),
|
||||||
url = f"https://www.amazon.com/{link}",
|
url = f"https://www.amazon.com{link}",
|
||||||
#the more searches the slower, these are too hard to find in reasonable time or might not even exist
|
#the more searches the slower, these are too hard to find in reasonable time or might not even exist
|
||||||
publisher= "", # very unreliable
|
publisher= "", # very unreliable
|
||||||
publishedDate= "", # very unreliable
|
publishedDate= "", # very unreliable
|
||||||
|
@ -102,21 +114,28 @@ class Amazon(Metadata):
|
||||||
match.cover = ""
|
match.cover = ""
|
||||||
return match, index
|
return match, index
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
print(e)
|
log.error_or_exception(e)
|
||||||
return
|
return
|
||||||
|
|
||||||
val = list()
|
val = list()
|
||||||
if self.active:
|
if self.active:
|
||||||
results = self.session.get(
|
try:
|
||||||
f"https://www.amazon.com/s?k={query.replace(' ', '+')}&i=digital-text&sprefix={query.replace(' ', '+')}"
|
results = self.session.get(
|
||||||
f"%2Cdigital-text&ref=nb_sb_noss",
|
f"https://www.amazon.com/s?k={query.replace(' ', '+')}&i=digital-text&sprefix={query.replace(' ', '+')}"
|
||||||
headers=self.headers)
|
f"%2Cdigital-text&ref=nb_sb_noss",
|
||||||
results.raise_for_status()
|
headers=self.headers)
|
||||||
|
results.raise_for_status()
|
||||||
|
except requests.exceptions.HTTPError as e:
|
||||||
|
log.error_or_exception(e)
|
||||||
|
return None
|
||||||
|
except Exception as e:
|
||||||
|
log.warning(e)
|
||||||
|
return None
|
||||||
soup = BS(results.text, 'html.parser')
|
soup = BS(results.text, 'html.parser')
|
||||||
links_list = [next(filter(lambda i: "digital-text" in i["href"], x.findAll("a")))["href"] for x in
|
links_list = [next(filter(lambda i: "digital-text" in i["href"], x.findAll("a")))["href"] for x in
|
||||||
soup.findAll("div", attrs={"data-component-type": "s-search-result"})]
|
soup.findAll("div", attrs={"data-component-type": "s-search-result"})]
|
||||||
with concurrent.futures.ThreadPoolExecutor(max_workers=5) as executor:
|
with concurrent.futures.ThreadPoolExecutor(max_workers=5) as executor:
|
||||||
fut = {executor.submit(inner, link, index) for index, link in enumerate(links_list[:5])}
|
fut = {executor.submit(inner, link, index) for index, link in enumerate(links_list[:5])}
|
||||||
val=list(map(lambda x : x.result() ,concurrent.futures.as_completed(fut)))
|
val = list(map(lambda x : x.result() ,concurrent.futures.as_completed(fut)))
|
||||||
result=list(filter(lambda x: x, val))
|
result = list(filter(lambda x: x, val))
|
||||||
return [x[0] for x in sorted(result, key=itemgetter(1))] #sort by amazons listing order for best relevance
|
return [x[0] for x in sorted(result, key=itemgetter(1))] #sort by amazons listing order for best relevance
|
||||||
|
|
|
@ -21,8 +21,11 @@ from typing import Dict, List, Optional
|
||||||
from urllib.parse import quote
|
from urllib.parse import quote
|
||||||
|
|
||||||
import requests
|
import requests
|
||||||
|
from cps import logger
|
||||||
from cps.services.Metadata import MetaRecord, MetaSourceInfo, Metadata
|
from cps.services.Metadata import MetaRecord, MetaSourceInfo, Metadata
|
||||||
|
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
|
|
||||||
class ComicVine(Metadata):
|
class ComicVine(Metadata):
|
||||||
__name__ = "ComicVine"
|
__name__ = "ComicVine"
|
||||||
|
@ -46,10 +49,15 @@ class ComicVine(Metadata):
|
||||||
if title_tokens:
|
if title_tokens:
|
||||||
tokens = [quote(t.encode("utf-8")) for t in title_tokens]
|
tokens = [quote(t.encode("utf-8")) for t in title_tokens]
|
||||||
query = "%20".join(tokens)
|
query = "%20".join(tokens)
|
||||||
result = requests.get(
|
try:
|
||||||
f"{ComicVine.BASE_URL}{query}{ComicVine.QUERY_PARAMS}",
|
result = requests.get(
|
||||||
headers=ComicVine.HEADERS,
|
f"{ComicVine.BASE_URL}{query}{ComicVine.QUERY_PARAMS}",
|
||||||
)
|
headers=ComicVine.HEADERS,
|
||||||
|
)
|
||||||
|
result.raise_for_status()
|
||||||
|
except Exception as e:
|
||||||
|
log.warning(e)
|
||||||
|
return None
|
||||||
for result in result.json()["results"]:
|
for result in result.json()["results"]:
|
||||||
match = self._parse_search_result(
|
match = self._parse_search_result(
|
||||||
result=result, generic_cover=generic_cover, locale=locale
|
result=result, generic_cover=generic_cover, locale=locale
|
||||||
|
|
206
cps/metadata_provider/douban.py
Normal file
206
cps/metadata_provider/douban.py
Normal file
|
@ -0,0 +1,206 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2022 xlivevil
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
import re
|
||||||
|
from concurrent import futures
|
||||||
|
from typing import List, Optional
|
||||||
|
|
||||||
|
import requests
|
||||||
|
from html2text import HTML2Text
|
||||||
|
from lxml import etree
|
||||||
|
|
||||||
|
from cps import logger
|
||||||
|
from cps.services.Metadata import Metadata, MetaRecord, MetaSourceInfo
|
||||||
|
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
|
|
||||||
|
def html2text(html: str) -> str:
|
||||||
|
|
||||||
|
h2t = HTML2Text()
|
||||||
|
h2t.body_width = 0
|
||||||
|
h2t.single_line_break = True
|
||||||
|
h2t.emphasis_mark = "*"
|
||||||
|
return h2t.handle(html)
|
||||||
|
|
||||||
|
|
||||||
|
class Douban(Metadata):
|
||||||
|
__name__ = "豆瓣"
|
||||||
|
__id__ = "douban"
|
||||||
|
DESCRIPTION = "豆瓣"
|
||||||
|
META_URL = "https://book.douban.com/"
|
||||||
|
SEARCH_URL = "https://www.douban.com/j/search"
|
||||||
|
|
||||||
|
ID_PATTERN = re.compile(r"sid: (?P<id>\d+),")
|
||||||
|
AUTHORS_PATTERN = re.compile(r"作者|译者")
|
||||||
|
PUBLISHER_PATTERN = re.compile(r"出版社")
|
||||||
|
SUBTITLE_PATTERN = re.compile(r"副标题")
|
||||||
|
PUBLISHED_DATE_PATTERN = re.compile(r"出版年")
|
||||||
|
SERIES_PATTERN = re.compile(r"丛书")
|
||||||
|
IDENTIFIERS_PATTERN = re.compile(r"ISBN|统一书号")
|
||||||
|
|
||||||
|
TITTLE_XPATH = "//span[@property='v:itemreviewed']"
|
||||||
|
COVER_XPATH = "//a[@class='nbg']"
|
||||||
|
INFO_XPATH = "//*[@id='info']//span[@class='pl']"
|
||||||
|
TAGS_XPATH = "//a[contains(@class, 'tag')]"
|
||||||
|
DESCRIPTION_XPATH = "//div[@id='link-report']//div[@class='intro']"
|
||||||
|
RATING_XPATH = "//div[@class='rating_self clearfix']/strong"
|
||||||
|
|
||||||
|
session = requests.Session()
|
||||||
|
session.headers = {
|
||||||
|
'user-agent':
|
||||||
|
'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/98.0.4758.102 Safari/537.36 Edg/98.0.1108.56',
|
||||||
|
}
|
||||||
|
|
||||||
|
def search(
|
||||||
|
self, query: str, generic_cover: str = "", locale: str = "en"
|
||||||
|
) -> Optional[List[MetaRecord]]:
|
||||||
|
if self.active:
|
||||||
|
log.debug(f"starting search {query} on douban")
|
||||||
|
if title_tokens := list(
|
||||||
|
self.get_title_tokens(query, strip_joiners=False)
|
||||||
|
):
|
||||||
|
query = "+".join(title_tokens)
|
||||||
|
|
||||||
|
try:
|
||||||
|
r = self.session.get(
|
||||||
|
self.SEARCH_URL, params={"cat": 1001, "q": query}
|
||||||
|
)
|
||||||
|
r.raise_for_status()
|
||||||
|
|
||||||
|
except Exception as e:
|
||||||
|
log.warning(e)
|
||||||
|
return None
|
||||||
|
|
||||||
|
results = r.json()
|
||||||
|
if results["total"] == 0:
|
||||||
|
return []
|
||||||
|
|
||||||
|
book_id_list = [
|
||||||
|
self.ID_PATTERN.search(item).group("id")
|
||||||
|
for item in results["items"][:10] if self.ID_PATTERN.search(item)
|
||||||
|
]
|
||||||
|
|
||||||
|
with futures.ThreadPoolExecutor(max_workers=5) as executor:
|
||||||
|
|
||||||
|
fut = [
|
||||||
|
executor.submit(self._parse_single_book, book_id, generic_cover)
|
||||||
|
for book_id in book_id_list
|
||||||
|
]
|
||||||
|
|
||||||
|
val = [
|
||||||
|
future.result()
|
||||||
|
for future in futures.as_completed(fut) if future.result()
|
||||||
|
]
|
||||||
|
|
||||||
|
return val
|
||||||
|
|
||||||
|
def _parse_single_book(
|
||||||
|
self, id: str, generic_cover: str = ""
|
||||||
|
) -> Optional[MetaRecord]:
|
||||||
|
url = f"https://book.douban.com/subject/{id}/"
|
||||||
|
|
||||||
|
try:
|
||||||
|
r = self.session.get(url)
|
||||||
|
r.raise_for_status()
|
||||||
|
except Exception as e:
|
||||||
|
log.warning(e)
|
||||||
|
return None
|
||||||
|
|
||||||
|
match = MetaRecord(
|
||||||
|
id=id,
|
||||||
|
title="",
|
||||||
|
authors=[],
|
||||||
|
url=url,
|
||||||
|
source=MetaSourceInfo(
|
||||||
|
id=self.__id__,
|
||||||
|
description=self.DESCRIPTION,
|
||||||
|
link=self.META_URL,
|
||||||
|
),
|
||||||
|
)
|
||||||
|
|
||||||
|
html = etree.HTML(r.content.decode("utf8"))
|
||||||
|
|
||||||
|
match.title = html.xpath(self.TITTLE_XPATH)[0].text
|
||||||
|
match.cover = html.xpath(self.COVER_XPATH)[0].attrib["href"] or generic_cover
|
||||||
|
try:
|
||||||
|
rating_num = float(html.xpath(self.RATING_XPATH)[0].text.strip())
|
||||||
|
except Exception:
|
||||||
|
rating_num = 0
|
||||||
|
match.rating = int(-1 * rating_num // 2 * -1) if rating_num else 0
|
||||||
|
|
||||||
|
tag_elements = html.xpath(self.TAGS_XPATH)
|
||||||
|
if len(tag_elements):
|
||||||
|
match.tags = [tag_element.text for tag_element in tag_elements]
|
||||||
|
|
||||||
|
description_element = html.xpath(self.DESCRIPTION_XPATH)
|
||||||
|
if len(description_element):
|
||||||
|
match.description = html2text(etree.tostring(
|
||||||
|
description_element[-1], encoding="utf8").decode("utf8"))
|
||||||
|
|
||||||
|
info = html.xpath(self.INFO_XPATH)
|
||||||
|
|
||||||
|
for element in info:
|
||||||
|
text = element.text
|
||||||
|
if self.AUTHORS_PATTERN.search(text):
|
||||||
|
next = element.getnext()
|
||||||
|
while next is not None and next.tag != "br":
|
||||||
|
match.authors.append(next.text)
|
||||||
|
next = next.getnext()
|
||||||
|
elif self.PUBLISHER_PATTERN.search(text):
|
||||||
|
match.publisher = element.tail.strip()
|
||||||
|
elif self.SUBTITLE_PATTERN.search(text):
|
||||||
|
match.title = f'{match.title}:' + element.tail.strip()
|
||||||
|
elif self.PUBLISHED_DATE_PATTERN.search(text):
|
||||||
|
match.publishedDate = self._clean_date(element.tail.strip())
|
||||||
|
elif self.SUBTITLE_PATTERN.search(text):
|
||||||
|
match.series = element.getnext().text
|
||||||
|
elif i_type := self.IDENTIFIERS_PATTERN.search(text):
|
||||||
|
match.identifiers[i_type.group()] = element.tail.strip()
|
||||||
|
|
||||||
|
return match
|
||||||
|
|
||||||
|
|
||||||
|
def _clean_date(self, date: str) -> str:
|
||||||
|
"""
|
||||||
|
Clean up the date string to be in the format YYYY-MM-DD
|
||||||
|
|
||||||
|
Examples of possible patterns:
|
||||||
|
'2014-7-16', '1988年4月', '1995-04', '2021-8', '2020-12-1', '1996年',
|
||||||
|
'1972', '2004/11/01', '1959年3月北京第1版第1印'
|
||||||
|
"""
|
||||||
|
year = date[:4]
|
||||||
|
moon = "01"
|
||||||
|
day = "01"
|
||||||
|
|
||||||
|
if len(date) > 5:
|
||||||
|
digit = []
|
||||||
|
ls = []
|
||||||
|
for i in range(5, len(date)):
|
||||||
|
if date[i].isdigit():
|
||||||
|
digit.append(date[i])
|
||||||
|
elif digit:
|
||||||
|
ls.append("".join(digit) if len(digit)==2 else f"0{digit[0]}")
|
||||||
|
digit = []
|
||||||
|
if digit:
|
||||||
|
ls.append("".join(digit) if len(digit)==2 else f"0{digit[0]}")
|
||||||
|
|
||||||
|
moon = ls[0]
|
||||||
|
if len(ls)>1:
|
||||||
|
day = ls[1]
|
||||||
|
|
||||||
|
return f"{year}-{moon}-{day}"
|
|
@ -22,9 +22,12 @@ from urllib.parse import quote
|
||||||
|
|
||||||
import requests
|
import requests
|
||||||
|
|
||||||
|
from cps import logger
|
||||||
from cps.isoLanguages import get_lang3, get_language_name
|
from cps.isoLanguages import get_lang3, get_language_name
|
||||||
from cps.services.Metadata import MetaRecord, MetaSourceInfo, Metadata
|
from cps.services.Metadata import MetaRecord, MetaSourceInfo, Metadata
|
||||||
|
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
|
|
||||||
class Google(Metadata):
|
class Google(Metadata):
|
||||||
__name__ = "Google"
|
__name__ = "Google"
|
||||||
|
@ -45,8 +48,13 @@ class Google(Metadata):
|
||||||
if title_tokens:
|
if title_tokens:
|
||||||
tokens = [quote(t.encode("utf-8")) for t in title_tokens]
|
tokens = [quote(t.encode("utf-8")) for t in title_tokens]
|
||||||
query = "+".join(tokens)
|
query = "+".join(tokens)
|
||||||
results = requests.get(Google.SEARCH_URL + query)
|
try:
|
||||||
for result in results.json()["items"]:
|
results = requests.get(Google.SEARCH_URL + query)
|
||||||
|
results.raise_for_status()
|
||||||
|
except Exception as e:
|
||||||
|
log.warning(e)
|
||||||
|
return None
|
||||||
|
for result in results.json().get("items", []):
|
||||||
val.append(
|
val.append(
|
||||||
self._parse_search_result(
|
self._parse_search_result(
|
||||||
result=result, generic_cover=generic_cover, locale=locale
|
result=result, generic_cover=generic_cover, locale=locale
|
||||||
|
|
|
@ -27,9 +27,12 @@ from html2text import HTML2Text
|
||||||
from lxml.html import HtmlElement, fromstring, tostring
|
from lxml.html import HtmlElement, fromstring, tostring
|
||||||
from markdown2 import Markdown
|
from markdown2 import Markdown
|
||||||
|
|
||||||
|
from cps import logger
|
||||||
from cps.isoLanguages import get_language_name
|
from cps.isoLanguages import get_language_name
|
||||||
from cps.services.Metadata import MetaRecord, MetaSourceInfo, Metadata
|
from cps.services.Metadata import MetaRecord, MetaSourceInfo, Metadata
|
||||||
|
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
SYMBOLS_TO_TRANSLATE = (
|
SYMBOLS_TO_TRANSLATE = (
|
||||||
"öÖüÜóÓőŐúÚéÉáÁűŰíÍąĄćĆęĘłŁńŃóÓśŚźŹżŻ",
|
"öÖüÜóÓőŐúÚéÉáÁűŰíÍąĄćĆęĘłŁńŃóÓśŚźŹżŻ",
|
||||||
"oOuUoOoOuUeEaAuUiIaAcCeElLnNoOsSzZzZ",
|
"oOuUoOoOuUeEaAuUiIaAcCeElLnNoOsSzZzZ",
|
||||||
|
@ -112,7 +115,12 @@ class LubimyCzytac(Metadata):
|
||||||
self, query: str, generic_cover: str = "", locale: str = "en"
|
self, query: str, generic_cover: str = "", locale: str = "en"
|
||||||
) -> Optional[List[MetaRecord]]:
|
) -> Optional[List[MetaRecord]]:
|
||||||
if self.active:
|
if self.active:
|
||||||
result = requests.get(self._prepare_query(title=query))
|
try:
|
||||||
|
result = requests.get(self._prepare_query(title=query))
|
||||||
|
result.raise_for_status()
|
||||||
|
except Exception as e:
|
||||||
|
log.warning(e)
|
||||||
|
return None
|
||||||
root = fromstring(result.text)
|
root = fromstring(result.text)
|
||||||
lc_parser = LubimyCzytacParser(root=root, metadata=self)
|
lc_parser = LubimyCzytacParser(root=root, metadata=self)
|
||||||
matches = lc_parser.parse_search_results()
|
matches = lc_parser.parse_search_results()
|
||||||
|
@ -200,7 +208,12 @@ class LubimyCzytacParser:
|
||||||
def parse_single_book(
|
def parse_single_book(
|
||||||
self, match: MetaRecord, generic_cover: str, locale: str
|
self, match: MetaRecord, generic_cover: str, locale: str
|
||||||
) -> MetaRecord:
|
) -> MetaRecord:
|
||||||
response = requests.get(match.url)
|
try:
|
||||||
|
response = requests.get(match.url)
|
||||||
|
response.raise_for_status()
|
||||||
|
except Exception as e:
|
||||||
|
log.warning(e)
|
||||||
|
return None
|
||||||
self.root = fromstring(response.text)
|
self.root = fromstring(response.text)
|
||||||
match.cover = self._parse_cover(generic_cover=generic_cover)
|
match.cover = self._parse_cover(generic_cover=generic_cover)
|
||||||
match.description = self._parse_description()
|
match.description = self._parse_description()
|
||||||
|
|
|
@ -17,7 +17,7 @@
|
||||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
import itertools
|
import itertools
|
||||||
from typing import Dict, List, Optional
|
from typing import Dict, List, Optional
|
||||||
from urllib.parse import quote
|
from urllib.parse import quote, unquote
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from fake_useragent.errors import FakeUserAgentError
|
from fake_useragent.errors import FakeUserAgentError
|
||||||
|
@ -28,8 +28,12 @@ try:
|
||||||
except FakeUserAgentError:
|
except FakeUserAgentError:
|
||||||
raise ImportError("No module named 'scholarly'")
|
raise ImportError("No module named 'scholarly'")
|
||||||
|
|
||||||
|
from cps import logger
|
||||||
from cps.services.Metadata import MetaRecord, MetaSourceInfo, Metadata
|
from cps.services.Metadata import MetaRecord, MetaSourceInfo, Metadata
|
||||||
|
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
|
|
||||||
class scholar(Metadata):
|
class scholar(Metadata):
|
||||||
__name__ = "Google Scholar"
|
__name__ = "Google Scholar"
|
||||||
__id__ = "googlescholar"
|
__id__ = "googlescholar"
|
||||||
|
@ -44,10 +48,14 @@ class scholar(Metadata):
|
||||||
if title_tokens:
|
if title_tokens:
|
||||||
tokens = [quote(t.encode("utf-8")) for t in title_tokens]
|
tokens = [quote(t.encode("utf-8")) for t in title_tokens]
|
||||||
query = " ".join(tokens)
|
query = " ".join(tokens)
|
||||||
scholar_gen = itertools.islice(scholarly.search_pubs(query), 10)
|
try:
|
||||||
|
scholar_gen = itertools.islice(scholarly.search_pubs(query), 10)
|
||||||
|
except Exception as e:
|
||||||
|
log.warning(e)
|
||||||
|
return None
|
||||||
for result in scholar_gen:
|
for result in scholar_gen:
|
||||||
match = self._parse_search_result(
|
match = self._parse_search_result(
|
||||||
result=result, generic_cover=generic_cover, locale=locale
|
result=result, generic_cover="", locale=locale
|
||||||
)
|
)
|
||||||
val.append(match)
|
val.append(match)
|
||||||
return val
|
return val
|
||||||
|
@ -66,7 +74,7 @@ class scholar(Metadata):
|
||||||
)
|
)
|
||||||
|
|
||||||
match.cover = result.get("image", {}).get("original_url", generic_cover)
|
match.cover = result.get("image", {}).get("original_url", generic_cover)
|
||||||
match.description = result["bib"].get("abstract", "")
|
match.description = unquote(result["bib"].get("abstract", ""))
|
||||||
match.publisher = result["bib"].get("venue", "")
|
match.publisher = result["bib"].get("venue", "")
|
||||||
match.publishedDate = result["bib"].get("pub_year") + "-01-01"
|
match.publishedDate = result["bib"].get("pub_year") + "-01-01"
|
||||||
match.identifiers = {"scholar": match.id}
|
match.identifiers = {"scholar": match.id}
|
||||||
|
|
|
@ -149,7 +149,7 @@ def bind_oauth_or_register(provider_id, provider_user_id, redirect_url, provider
|
||||||
log.info("Link to {} Succeeded".format(provider_name))
|
log.info("Link to {} Succeeded".format(provider_name))
|
||||||
return redirect(url_for('web.profile'))
|
return redirect(url_for('web.profile'))
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.debug_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
else:
|
else:
|
||||||
flash(_(u"Login failed, No User Linked With OAuth Account"), category="error")
|
flash(_(u"Login failed, No User Linked With OAuth Account"), category="error")
|
||||||
|
@ -197,7 +197,7 @@ def unlink_oauth(provider):
|
||||||
flash(_(u"Unlink to %(oauth)s Succeeded", oauth=oauth_check[provider]), category="success")
|
flash(_(u"Unlink to %(oauth)s Succeeded", oauth=oauth_check[provider]), category="success")
|
||||||
log.info("Unlink to {} Succeeded".format(oauth_check[provider]))
|
log.info("Unlink to {} Succeeded".format(oauth_check[provider]))
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.debug_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
flash(_(u"Unlink to %(oauth)s Failed", oauth=oauth_check[provider]), category="error")
|
flash(_(u"Unlink to %(oauth)s Failed", oauth=oauth_check[provider]), category="error")
|
||||||
except NoResultFound:
|
except NoResultFound:
|
||||||
|
|
300
cps/opds.py
300
cps/opds.py
|
@ -27,15 +27,14 @@ from functools import wraps
|
||||||
from flask import Blueprint, request, render_template, Response, g, make_response, abort
|
from flask import Blueprint, request, render_template, Response, g, make_response, abort
|
||||||
from flask_login import current_user
|
from flask_login import current_user
|
||||||
from sqlalchemy.sql.expression import func, text, or_, and_, true
|
from sqlalchemy.sql.expression import func, text, or_, and_, true
|
||||||
|
from sqlalchemy.exc import InvalidRequestError, OperationalError
|
||||||
from werkzeug.security import check_password_hash
|
from werkzeug.security import check_password_hash
|
||||||
from tornado.httputil import HTTPServerRequest
|
|
||||||
from . import constants, logger, config, db, calibre_db, ub, services, get_locale, isoLanguages
|
from . import constants, logger, config, db, calibre_db, ub, services, get_locale, isoLanguages
|
||||||
from .helper import get_download_link, get_book_cover
|
from .helper import get_download_link, get_book_cover
|
||||||
from .pagination import Pagination
|
from .pagination import Pagination
|
||||||
from .web import render_read_books
|
from .web import render_read_books
|
||||||
from .usermanagement import load_user_from_request
|
from .usermanagement import load_user_from_request
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
|
|
||||||
opds = Blueprint('opds', __name__)
|
opds = Blueprint('opds', __name__)
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
@ -86,7 +85,7 @@ def feed_osd():
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_cc_search(query):
|
def feed_cc_search(query):
|
||||||
# Handle strange query from Libera Reader with + instead of spaces
|
# Handle strange query from Libera Reader with + instead of spaces
|
||||||
plus_query = unquote_plus(request.base_url.split('/opds/search/')[1]).strip()
|
plus_query = unquote_plus(request.environ['RAW_URI'].split('/opds/search/')[1]).strip()
|
||||||
return feed_search(plus_query)
|
return feed_search(plus_query)
|
||||||
|
|
||||||
|
|
||||||
|
@ -99,26 +98,7 @@ def feed_normal_search():
|
||||||
@opds.route("/opds/books")
|
@opds.route("/opds/books")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_booksindex():
|
def feed_booksindex():
|
||||||
shift = 0
|
return render_element_index(db.Books.sort, None, 'opds.feed_letter_books')
|
||||||
off = int(request.args.get("offset") or 0)
|
|
||||||
entries = calibre_db.session.query(func.upper(func.substr(db.Books.sort, 1, 1)).label('id'))\
|
|
||||||
.filter(calibre_db.common_filters()).group_by(func.upper(func.substr(db.Books.sort, 1, 1))).all()
|
|
||||||
|
|
||||||
elements = []
|
|
||||||
if off == 0:
|
|
||||||
elements.append({'id': "00", 'name':_("All")})
|
|
||||||
shift = 1
|
|
||||||
for entry in entries[
|
|
||||||
off + shift - 1:
|
|
||||||
int(off + int(config.config_books_per_page) - shift)]:
|
|
||||||
elements.append({'id': entry.id, 'name': entry.id})
|
|
||||||
|
|
||||||
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
|
||||||
len(entries) + 1)
|
|
||||||
return render_xml_template('feed.xml',
|
|
||||||
letterelements=elements,
|
|
||||||
folder='opds.feed_letter_books',
|
|
||||||
pagination=pagination)
|
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/books/letter/<book_id>")
|
@opds.route("/opds/books/letter/<book_id>")
|
||||||
|
@ -129,7 +109,8 @@ def feed_letter_books(book_id):
|
||||||
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
||||||
db.Books,
|
db.Books,
|
||||||
letter,
|
letter,
|
||||||
[db.Books.sort])
|
[db.Books.sort],
|
||||||
|
True, config.config_read_column)
|
||||||
|
|
||||||
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
||||||
|
|
||||||
|
@ -139,15 +120,16 @@ def feed_letter_books(book_id):
|
||||||
def feed_new():
|
def feed_new():
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
||||||
db.Books, True, [db.Books.timestamp.desc()])
|
db.Books, True, [db.Books.timestamp.desc()],
|
||||||
|
True, config.config_read_column)
|
||||||
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/discover")
|
@opds.route("/opds/discover")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_discover():
|
def feed_discover():
|
||||||
entries = calibre_db.session.query(db.Books).filter(calibre_db.common_filters()).order_by(func.random())\
|
query = calibre_db.generate_linked_query(config.config_read_column, db.Books)
|
||||||
.limit(config.config_books_per_page)
|
entries = query.filter(calibre_db.common_filters()).order_by(func.random()).limit(config.config_books_per_page)
|
||||||
pagination = Pagination(1, config.config_books_per_page, int(config.config_books_per_page))
|
pagination = Pagination(1, config.config_books_per_page, int(config.config_books_per_page))
|
||||||
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
||||||
|
|
||||||
|
@ -158,7 +140,8 @@ def feed_best_rated():
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
||||||
db.Books, db.Books.ratings.any(db.Ratings.rating > 9),
|
db.Books, db.Books.ratings.any(db.Ratings.rating > 9),
|
||||||
[db.Books.timestamp.desc()])
|
[db.Books.timestamp.desc()],
|
||||||
|
True, config.config_read_column)
|
||||||
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
|
@ -171,43 +154,23 @@ def feed_hot():
|
||||||
hot_books = all_books.offset(off).limit(config.config_books_per_page)
|
hot_books = all_books.offset(off).limit(config.config_books_per_page)
|
||||||
entries = list()
|
entries = list()
|
||||||
for book in hot_books:
|
for book in hot_books:
|
||||||
downloadBook = calibre_db.get_book(book.Downloads.book_id)
|
query = calibre_db.generate_linked_query(config.config_read_column, db.Books)
|
||||||
if downloadBook:
|
download_book = query.filter(calibre_db.common_filters()).filter(
|
||||||
entries.append(
|
book.Downloads.book_id == db.Books.id).first()
|
||||||
calibre_db.get_filtered_book(book.Downloads.book_id)
|
if download_book:
|
||||||
)
|
entries.append(download_book)
|
||||||
else:
|
else:
|
||||||
ub.delete_download(book.Downloads.book_id)
|
ub.delete_download(book.Downloads.book_id)
|
||||||
numBooks = entries.__len__()
|
num_books = entries.__len__()
|
||||||
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1),
|
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1),
|
||||||
config.config_books_per_page, numBooks)
|
config.config_books_per_page, num_books)
|
||||||
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/author")
|
@opds.route("/opds/author")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_authorindex():
|
def feed_authorindex():
|
||||||
shift = 0
|
return render_element_index(db.Authors.sort, db.books_authors_link, 'opds.feed_letter_author')
|
||||||
off = int(request.args.get("offset") or 0)
|
|
||||||
entries = calibre_db.session.query(func.upper(func.substr(db.Authors.sort, 1, 1)).label('id'))\
|
|
||||||
.join(db.books_authors_link).join(db.Books).filter(calibre_db.common_filters())\
|
|
||||||
.group_by(func.upper(func.substr(db.Authors.sort, 1, 1))).all()
|
|
||||||
|
|
||||||
elements = []
|
|
||||||
if off == 0:
|
|
||||||
elements.append({'id': "00", 'name':_("All")})
|
|
||||||
shift = 1
|
|
||||||
for entry in entries[
|
|
||||||
off + shift - 1:
|
|
||||||
int(off + int(config.config_books_per_page) - shift)]:
|
|
||||||
elements.append({'id': entry.id, 'name': entry.id})
|
|
||||||
|
|
||||||
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
|
||||||
len(entries) + 1)
|
|
||||||
return render_xml_template('feed.xml',
|
|
||||||
letterelements=elements,
|
|
||||||
folder='opds.feed_letter_author',
|
|
||||||
pagination=pagination)
|
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/author/letter/<book_id>")
|
@opds.route("/opds/author/letter/<book_id>")
|
||||||
|
@ -228,12 +191,7 @@ def feed_letter_author(book_id):
|
||||||
@opds.route("/opds/author/<int:book_id>")
|
@opds.route("/opds/author/<int:book_id>")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_author(book_id):
|
def feed_author(book_id):
|
||||||
off = request.args.get("offset") or 0
|
return render_xml_dataset(db.Authors, book_id)
|
||||||
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
|
||||||
db.Books,
|
|
||||||
db.Books.authors.any(db.Authors.id == book_id),
|
|
||||||
[db.Books.timestamp.desc()])
|
|
||||||
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/publisher")
|
@opds.route("/opds/publisher")
|
||||||
|
@ -254,37 +212,14 @@ def feed_publisherindex():
|
||||||
@opds.route("/opds/publisher/<int:book_id>")
|
@opds.route("/opds/publisher/<int:book_id>")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_publisher(book_id):
|
def feed_publisher(book_id):
|
||||||
off = request.args.get("offset") or 0
|
return render_xml_dataset(db.Publishers, book_id)
|
||||||
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
|
||||||
db.Books,
|
|
||||||
db.Books.publishers.any(db.Publishers.id == book_id),
|
|
||||||
[db.Books.timestamp.desc()])
|
|
||||||
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/category")
|
@opds.route("/opds/category")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_categoryindex():
|
def feed_categoryindex():
|
||||||
shift = 0
|
return render_element_index(db.Tags.name, db.books_tags_link, 'opds.feed_letter_category')
|
||||||
off = int(request.args.get("offset") or 0)
|
|
||||||
entries = calibre_db.session.query(func.upper(func.substr(db.Tags.name, 1, 1)).label('id'))\
|
|
||||||
.join(db.books_tags_link).join(db.Books).filter(calibre_db.common_filters())\
|
|
||||||
.group_by(func.upper(func.substr(db.Tags.name, 1, 1))).all()
|
|
||||||
elements = []
|
|
||||||
if off == 0:
|
|
||||||
elements.append({'id': "00", 'name':_("All")})
|
|
||||||
shift = 1
|
|
||||||
for entry in entries[
|
|
||||||
off + shift - 1:
|
|
||||||
int(off + int(config.config_books_per_page) - shift)]:
|
|
||||||
elements.append({'id': entry.id, 'name': entry.id})
|
|
||||||
|
|
||||||
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
|
||||||
len(entries) + 1)
|
|
||||||
return render_xml_template('feed.xml',
|
|
||||||
letterelements=elements,
|
|
||||||
folder='opds.feed_letter_category',
|
|
||||||
pagination=pagination)
|
|
||||||
|
|
||||||
@opds.route("/opds/category/letter/<book_id>")
|
@opds.route("/opds/category/letter/<book_id>")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
|
@ -306,36 +241,14 @@ def feed_letter_category(book_id):
|
||||||
@opds.route("/opds/category/<int:book_id>")
|
@opds.route("/opds/category/<int:book_id>")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_category(book_id):
|
def feed_category(book_id):
|
||||||
off = request.args.get("offset") or 0
|
return render_xml_dataset(db.Tags, book_id)
|
||||||
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
|
||||||
db.Books,
|
|
||||||
db.Books.tags.any(db.Tags.id == book_id),
|
|
||||||
[db.Books.timestamp.desc()])
|
|
||||||
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/series")
|
@opds.route("/opds/series")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_seriesindex():
|
def feed_seriesindex():
|
||||||
shift = 0
|
return render_element_index(db.Series.sort, db.books_series_link, 'opds.feed_letter_series')
|
||||||
off = int(request.args.get("offset") or 0)
|
|
||||||
entries = calibre_db.session.query(func.upper(func.substr(db.Series.sort, 1, 1)).label('id'))\
|
|
||||||
.join(db.books_series_link).join(db.Books).filter(calibre_db.common_filters())\
|
|
||||||
.group_by(func.upper(func.substr(db.Series.sort, 1, 1))).all()
|
|
||||||
elements = []
|
|
||||||
if off == 0:
|
|
||||||
elements.append({'id': "00", 'name':_("All")})
|
|
||||||
shift = 1
|
|
||||||
for entry in entries[
|
|
||||||
off + shift - 1:
|
|
||||||
int(off + int(config.config_books_per_page) - shift)]:
|
|
||||||
elements.append({'id': entry.id, 'name': entry.id})
|
|
||||||
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
|
||||||
len(entries) + 1)
|
|
||||||
return render_xml_template('feed.xml',
|
|
||||||
letterelements=elements,
|
|
||||||
folder='opds.feed_letter_series',
|
|
||||||
pagination=pagination)
|
|
||||||
|
|
||||||
@opds.route("/opds/series/letter/<book_id>")
|
@opds.route("/opds/series/letter/<book_id>")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
|
@ -361,7 +274,8 @@ def feed_series(book_id):
|
||||||
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
||||||
db.Books,
|
db.Books,
|
||||||
db.Books.series.any(db.Series.id == book_id),
|
db.Books.series.any(db.Series.id == book_id),
|
||||||
[db.Books.series_index])
|
[db.Books.series_index],
|
||||||
|
True, config.config_read_column)
|
||||||
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
|
@ -370,7 +284,7 @@ def feed_series(book_id):
|
||||||
def feed_ratingindex():
|
def feed_ratingindex():
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
entries = calibre_db.session.query(db.Ratings, func.count('books_ratings_link.book').label('count'),
|
entries = calibre_db.session.query(db.Ratings, func.count('books_ratings_link.book').label('count'),
|
||||||
(db.Ratings.rating / 2).label('name')) \
|
(db.Ratings.rating / 2).label('name')) \
|
||||||
.join(db.books_ratings_link)\
|
.join(db.books_ratings_link)\
|
||||||
.join(db.Books)\
|
.join(db.Books)\
|
||||||
.filter(calibre_db.common_filters()) \
|
.filter(calibre_db.common_filters()) \
|
||||||
|
@ -388,12 +302,7 @@ def feed_ratingindex():
|
||||||
@opds.route("/opds/ratings/<book_id>")
|
@opds.route("/opds/ratings/<book_id>")
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def feed_ratings(book_id):
|
def feed_ratings(book_id):
|
||||||
off = request.args.get("offset") or 0
|
return render_xml_dataset(db.Ratings, book_id)
|
||||||
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
|
||||||
db.Books,
|
|
||||||
db.Books.ratings.any(db.Ratings.id == book_id),
|
|
||||||
[db.Books.timestamp.desc()])
|
|
||||||
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/formats")
|
@opds.route("/opds/formats")
|
||||||
|
@ -420,7 +329,8 @@ def feed_format(book_id):
|
||||||
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
||||||
db.Books,
|
db.Books,
|
||||||
db.Books.data.any(db.Data.format == book_id.upper()),
|
db.Books.data.any(db.Data.format == book_id.upper()),
|
||||||
[db.Books.timestamp.desc()])
|
[db.Books.timestamp.desc()],
|
||||||
|
True, config.config_read_column)
|
||||||
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
|
@ -447,7 +357,8 @@ def feed_languages(book_id):
|
||||||
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
||||||
db.Books,
|
db.Books,
|
||||||
db.Books.languages.any(db.Languages.id == book_id),
|
db.Books.languages.any(db.Languages.id == book_id),
|
||||||
[db.Books.timestamp.desc()])
|
[db.Books.timestamp.desc()],
|
||||||
|
True, config.config_read_column)
|
||||||
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
|
@ -477,13 +388,25 @@ def feed_shelf(book_id):
|
||||||
result = list()
|
result = list()
|
||||||
# user is allowed to access shelf
|
# user is allowed to access shelf
|
||||||
if shelf:
|
if shelf:
|
||||||
books_in_shelf = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == book_id).order_by(
|
result, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1),
|
||||||
ub.BookShelf.order.asc()).all()
|
config.config_books_per_page,
|
||||||
for book in books_in_shelf:
|
db.Books,
|
||||||
cur_book = calibre_db.get_book(book.book_id)
|
ub.BookShelf.shelf == shelf.id,
|
||||||
result.append(cur_book)
|
[ub.BookShelf.order.asc()],
|
||||||
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
True, config.config_read_column,
|
||||||
len(result))
|
ub.BookShelf, ub.BookShelf.book_id == db.Books.id)
|
||||||
|
# delete shelf entries where book is not existent anymore, can happen if book is deleted outside calibre-web
|
||||||
|
wrong_entries = calibre_db.session.query(ub.BookShelf) \
|
||||||
|
.join(db.Books, ub.BookShelf.book_id == db.Books.id, isouter=True) \
|
||||||
|
.filter(db.Books.id == None).all()
|
||||||
|
for entry in wrong_entries:
|
||||||
|
log.info('Not existing book {} in {} deleted'.format(entry.book_id, shelf))
|
||||||
|
try:
|
||||||
|
ub.session.query(ub.BookShelf).filter(ub.BookShelf.book_id == entry.book_id).delete()
|
||||||
|
ub.session.commit()
|
||||||
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
|
ub.session.rollback()
|
||||||
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
return render_xml_template('feed.xml', entries=result, pagination=pagination)
|
return render_xml_template('feed.xml', entries=result, pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
|
@ -491,7 +414,7 @@ def feed_shelf(book_id):
|
||||||
@requires_basic_auth_if_no_ano
|
@requires_basic_auth_if_no_ano
|
||||||
def opds_download_link(book_id, book_format):
|
def opds_download_link(book_id, book_format):
|
||||||
# I gave up with this: With enabled ldap login, the user doesn't get logged in, therefore it's always guest
|
# I gave up with this: With enabled ldap login, the user doesn't get logged in, therefore it's always guest
|
||||||
# workaround, loading the user from the request and checking it's download rights here
|
# workaround, loading the user from the request and checking its download rights here
|
||||||
# in case of anonymous browsing user is None
|
# in case of anonymous browsing user is None
|
||||||
user = load_user_from_request(request) or current_user
|
user = load_user_from_request(request) or current_user
|
||||||
if not user.role_download():
|
if not user.role_download():
|
||||||
|
@ -517,48 +440,6 @@ def get_metadata_calibre_companion(uuid, library):
|
||||||
return ""
|
return ""
|
||||||
|
|
||||||
|
|
||||||
def feed_search(term):
|
|
||||||
if term:
|
|
||||||
entries, __, ___ = calibre_db.get_search_results(term, config_read_column=config.config_read_column)
|
|
||||||
entries_count = len(entries) if len(entries) > 0 else 1
|
|
||||||
pagination = Pagination(1, entries_count, entries_count)
|
|
||||||
items = [entry[0] for entry in entries]
|
|
||||||
return render_xml_template('feed.xml', searchterm=term, entries=items, pagination=pagination)
|
|
||||||
else:
|
|
||||||
return render_xml_template('feed.xml', searchterm="")
|
|
||||||
|
|
||||||
|
|
||||||
def check_auth(username, password):
|
|
||||||
try:
|
|
||||||
username = username.encode('windows-1252')
|
|
||||||
except UnicodeEncodeError:
|
|
||||||
username = username.encode('utf-8')
|
|
||||||
user = ub.session.query(ub.User).filter(func.lower(ub.User.name) ==
|
|
||||||
username.decode('utf-8').lower()).first()
|
|
||||||
if bool(user and check_password_hash(str(user.password), password)):
|
|
||||||
return True
|
|
||||||
else:
|
|
||||||
ip_Address = request.headers.get('X-Forwarded-For', request.remote_addr)
|
|
||||||
log.warning('OPDS Login failed for user "%s" IP-address: %s', username.decode('utf-8'), ip_Address)
|
|
||||||
return False
|
|
||||||
|
|
||||||
|
|
||||||
def authenticate():
|
|
||||||
return Response(
|
|
||||||
'Could not verify your access level for that URL.\n'
|
|
||||||
'You have to login with proper credentials', 401,
|
|
||||||
{'WWW-Authenticate': 'Basic realm="Login Required"'})
|
|
||||||
|
|
||||||
|
|
||||||
def render_xml_template(*args, **kwargs):
|
|
||||||
# ToDo: return time in current timezone similar to %z
|
|
||||||
currtime = datetime.datetime.now().strftime("%Y-%m-%dT%H:%M:%S+00:00")
|
|
||||||
xml = render_template(current_time=currtime, instance=config.config_calibre_web_title, *args, **kwargs)
|
|
||||||
response = make_response(xml)
|
|
||||||
response.headers["Content-Type"] = "application/atom+xml; charset=utf-8"
|
|
||||||
return response
|
|
||||||
|
|
||||||
|
|
||||||
@opds.route("/opds/thumb_240_240/<book_id>")
|
@opds.route("/opds/thumb_240_240/<book_id>")
|
||||||
@opds.route("/opds/cover_240_240/<book_id>")
|
@opds.route("/opds/cover_240_240/<book_id>")
|
||||||
@opds.route("/opds/cover_90_90/<book_id>")
|
@opds.route("/opds/cover_90_90/<book_id>")
|
||||||
|
@ -582,3 +463,78 @@ def feed_unread_books():
|
||||||
off = request.args.get("offset") or 0
|
off = request.args.get("offset") or 0
|
||||||
result, pagination = render_read_books(int(off) / (int(config.config_books_per_page)) + 1, False, True)
|
result, pagination = render_read_books(int(off) / (int(config.config_books_per_page)) + 1, False, True)
|
||||||
return render_xml_template('feed.xml', entries=result, pagination=pagination)
|
return render_xml_template('feed.xml', entries=result, pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
|
def feed_search(term):
|
||||||
|
if term:
|
||||||
|
entries, __, ___ = calibre_db.get_search_results(term, config=config)
|
||||||
|
entries_count = len(entries) if len(entries) > 0 else 1
|
||||||
|
pagination = Pagination(1, entries_count, entries_count)
|
||||||
|
return render_xml_template('feed.xml', searchterm=term, entries=entries, pagination=pagination)
|
||||||
|
else:
|
||||||
|
return render_xml_template('feed.xml', searchterm="")
|
||||||
|
|
||||||
|
|
||||||
|
def check_auth(username, password):
|
||||||
|
try:
|
||||||
|
username = username.encode('windows-1252')
|
||||||
|
except UnicodeEncodeError:
|
||||||
|
username = username.encode('utf-8')
|
||||||
|
user = ub.session.query(ub.User).filter(func.lower(ub.User.name) ==
|
||||||
|
username.decode('utf-8').lower()).first()
|
||||||
|
if bool(user and check_password_hash(str(user.password), password)):
|
||||||
|
return True
|
||||||
|
else:
|
||||||
|
ip_address = request.headers.get('X-Forwarded-For', request.remote_addr)
|
||||||
|
log.warning('OPDS Login failed for user "%s" IP-address: %s', username.decode('utf-8'), ip_address)
|
||||||
|
return False
|
||||||
|
|
||||||
|
|
||||||
|
def authenticate():
|
||||||
|
return Response(
|
||||||
|
'Could not verify your access level for that URL.\n'
|
||||||
|
'You have to login with proper credentials', 401,
|
||||||
|
{'WWW-Authenticate': 'Basic realm="Login Required"'})
|
||||||
|
|
||||||
|
|
||||||
|
def render_xml_template(*args, **kwargs):
|
||||||
|
# ToDo: return time in current timezone similar to %z
|
||||||
|
currtime = datetime.datetime.now().strftime("%Y-%m-%dT%H:%M:%S+00:00")
|
||||||
|
xml = render_template(current_time=currtime, instance=config.config_calibre_web_title, *args, **kwargs)
|
||||||
|
response = make_response(xml)
|
||||||
|
response.headers["Content-Type"] = "application/atom+xml; charset=utf-8"
|
||||||
|
return response
|
||||||
|
|
||||||
|
|
||||||
|
def render_xml_dataset(data_table, book_id):
|
||||||
|
off = request.args.get("offset") or 0
|
||||||
|
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
|
||||||
|
db.Books,
|
||||||
|
getattr(db.Books, data_table.__tablename__).any(data_table.id == book_id),
|
||||||
|
[db.Books.timestamp.desc()],
|
||||||
|
True, config.config_read_column)
|
||||||
|
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
|
||||||
|
|
||||||
|
|
||||||
|
def render_element_index(database_column, linked_table, folder):
|
||||||
|
shift = 0
|
||||||
|
off = int(request.args.get("offset") or 0)
|
||||||
|
entries = calibre_db.session.query(func.upper(func.substr(database_column, 1, 1)).label('id'), None, None)
|
||||||
|
# query = calibre_db.generate_linked_query(config.config_read_column, db.Books)
|
||||||
|
if linked_table is not None:
|
||||||
|
entries = entries.join(linked_table).join(db.Books)
|
||||||
|
entries = entries.filter(calibre_db.common_filters()).group_by(func.upper(func.substr(database_column, 1, 1))).all()
|
||||||
|
elements = []
|
||||||
|
if off == 0:
|
||||||
|
elements.append({'id': "00", 'name': _("All")})
|
||||||
|
shift = 1
|
||||||
|
for entry in entries[
|
||||||
|
off + shift - 1:
|
||||||
|
int(off + int(config.config_books_per_page) - shift)]:
|
||||||
|
elements.append({'id': entry.id, 'name': entry.id})
|
||||||
|
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
|
||||||
|
len(entries) + 1)
|
||||||
|
return render_xml_template('feed.xml',
|
||||||
|
letterelements=elements,
|
||||||
|
folder=folder,
|
||||||
|
pagination=pagination)
|
||||||
|
|
|
@ -57,10 +57,10 @@ class Pagination(object):
|
||||||
def has_next(self):
|
def has_next(self):
|
||||||
return self.page < self.pages
|
return self.page < self.pages
|
||||||
|
|
||||||
# right_edge: last right_edges count of all pages are shown as number, means, if 10 pages are paginated -> 9,10 shwn
|
# right_edge: last right_edges count of all pages are shown as number, means, if 10 pages are paginated -> 9,10 shown
|
||||||
# left_edge: first left_edges count of all pages are shown as number -> 1,2 shwn
|
# left_edge: first left_edges count of all pages are shown as number -> 1,2 shown
|
||||||
# left_current: left_current count below current page are shown as number, means if current page 5 -> 3,4 shwn
|
# left_current: left_current count below current page are shown as number, means if current page 5 -> 3,4 shown
|
||||||
# left_current: right_current count above current page are shown as number, means if current page 5 -> 6,7 shwn
|
# left_current: right_current count above current page are shown as number, means if current page 5 -> 6,7 shown
|
||||||
def iter_pages(self, left_edge=2, left_current=2,
|
def iter_pages(self, left_edge=2, left_current=2,
|
||||||
right_current=4, right_edge=2):
|
right_current=4, right_edge=2):
|
||||||
last = 0
|
last = 0
|
||||||
|
|
|
@ -22,6 +22,7 @@
|
||||||
|
|
||||||
import json
|
import json
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
from functools import wraps
|
||||||
|
|
||||||
from flask import Blueprint, request, make_response, abort, url_for, flash, redirect
|
from flask import Blueprint, request, make_response, abort, url_for, flash, redirect
|
||||||
from flask_login import login_required, current_user, login_user
|
from flask_login import login_required, current_user, login_user
|
||||||
|
@ -31,10 +32,6 @@ from sqlalchemy.sql.expression import true
|
||||||
from . import config, logger, ub
|
from . import config, logger, ub
|
||||||
from .render_template import render_title_template
|
from .render_template import render_title_template
|
||||||
|
|
||||||
try:
|
|
||||||
from functools import wraps
|
|
||||||
except ImportError:
|
|
||||||
pass # We're not using Python 3
|
|
||||||
|
|
||||||
remotelogin = Blueprint('remotelogin', __name__)
|
remotelogin = Blueprint('remotelogin', __name__)
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
|
@ -16,13 +16,13 @@
|
||||||
# You should have received a copy of the GNU General Public License
|
# You should have received a copy of the GNU General Public License
|
||||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
from flask import render_template
|
from flask import render_template, request
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
from flask import g
|
from flask import g, abort
|
||||||
from werkzeug.local import LocalProxy
|
from werkzeug.local import LocalProxy
|
||||||
from flask_login import current_user
|
from flask_login import current_user
|
||||||
|
|
||||||
from . import config, constants, ub, logger, db, calibre_db
|
from . import config, constants, logger
|
||||||
from .ub import User
|
from .ub import User
|
||||||
|
|
||||||
|
|
||||||
|
@ -30,6 +30,8 @@ log = logger.create()
|
||||||
|
|
||||||
def get_sidebar_config(kwargs=None):
|
def get_sidebar_config(kwargs=None):
|
||||||
kwargs = kwargs or []
|
kwargs = kwargs or []
|
||||||
|
simple = bool([e for e in ['kindle', 'tolino', "kobo", "bookeen"]
|
||||||
|
if (e in request.headers.get('User-Agent', "").lower())])
|
||||||
if 'content' in kwargs:
|
if 'content' in kwargs:
|
||||||
content = kwargs['content']
|
content = kwargs['content']
|
||||||
content = isinstance(content, (User, LocalProxy)) and not content.role_anonymous()
|
content = isinstance(content, (User, LocalProxy)) and not content.role_anonymous()
|
||||||
|
@ -93,14 +95,14 @@ def get_sidebar_config(kwargs=None):
|
||||||
{"glyph": "glyphicon-trash", "text": _('Archived Books'), "link": 'web.books_list', "id": "archived",
|
{"glyph": "glyphicon-trash", "text": _('Archived Books'), "link": 'web.books_list', "id": "archived",
|
||||||
"visibility": constants.SIDEBAR_ARCHIVED, 'public': (not g.user.is_anonymous), "page": "archived",
|
"visibility": constants.SIDEBAR_ARCHIVED, 'public': (not g.user.is_anonymous), "page": "archived",
|
||||||
"show_text": _('Show archived books'), "config_show": content})
|
"show_text": _('Show archived books'), "config_show": content})
|
||||||
sidebar.append(
|
if not simple:
|
||||||
{"glyph": "glyphicon-th-list", "text": _('Books List'), "link": 'web.books_table', "id": "list",
|
sidebar.append(
|
||||||
"visibility": constants.SIDEBAR_LIST, 'public': (not g.user.is_anonymous), "page": "list",
|
{"glyph": "glyphicon-th-list", "text": _('Books List'), "link": 'web.books_table', "id": "list",
|
||||||
"show_text": _('Show Books List'), "config_show": content})
|
"visibility": constants.SIDEBAR_LIST, 'public': (not g.user.is_anonymous), "page": "list",
|
||||||
|
"show_text": _('Show Books List'), "config_show": content})
|
||||||
|
return sidebar, simple
|
||||||
|
|
||||||
return sidebar
|
'''def get_readbooks_ids():
|
||||||
|
|
||||||
def get_readbooks_ids():
|
|
||||||
if not config.config_read_column:
|
if not config.config_read_column:
|
||||||
readBooks = ub.session.query(ub.ReadBook).filter(ub.ReadBook.user_id == int(current_user.id))\
|
readBooks = ub.session.query(ub.ReadBook).filter(ub.ReadBook.user_id == int(current_user.id))\
|
||||||
.filter(ub.ReadBook.read_status == ub.ReadBook.STATUS_FINISHED).all()
|
.filter(ub.ReadBook.read_status == ub.ReadBook.STATUS_FINISHED).all()
|
||||||
|
@ -110,13 +112,17 @@ def get_readbooks_ids():
|
||||||
readBooks = calibre_db.session.query(db.cc_classes[config.config_read_column])\
|
readBooks = calibre_db.session.query(db.cc_classes[config.config_read_column])\
|
||||||
.filter(db.cc_classes[config.config_read_column].value == True).all()
|
.filter(db.cc_classes[config.config_read_column].value == True).all()
|
||||||
return frozenset([x.book for x in readBooks])
|
return frozenset([x.book for x in readBooks])
|
||||||
except (KeyError, AttributeError):
|
except (KeyError, AttributeError, IndexError):
|
||||||
log.error("Custom Column No.%d is not existing in calibre database", config.config_read_column)
|
log.error("Custom Column No.{} is not existing in calibre database".format(config.config_read_column))
|
||||||
return []
|
return []'''
|
||||||
|
|
||||||
# Returns the template for rendering and includes the instance name
|
# Returns the template for rendering and includes the instance name
|
||||||
def render_title_template(*args, **kwargs):
|
def render_title_template(*args, **kwargs):
|
||||||
sidebar = get_sidebar_config(kwargs)
|
sidebar, simple = get_sidebar_config(kwargs)
|
||||||
return render_template(instance=config.config_calibre_web_title, sidebar=sidebar,
|
try:
|
||||||
accept=constants.EXTENSIONS_UPLOAD, read_book_ids=get_readbooks_ids(),
|
return render_template(instance=config.config_calibre_web_title, sidebar=sidebar, simple=simple,
|
||||||
*args, **kwargs)
|
accept=constants.EXTENSIONS_UPLOAD,
|
||||||
|
*args, **kwargs)
|
||||||
|
except PermissionError:
|
||||||
|
log.error("No permission to access {} file.".format(args[0]))
|
||||||
|
abort(403)
|
||||||
|
|
97
cps/schedule.py
Normal file
97
cps/schedule.py
Normal file
|
@ -0,0 +1,97 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2020 mmonkey
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
import datetime
|
||||||
|
|
||||||
|
from . import config, constants
|
||||||
|
from .services.background_scheduler import BackgroundScheduler, use_APScheduler
|
||||||
|
from .tasks.database import TaskReconnectDatabase
|
||||||
|
from .tasks.thumbnail import TaskGenerateCoverThumbnails, TaskGenerateSeriesThumbnails, TaskClearCoverThumbnailCache
|
||||||
|
from .services.worker import WorkerThread
|
||||||
|
|
||||||
|
|
||||||
|
def get_scheduled_tasks(reconnect=True):
|
||||||
|
tasks = list()
|
||||||
|
# config.schedule_reconnect or
|
||||||
|
# Reconnect Calibre database (metadata.db)
|
||||||
|
if reconnect:
|
||||||
|
tasks.append([lambda: TaskReconnectDatabase(), 'reconnect', False])
|
||||||
|
|
||||||
|
# Generate all missing book cover thumbnails
|
||||||
|
if config.schedule_generate_book_covers:
|
||||||
|
tasks.append([lambda: TaskClearCoverThumbnailCache(0), 'delete superfluous book covers', True])
|
||||||
|
tasks.append([lambda: TaskGenerateCoverThumbnails(), 'generate book covers', False])
|
||||||
|
|
||||||
|
# Generate all missing series thumbnails
|
||||||
|
if config.schedule_generate_series_covers:
|
||||||
|
tasks.append([lambda: TaskGenerateSeriesThumbnails(), 'generate book covers', False])
|
||||||
|
|
||||||
|
return tasks
|
||||||
|
|
||||||
|
|
||||||
|
def end_scheduled_tasks():
|
||||||
|
worker = WorkerThread.get_instance()
|
||||||
|
for __, __, __, task, __ in worker.tasks:
|
||||||
|
if task.scheduled and task.is_cancellable:
|
||||||
|
worker.end_task(task.id)
|
||||||
|
|
||||||
|
|
||||||
|
def register_scheduled_tasks(reconnect=True):
|
||||||
|
scheduler = BackgroundScheduler()
|
||||||
|
|
||||||
|
if scheduler:
|
||||||
|
# Remove all existing jobs
|
||||||
|
scheduler.remove_all_jobs()
|
||||||
|
|
||||||
|
start = config.schedule_start_time
|
||||||
|
duration = config.schedule_duration
|
||||||
|
|
||||||
|
# Register scheduled tasks
|
||||||
|
scheduler.schedule_tasks(tasks=get_scheduled_tasks(), trigger='cron', hour=start)
|
||||||
|
end_time = calclulate_end_time(start, duration)
|
||||||
|
scheduler.schedule(func=end_scheduled_tasks, trigger='cron', name="end scheduled task", hour=end_time.hour,
|
||||||
|
minute=end_time.minute)
|
||||||
|
|
||||||
|
# Kick-off tasks, if they should currently be running
|
||||||
|
if should_task_be_running(start, duration):
|
||||||
|
scheduler.schedule_tasks_immediately(tasks=get_scheduled_tasks(reconnect))
|
||||||
|
|
||||||
|
|
||||||
|
def register_startup_tasks():
|
||||||
|
scheduler = BackgroundScheduler()
|
||||||
|
|
||||||
|
if scheduler:
|
||||||
|
start = config.schedule_start_time
|
||||||
|
duration = config.schedule_duration
|
||||||
|
|
||||||
|
# Run scheduled tasks immediately for development and testing
|
||||||
|
# Ignore tasks that should currently be running, as these will be added when registering scheduled tasks
|
||||||
|
if constants.APP_MODE in ['development', 'test'] and not should_task_be_running(start, duration):
|
||||||
|
scheduler.schedule_tasks_immediately(tasks=get_scheduled_tasks(False))
|
||||||
|
|
||||||
|
|
||||||
|
def should_task_be_running(start, duration):
|
||||||
|
now = datetime.datetime.now()
|
||||||
|
start_time = datetime.datetime.now().replace(hour=start, minute=0, second=0, microsecond=0)
|
||||||
|
end_time = start_time + datetime.timedelta(hours=duration // 60, minutes=duration % 60)
|
||||||
|
return start_time < now < end_time
|
||||||
|
|
||||||
|
def calclulate_end_time(start, duration):
|
||||||
|
start_time = datetime.datetime.now().replace(hour=start, minute=0)
|
||||||
|
return start_time + datetime.timedelta(hours=duration // 60, minutes=duration % 60)
|
||||||
|
|
|
@ -23,7 +23,7 @@ import json
|
||||||
import os
|
import os
|
||||||
import sys
|
import sys
|
||||||
# from time import time
|
# from time import time
|
||||||
from dataclasses import asdict
|
|
||||||
|
|
||||||
from flask import Blueprint, Response, request, url_for
|
from flask import Blueprint, Response, request, url_for
|
||||||
from flask_login import current_user
|
from flask_login import current_user
|
||||||
|
@ -32,7 +32,7 @@ from sqlalchemy.exc import InvalidRequestError, OperationalError
|
||||||
from sqlalchemy.orm.attributes import flag_modified
|
from sqlalchemy.orm.attributes import flag_modified
|
||||||
|
|
||||||
from cps.services.Metadata import Metadata
|
from cps.services.Metadata import Metadata
|
||||||
from . import constants, get_locale, logger, ub
|
from . import constants, get_locale, logger, ub, web_server
|
||||||
|
|
||||||
# current_milli_time = lambda: int(round(time() * 1000))
|
# current_milli_time = lambda: int(round(time() * 1000))
|
||||||
|
|
||||||
|
@ -40,6 +40,14 @@ meta = Blueprint("metadata", __name__)
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
|
try:
|
||||||
|
from dataclasses import asdict
|
||||||
|
except ImportError:
|
||||||
|
log.info('*** "dataclasses" is needed for calibre-web to run. Please install it using pip: "pip install dataclasses" ***')
|
||||||
|
print('*** "dataclasses" is needed for calibre-web to run. Please install it using pip: "pip install dataclasses" ***')
|
||||||
|
web_server.stop(True)
|
||||||
|
sys.exit(6)
|
||||||
|
|
||||||
new_list = list()
|
new_list = list()
|
||||||
meta_dir = os.path.join(constants.BASE_DIR, "cps", "metadata_provider")
|
meta_dir = os.path.join(constants.BASE_DIR, "cps", "metadata_provider")
|
||||||
modules = os.listdir(os.path.join(constants.BASE_DIR, "cps", "metadata_provider"))
|
modules = os.listdir(os.path.join(constants.BASE_DIR, "cps", "metadata_provider"))
|
||||||
|
@ -49,7 +57,7 @@ for f in modules:
|
||||||
try:
|
try:
|
||||||
importlib.import_module("cps.metadata_provider." + a)
|
importlib.import_module("cps.metadata_provider." + a)
|
||||||
new_list.append(a)
|
new_list.append(a)
|
||||||
except ImportError as e:
|
except (ImportError, IndentationError, SyntaxError) as e:
|
||||||
log.error("Import error for metadata source: {} - {}".format(a, e))
|
log.error("Import error for metadata source: {} - {}".format(a, e))
|
||||||
pass
|
pass
|
||||||
|
|
||||||
|
@ -130,6 +138,6 @@ def metadata_search():
|
||||||
if active.get(c.__id__, True)
|
if active.get(c.__id__, True)
|
||||||
}
|
}
|
||||||
for future in concurrent.futures.as_completed(meta):
|
for future in concurrent.futures.as_completed(meta):
|
||||||
data.extend([asdict(x) for x in future.result()])
|
data.extend([asdict(x) for x in future.result() if x])
|
||||||
# log.info({'Time elapsed {}'.format(current_milli_time()-start)})
|
# log.info({'Time elapsed {}'.format(current_milli_time()-start)})
|
||||||
return Response(json.dumps(data), mimetype="application/json")
|
return Response(json.dumps(data), mimetype="application/json")
|
||||||
|
|
|
@ -25,6 +25,7 @@ import subprocess # nosec
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from gevent.pywsgi import WSGIServer
|
from gevent.pywsgi import WSGIServer
|
||||||
|
from .gevent_wsgi import MyWSGIHandler
|
||||||
from gevent.pool import Pool
|
from gevent.pool import Pool
|
||||||
from gevent import __version__ as _version
|
from gevent import __version__ as _version
|
||||||
from greenlet import GreenletExit
|
from greenlet import GreenletExit
|
||||||
|
@ -32,7 +33,7 @@ try:
|
||||||
VERSION = 'Gevent ' + _version
|
VERSION = 'Gevent ' + _version
|
||||||
_GEVENT = True
|
_GEVENT = True
|
||||||
except ImportError:
|
except ImportError:
|
||||||
from tornado.wsgi import WSGIContainer
|
from .tornado_wsgi import MyWSGIContainer
|
||||||
from tornado.httpserver import HTTPServer
|
from tornado.httpserver import HTTPServer
|
||||||
from tornado.ioloop import IOLoop
|
from tornado.ioloop import IOLoop
|
||||||
from tornado import version as _version
|
from tornado import version as _version
|
||||||
|
@ -202,7 +203,8 @@ class WebServer(object):
|
||||||
if output is None:
|
if output is None:
|
||||||
output = _readable_listen_address(self.listen_address, self.listen_port)
|
output = _readable_listen_address(self.listen_address, self.listen_port)
|
||||||
log.info('Starting Gevent server on %s', output)
|
log.info('Starting Gevent server on %s', output)
|
||||||
self.wsgiserver = WSGIServer(sock, self.app, log=self.access_logger, spawn=Pool(), **ssl_args)
|
self.wsgiserver = WSGIServer(sock, self.app, log=self.access_logger, handler_class=MyWSGIHandler,
|
||||||
|
spawn=Pool(), **ssl_args)
|
||||||
if ssl_args:
|
if ssl_args:
|
||||||
wrap_socket = self.wsgiserver.wrap_socket
|
wrap_socket = self.wsgiserver.wrap_socket
|
||||||
def my_wrap_socket(*args, **kwargs):
|
def my_wrap_socket(*args, **kwargs):
|
||||||
|
@ -225,8 +227,8 @@ class WebServer(object):
|
||||||
asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
|
asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
|
||||||
log.info('Starting Tornado server on %s', _readable_listen_address(self.listen_address, self.listen_port))
|
log.info('Starting Tornado server on %s', _readable_listen_address(self.listen_address, self.listen_port))
|
||||||
|
|
||||||
# Max Buffersize set to 200MB )
|
# Max Buffersize set to 200MB
|
||||||
http_server = HTTPServer(WSGIContainer(self.app),
|
http_server = HTTPServer(MyWSGIContainer(self.app),
|
||||||
max_buffer_size=209700000,
|
max_buffer_size=209700000,
|
||||||
ssl_options=self.ssl_args)
|
ssl_options=self.ssl_args)
|
||||||
http_server.listen(self.listen_port, self.listen_address)
|
http_server.listen(self.listen_port, self.listen_address)
|
||||||
|
|
84
cps/services/background_scheduler.py
Normal file
84
cps/services/background_scheduler.py
Normal file
|
@ -0,0 +1,84 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2020 mmonkey
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
import atexit
|
||||||
|
|
||||||
|
from .. import logger
|
||||||
|
from .worker import WorkerThread
|
||||||
|
|
||||||
|
try:
|
||||||
|
from apscheduler.schedulers.background import BackgroundScheduler as BScheduler
|
||||||
|
use_APScheduler = True
|
||||||
|
except (ImportError, RuntimeError) as e:
|
||||||
|
use_APScheduler = False
|
||||||
|
log = logger.create()
|
||||||
|
log.info('APScheduler not found. Unable to schedule tasks.')
|
||||||
|
|
||||||
|
|
||||||
|
class BackgroundScheduler:
|
||||||
|
_instance = None
|
||||||
|
|
||||||
|
def __new__(cls):
|
||||||
|
if not use_APScheduler:
|
||||||
|
return False
|
||||||
|
|
||||||
|
if cls._instance is None:
|
||||||
|
cls._instance = super(BackgroundScheduler, cls).__new__(cls)
|
||||||
|
cls.log = logger.create()
|
||||||
|
cls.scheduler = BScheduler()
|
||||||
|
cls.scheduler.start()
|
||||||
|
|
||||||
|
atexit.register(lambda: cls.scheduler.shutdown())
|
||||||
|
|
||||||
|
return cls._instance
|
||||||
|
|
||||||
|
def schedule(self, func, trigger, name=None, **trigger_args):
|
||||||
|
if use_APScheduler:
|
||||||
|
return self.scheduler.add_job(func=func, trigger=trigger, name=name, **trigger_args)
|
||||||
|
|
||||||
|
# Expects a lambda expression for the task
|
||||||
|
def schedule_task(self, task, user=None, name=None, hidden=False, trigger='cron', **trigger_args):
|
||||||
|
if use_APScheduler:
|
||||||
|
def scheduled_task():
|
||||||
|
worker_task = task()
|
||||||
|
worker_task.scheduled = True
|
||||||
|
WorkerThread.add(user, worker_task, hidden=hidden)
|
||||||
|
return self.schedule(func=scheduled_task, trigger=trigger, name=name, **trigger_args)
|
||||||
|
|
||||||
|
# Expects a list of lambda expressions for the tasks
|
||||||
|
def schedule_tasks(self, tasks, user=None, trigger='cron', **trigger_args):
|
||||||
|
if use_APScheduler:
|
||||||
|
for task in tasks:
|
||||||
|
self.schedule_task(task[0], user=user, trigger=trigger, name=task[1], hidden=task[2], **trigger_args)
|
||||||
|
|
||||||
|
# Expects a lambda expression for the task
|
||||||
|
def schedule_task_immediately(self, task, user=None, name=None, hidden=False):
|
||||||
|
if use_APScheduler:
|
||||||
|
def immediate_task():
|
||||||
|
WorkerThread.add(user, task(), hidden)
|
||||||
|
return self.schedule(func=immediate_task, trigger='date', name=name)
|
||||||
|
|
||||||
|
# Expects a list of lambda expressions for the tasks
|
||||||
|
def schedule_tasks_immediately(self, tasks, user=None):
|
||||||
|
if use_APScheduler:
|
||||||
|
for task in tasks:
|
||||||
|
self.schedule_task_immediately(task[0], user, name="immediately " + task[1], hidden=task[2])
|
||||||
|
|
||||||
|
# Remove all jobs
|
||||||
|
def remove_all_jobs(self):
|
||||||
|
self.scheduler.remove_all_jobs()
|
|
@ -25,7 +25,7 @@ from google.oauth2.credentials import Credentials
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
import base64
|
import base64
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
from ..constants import BASE_DIR
|
from ..constants import CONFIG_DIR
|
||||||
from .. import logger
|
from .. import logger
|
||||||
|
|
||||||
|
|
||||||
|
@ -53,11 +53,11 @@ def setup_gmail(token):
|
||||||
if creds and creds.expired and creds.refresh_token:
|
if creds and creds.expired and creds.refresh_token:
|
||||||
creds.refresh(Request())
|
creds.refresh(Request())
|
||||||
else:
|
else:
|
||||||
cred_file = os.path.join(BASE_DIR, 'gmail.json')
|
cred_file = os.path.join(CONFIG_DIR, 'gmail.json')
|
||||||
if not os.path.exists(cred_file):
|
if not os.path.exists(cred_file):
|
||||||
raise Exception(_("Found no valid gmail.json file with OAuth information"))
|
raise Exception(_("Found no valid gmail.json file with OAuth information"))
|
||||||
flow = InstalledAppFlow.from_client_secrets_file(
|
flow = InstalledAppFlow.from_client_secrets_file(
|
||||||
os.path.join(BASE_DIR, 'gmail.json'), SCOPES)
|
os.path.join(CONFIG_DIR, 'gmail.json'), SCOPES)
|
||||||
creds = flow.run_local_server(port=0)
|
creds = flow.run_local_server(port=0)
|
||||||
user_info = get_user_info(creds)
|
user_info = get_user_info(creds)
|
||||||
return {
|
return {
|
||||||
|
|
|
@ -37,11 +37,13 @@ STAT_WAITING = 0
|
||||||
STAT_FAIL = 1
|
STAT_FAIL = 1
|
||||||
STAT_STARTED = 2
|
STAT_STARTED = 2
|
||||||
STAT_FINISH_SUCCESS = 3
|
STAT_FINISH_SUCCESS = 3
|
||||||
|
STAT_ENDED = 4
|
||||||
|
STAT_CANCELLED = 5
|
||||||
|
|
||||||
# Only retain this many tasks in dequeued list
|
# Only retain this many tasks in dequeued list
|
||||||
TASK_CLEANUP_TRIGGER = 20
|
TASK_CLEANUP_TRIGGER = 20
|
||||||
|
|
||||||
QueuedTask = namedtuple('QueuedTask', 'num, user, added, task')
|
QueuedTask = namedtuple('QueuedTask', 'num, user, added, task, hidden')
|
||||||
|
|
||||||
|
|
||||||
def _get_main_thread():
|
def _get_main_thread():
|
||||||
|
@ -51,7 +53,6 @@ def _get_main_thread():
|
||||||
raise Exception("main thread not found?!")
|
raise Exception("main thread not found?!")
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
class ImprovedQueue(queue.Queue):
|
class ImprovedQueue(queue.Queue):
|
||||||
def to_list(self):
|
def to_list(self):
|
||||||
"""
|
"""
|
||||||
|
@ -61,12 +62,13 @@ class ImprovedQueue(queue.Queue):
|
||||||
with self.mutex:
|
with self.mutex:
|
||||||
return list(self.queue)
|
return list(self.queue)
|
||||||
|
|
||||||
|
|
||||||
# Class for all worker tasks in the background
|
# Class for all worker tasks in the background
|
||||||
class WorkerThread(threading.Thread):
|
class WorkerThread(threading.Thread):
|
||||||
_instance = None
|
_instance = None
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def getInstance(cls):
|
def get_instance(cls):
|
||||||
if cls._instance is None:
|
if cls._instance is None:
|
||||||
cls._instance = WorkerThread()
|
cls._instance = WorkerThread()
|
||||||
return cls._instance
|
return cls._instance
|
||||||
|
@ -82,15 +84,17 @@ class WorkerThread(threading.Thread):
|
||||||
self.start()
|
self.start()
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def add(cls, user, task):
|
def add(cls, user, task, hidden=False):
|
||||||
ins = cls.getInstance()
|
ins = cls.get_instance()
|
||||||
ins.num += 1
|
ins.num += 1
|
||||||
log.debug("Add Task for user: {} - {}".format(user, task))
|
username = user if user is not None else 'System'
|
||||||
|
log.debug("Add Task for user: {} - {}".format(username, task))
|
||||||
ins.queue.put(QueuedTask(
|
ins.queue.put(QueuedTask(
|
||||||
num=ins.num,
|
num=ins.num,
|
||||||
user=user,
|
user=username,
|
||||||
added=datetime.now(),
|
added=datetime.now(),
|
||||||
task=task,
|
task=task,
|
||||||
|
hidden=hidden
|
||||||
))
|
))
|
||||||
|
|
||||||
@property
|
@property
|
||||||
|
@ -111,10 +115,10 @@ class WorkerThread(threading.Thread):
|
||||||
if delta > TASK_CLEANUP_TRIGGER:
|
if delta > TASK_CLEANUP_TRIGGER:
|
||||||
ret = alive
|
ret = alive
|
||||||
else:
|
else:
|
||||||
# otherwise, lop off the oldest dead tasks until we hit the target trigger
|
# otherwise, loop off the oldest dead tasks until we hit the target trigger
|
||||||
ret = sorted(dead, key=lambda x: x.task.end_time)[-TASK_CLEANUP_TRIGGER:] + alive
|
ret = sorted(dead, key=lambda y: y.task.end_time)[-TASK_CLEANUP_TRIGGER:] + alive
|
||||||
|
|
||||||
self.dequeued = sorted(ret, key=lambda x: x.num)
|
self.dequeued = sorted(ret, key=lambda y: y.num)
|
||||||
|
|
||||||
# Main thread loop starting the different tasks
|
# Main thread loop starting the different tasks
|
||||||
def run(self):
|
def run(self):
|
||||||
|
@ -141,11 +145,21 @@ class WorkerThread(threading.Thread):
|
||||||
|
|
||||||
# sometimes tasks (like Upload) don't actually have work to do and are created as already finished
|
# sometimes tasks (like Upload) don't actually have work to do and are created as already finished
|
||||||
if item.task.stat is STAT_WAITING:
|
if item.task.stat is STAT_WAITING:
|
||||||
# CalibreTask.start() should wrap all exceptions in it's own error handling
|
# CalibreTask.start() should wrap all exceptions in its own error handling
|
||||||
item.task.start(self)
|
item.task.start(self)
|
||||||
|
|
||||||
|
# remove self_cleanup tasks and hidden "System Tasks" from list
|
||||||
|
if item.task.self_cleanup or item.hidden:
|
||||||
|
self.dequeued.remove(item)
|
||||||
|
|
||||||
self.queue.task_done()
|
self.queue.task_done()
|
||||||
|
|
||||||
|
def end_task(self, task_id):
|
||||||
|
ins = self.get_instance()
|
||||||
|
for __, __, __, task, __ in ins.tasks:
|
||||||
|
if str(task.id) == str(task_id) and task.is_cancellable:
|
||||||
|
task.stat = STAT_CANCELLED if task.stat == STAT_WAITING else STAT_ENDED
|
||||||
|
|
||||||
|
|
||||||
class CalibreTask:
|
class CalibreTask:
|
||||||
__metaclass__ = abc.ABCMeta
|
__metaclass__ = abc.ABCMeta
|
||||||
|
@ -158,10 +172,12 @@ class CalibreTask:
|
||||||
self.end_time = None
|
self.end_time = None
|
||||||
self.message = message
|
self.message = message
|
||||||
self.id = uuid.uuid4()
|
self.id = uuid.uuid4()
|
||||||
|
self.self_cleanup = False
|
||||||
|
self._scheduled = False
|
||||||
|
|
||||||
@abc.abstractmethod
|
@abc.abstractmethod
|
||||||
def run(self, worker_thread):
|
def run(self, worker_thread):
|
||||||
"""Provides the caller some human-readable name for this class"""
|
"""The main entry-point for this task"""
|
||||||
raise NotImplementedError
|
raise NotImplementedError
|
||||||
|
|
||||||
@abc.abstractmethod
|
@abc.abstractmethod
|
||||||
|
@ -169,6 +185,11 @@ class CalibreTask:
|
||||||
"""Provides the caller some human-readable name for this class"""
|
"""Provides the caller some human-readable name for this class"""
|
||||||
raise NotImplementedError
|
raise NotImplementedError
|
||||||
|
|
||||||
|
@abc.abstractmethod
|
||||||
|
def is_cancellable(self):
|
||||||
|
"""Does this task gracefully handle being cancelled (STAT_ENDED, STAT_CANCELLED)?"""
|
||||||
|
raise NotImplementedError
|
||||||
|
|
||||||
def start(self, *args):
|
def start(self, *args):
|
||||||
self.start_time = datetime.now()
|
self.start_time = datetime.now()
|
||||||
self.stat = STAT_STARTED
|
self.stat = STAT_STARTED
|
||||||
|
@ -178,7 +199,7 @@ class CalibreTask:
|
||||||
self.run(*args)
|
self.run(*args)
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
self._handleError(str(ex))
|
self._handleError(str(ex))
|
||||||
log.debug_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
|
|
||||||
self.end_time = datetime.now()
|
self.end_time = datetime.now()
|
||||||
|
|
||||||
|
@ -219,15 +240,23 @@ class CalibreTask:
|
||||||
We have a separate dictating this because there may be certain tasks that want to override this
|
We have a separate dictating this because there may be certain tasks that want to override this
|
||||||
"""
|
"""
|
||||||
# By default, we're good to clean a task if it's "Done"
|
# By default, we're good to clean a task if it's "Done"
|
||||||
return self.stat in (STAT_FINISH_SUCCESS, STAT_FAIL)
|
return self.stat in (STAT_FINISH_SUCCESS, STAT_FAIL, STAT_ENDED, STAT_CANCELLED)
|
||||||
|
|
||||||
'''@progress.setter
|
@property
|
||||||
def progress(self, x):
|
def self_cleanup(self):
|
||||||
if x > 1:
|
return self._self_cleanup
|
||||||
x = 1
|
|
||||||
if x < 0:
|
@self_cleanup.setter
|
||||||
x = 0
|
def self_cleanup(self, is_self_cleanup):
|
||||||
self._progress = x'''
|
self._self_cleanup = is_self_cleanup
|
||||||
|
|
||||||
|
@property
|
||||||
|
def scheduled(self):
|
||||||
|
return self._scheduled
|
||||||
|
|
||||||
|
@scheduled.setter
|
||||||
|
def scheduled(self, is_scheduled):
|
||||||
|
self._scheduled = is_scheduled
|
||||||
|
|
||||||
def _handleError(self, error_message):
|
def _handleError(self, error_message):
|
||||||
self.stat = STAT_FAIL
|
self.stat = STAT_FAIL
|
||||||
|
|
46
cps/shelf.py
46
cps/shelf.py
|
@ -94,10 +94,10 @@ def add_to_shelf(shelf_id, book_id):
|
||||||
try:
|
try:
|
||||||
ub.session.merge(shelf)
|
ub.session.merge(shelf)
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
except (OperationalError, InvalidRequestError):
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("Settings DB is not Writeable")
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_(u"Settings DB is not Writeable"), category="error")
|
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
||||||
if "HTTP_REFERER" in request.environ:
|
if "HTTP_REFERER" in request.environ:
|
||||||
return redirect(request.environ["HTTP_REFERER"])
|
return redirect(request.environ["HTTP_REFERER"])
|
||||||
else:
|
else:
|
||||||
|
@ -154,10 +154,10 @@ def search_to_shelf(shelf_id):
|
||||||
ub.session.merge(shelf)
|
ub.session.merge(shelf)
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
flash(_(u"Books have been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
flash(_(u"Books have been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
||||||
except (OperationalError, InvalidRequestError):
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("Settings DB is not Writeable")
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_("Settings DB is not Writeable"), category="error")
|
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
||||||
else:
|
else:
|
||||||
log.error("Could not add books to shelf: {}".format(shelf.name))
|
log.error("Could not add books to shelf: {}".format(shelf.name))
|
||||||
flash(_(u"Could not add books to shelf: %(sname)s", sname=shelf.name), category="error")
|
flash(_(u"Could not add books to shelf: %(sname)s", sname=shelf.name), category="error")
|
||||||
|
@ -197,10 +197,10 @@ def remove_from_shelf(shelf_id, book_id):
|
||||||
ub.session.delete(book_shelf)
|
ub.session.delete(book_shelf)
|
||||||
shelf.last_modified = datetime.utcnow()
|
shelf.last_modified = datetime.utcnow()
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
except (OperationalError, InvalidRequestError):
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("Settings DB is not Writeable")
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_("Settings DB is not Writeable"), category="error")
|
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
||||||
if "HTTP_REFERER" in request.environ:
|
if "HTTP_REFERER" in request.environ:
|
||||||
return redirect(request.environ["HTTP_REFERER"])
|
return redirect(request.environ["HTTP_REFERER"])
|
||||||
else:
|
else:
|
||||||
|
@ -273,12 +273,12 @@ def create_edit_shelf(shelf, page_title, page, shelf_id=False):
|
||||||
return redirect(url_for('shelf.show_shelf', shelf_id=shelf.id))
|
return redirect(url_for('shelf.show_shelf', shelf_id=shelf.id))
|
||||||
except (OperationalError, InvalidRequestError) as ex:
|
except (OperationalError, InvalidRequestError) as ex:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.debug_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
log.error("Settings DB is not Writeable")
|
log.error_or_exception("Settings Database error: {}".format(ex))
|
||||||
flash(_("Settings DB is not Writeable"), category="error")
|
flash(_(u"Database error: %(error)s.", error=ex.orig), category="error")
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.debug_or_exception(ex)
|
log.error_or_exception(ex)
|
||||||
flash(_(u"There was an error"), category="error")
|
flash(_(u"There was an error"), category="error")
|
||||||
return render_title_template('shelf_edit.html',
|
return render_title_template('shelf_edit.html',
|
||||||
shelf=shelf,
|
shelf=shelf,
|
||||||
|
@ -337,10 +337,10 @@ def delete_shelf(shelf_id):
|
||||||
flash(_("Error deleting Shelf"), category="error")
|
flash(_("Error deleting Shelf"), category="error")
|
||||||
else:
|
else:
|
||||||
flash(_("Shelf successfully deleted"), category="success")
|
flash(_("Shelf successfully deleted"), category="success")
|
||||||
except InvalidRequestError:
|
except InvalidRequestError as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("Settings DB is not Writeable")
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_("Settings DB is not Writeable"), category="error")
|
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
|
|
||||||
|
@ -374,10 +374,10 @@ def order_shelf(shelf_id):
|
||||||
# if order diffrent from before -> shelf.last_modified = datetime.utcnow()
|
# if order diffrent from before -> shelf.last_modified = datetime.utcnow()
|
||||||
try:
|
try:
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
except (OperationalError, InvalidRequestError):
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("Settings DB is not Writeable")
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_("Settings DB is not Writeable"), category="error")
|
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
||||||
|
|
||||||
result = list()
|
result = list()
|
||||||
if shelf:
|
if shelf:
|
||||||
|
@ -439,7 +439,7 @@ def render_show_shelf(shelf_type, shelf_id, page_no, sort_param):
|
||||||
db.Books,
|
db.Books,
|
||||||
ub.BookShelf.shelf == shelf_id,
|
ub.BookShelf.shelf == shelf_id,
|
||||||
[ub.BookShelf.order.asc()],
|
[ub.BookShelf.order.asc()],
|
||||||
False, 0,
|
True, config.config_read_column,
|
||||||
ub.BookShelf, ub.BookShelf.book_id == db.Books.id)
|
ub.BookShelf, ub.BookShelf.book_id == db.Books.id)
|
||||||
# delete chelf entries where book is not existent anymore, can happen if book is deleted outside calibre-web
|
# delete chelf entries where book is not existent anymore, can happen if book is deleted outside calibre-web
|
||||||
wrong_entries = calibre_db.session.query(ub.BookShelf) \
|
wrong_entries = calibre_db.session.query(ub.BookShelf) \
|
||||||
|
@ -450,10 +450,10 @@ def render_show_shelf(shelf_type, shelf_id, page_no, sort_param):
|
||||||
try:
|
try:
|
||||||
ub.session.query(ub.BookShelf).filter(ub.BookShelf.book_id == entry.book_id).delete()
|
ub.session.query(ub.BookShelf).filter(ub.BookShelf.book_id == entry.book_id).delete()
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
except (OperationalError, InvalidRequestError):
|
except (OperationalError, InvalidRequestError) as e:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
log.error("Settings DB is not Writeable")
|
log.error_or_exception("Settings Database error: {}".format(e))
|
||||||
flash(_("Settings DB is not Writeable"), category="error")
|
flash(_(u"Database error: %(error)s.", error=e.orig), category="error")
|
||||||
|
|
||||||
return render_title_template(page,
|
return render_title_template(page,
|
||||||
entries=result,
|
entries=result,
|
||||||
|
|
|
@ -5150,7 +5150,7 @@ body.login > div.navbar.navbar-default.navbar-static-top > div > div.navbar-head
|
||||||
pointer-events: none
|
pointer-events: none
|
||||||
}
|
}
|
||||||
|
|
||||||
#DeleteDomain:hover:before, #RestartDialog:hover:before, #ShutdownDialog:hover:before, #StatusDialog:hover:before, #deleteButton, #deleteModal:hover:before, body.mailset > div.container-fluid > div > div.col-sm-10 > div.discover td > a:hover {
|
#DeleteDomain:hover:before, #RestartDialog:hover:before, #ShutdownDialog:hover:before, #StatusDialog:hover:before, #deleteButton, #deleteModal:hover:before, #cancelTaskModal:hover:before, body.mailset > div.container-fluid > div > div.col-sm-10 > div.discover td > a:hover {
|
||||||
cursor: pointer
|
cursor: pointer
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -5237,7 +5237,11 @@ body.admin > div.container-fluid > div > div.col-sm-10 > div.container-fluid > d
|
||||||
margin-bottom: 20px
|
margin-bottom: 20px
|
||||||
}
|
}
|
||||||
|
|
||||||
body.admin:not(.modal-open) .btn-default {
|
body.admin > div.container-fluid div.scheduled_tasks_details {
|
||||||
|
margin-bottom: 20px
|
||||||
|
}
|
||||||
|
|
||||||
|
body.admin .btn-default {
|
||||||
margin-bottom: 10px
|
margin-bottom: 10px
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -5468,7 +5472,7 @@ body.admin.modal-open .navbar {
|
||||||
z-index: 0 !important
|
z-index: 0 !important
|
||||||
}
|
}
|
||||||
|
|
||||||
#RestartDialog, #ShutdownDialog, #StatusDialog, #deleteModal {
|
#RestartDialog, #ShutdownDialog, #StatusDialog, #deleteModal, #cancelTaskModal {
|
||||||
top: 0;
|
top: 0;
|
||||||
overflow: hidden;
|
overflow: hidden;
|
||||||
padding-top: 70px;
|
padding-top: 70px;
|
||||||
|
@ -5478,7 +5482,7 @@ body.admin.modal-open .navbar {
|
||||||
background: rgba(0, 0, 0, .5)
|
background: rgba(0, 0, 0, .5)
|
||||||
}
|
}
|
||||||
|
|
||||||
#RestartDialog:before, #ShutdownDialog:before, #StatusDialog:before, #deleteModal:before {
|
#RestartDialog:before, #ShutdownDialog:before, #StatusDialog:before, #deleteModal:before, #cancelTaskModal:before {
|
||||||
content: "\E208";
|
content: "\E208";
|
||||||
padding-right: 10px;
|
padding-right: 10px;
|
||||||
display: block;
|
display: block;
|
||||||
|
@ -5500,18 +5504,18 @@ body.admin.modal-open .navbar {
|
||||||
z-index: 99
|
z-index: 99
|
||||||
}
|
}
|
||||||
|
|
||||||
#RestartDialog.in:before, #ShutdownDialog.in:before, #StatusDialog.in:before, #deleteModal.in:before {
|
#RestartDialog.in:before, #ShutdownDialog.in:before, #StatusDialog.in:before, #deleteModal.in:before, #cancelTaskModal.in:before {
|
||||||
-webkit-transform: translate(0, 0);
|
-webkit-transform: translate(0, 0);
|
||||||
-ms-transform: translate(0, 0);
|
-ms-transform: translate(0, 0);
|
||||||
transform: translate(0, 0)
|
transform: translate(0, 0)
|
||||||
}
|
}
|
||||||
|
|
||||||
#RestartDialog > .modal-dialog, #ShutdownDialog > .modal-dialog, #StatusDialog > .modal-dialog, #deleteModal > .modal-dialog {
|
#RestartDialog > .modal-dialog, #ShutdownDialog > .modal-dialog, #StatusDialog > .modal-dialog, #deleteModal > .modal-dialog, #cancelTaskModal > .modal-dialog {
|
||||||
width: 450px;
|
width: 450px;
|
||||||
margin: auto
|
margin: auto
|
||||||
}
|
}
|
||||||
|
|
||||||
#RestartDialog > .modal-dialog > .modal-content, #ShutdownDialog > .modal-dialog > .modal-content, #StatusDialog > .modal-dialog > .modal-content, #deleteModal > .modal-dialog > .modal-content {
|
#RestartDialog > .modal-dialog > .modal-content, #ShutdownDialog > .modal-dialog > .modal-content, #StatusDialog > .modal-dialog > .modal-content, #deleteModal > .modal-dialog > .modal-content, #cancelTaskModal > .modal-dialog > .modal-content {
|
||||||
max-height: calc(100% - 90px);
|
max-height: calc(100% - 90px);
|
||||||
-webkit-box-shadow: 0 5px 15px rgba(0, 0, 0, .5);
|
-webkit-box-shadow: 0 5px 15px rgba(0, 0, 0, .5);
|
||||||
box-shadow: 0 5px 15px rgba(0, 0, 0, .5);
|
box-shadow: 0 5px 15px rgba(0, 0, 0, .5);
|
||||||
|
@ -5522,7 +5526,7 @@ body.admin.modal-open .navbar {
|
||||||
width: 450px
|
width: 450px
|
||||||
}
|
}
|
||||||
|
|
||||||
#RestartDialog > .modal-dialog > .modal-content > .modal-header, #ShutdownDialog > .modal-dialog > .modal-content > .modal-header, #StatusDialog > .modal-dialog > .modal-content > .modal-header, #deleteModal > .modal-dialog > .modal-content > .modal-header {
|
#RestartDialog > .modal-dialog > .modal-content > .modal-header, #ShutdownDialog > .modal-dialog > .modal-content > .modal-header, #StatusDialog > .modal-dialog > .modal-content > .modal-header, #deleteModal > .modal-dialog > .modal-content > .modal-header, #cancelTaskModal > .modal-dialog > .modal-content > .modal-header {
|
||||||
padding: 15px 20px;
|
padding: 15px 20px;
|
||||||
border-radius: 3px 3px 0 0;
|
border-radius: 3px 3px 0 0;
|
||||||
line-height: 1.71428571;
|
line-height: 1.71428571;
|
||||||
|
@ -5535,7 +5539,7 @@ body.admin.modal-open .navbar {
|
||||||
text-align: left
|
text-align: left
|
||||||
}
|
}
|
||||||
|
|
||||||
#RestartDialog > .modal-dialog > .modal-content > .modal-header:before, #ShutdownDialog > .modal-dialog > .modal-content > .modal-header:before, #StatusDialog > .modal-dialog > .modal-content > .modal-header:before, #deleteModal > .modal-dialog > .modal-content > .modal-header:before {
|
#RestartDialog > .modal-dialog > .modal-content > .modal-header:before, #ShutdownDialog > .modal-dialog > .modal-content > .modal-header:before, #StatusDialog > .modal-dialog > .modal-content > .modal-header:before, #deleteModal > .modal-dialog > .modal-content > .modal-header:before, #cancelTaskModal > .modal-dialog > .modal-content > .modal-header:before {
|
||||||
padding-right: 10px;
|
padding-right: 10px;
|
||||||
font-size: 18px;
|
font-size: 18px;
|
||||||
color: #999;
|
color: #999;
|
||||||
|
@ -5564,6 +5568,11 @@ body.admin.modal-open .navbar {
|
||||||
font-family: plex-icons-new, serif
|
font-family: plex-icons-new, serif
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#cancelTaskModal > .modal-dialog > .modal-content > .modal-header:before {
|
||||||
|
content: "\EA6D";
|
||||||
|
font-family: plex-icons-new, serif
|
||||||
|
}
|
||||||
|
|
||||||
#RestartDialog > .modal-dialog > .modal-content > .modal-header:after {
|
#RestartDialog > .modal-dialog > .modal-content > .modal-header:after {
|
||||||
content: "Restart Calibre-Web";
|
content: "Restart Calibre-Web";
|
||||||
display: inline-block;
|
display: inline-block;
|
||||||
|
@ -5588,7 +5597,13 @@ body.admin.modal-open .navbar {
|
||||||
font-size: 20px
|
font-size: 20px
|
||||||
}
|
}
|
||||||
|
|
||||||
#StatusDialog > .modal-dialog > .modal-content > .modal-header > span, #deleteModal > .modal-dialog > .modal-content > .modal-header > span, #loader > center > img, .rating-mobile {
|
#cancelTaskModal > .modal-dialog > .modal-content > .modal-header:after {
|
||||||
|
content: "Delete Book";
|
||||||
|
display: inline-block;
|
||||||
|
font-size: 20px
|
||||||
|
}
|
||||||
|
|
||||||
|
#StatusDialog > .modal-dialog > .modal-content > .modal-header > span, #deleteModal > .modal-dialog > .modal-content > .modal-header > span, #cancelTaskModal > .modal-dialog > .modal-content > .modal-header > span, #loader > center > img, .rating-mobile {
|
||||||
display: none
|
display: none
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -5602,7 +5617,7 @@ body.admin.modal-open .navbar {
|
||||||
text-align: left
|
text-align: left
|
||||||
}
|
}
|
||||||
|
|
||||||
#ShutdownDialog > .modal-dialog > .modal-content > .modal-body, #StatusDialog > .modal-dialog > .modal-content > .modal-body, #deleteModal > .modal-dialog > .modal-content > .modal-body {
|
#ShutdownDialog > .modal-dialog > .modal-content > .modal-body, #StatusDialog > .modal-dialog > .modal-content > .modal-body, #deleteModal > .modal-dialog > .modal-content > .modal-body, #cancelTaskModal > .modal-dialog > .modal-content > .modal-body {
|
||||||
padding: 20px 20px 40px;
|
padding: 20px 20px 40px;
|
||||||
font-size: 16px;
|
font-size: 16px;
|
||||||
line-height: 1.6em;
|
line-height: 1.6em;
|
||||||
|
@ -5612,7 +5627,7 @@ body.admin.modal-open .navbar {
|
||||||
text-align: left
|
text-align: left
|
||||||
}
|
}
|
||||||
|
|
||||||
#RestartDialog > .modal-dialog > .modal-content > .modal-body > p, #ShutdownDialog > .modal-dialog > .modal-content > .modal-body > p, #StatusDialog > .modal-dialog > .modal-content > .modal-body > p, #deleteModal > .modal-dialog > .modal-content > .modal-body > p {
|
#RestartDialog > .modal-dialog > .modal-content > .modal-body > p, #ShutdownDialog > .modal-dialog > .modal-content > .modal-body > p, #StatusDialog > .modal-dialog > .modal-content > .modal-body > p, #deleteModal > .modal-dialog > .modal-content > .modal-body > p, #cancelTaskModal > .modal-dialog > .modal-content > .modal-body > p {
|
||||||
padding: 20px 20px 0 0;
|
padding: 20px 20px 0 0;
|
||||||
font-size: 16px;
|
font-size: 16px;
|
||||||
line-height: 1.6em;
|
line-height: 1.6em;
|
||||||
|
@ -5621,7 +5636,7 @@ body.admin.modal-open .navbar {
|
||||||
background: #282828
|
background: #282828
|
||||||
}
|
}
|
||||||
|
|
||||||
#RestartDialog > .modal-dialog > .modal-content > .modal-body > .btn-default:not(#restart), #ShutdownDialog > .modal-dialog > .modal-content > .modal-body > .btn-default:not(#shutdown), #deleteModal > .modal-dialog > .modal-content > .modal-footer > .btn-default {
|
#RestartDialog > .modal-dialog > .modal-content > .modal-body > .btn-default:not(#restart), #ShutdownDialog > .modal-dialog > .modal-content > .modal-body > .btn-default:not(#shutdown), #deleteModal > .modal-dialog > .modal-content > .modal-footer > .btn-default, #cancelTaskModal > .modal-dialog > .modal-content > .modal-footer > .btn-default {
|
||||||
float: right;
|
float: right;
|
||||||
z-index: 9;
|
z-index: 9;
|
||||||
position: relative;
|
position: relative;
|
||||||
|
@ -5669,6 +5684,18 @@ body.admin.modal-open .navbar {
|
||||||
border-radius: 3px
|
border-radius: 3px
|
||||||
}
|
}
|
||||||
|
|
||||||
|
#cancelTaskModal > .modal-dialog > .modal-content > .modal-footer > .btn-danger {
|
||||||
|
float: right;
|
||||||
|
z-index: 9;
|
||||||
|
position: relative;
|
||||||
|
margin: 0 0 0 10px;
|
||||||
|
min-width: 80px;
|
||||||
|
padding: 10px 18px;
|
||||||
|
font-size: 16px;
|
||||||
|
line-height: 1.33;
|
||||||
|
border-radius: 3px
|
||||||
|
}
|
||||||
|
|
||||||
#RestartDialog > .modal-dialog > .modal-content > .modal-body > .btn-default:not(#restart) {
|
#RestartDialog > .modal-dialog > .modal-content > .modal-body > .btn-default:not(#restart) {
|
||||||
margin: 25px 0 0 10px
|
margin: 25px 0 0 10px
|
||||||
}
|
}
|
||||||
|
@ -5681,7 +5708,11 @@ body.admin.modal-open .navbar {
|
||||||
margin: 0 0 0 10px
|
margin: 0 0 0 10px
|
||||||
}
|
}
|
||||||
|
|
||||||
#RestartDialog > .modal-dialog > .modal-content > .modal-body > .btn-default:not(#restart):hover, #ShutdownDialog > .modal-dialog > .modal-content > .modal-body > .btn-default:not(#shutdown):hover, #deleteModal > .modal-dialog > .modal-content > .modal-footer > .btn-default:hover {
|
#cancelTaskModal > .modal-dialog > .modal-content > .modal-footer > .btn-default {
|
||||||
|
margin: 0 0 0 10px
|
||||||
|
}
|
||||||
|
|
||||||
|
#RestartDialog > .modal-dialog > .modal-content > .modal-body > .btn-default:not(#restart):hover, #ShutdownDialog > .modal-dialog > .modal-content > .modal-body > .btn-default:not(#shutdown):hover, #deleteModal > .modal-dialog > .modal-content > .modal-footer > .btn-default:hover, #cancelTaskModal > .modal-dialog > .modal-content > .modal-footer > .btn-default:hover {
|
||||||
background-color: hsla(0, 0%, 100%, .3)
|
background-color: hsla(0, 0%, 100%, .3)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
@ -7303,11 +7334,11 @@ body.edituser.admin > div.container-fluid > div.row-fluid > div.col-sm-10 > div.
|
||||||
background-color: transparent !important
|
background-color: transparent !important
|
||||||
}
|
}
|
||||||
|
|
||||||
#RestartDialog > .modal-dialog, #ShutdownDialog > .modal-dialog, #StatusDialog > .modal-dialog, #deleteModal > .modal-dialog {
|
#RestartDialog > .modal-dialog, #ShutdownDialog > .modal-dialog, #StatusDialog > .modal-dialog, #deleteModal > .modal-dialog, #cancelTaskModal > .modal-dialog {
|
||||||
max-width: calc(100vw - 40px)
|
max-width: calc(100vw - 40px)
|
||||||
}
|
}
|
||||||
|
|
||||||
#RestartDialog > .modal-dialog > .modal-content, #ShutdownDialog > .modal-dialog > .modal-content, #StatusDialog > .modal-dialog > .modal-content, #deleteModal > .modal-dialog > .modal-content {
|
#RestartDialog > .modal-dialog > .modal-content, #ShutdownDialog > .modal-dialog > .modal-content, #StatusDialog > .modal-dialog > .modal-content, #deleteModal > .modal-dialog > .modal-content, #cancelTaskModal > .modal-dialog > .modal-content {
|
||||||
max-width: calc(100vw - 40px);
|
max-width: calc(100vw - 40px);
|
||||||
left: 0
|
left: 0
|
||||||
}
|
}
|
||||||
|
@ -7457,7 +7488,7 @@ body.edituser.admin > div.container-fluid > div.row-fluid > div.col-sm-10 > div.
|
||||||
padding: 30px 15px
|
padding: 30px 15px
|
||||||
}
|
}
|
||||||
|
|
||||||
#RestartDialog.in:before, #ShutdownDialog.in:before, #StatusDialog.in:before, #deleteModal.in:before {
|
#RestartDialog.in:before, #ShutdownDialog.in:before, #StatusDialog.in:before, #deleteModal.in:before, #cancelTaskModal.in:before {
|
||||||
left: auto;
|
left: auto;
|
||||||
right: 34px
|
right: 34px
|
||||||
}
|
}
|
||||||
|
|
|
@ -28,14 +28,24 @@ $("#have_read_cb").on("change", function() {
|
||||||
data: $(this).closest("form").serialize(),
|
data: $(this).closest("form").serialize(),
|
||||||
error: function(response) {
|
error: function(response) {
|
||||||
var data = [{type:"danger", message:response.responseText}]
|
var data = [{type:"danger", message:response.responseText}]
|
||||||
$("#flash_success").remove();
|
// $("#flash_success").parent().remove();
|
||||||
$("#flash_danger").remove();
|
$("#flash_danger").remove();
|
||||||
|
$(".row-fluid.text-center").remove();
|
||||||
if (!jQuery.isEmptyObject(data)) {
|
if (!jQuery.isEmptyObject(data)) {
|
||||||
data.forEach(function (item) {
|
$("#have_read_cb").prop("checked", !$("#have_read_cb").prop("checked"));
|
||||||
$(".navbar").after('<div class="row-fluid text-center" >' +
|
if($("#bookDetailsModal").is(":visible")) {
|
||||||
'<div id="flash_' + item.type + '" class="alert alert-' + item.type + '">' + item.message + '</div>' +
|
data.forEach(function (item) {
|
||||||
'</div>');
|
$(".modal-header").after('<div id="flash_' + item.type +
|
||||||
});
|
'" class="text-center alert alert-' + item.type + '">' + item.message + '</div>');
|
||||||
|
});
|
||||||
|
} else
|
||||||
|
{
|
||||||
|
data.forEach(function (item) {
|
||||||
|
$(".navbar").after('<div class="row-fluid text-center" >' +
|
||||||
|
'<div id="flash_' + item.type + '" class="alert alert-' + item.type + '">' + item.message + '</div>' +
|
||||||
|
'</div>');
|
||||||
|
});
|
||||||
|
}
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
|
@ -33,7 +33,7 @@ $(".datepicker").datepicker({
|
||||||
if (results) {
|
if (results) {
|
||||||
pubDate = new Date(results[1], parseInt(results[2], 10) - 1, results[3]) || new Date(this.value);
|
pubDate = new Date(results[1], parseInt(results[2], 10) - 1, results[3]) || new Date(this.value);
|
||||||
$(this).next('input')
|
$(this).next('input')
|
||||||
.val(pubDate.toLocaleDateString(language))
|
.val(pubDate.toLocaleDateString(language.replaceAll("_","-")))
|
||||||
.removeClass("hidden");
|
.removeClass("hidden");
|
||||||
}
|
}
|
||||||
}).trigger("change");
|
}).trigger("change");
|
||||||
|
|
|
@ -92,14 +92,19 @@ $(function () {
|
||||||
data: {"query": keyword},
|
data: {"query": keyword},
|
||||||
dataType: "json",
|
dataType: "json",
|
||||||
success: function success(data) {
|
success: function success(data) {
|
||||||
$("#meta-info").html("<ul id=\"book-list\" class=\"media-list\"></ul>");
|
if (data.length) {
|
||||||
data.forEach(function(book) {
|
$("#meta-info").html("<ul id=\"book-list\" class=\"media-list\"></ul>");
|
||||||
var $book = $(templates.bookResult(book));
|
data.forEach(function(book) {
|
||||||
$book.find("img").on("click", function () {
|
var $book = $(templates.bookResult(book));
|
||||||
populateForm(book);
|
$book.find("img").on("click", function () {
|
||||||
|
populateForm(book);
|
||||||
|
});
|
||||||
|
$("#book-list").append($book);
|
||||||
});
|
});
|
||||||
$("#book-list").append($book);
|
}
|
||||||
});
|
else {
|
||||||
|
$("#meta-info").html("<p class=\"text-danger\">" + msg.no_result + "!</p>" + $("#meta-info")[0].innerHTML)
|
||||||
|
}
|
||||||
},
|
},
|
||||||
error: function error() {
|
error: function error() {
|
||||||
$("#meta-info").html("<p class=\"text-danger\">" + msg.search_error + "!</p>" + $("#meta-info")[0].innerHTML);
|
$("#meta-info").html("<p class=\"text-danger\">" + msg.search_error + "!</p>" + $("#meta-info")[0].innerHTML);
|
||||||
|
|
|
@ -381,8 +381,8 @@ $(function() {
|
||||||
//extraScrollPx: 300
|
//extraScrollPx: 300
|
||||||
});
|
});
|
||||||
$loadMore.on( "append.infiniteScroll", function( event, response, path, data ) {
|
$loadMore.on( "append.infiniteScroll", function( event, response, path, data ) {
|
||||||
|
$(".pagination").addClass("hidden").html(() => $(response).find(".pagination").html());
|
||||||
if ($("body").hasClass("blur")) {
|
if ($("body").hasClass("blur")) {
|
||||||
$(".pagination").addClass("hidden").html(() => $(response).find(".pagination").html());
|
|
||||||
$(" a:not(.dropdown-toggle) ")
|
$(" a:not(.dropdown-toggle) ")
|
||||||
.removeAttr("data-toggle");
|
.removeAttr("data-toggle");
|
||||||
}
|
}
|
||||||
|
@ -474,6 +474,17 @@ $(function() {
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
});
|
});
|
||||||
|
$("#admin_refresh_cover_cache").click(function() {
|
||||||
|
confirmDialog("admin_refresh_cover_cache", "GeneralChangeModal", 0, function () {
|
||||||
|
$.ajax({
|
||||||
|
method:"post",
|
||||||
|
contentType: "application/json; charset=utf-8",
|
||||||
|
dataType: "json",
|
||||||
|
url: getPath() + "/ajax/updateThumbnails",
|
||||||
|
});
|
||||||
|
});
|
||||||
|
});
|
||||||
|
|
||||||
$("#restart_database").click(function() {
|
$("#restart_database").click(function() {
|
||||||
$("#DialogHeader").addClass("hidden");
|
$("#DialogHeader").addClass("hidden");
|
||||||
$("#DialogFinished").addClass("hidden");
|
$("#DialogFinished").addClass("hidden");
|
||||||
|
@ -515,6 +526,7 @@ $(function() {
|
||||||
|
|
||||||
$("#bookDetailsModal")
|
$("#bookDetailsModal")
|
||||||
.on("show.bs.modal", function(e) {
|
.on("show.bs.modal", function(e) {
|
||||||
|
$("#flash_danger").remove();
|
||||||
var $modalBody = $(this).find(".modal-body");
|
var $modalBody = $(this).find(".modal-body");
|
||||||
|
|
||||||
// Prevent static assets from loading multiple times
|
// Prevent static assets from loading multiple times
|
||||||
|
|
|
@ -15,7 +15,7 @@
|
||||||
* along with this program. If not, see <http://www.gnu.org/licenses/>.
|
* along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
*/
|
*/
|
||||||
|
|
||||||
/* exported TableActions, RestrictionActions, EbookActions, responseHandler */
|
/* exported TableActions, RestrictionActions, EbookActions, TaskActions, responseHandler */
|
||||||
/* global getPath, confirmDialog */
|
/* global getPath, confirmDialog */
|
||||||
|
|
||||||
var selections = [];
|
var selections = [];
|
||||||
|
@ -42,6 +42,24 @@ $(function() {
|
||||||
}, 1000);
|
}, 1000);
|
||||||
}
|
}
|
||||||
|
|
||||||
|
$("#cancel_task_confirm").click(function() {
|
||||||
|
//get data-id attribute of the clicked element
|
||||||
|
var taskId = $(this).data("task-id");
|
||||||
|
$.ajax({
|
||||||
|
method: "post",
|
||||||
|
contentType: "application/json; charset=utf-8",
|
||||||
|
dataType: "json",
|
||||||
|
url: window.location.pathname + "/../ajax/canceltask",
|
||||||
|
data: JSON.stringify({"task_id": taskId}),
|
||||||
|
});
|
||||||
|
});
|
||||||
|
//triggered when modal is about to be shown
|
||||||
|
$("#cancelTaskModal").on("show.bs.modal", function(e) {
|
||||||
|
//get data-id attribute of the clicked element and store in button
|
||||||
|
var taskId = $(e.relatedTarget).data("task-id");
|
||||||
|
$(e.currentTarget).find("#cancel_task_confirm").data("task-id", taskId);
|
||||||
|
});
|
||||||
|
|
||||||
$("#books-table").on("check.bs.table check-all.bs.table uncheck.bs.table uncheck-all.bs.table",
|
$("#books-table").on("check.bs.table check-all.bs.table uncheck.bs.table uncheck-all.bs.table",
|
||||||
function (e, rowsAfter, rowsBefore) {
|
function (e, rowsAfter, rowsBefore) {
|
||||||
var rows = rowsAfter;
|
var rows = rowsAfter;
|
||||||
|
@ -107,8 +125,9 @@ $(function() {
|
||||||
url: window.location.pathname + "/../ajax/simulatemerge",
|
url: window.location.pathname + "/../ajax/simulatemerge",
|
||||||
data: JSON.stringify({"Merge_books":selections}),
|
data: JSON.stringify({"Merge_books":selections}),
|
||||||
success: function success(booTitles) {
|
success: function success(booTitles) {
|
||||||
|
$('#merge_from').empty();
|
||||||
$.each(booTitles.from, function(i, item) {
|
$.each(booTitles.from, function(i, item) {
|
||||||
$("<span>- " + item + "</span>").appendTo("#merge_from");
|
$("<span>- " + item + "</span><p></p>").appendTo("#merge_from");
|
||||||
});
|
});
|
||||||
$("#merge_to").text("- " + booTitles.to);
|
$("#merge_to").text("- " + booTitles.to);
|
||||||
|
|
||||||
|
@ -531,7 +550,7 @@ $(function() {
|
||||||
|
|
||||||
$("#user-table").on("click-cell.bs.table", function (field, value, row, $element) {
|
$("#user-table").on("click-cell.bs.table", function (field, value, row, $element) {
|
||||||
if (value === "denied_column_value") {
|
if (value === "denied_column_value") {
|
||||||
ConfirmDialog("btndeluser", "GeneralDeleteModal", $element.id, user_handle);
|
confirmDialog("btndeluser", "GeneralDeleteModal", $element.id, user_handle);
|
||||||
}
|
}
|
||||||
});
|
});
|
||||||
|
|
||||||
|
@ -581,6 +600,7 @@ function handle_header_buttons () {
|
||||||
$(".header_select").removeAttr("disabled");
|
$(".header_select").removeAttr("disabled");
|
||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
/* Function for deleting domain restrictions */
|
/* Function for deleting domain restrictions */
|
||||||
function TableActions (value, row) {
|
function TableActions (value, row) {
|
||||||
return [
|
return [
|
||||||
|
@ -618,6 +638,19 @@ function UserActions (value, row) {
|
||||||
].join("");
|
].join("");
|
||||||
}
|
}
|
||||||
|
|
||||||
|
/* Function for cancelling tasks */
|
||||||
|
function TaskActions (value, row) {
|
||||||
|
var cancellableStats = [0, 1, 2];
|
||||||
|
if (row.task_id && row.is_cancellable && cancellableStats.includes(row.stat)) {
|
||||||
|
return [
|
||||||
|
"<div class=\"danger task-cancel\" data-toggle=\"modal\" data-target=\"#cancelTaskModal\" data-task-id=\"" + row.task_id + "\" title=\"Cancel\">",
|
||||||
|
"<i class=\"glyphicon glyphicon-ban-circle\"></i>",
|
||||||
|
"</div>"
|
||||||
|
].join("");
|
||||||
|
}
|
||||||
|
return '';
|
||||||
|
}
|
||||||
|
|
||||||
/* Function for keeping checked rows */
|
/* Function for keeping checked rows */
|
||||||
function responseHandler(res) {
|
function responseHandler(res) {
|
||||||
$.each(res.rows, function (i, row) {
|
$.each(res.rows, function (i, row) {
|
||||||
|
@ -811,11 +844,13 @@ function checkboxChange(checkbox, userId, field, field_index) {
|
||||||
|
|
||||||
function BookCheckboxChange(checkbox, userId, field) {
|
function BookCheckboxChange(checkbox, userId, field) {
|
||||||
var value = checkbox.checked ? "True" : "False";
|
var value = checkbox.checked ? "True" : "False";
|
||||||
|
var element = checkbox;
|
||||||
$.ajax({
|
$.ajax({
|
||||||
method: "post",
|
method: "post",
|
||||||
url: getPath() + "/ajax/editbooks/" + field,
|
url: getPath() + "/ajax/editbooks/" + field,
|
||||||
data: {"pk": userId, "value": value},
|
data: {"pk": userId, "value": value},
|
||||||
error: function(data) {
|
error: function(data) {
|
||||||
|
element.checked = !element.checked;
|
||||||
handleListServerResponse([{type:"danger", message:data.responseText}])
|
handleListServerResponse([{type:"danger", message:data.responseText}])
|
||||||
},
|
},
|
||||||
success: handleListServerResponse
|
success: handleListServerResponse
|
||||||
|
|
|
@ -18,12 +18,12 @@
|
||||||
|
|
||||||
import os
|
import os
|
||||||
import re
|
import re
|
||||||
|
|
||||||
from glob import glob
|
from glob import glob
|
||||||
from shutil import copyfile
|
from shutil import copyfile
|
||||||
from markupsafe import escape
|
from markupsafe import escape
|
||||||
|
|
||||||
from sqlalchemy.exc import SQLAlchemyError
|
from sqlalchemy.exc import SQLAlchemyError
|
||||||
|
from flask_babel import lazy_gettext as N_
|
||||||
|
|
||||||
from cps.services.worker import CalibreTask
|
from cps.services.worker import CalibreTask
|
||||||
from cps import db
|
from cps import db
|
||||||
|
@ -35,14 +35,16 @@ from cps.ub import init_db_thread
|
||||||
|
|
||||||
from cps.tasks.mail import TaskEmail
|
from cps.tasks.mail import TaskEmail
|
||||||
from cps import gdriveutils
|
from cps import gdriveutils
|
||||||
|
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
|
|
||||||
class TaskConvert(CalibreTask):
|
class TaskConvert(CalibreTask):
|
||||||
def __init__(self, file_path, bookid, taskMessage, settings, kindle_mail, user=None):
|
def __init__(self, file_path, book_id, task_message, settings, kindle_mail, user=None):
|
||||||
super(TaskConvert, self).__init__(taskMessage)
|
super(TaskConvert, self).__init__(task_message)
|
||||||
self.file_path = file_path
|
self.file_path = file_path
|
||||||
self.bookid = bookid
|
self.book_id = book_id
|
||||||
self.title = ""
|
self.title = ""
|
||||||
self.settings = settings
|
self.settings = settings
|
||||||
self.kindle_mail = kindle_mail
|
self.kindle_mail = kindle_mail
|
||||||
|
@ -54,9 +56,9 @@ class TaskConvert(CalibreTask):
|
||||||
self.worker_thread = worker_thread
|
self.worker_thread = worker_thread
|
||||||
if config.config_use_google_drive:
|
if config.config_use_google_drive:
|
||||||
worker_db = db.CalibreDB(expire_on_commit=False)
|
worker_db = db.CalibreDB(expire_on_commit=False)
|
||||||
cur_book = worker_db.get_book(self.bookid)
|
cur_book = worker_db.get_book(self.book_id)
|
||||||
self.title = cur_book.title
|
self.title = cur_book.title
|
||||||
data = worker_db.get_book_format(self.bookid, self.settings['old_book_format'])
|
data = worker_db.get_book_format(self.book_id, self.settings['old_book_format'])
|
||||||
df = gdriveutils.getFileFromEbooksFolder(cur_book.path,
|
df = gdriveutils.getFileFromEbooksFolder(cur_book.path,
|
||||||
data.name + "." + self.settings['old_book_format'].lower())
|
data.name + "." + self.settings['old_book_format'].lower())
|
||||||
if df:
|
if df:
|
||||||
|
@ -87,7 +89,7 @@ class TaskConvert(CalibreTask):
|
||||||
# if we're sending to kindle after converting, create a one-off task and run it immediately
|
# if we're sending to kindle after converting, create a one-off task and run it immediately
|
||||||
# todo: figure out how to incorporate this into the progress
|
# todo: figure out how to incorporate this into the progress
|
||||||
try:
|
try:
|
||||||
EmailText = _(u"%(book)s send to Kindle", book=escape(self.title))
|
EmailText = N_(u"%(book)s send to Kindle", book=escape(self.title))
|
||||||
worker_thread.add(self.user, TaskEmail(self.settings['subject'],
|
worker_thread.add(self.user, TaskEmail(self.settings['subject'],
|
||||||
self.results["path"],
|
self.results["path"],
|
||||||
filename,
|
filename,
|
||||||
|
@ -104,7 +106,7 @@ class TaskConvert(CalibreTask):
|
||||||
error_message = None
|
error_message = None
|
||||||
local_db = db.CalibreDB(expire_on_commit=False)
|
local_db = db.CalibreDB(expire_on_commit=False)
|
||||||
file_path = self.file_path
|
file_path = self.file_path
|
||||||
book_id = self.bookid
|
book_id = self.book_id
|
||||||
format_old_ext = u'.' + self.settings['old_book_format'].lower()
|
format_old_ext = u'.' + self.settings['old_book_format'].lower()
|
||||||
format_new_ext = u'.' + self.settings['new_book_format'].lower()
|
format_new_ext = u'.' + self.settings['new_book_format'].lower()
|
||||||
|
|
||||||
|
@ -112,7 +114,7 @@ class TaskConvert(CalibreTask):
|
||||||
# if it does - mark the conversion task as complete and return a success
|
# if it does - mark the conversion task as complete and return a success
|
||||||
# this will allow send to kindle workflow to continue to work
|
# this will allow send to kindle workflow to continue to work
|
||||||
if os.path.isfile(file_path + format_new_ext) or\
|
if os.path.isfile(file_path + format_new_ext) or\
|
||||||
local_db.get_book_format(self.bookid, self.settings['new_book_format']):
|
local_db.get_book_format(self.book_id, self.settings['new_book_format']):
|
||||||
log.info("Book id %d already converted to %s", book_id, format_new_ext)
|
log.info("Book id %d already converted to %s", book_id, format_new_ext)
|
||||||
cur_book = local_db.get_book(book_id)
|
cur_book = local_db.get_book(book_id)
|
||||||
self.title = cur_book.title
|
self.title = cur_book.title
|
||||||
|
@ -131,7 +133,7 @@ class TaskConvert(CalibreTask):
|
||||||
local_db.session.rollback()
|
local_db.session.rollback()
|
||||||
log.error("Database error: %s", e)
|
log.error("Database error: %s", e)
|
||||||
local_db.session.close()
|
local_db.session.close()
|
||||||
self._handleError(error_message)
|
self._handleError(N_("Database error: %(error)s.", error=e))
|
||||||
return
|
return
|
||||||
self._handleSuccess()
|
self._handleSuccess()
|
||||||
local_db.session.close()
|
local_db.session.close()
|
||||||
|
@ -148,8 +150,7 @@ class TaskConvert(CalibreTask):
|
||||||
else:
|
else:
|
||||||
# check if calibre converter-executable is existing
|
# check if calibre converter-executable is existing
|
||||||
if not os.path.exists(config.config_converterpath):
|
if not os.path.exists(config.config_converterpath):
|
||||||
# ToDo Text is not translated
|
self._handleError(N_(u"Calibre ebook-convert %(tool)s not found", tool=config.config_converterpath))
|
||||||
self._handleError(_(u"Calibre ebook-convert %(tool)s not found", tool=config.config_converterpath))
|
|
||||||
return
|
return
|
||||||
check, error_message = self._convert_calibre(file_path, format_old_ext, format_new_ext)
|
check, error_message = self._convert_calibre(file_path, format_old_ext, format_new_ext)
|
||||||
|
|
||||||
|
@ -182,11 +183,11 @@ class TaskConvert(CalibreTask):
|
||||||
self._handleSuccess()
|
self._handleSuccess()
|
||||||
return os.path.basename(file_path + format_new_ext)
|
return os.path.basename(file_path + format_new_ext)
|
||||||
else:
|
else:
|
||||||
error_message = _('%(format)s format not found on disk', format=format_new_ext.upper())
|
error_message = N_('%(format)s format not found on disk', format=format_new_ext.upper())
|
||||||
local_db.session.close()
|
local_db.session.close()
|
||||||
log.info("ebook converter failed with error while converting book")
|
log.info("ebook converter failed with error while converting book")
|
||||||
if not error_message:
|
if not error_message:
|
||||||
error_message = _('Ebook converter failed with unknown error')
|
error_message = N_('Ebook converter failed with unknown error')
|
||||||
self._handleError(error_message)
|
self._handleError(error_message)
|
||||||
return
|
return
|
||||||
|
|
||||||
|
@ -196,7 +197,7 @@ class TaskConvert(CalibreTask):
|
||||||
try:
|
try:
|
||||||
p = process_open(command, quotes)
|
p = process_open(command, quotes)
|
||||||
except OSError as e:
|
except OSError as e:
|
||||||
return 1, _(u"Kepubify-converter failed: %(error)s", error=e)
|
return 1, N_(u"Kepubify-converter failed: %(error)s", error=e)
|
||||||
self.progress = 0.01
|
self.progress = 0.01
|
||||||
while True:
|
while True:
|
||||||
nextline = p.stdout.readlines()
|
nextline = p.stdout.readlines()
|
||||||
|
@ -217,7 +218,7 @@ class TaskConvert(CalibreTask):
|
||||||
copyfile(converted_file[0], (file_path + format_new_ext))
|
copyfile(converted_file[0], (file_path + format_new_ext))
|
||||||
os.unlink(converted_file[0])
|
os.unlink(converted_file[0])
|
||||||
else:
|
else:
|
||||||
return 1, _(u"Converted file not found or more than one file in folder %(folder)s",
|
return 1, N_(u"Converted file not found or more than one file in folder %(folder)s",
|
||||||
folder=os.path.dirname(file_path))
|
folder=os.path.dirname(file_path))
|
||||||
return check, None
|
return check, None
|
||||||
|
|
||||||
|
@ -241,7 +242,7 @@ class TaskConvert(CalibreTask):
|
||||||
|
|
||||||
p = process_open(command, quotes, newlines=False)
|
p = process_open(command, quotes, newlines=False)
|
||||||
except OSError as e:
|
except OSError as e:
|
||||||
return 1, _(u"Ebook-converter failed: %(error)s", error=e)
|
return 1, N_(u"Ebook-converter failed: %(error)s", error=e)
|
||||||
|
|
||||||
while p.poll() is None:
|
while p.poll() is None:
|
||||||
nextline = p.stdout.readline()
|
nextline = p.stdout.readline()
|
||||||
|
@ -264,12 +265,16 @@ class TaskConvert(CalibreTask):
|
||||||
ele = ele.decode('utf-8', errors="ignore").strip('\n')
|
ele = ele.decode('utf-8', errors="ignore").strip('\n')
|
||||||
log.debug(ele)
|
log.debug(ele)
|
||||||
if not ele.startswith('Traceback') and not ele.startswith(' File'):
|
if not ele.startswith('Traceback') and not ele.startswith(' File'):
|
||||||
error_message = _("Calibre failed with error: %(error)s", error=ele)
|
error_message = N_("Calibre failed with error: %(error)s", error=ele)
|
||||||
return check, error_message
|
return check, error_message
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def name(self):
|
def name(self):
|
||||||
return "Convert"
|
return N_("Convert")
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return "Convert {} {}".format(self.bookid, self.kindle_mail)
|
return "Convert {} {}".format(self.book_id, self.kindle_mail)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_cancellable(self):
|
||||||
|
return False
|
||||||
|
|
51
cps/tasks/database.py
Normal file
51
cps/tasks/database.py
Normal file
|
@ -0,0 +1,51 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2020 mmonkey
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
from urllib.request import urlopen
|
||||||
|
|
||||||
|
from flask_babel import lazy_gettext as N_
|
||||||
|
|
||||||
|
from cps import config, logger
|
||||||
|
from cps.services.worker import CalibreTask
|
||||||
|
|
||||||
|
|
||||||
|
class TaskReconnectDatabase(CalibreTask):
|
||||||
|
def __init__(self, task_message=N_('Reconnecting Calibre database')):
|
||||||
|
super(TaskReconnectDatabase, self).__init__(task_message)
|
||||||
|
self.log = logger.create()
|
||||||
|
self.listen_address = config.get_config_ipaddress()
|
||||||
|
self.listen_port = config.config_port
|
||||||
|
|
||||||
|
|
||||||
|
def run(self, worker_thread):
|
||||||
|
address = self.listen_address if self.listen_address else 'localhost'
|
||||||
|
port = self.listen_port if self.listen_port else 8083
|
||||||
|
|
||||||
|
try:
|
||||||
|
urlopen('http://' + address + ':' + str(port) + '/reconnect')
|
||||||
|
self._handleSuccess()
|
||||||
|
except Exception as ex:
|
||||||
|
self._handleError('Unable to reconnect Calibre database: ' + str(ex))
|
||||||
|
|
||||||
|
@property
|
||||||
|
def name(self):
|
||||||
|
return "Reconnect Database"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_cancellable(self):
|
||||||
|
return False
|
107
cps/tasks/mail.py
Normal file → Executable file
107
cps/tasks/mail.py
Normal file → Executable file
|
@ -16,28 +16,18 @@
|
||||||
# You should have received a copy of the GNU General Public License
|
# You should have received a copy of the GNU General Public License
|
||||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
import sys
|
|
||||||
import os
|
import os
|
||||||
import smtplib
|
import smtplib
|
||||||
import threading
|
import threading
|
||||||
import socket
|
import socket
|
||||||
import mimetypes
|
import mimetypes
|
||||||
|
|
||||||
try:
|
from io import StringIO
|
||||||
from StringIO import StringIO
|
from email.message import EmailMessage
|
||||||
from email.MIMEBase import MIMEBase
|
from email.utils import parseaddr
|
||||||
from email.MIMEMultipart import MIMEMultipart
|
|
||||||
from email.MIMEText import MIMEText
|
|
||||||
except ImportError:
|
|
||||||
from io import StringIO
|
|
||||||
from email.mime.base import MIMEBase
|
|
||||||
from email.mime.multipart import MIMEMultipart
|
|
||||||
from email.mime.text import MIMEText
|
|
||||||
|
|
||||||
|
from flask_babel import lazy_gettext as N_
|
||||||
|
from email.utils import formatdate
|
||||||
from email import encoders
|
|
||||||
from email.utils import formatdate, make_msgid
|
|
||||||
from email.generator import Generator
|
from email.generator import Generator
|
||||||
|
|
||||||
from cps.services.worker import CalibreTask
|
from cps.services.worker import CalibreTask
|
||||||
|
@ -45,6 +35,7 @@ from cps.services import gmail
|
||||||
from cps import logger, config
|
from cps import logger, config
|
||||||
|
|
||||||
from cps import gdriveutils
|
from cps import gdriveutils
|
||||||
|
import uuid
|
||||||
|
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
|
@ -119,31 +110,48 @@ class EmailSSL(EmailBase, smtplib.SMTP_SSL):
|
||||||
|
|
||||||
|
|
||||||
class TaskEmail(CalibreTask):
|
class TaskEmail(CalibreTask):
|
||||||
def __init__(self, subject, filepath, attachment, settings, recipient, taskMessage, text, internal=False):
|
def __init__(self, subject, filepath, attachment, settings, recipient, task_message, text, internal=False):
|
||||||
super(TaskEmail, self).__init__(taskMessage)
|
super(TaskEmail, self).__init__(task_message)
|
||||||
self.subject = subject
|
self.subject = subject
|
||||||
self.attachment = attachment
|
self.attachment = attachment
|
||||||
self.settings = settings
|
self.settings = settings
|
||||||
self.filepath = filepath
|
self.filepath = filepath
|
||||||
self.recipent = recipient
|
self.recipient = recipient
|
||||||
self.text = text
|
self.text = text
|
||||||
self.asyncSMTP = None
|
self.asyncSMTP = None
|
||||||
self.results = dict()
|
self.results = dict()
|
||||||
|
|
||||||
|
# from calibre code:
|
||||||
|
# https://github.com/kovidgoyal/calibre/blob/731ccd92a99868de3e2738f65949f19768d9104c/src/calibre/utils/smtp.py#L60
|
||||||
|
def get_msgid_domain(self):
|
||||||
|
try:
|
||||||
|
# Parse out the address from the From line, and then the domain from that
|
||||||
|
from_email = parseaddr(self.settings["mail_from"])[1]
|
||||||
|
msgid_domain = from_email.partition('@')[2].strip()
|
||||||
|
# This can sometimes sneak through parseaddr if the input is malformed
|
||||||
|
msgid_domain = msgid_domain.rstrip('>').strip()
|
||||||
|
except Exception:
|
||||||
|
msgid_domain = ''
|
||||||
|
return msgid_domain or 'calibre-web.com'
|
||||||
|
|
||||||
def prepare_message(self):
|
def prepare_message(self):
|
||||||
message = MIMEMultipart()
|
message = EmailMessage()
|
||||||
message['to'] = self.recipent
|
# message = MIMEMultipart()
|
||||||
message['from'] = self.settings["mail_from"]
|
message['From'] = self.settings["mail_from"]
|
||||||
message['subject'] = self.subject
|
message['To'] = self.recipient
|
||||||
message['Message-Id'] = make_msgid('calibre-web')
|
message['Subject'] = self.subject
|
||||||
message['Date'] = formatdate(localtime=True)
|
message['Date'] = formatdate(localtime=True)
|
||||||
text = self.text
|
message['Message-Id'] = "{}@{}".format(uuid.uuid4(), self.get_msgid_domain()) # f"<{uuid.uuid4()}@{get_msgid_domain(from_)}>" # make_msgid('calibre-web')
|
||||||
msg = MIMEText(text.encode('UTF-8'), 'plain', 'UTF-8')
|
message.set_content(self.text.encode('UTF-8'), "text", "plain")
|
||||||
message.attach(msg)
|
|
||||||
if self.attachment:
|
if self.attachment:
|
||||||
result = self._get_attachment(self.filepath, self.attachment)
|
data = self._get_attachment(self.filepath, self.attachment)
|
||||||
if result:
|
if data:
|
||||||
message.attach(result)
|
# Set mimetype
|
||||||
|
content_type, encoding = mimetypes.guess_type(self.attachment)
|
||||||
|
if content_type is None or encoding is not None:
|
||||||
|
content_type = 'application/octet-stream'
|
||||||
|
main_type, sub_type = content_type.split('/', 1)
|
||||||
|
message.add_attachment(data, maintype=main_type, subtype=sub_type, filename=self.attachment)
|
||||||
else:
|
else:
|
||||||
self._handleError(u"Attachment not found")
|
self._handleError(u"Attachment not found")
|
||||||
return
|
return
|
||||||
|
@ -158,10 +166,10 @@ class TaskEmail(CalibreTask):
|
||||||
else:
|
else:
|
||||||
self.send_gmail_email(msg)
|
self.send_gmail_email(msg)
|
||||||
except MemoryError as e:
|
except MemoryError as e:
|
||||||
log.debug_or_exception(e, stacklevel=3)
|
log.error_or_exception(e, stacklevel=3)
|
||||||
self._handleError(u'MemoryError sending e-mail: {}'.format(str(e)))
|
self._handleError(u'MemoryError sending e-mail: {}'.format(str(e)))
|
||||||
except (smtplib.SMTPException, smtplib.SMTPAuthenticationError) as e:
|
except (smtplib.SMTPException, smtplib.SMTPAuthenticationError) as e:
|
||||||
log.debug_or_exception(e, stacklevel=3)
|
log.error_or_exception(e, stacklevel=3)
|
||||||
if hasattr(e, "smtp_error"):
|
if hasattr(e, "smtp_error"):
|
||||||
text = e.smtp_error.decode('utf-8').replace("\n", '. ')
|
text = e.smtp_error.decode('utf-8').replace("\n", '. ')
|
||||||
elif hasattr(e, "message"):
|
elif hasattr(e, "message"):
|
||||||
|
@ -172,10 +180,10 @@ class TaskEmail(CalibreTask):
|
||||||
text = ''
|
text = ''
|
||||||
self._handleError(u'Smtplib Error sending e-mail: {}'.format(text))
|
self._handleError(u'Smtplib Error sending e-mail: {}'.format(text))
|
||||||
except (socket.error) as e:
|
except (socket.error) as e:
|
||||||
log.debug_or_exception(e, stacklevel=3)
|
log.error_or_exception(e, stacklevel=3)
|
||||||
self._handleError(u'Socket Error sending e-mail: {}'.format(e.strerror))
|
self._handleError(u'Socket Error sending e-mail: {}'.format(e.strerror))
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.debug_or_exception(ex, stacklevel=3)
|
log.error_or_exception(ex, stacklevel=3)
|
||||||
self._handleError(u'Error sending e-mail: {}'.format(ex))
|
self._handleError(u'Error sending e-mail: {}'.format(ex))
|
||||||
|
|
||||||
def send_standard_email(self, msg):
|
def send_standard_email(self, msg):
|
||||||
|
@ -203,7 +211,7 @@ class TaskEmail(CalibreTask):
|
||||||
gen = Generator(fp, mangle_from_=False)
|
gen = Generator(fp, mangle_from_=False)
|
||||||
gen.flatten(msg)
|
gen.flatten(msg)
|
||||||
|
|
||||||
self.asyncSMTP.sendmail(self.settings["mail_from"], self.recipent, fp.getvalue())
|
self.asyncSMTP.sendmail(self.settings["mail_from"], self.recipient, fp.getvalue())
|
||||||
self.asyncSMTP.quit()
|
self.asyncSMTP.quit()
|
||||||
self._handleSuccess()
|
self._handleSuccess()
|
||||||
log.debug("E-mail send successfully")
|
log.debug("E-mail send successfully")
|
||||||
|
@ -226,15 +234,15 @@ class TaskEmail(CalibreTask):
|
||||||
self._progress = x
|
self._progress = x
|
||||||
|
|
||||||
@classmethod
|
@classmethod
|
||||||
def _get_attachment(cls, bookpath, filename):
|
def _get_attachment(cls, book_path, filename):
|
||||||
"""Get file as MIMEBase message"""
|
"""Get file as MIMEBase message"""
|
||||||
calibre_path = config.config_calibre_dir
|
calibre_path = config.config_calibre_dir
|
||||||
if config.config_use_google_drive:
|
if config.config_use_google_drive:
|
||||||
df = gdriveutils.getFileFromEbooksFolder(bookpath, filename)
|
df = gdriveutils.getFileFromEbooksFolder(book_path, filename)
|
||||||
if df:
|
if df:
|
||||||
datafile = os.path.join(calibre_path, bookpath, filename)
|
datafile = os.path.join(calibre_path, book_path, filename)
|
||||||
if not os.path.exists(os.path.join(calibre_path, bookpath)):
|
if not os.path.exists(os.path.join(calibre_path, book_path)):
|
||||||
os.makedirs(os.path.join(calibre_path, bookpath))
|
os.makedirs(os.path.join(calibre_path, book_path))
|
||||||
df.GetContentFile(datafile)
|
df.GetContentFile(datafile)
|
||||||
else:
|
else:
|
||||||
return None
|
return None
|
||||||
|
@ -244,27 +252,22 @@ class TaskEmail(CalibreTask):
|
||||||
os.remove(datafile)
|
os.remove(datafile)
|
||||||
else:
|
else:
|
||||||
try:
|
try:
|
||||||
file_ = open(os.path.join(calibre_path, bookpath, filename), 'rb')
|
file_ = open(os.path.join(calibre_path, book_path, filename), 'rb')
|
||||||
data = file_.read()
|
data = file_.read()
|
||||||
file_.close()
|
file_.close()
|
||||||
except IOError as e:
|
except IOError as e:
|
||||||
log.debug_or_exception(e, stacklevel=3)
|
log.error_or_exception(e, stacklevel=3)
|
||||||
log.error(u'The requested file could not be read. Maybe wrong permissions?')
|
log.error(u'The requested file could not be read. Maybe wrong permissions?')
|
||||||
return None
|
return None
|
||||||
# Set mimetype
|
return data
|
||||||
content_type, encoding = mimetypes.guess_type(filename)
|
|
||||||
if content_type is None or encoding is not None:
|
|
||||||
content_type = 'application/octet-stream'
|
|
||||||
main_type, sub_type = content_type.split('/', 1)
|
|
||||||
attachment = MIMEBase(main_type, sub_type)
|
|
||||||
attachment.set_payload(data)
|
|
||||||
encoders.encode_base64(attachment)
|
|
||||||
attachment.add_header('Content-Disposition', 'attachment', filename=filename)
|
|
||||||
return attachment
|
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def name(self):
|
def name(self):
|
||||||
return "E-mail"
|
return N_("E-mail")
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_cancellable(self):
|
||||||
|
return False
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return "E-mail {}, {}".format(self.name, self.subject)
|
return "E-mail {}, {}".format(self.name, self.subject)
|
||||||
|
|
514
cps/tasks/thumbnail.py
Normal file
514
cps/tasks/thumbnail.py
Normal file
|
@ -0,0 +1,514 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2020 monkey
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
import os
|
||||||
|
from urllib.request import urlopen
|
||||||
|
|
||||||
|
from .. import constants
|
||||||
|
from cps import config, db, fs, gdriveutils, logger, ub
|
||||||
|
from cps.services.worker import CalibreTask, STAT_CANCELLED, STAT_ENDED
|
||||||
|
from datetime import datetime
|
||||||
|
from sqlalchemy import func, text, or_
|
||||||
|
from flask_babel import lazy_gettext as N_
|
||||||
|
|
||||||
|
try:
|
||||||
|
from wand.image import Image
|
||||||
|
use_IM = True
|
||||||
|
except (ImportError, RuntimeError) as e:
|
||||||
|
use_IM = False
|
||||||
|
|
||||||
|
|
||||||
|
def get_resize_height(resolution):
|
||||||
|
return int(225 * resolution)
|
||||||
|
|
||||||
|
|
||||||
|
def get_resize_width(resolution, original_width, original_height):
|
||||||
|
height = get_resize_height(resolution)
|
||||||
|
percent = (height / float(original_height))
|
||||||
|
width = int((float(original_width) * float(percent)))
|
||||||
|
return width if width % 2 == 0 else width + 1
|
||||||
|
|
||||||
|
|
||||||
|
def get_best_fit(width, height, image_width, image_height):
|
||||||
|
resize_width = int(width / 2.0)
|
||||||
|
resize_height = int(height / 2.0)
|
||||||
|
aspect_ratio = image_width / image_height
|
||||||
|
|
||||||
|
# If this image's aspect ratio is different from the first image, then resize this image
|
||||||
|
# to fill the width and height of the first image
|
||||||
|
if aspect_ratio < width / height:
|
||||||
|
resize_width = int(width / 2.0)
|
||||||
|
resize_height = image_height * int(width / 2.0) / image_width
|
||||||
|
|
||||||
|
elif aspect_ratio > width / height:
|
||||||
|
resize_width = image_width * int(height / 2.0) / image_height
|
||||||
|
resize_height = int(height / 2.0)
|
||||||
|
|
||||||
|
return {'width': resize_width, 'height': resize_height}
|
||||||
|
|
||||||
|
|
||||||
|
class TaskGenerateCoverThumbnails(CalibreTask):
|
||||||
|
def __init__(self, book_id=-1, task_message=''):
|
||||||
|
super(TaskGenerateCoverThumbnails, self).__init__(task_message)
|
||||||
|
self.log = logger.create()
|
||||||
|
self.book_id = book_id
|
||||||
|
self.app_db_session = ub.get_new_session_instance()
|
||||||
|
self.calibre_db = db.CalibreDB(expire_on_commit=False)
|
||||||
|
self.cache = fs.FileSystem()
|
||||||
|
self.resolutions = [
|
||||||
|
constants.COVER_THUMBNAIL_SMALL,
|
||||||
|
constants.COVER_THUMBNAIL_MEDIUM
|
||||||
|
]
|
||||||
|
|
||||||
|
def run(self, worker_thread):
|
||||||
|
if self.calibre_db.session and use_IM and self.stat != STAT_CANCELLED and self.stat != STAT_ENDED:
|
||||||
|
self.message = 'Scanning Books'
|
||||||
|
books_with_covers = self.get_books_with_covers(self.book_id)
|
||||||
|
count = len(books_with_covers)
|
||||||
|
|
||||||
|
total_generated = 0
|
||||||
|
for i, book in enumerate(books_with_covers):
|
||||||
|
|
||||||
|
# Generate new thumbnails for missing covers
|
||||||
|
generated = self.create_book_cover_thumbnails(book)
|
||||||
|
|
||||||
|
# Increment the progress
|
||||||
|
self.progress = (1.0 / count) * i
|
||||||
|
|
||||||
|
if generated > 0:
|
||||||
|
total_generated += generated
|
||||||
|
self.message = N_(u'Generated %(count)s cover thumbnails', count=total_generated)
|
||||||
|
|
||||||
|
# Check if job has been cancelled or ended
|
||||||
|
if self.stat == STAT_CANCELLED:
|
||||||
|
self.log.info(f'GenerateCoverThumbnails task has been cancelled.')
|
||||||
|
return
|
||||||
|
|
||||||
|
if self.stat == STAT_ENDED:
|
||||||
|
self.log.info(f'GenerateCoverThumbnails task has been ended.')
|
||||||
|
return
|
||||||
|
|
||||||
|
if total_generated == 0:
|
||||||
|
self.self_cleanup = True
|
||||||
|
|
||||||
|
self._handleSuccess()
|
||||||
|
self.app_db_session.remove()
|
||||||
|
|
||||||
|
def get_books_with_covers(self, book_id=-1):
|
||||||
|
filter_exp = (db.Books.id == book_id) if book_id != -1 else True
|
||||||
|
return self.calibre_db.session \
|
||||||
|
.query(db.Books) \
|
||||||
|
.filter(db.Books.has_cover == 1) \
|
||||||
|
.filter(filter_exp) \
|
||||||
|
.all()
|
||||||
|
|
||||||
|
def get_book_cover_thumbnails(self, book_id):
|
||||||
|
return self.app_db_session \
|
||||||
|
.query(ub.Thumbnail) \
|
||||||
|
.filter(ub.Thumbnail.type == constants.THUMBNAIL_TYPE_COVER) \
|
||||||
|
.filter(ub.Thumbnail.entity_id == book_id) \
|
||||||
|
.filter(or_(ub.Thumbnail.expiration.is_(None), ub.Thumbnail.expiration > datetime.utcnow())) \
|
||||||
|
.all()
|
||||||
|
|
||||||
|
def create_book_cover_thumbnails(self, book):
|
||||||
|
generated = 0
|
||||||
|
book_cover_thumbnails = self.get_book_cover_thumbnails(book.id)
|
||||||
|
|
||||||
|
# Generate new thumbnails for missing covers
|
||||||
|
resolutions = list(map(lambda t: t.resolution, book_cover_thumbnails))
|
||||||
|
missing_resolutions = list(set(self.resolutions).difference(resolutions))
|
||||||
|
for resolution in missing_resolutions:
|
||||||
|
generated += 1
|
||||||
|
self.create_book_cover_single_thumbnail(book, resolution)
|
||||||
|
|
||||||
|
# Replace outdated or missing thumbnails
|
||||||
|
for thumbnail in book_cover_thumbnails:
|
||||||
|
if book.last_modified > thumbnail.generated_at:
|
||||||
|
generated += 1
|
||||||
|
self.update_book_cover_thumbnail(book, thumbnail)
|
||||||
|
|
||||||
|
elif not self.cache.get_cache_file_exists(thumbnail.filename, constants.CACHE_TYPE_THUMBNAILS):
|
||||||
|
generated += 1
|
||||||
|
self.update_book_cover_thumbnail(book, thumbnail)
|
||||||
|
return generated
|
||||||
|
|
||||||
|
def create_book_cover_single_thumbnail(self, book, resolution):
|
||||||
|
thumbnail = ub.Thumbnail()
|
||||||
|
thumbnail.type = constants.THUMBNAIL_TYPE_COVER
|
||||||
|
thumbnail.entity_id = book.id
|
||||||
|
thumbnail.format = 'jpeg'
|
||||||
|
thumbnail.resolution = resolution
|
||||||
|
|
||||||
|
self.app_db_session.add(thumbnail)
|
||||||
|
try:
|
||||||
|
self.app_db_session.commit()
|
||||||
|
self.generate_book_thumbnail(book, thumbnail)
|
||||||
|
except Exception as ex:
|
||||||
|
self.log.info('Error creating book thumbnail: ' + str(ex))
|
||||||
|
self._handleError('Error creating book thumbnail: ' + str(ex))
|
||||||
|
self.app_db_session.rollback()
|
||||||
|
|
||||||
|
def update_book_cover_thumbnail(self, book, thumbnail):
|
||||||
|
thumbnail.generated_at = datetime.utcnow()
|
||||||
|
|
||||||
|
try:
|
||||||
|
self.app_db_session.commit()
|
||||||
|
self.cache.delete_cache_file(thumbnail.filename, constants.CACHE_TYPE_THUMBNAILS)
|
||||||
|
self.generate_book_thumbnail(book, thumbnail)
|
||||||
|
except Exception as ex:
|
||||||
|
self.log.info('Error updating book thumbnail: ' + str(ex))
|
||||||
|
self._handleError('Error updating book thumbnail: ' + str(ex))
|
||||||
|
self.app_db_session.rollback()
|
||||||
|
|
||||||
|
def generate_book_thumbnail(self, book, thumbnail):
|
||||||
|
if book and thumbnail:
|
||||||
|
if config.config_use_google_drive:
|
||||||
|
if not gdriveutils.is_gdrive_ready():
|
||||||
|
raise Exception('Google Drive is configured but not ready')
|
||||||
|
|
||||||
|
web_content_link = gdriveutils.get_cover_via_gdrive(book.path)
|
||||||
|
if not web_content_link:
|
||||||
|
raise Exception('Google Drive cover url not found')
|
||||||
|
|
||||||
|
stream = None
|
||||||
|
try:
|
||||||
|
stream = urlopen(web_content_link)
|
||||||
|
with Image(file=stream) as img:
|
||||||
|
height = get_resize_height(thumbnail.resolution)
|
||||||
|
if img.height > height:
|
||||||
|
width = get_resize_width(thumbnail.resolution, img.width, img.height)
|
||||||
|
img.resize(width=width, height=height, filter='lanczos')
|
||||||
|
img.format = thumbnail.format
|
||||||
|
filename = self.cache.get_cache_file_path(thumbnail.filename,
|
||||||
|
constants.CACHE_TYPE_THUMBNAILS)
|
||||||
|
img.save(filename=filename)
|
||||||
|
except Exception as ex:
|
||||||
|
# Bubble exception to calling function
|
||||||
|
self.log.info('Error generating thumbnail file: ' + str(ex))
|
||||||
|
raise ex
|
||||||
|
finally:
|
||||||
|
if stream is not None:
|
||||||
|
stream.close()
|
||||||
|
else:
|
||||||
|
book_cover_filepath = os.path.join(config.config_calibre_dir, book.path, 'cover.jpg')
|
||||||
|
if not os.path.isfile(book_cover_filepath):
|
||||||
|
raise Exception('Book cover file not found')
|
||||||
|
|
||||||
|
with Image(filename=book_cover_filepath) as img:
|
||||||
|
height = get_resize_height(thumbnail.resolution)
|
||||||
|
if img.height > height:
|
||||||
|
width = get_resize_width(thumbnail.resolution, img.width, img.height)
|
||||||
|
img.resize(width=width, height=height, filter='lanczos')
|
||||||
|
img.format = thumbnail.format
|
||||||
|
filename = self.cache.get_cache_file_path(thumbnail.filename, constants.CACHE_TYPE_THUMBNAILS)
|
||||||
|
img.save(filename=filename)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def name(self):
|
||||||
|
return N_('Cover Thumbnails')
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
if self.book_id > 0:
|
||||||
|
return "Add Cover Thumbnails for Book {}".format(self.book_id)
|
||||||
|
else:
|
||||||
|
return "Generate Cover Thumbnails"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_cancellable(self):
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
class TaskGenerateSeriesThumbnails(CalibreTask):
|
||||||
|
def __init__(self, task_message=''):
|
||||||
|
super(TaskGenerateSeriesThumbnails, self).__init__(task_message)
|
||||||
|
self.log = logger.create()
|
||||||
|
self.app_db_session = ub.get_new_session_instance()
|
||||||
|
self.calibre_db = db.CalibreDB(expire_on_commit=False)
|
||||||
|
self.cache = fs.FileSystem()
|
||||||
|
self.resolutions = [
|
||||||
|
constants.COVER_THUMBNAIL_SMALL,
|
||||||
|
constants.COVER_THUMBNAIL_MEDIUM,
|
||||||
|
]
|
||||||
|
|
||||||
|
def run(self, worker_thread):
|
||||||
|
if self.calibre_db.session and use_IM and self.stat != STAT_CANCELLED and self.stat != STAT_ENDED:
|
||||||
|
self.message = 'Scanning Series'
|
||||||
|
all_series = self.get_series_with_four_plus_books()
|
||||||
|
count = len(all_series)
|
||||||
|
|
||||||
|
total_generated = 0
|
||||||
|
for i, series in enumerate(all_series):
|
||||||
|
generated = 0
|
||||||
|
series_thumbnails = self.get_series_thumbnails(series.id)
|
||||||
|
series_books = self.get_series_books(series.id)
|
||||||
|
|
||||||
|
# Generate new thumbnails for missing covers
|
||||||
|
resolutions = list(map(lambda t: t.resolution, series_thumbnails))
|
||||||
|
missing_resolutions = list(set(self.resolutions).difference(resolutions))
|
||||||
|
for resolution in missing_resolutions:
|
||||||
|
generated += 1
|
||||||
|
self.create_series_thumbnail(series, series_books, resolution)
|
||||||
|
|
||||||
|
# Replace outdated or missing thumbnails
|
||||||
|
for thumbnail in series_thumbnails:
|
||||||
|
if any(book.last_modified > thumbnail.generated_at for book in series_books):
|
||||||
|
generated += 1
|
||||||
|
self.update_series_thumbnail(series_books, thumbnail)
|
||||||
|
|
||||||
|
elif not self.cache.get_cache_file_exists(thumbnail.filename, constants.CACHE_TYPE_THUMBNAILS):
|
||||||
|
generated += 1
|
||||||
|
self.update_series_thumbnail(series_books, thumbnail)
|
||||||
|
|
||||||
|
# Increment the progress
|
||||||
|
self.progress = (1.0 / count) * i
|
||||||
|
|
||||||
|
if generated > 0:
|
||||||
|
total_generated += generated
|
||||||
|
self.message = N_('Generated {0} series thumbnails').format(total_generated)
|
||||||
|
|
||||||
|
# Check if job has been cancelled or ended
|
||||||
|
if self.stat == STAT_CANCELLED:
|
||||||
|
self.log.info(f'GenerateSeriesThumbnails task has been cancelled.')
|
||||||
|
return
|
||||||
|
|
||||||
|
if self.stat == STAT_ENDED:
|
||||||
|
self.log.info(f'GenerateSeriesThumbnails task has been ended.')
|
||||||
|
return
|
||||||
|
|
||||||
|
if total_generated == 0:
|
||||||
|
self.self_cleanup = True
|
||||||
|
|
||||||
|
self._handleSuccess()
|
||||||
|
self.app_db_session.remove()
|
||||||
|
|
||||||
|
def get_series_with_four_plus_books(self):
|
||||||
|
return self.calibre_db.session \
|
||||||
|
.query(db.Series) \
|
||||||
|
.join(db.books_series_link) \
|
||||||
|
.join(db.Books) \
|
||||||
|
.filter(db.Books.has_cover == 1) \
|
||||||
|
.group_by(text('books_series_link.series')) \
|
||||||
|
.having(func.count('book_series_link') > 3) \
|
||||||
|
.all()
|
||||||
|
|
||||||
|
def get_series_books(self, series_id):
|
||||||
|
return self.calibre_db.session \
|
||||||
|
.query(db.Books) \
|
||||||
|
.join(db.books_series_link) \
|
||||||
|
.join(db.Series) \
|
||||||
|
.filter(db.Books.has_cover == 1) \
|
||||||
|
.filter(db.Series.id == series_id) \
|
||||||
|
.all()
|
||||||
|
|
||||||
|
def get_series_thumbnails(self, series_id):
|
||||||
|
return self.app_db_session \
|
||||||
|
.query(ub.Thumbnail) \
|
||||||
|
.filter(ub.Thumbnail.type == constants.THUMBNAIL_TYPE_SERIES) \
|
||||||
|
.filter(ub.Thumbnail.entity_id == series_id) \
|
||||||
|
.filter(or_(ub.Thumbnail.expiration.is_(None), ub.Thumbnail.expiration > datetime.utcnow())) \
|
||||||
|
.all()
|
||||||
|
|
||||||
|
def create_series_thumbnail(self, series, series_books, resolution):
|
||||||
|
thumbnail = ub.Thumbnail()
|
||||||
|
thumbnail.type = constants.THUMBNAIL_TYPE_SERIES
|
||||||
|
thumbnail.entity_id = series.id
|
||||||
|
thumbnail.format = 'jpeg'
|
||||||
|
thumbnail.resolution = resolution
|
||||||
|
|
||||||
|
self.app_db_session.add(thumbnail)
|
||||||
|
try:
|
||||||
|
self.app_db_session.commit()
|
||||||
|
self.generate_series_thumbnail(series_books, thumbnail)
|
||||||
|
except Exception as ex:
|
||||||
|
self.log.info('Error creating book thumbnail: ' + str(ex))
|
||||||
|
self._handleError('Error creating book thumbnail: ' + str(ex))
|
||||||
|
self.app_db_session.rollback()
|
||||||
|
|
||||||
|
def update_series_thumbnail(self, series_books, thumbnail):
|
||||||
|
thumbnail.generated_at = datetime.utcnow()
|
||||||
|
|
||||||
|
try:
|
||||||
|
self.app_db_session.commit()
|
||||||
|
self.cache.delete_cache_file(thumbnail.filename, constants.CACHE_TYPE_THUMBNAILS)
|
||||||
|
self.generate_series_thumbnail(series_books, thumbnail)
|
||||||
|
except Exception as ex:
|
||||||
|
self.log.info('Error updating book thumbnail: ' + str(ex))
|
||||||
|
self._handleError('Error updating book thumbnail: ' + str(ex))
|
||||||
|
self.app_db_session.rollback()
|
||||||
|
|
||||||
|
def generate_series_thumbnail(self, series_books, thumbnail):
|
||||||
|
# Get the last four books in the series based on series_index
|
||||||
|
books = sorted(series_books, key=lambda b: float(b.series_index), reverse=True)[:4]
|
||||||
|
|
||||||
|
top = 0
|
||||||
|
left = 0
|
||||||
|
width = 0
|
||||||
|
height = 0
|
||||||
|
with Image() as canvas:
|
||||||
|
for book in books:
|
||||||
|
if config.config_use_google_drive:
|
||||||
|
if not gdriveutils.is_gdrive_ready():
|
||||||
|
raise Exception('Google Drive is configured but not ready')
|
||||||
|
|
||||||
|
web_content_link = gdriveutils.get_cover_via_gdrive(book.path)
|
||||||
|
if not web_content_link:
|
||||||
|
raise Exception('Google Drive cover url not found')
|
||||||
|
|
||||||
|
stream = None
|
||||||
|
try:
|
||||||
|
stream = urlopen(web_content_link)
|
||||||
|
with Image(file=stream) as img:
|
||||||
|
# Use the first image in this set to determine the width and height to scale the
|
||||||
|
# other images in this set
|
||||||
|
if width == 0 or height == 0:
|
||||||
|
width = get_resize_width(thumbnail.resolution, img.width, img.height)
|
||||||
|
height = get_resize_height(thumbnail.resolution)
|
||||||
|
canvas.blank(width, height)
|
||||||
|
|
||||||
|
dimensions = get_best_fit(width, height, img.width, img.height)
|
||||||
|
|
||||||
|
# resize and crop the image
|
||||||
|
img.resize(width=int(dimensions['width']), height=int(dimensions['height']),
|
||||||
|
filter='lanczos')
|
||||||
|
img.crop(width=int(width / 2.0), height=int(height / 2.0), gravity='center')
|
||||||
|
|
||||||
|
# add the image to the canvas
|
||||||
|
canvas.composite(img, left, top)
|
||||||
|
|
||||||
|
except Exception as ex:
|
||||||
|
self.log.info('Error generating thumbnail file: ' + str(ex))
|
||||||
|
raise ex
|
||||||
|
finally:
|
||||||
|
if stream is not None:
|
||||||
|
stream.close()
|
||||||
|
|
||||||
|
book_cover_filepath = os.path.join(config.config_calibre_dir, book.path, 'cover.jpg')
|
||||||
|
if not os.path.isfile(book_cover_filepath):
|
||||||
|
raise Exception('Book cover file not found')
|
||||||
|
|
||||||
|
with Image(filename=book_cover_filepath) as img:
|
||||||
|
# Use the first image in this set to determine the width and height to scale the
|
||||||
|
# other images in this set
|
||||||
|
if width == 0 or height == 0:
|
||||||
|
width = get_resize_width(thumbnail.resolution, img.width, img.height)
|
||||||
|
height = get_resize_height(thumbnail.resolution)
|
||||||
|
canvas.blank(width, height)
|
||||||
|
|
||||||
|
dimensions = get_best_fit(width, height, img.width, img.height)
|
||||||
|
|
||||||
|
# resize and crop the image
|
||||||
|
img.resize(width=int(dimensions['width']), height=int(dimensions['height']), filter='lanczos')
|
||||||
|
img.crop(width=int(width / 2.0), height=int(height / 2.0), gravity='center')
|
||||||
|
|
||||||
|
# add the image to the canvas
|
||||||
|
canvas.composite(img, left, top)
|
||||||
|
|
||||||
|
# set the coordinates for the next iteration
|
||||||
|
if left == 0 and top == 0:
|
||||||
|
left = int(width / 2.0)
|
||||||
|
elif left == int(width / 2.0) and top == 0:
|
||||||
|
left = 0
|
||||||
|
top = int(height / 2.0)
|
||||||
|
else:
|
||||||
|
left = int(width / 2.0)
|
||||||
|
|
||||||
|
canvas.format = thumbnail.format
|
||||||
|
filename = self.cache.get_cache_file_path(thumbnail.filename, constants.CACHE_TYPE_THUMBNAILS)
|
||||||
|
canvas.save(filename=filename)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def name(self):
|
||||||
|
return N_('Cover Thumbnails')
|
||||||
|
|
||||||
|
def __str__(self):
|
||||||
|
return "GenerateSeriesThumbnails"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_cancellable(self):
|
||||||
|
return True
|
||||||
|
|
||||||
|
|
||||||
|
class TaskClearCoverThumbnailCache(CalibreTask):
|
||||||
|
def __init__(self, book_id, task_message=N_('Clearing cover thumbnail cache')):
|
||||||
|
super(TaskClearCoverThumbnailCache, self).__init__(task_message)
|
||||||
|
self.log = logger.create()
|
||||||
|
self.book_id = book_id
|
||||||
|
self.calibre_db = db.CalibreDB(expire_on_commit=False)
|
||||||
|
self.app_db_session = ub.get_new_session_instance()
|
||||||
|
self.cache = fs.FileSystem()
|
||||||
|
|
||||||
|
def run(self, worker_thread):
|
||||||
|
if self.app_db_session:
|
||||||
|
if self.book_id == 0: # delete superfluous thumbnails
|
||||||
|
thumbnails = (self.calibre_db.session.query(ub.Thumbnail)
|
||||||
|
.join(db.Books, ub.Thumbnail.entity_id == db.Books.id, isouter=True)
|
||||||
|
.filter(db.Books.id == None)
|
||||||
|
.all())
|
||||||
|
elif self.book_id > 0: # make sure single book is selected
|
||||||
|
thumbnails = self.get_thumbnails_for_book(self.book_id)
|
||||||
|
if self.book_id < 0:
|
||||||
|
self.delete_all_thumbnails()
|
||||||
|
else:
|
||||||
|
for thumbnail in thumbnails:
|
||||||
|
self.delete_thumbnail(thumbnail)
|
||||||
|
self._handleSuccess()
|
||||||
|
self.app_db_session.remove()
|
||||||
|
|
||||||
|
def get_thumbnails_for_book(self, book_id):
|
||||||
|
return self.app_db_session \
|
||||||
|
.query(ub.Thumbnail) \
|
||||||
|
.filter(ub.Thumbnail.type == constants.THUMBNAIL_TYPE_COVER) \
|
||||||
|
.filter(ub.Thumbnail.entity_id == book_id) \
|
||||||
|
.all()
|
||||||
|
|
||||||
|
def delete_thumbnail(self, thumbnail):
|
||||||
|
try:
|
||||||
|
self.cache.delete_cache_file(thumbnail.filename, constants.CACHE_TYPE_THUMBNAILS)
|
||||||
|
self.app_db_session \
|
||||||
|
.query(ub.Thumbnail) \
|
||||||
|
.filter(ub.Thumbnail.type == constants.THUMBNAIL_TYPE_COVER) \
|
||||||
|
.filter(ub.Thumbnail.entity_id == thumbnail.entity_id) \
|
||||||
|
.delete()
|
||||||
|
self.app_db_session.commit()
|
||||||
|
except Exception as ex:
|
||||||
|
self.log.info('Error deleting book thumbnail: ' + str(ex))
|
||||||
|
self._handleError('Error deleting book thumbnail: ' + str(ex))
|
||||||
|
|
||||||
|
def delete_all_thumbnails(self):
|
||||||
|
try:
|
||||||
|
self.app_db_session.query(ub.Thumbnail).filter(ub.Thumbnail.type == constants.THUMBNAIL_TYPE_COVER).delete()
|
||||||
|
self.app_db_session.commit()
|
||||||
|
self.cache.delete_cache_dir(constants.CACHE_TYPE_THUMBNAILS)
|
||||||
|
except Exception as ex:
|
||||||
|
self.log.info('Error deleting thumbnail directory: ' + str(ex))
|
||||||
|
self._handleError('Error deleting thumbnail directory: ' + str(ex))
|
||||||
|
|
||||||
|
@property
|
||||||
|
def name(self):
|
||||||
|
return N_('Cover Thumbnails')
|
||||||
|
|
||||||
|
# needed for logging
|
||||||
|
def __str__(self):
|
||||||
|
if self.book_id > 0:
|
||||||
|
return "Replace/Delete Cover Thumbnails for book " + str(self.book_id)
|
||||||
|
else:
|
||||||
|
return "Delete Thumbnail cache directory"
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_cancellable(self):
|
||||||
|
return False
|
|
@ -17,11 +17,14 @@
|
||||||
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
from datetime import datetime
|
from datetime import datetime
|
||||||
|
|
||||||
|
from flask_babel import lazy_gettext as N_
|
||||||
|
|
||||||
from cps.services.worker import CalibreTask, STAT_FINISH_SUCCESS
|
from cps.services.worker import CalibreTask, STAT_FINISH_SUCCESS
|
||||||
|
|
||||||
class TaskUpload(CalibreTask):
|
class TaskUpload(CalibreTask):
|
||||||
def __init__(self, taskMessage, book_title):
|
def __init__(self, task_message, book_title):
|
||||||
super(TaskUpload, self).__init__(taskMessage)
|
super(TaskUpload, self).__init__(task_message)
|
||||||
self.start_time = self.end_time = datetime.now()
|
self.start_time = self.end_time = datetime.now()
|
||||||
self.stat = STAT_FINISH_SUCCESS
|
self.stat = STAT_FINISH_SUCCESS
|
||||||
self.progress = 1
|
self.progress = 1
|
||||||
|
@ -32,7 +35,11 @@ class TaskUpload(CalibreTask):
|
||||||
|
|
||||||
@property
|
@property
|
||||||
def name(self):
|
def name(self):
|
||||||
return "Upload"
|
return N_("Upload")
|
||||||
|
|
||||||
def __str__(self):
|
def __str__(self):
|
||||||
return "Upload {}".format(self.book_title)
|
return "Upload {}".format(self.book_title)
|
||||||
|
|
||||||
|
@property
|
||||||
|
def is_cancellable(self):
|
||||||
|
return False
|
||||||
|
|
|
@ -47,7 +47,9 @@
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
</table>
|
</table>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<a class="btn btn-default" id="admin_user_table" href="{{url_for('admin.edit_user_table')}}">{{_('Edit Users')}}</a>
|
{% if not simple %}
|
||||||
|
<a class="btn btn-default" id="admin_user_table" href="{{url_for('admin.edit_user_table')}}">{{_('Edit Users')}}</a>
|
||||||
|
{% endif %}
|
||||||
<a class="btn btn-default" id="admin_new_user" href="{{url_for('admin.new_user')}}">{{_('Add New User')}}</a>
|
<a class="btn btn-default" id="admin_new_user" href="{{url_for('admin.new_user')}}">{{_('Add New User')}}</a>
|
||||||
{% if (config.config_login_type == 1) %}
|
{% if (config.config_login_type == 1) %}
|
||||||
<div class="btn btn-default" id="import_ldap_users" data-toggle="modal" data-target="#StatusDialog">{{_('Import LDAP Users')}}</div>
|
<div class="btn btn-default" id="import_ldap_users" data-toggle="modal" data-target="#StatusDialog">{{_('Import LDAP Users')}}</div>
|
||||||
|
@ -159,16 +161,51 @@
|
||||||
<a class="btn btn-default" id="view_config" href="{{url_for('admin.view_configuration')}}">{{_('Edit UI Configuration')}}</a>
|
<a class="btn btn-default" id="view_config" href="{{url_for('admin.view_configuration')}}">{{_('Edit UI Configuration')}}</a>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
{% if feature_support['scheduler'] %}
|
||||||
|
<div class="row">
|
||||||
|
<div class="col">
|
||||||
|
<h2>{{_('Scheduled Tasks')}}</h2>
|
||||||
|
<div class="col-xs-12 col-sm-12 scheduled_tasks_details">
|
||||||
|
<div class="row">
|
||||||
|
<div class="col-xs-6 col-sm-3">{{_('Time at which tasks start to run')}}</div>
|
||||||
|
<div class="col-xs-6 col-sm-3">{{schedule_time}}</div>
|
||||||
|
</div>
|
||||||
|
<div class="row">
|
||||||
|
<div class="col-xs-6 col-sm-3">{{_('Maximum tasks duration')}}</div>
|
||||||
|
<div class="col-xs-6 col-sm-3">{{schedule_duration}}</div>
|
||||||
|
</div>
|
||||||
|
<div class="row">
|
||||||
|
<div class="col-xs-6 col-sm-3">{{_('Generate book cover thumbnails')}}</div>
|
||||||
|
<div class="col-xs-6 col-sm-3">{{ display_bool_setting(config.schedule_generate_book_covers) }}</div>
|
||||||
|
</div>
|
||||||
|
<!--div class="row">
|
||||||
|
<div class="col-xs-6 col-sm-3">{{_('Generate series cover thumbnails')}}</div>
|
||||||
|
<div class="col-xs-6 col-sm-3">{{ display_bool_setting(config.schedule_generate_series_covers) }}</div>
|
||||||
|
</div-->
|
||||||
|
<div class="row">
|
||||||
|
<div class="col-xs-6 col-sm-3">{{_('Reconnect to Calibre Library')}}</div>
|
||||||
|
<div class="col-xs-6 col-sm-3">{{ display_bool_setting(config.schedule_reconnect) }}</div>
|
||||||
|
</div>
|
||||||
|
|
||||||
<div class="row form-group">
|
</div>
|
||||||
<h2>{{_('Administration')}}</h2>
|
<a class="btn btn-default scheduledtasks" id="admin_edit_scheduled_tasks" href="{{url_for('admin.edit_scheduledtasks')}}">{{_('Edit Scheduled Tasks Settings')}}</a>
|
||||||
<a class="btn btn-default" id="debug" href="{{url_for('admin.download_debug')}}">{{_('Download Debug Package')}}</a>
|
{% if config.schedule_generate_book_covers %}
|
||||||
<a class="btn btn-default" id="logfile" href="{{url_for('admin.view_logfile')}}">{{_('View Logs')}}</a>
|
<a class="btn btn-default" id="admin_refresh_cover_cache">{{_('Refresh Thumbnail Cover Cache')}}</a>
|
||||||
|
{% endif %}
|
||||||
</div>
|
</div>
|
||||||
<div class="row form-group">
|
</div>
|
||||||
<div class="btn btn-default" id="restart_database" data-toggle="modal" data-target="#StatusDialog">{{_('Reconnect Calibre Database')}}</div>
|
{% endif %}
|
||||||
<div class="btn btn-default" id="admin_restart" data-toggle="modal" data-target="#RestartDialog">{{_('Restart')}}</div>
|
<div class="row form-group">
|
||||||
<div class="btn btn-default" id="admin_stop" data-toggle="modal" data-target="#ShutdownDialog">{{_('Shutdown')}}</div>
|
<h2>{{_('Administration')}}</h2>
|
||||||
|
<a class="btn btn-default" id="debug" href="{{url_for('admin.download_debug')}}">{{_('Download Debug Package')}}</a>
|
||||||
|
<a class="btn btn-default" id="logfile" href="{{url_for('admin.view_logfile')}}">{{_('View Logs')}}</a>
|
||||||
|
</div>
|
||||||
|
<div class="row form-group">
|
||||||
|
<div class="btn btn-default" id="restart_database" data-toggle="modal" data-target="#StatusDialog">{{_('Reconnect Calibre Database')}}</div>
|
||||||
|
</div>
|
||||||
|
<div class="row form-group">
|
||||||
|
<div class="btn btn-default" id="admin_restart" data-toggle="modal" data-target="#RestartDialog">{{_('Restart')}}</div>
|
||||||
|
<div class="btn btn-default" id="admin_stop" data-toggle="modal" data-target="#ShutdownDialog">{{_('Shutdown')}}</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div class="row">
|
<div class="row">
|
||||||
|
@ -250,3 +287,6 @@
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
{% block modal %}
|
||||||
|
{{ change_confirm_modal() }}
|
||||||
|
{% endblock %}
|
||||||
|
|
|
@ -31,28 +31,27 @@
|
||||||
<a id="pub_old" data-toggle="tooltip" title="{{_('Sort according to publishing date, oldest first')}}" class="btn btn-primary{% if order == "pubold" %} active{% endif%}" href="{{url_for('web.books_list', data='author', book_id=id, sort_param='pubold')}}"><span class="glyphicon glyphicon-calendar"></span><span class="glyphicon glyphicon-sort-by-order-alt"></span></a>
|
<a id="pub_old" data-toggle="tooltip" title="{{_('Sort according to publishing date, oldest first')}}" class="btn btn-primary{% if order == "pubold" %} active{% endif%}" href="{{url_for('web.books_list', data='author', book_id=id, sort_param='pubold')}}"><span class="glyphicon glyphicon-calendar"></span><span class="glyphicon glyphicon-sort-by-order-alt"></span></a>
|
||||||
</div>
|
</div>
|
||||||
<div class="row display-flex">
|
<div class="row display-flex">
|
||||||
{% if entries[0] %}
|
|
||||||
{% for entry in entries %}
|
{% for entry in entries %}
|
||||||
<div id="books" class="col-sm-3 col-lg-2 col-xs-6 book">
|
<div id="books" class="col-sm-3 col-lg-2 col-xs-6 book">
|
||||||
<div class="cover">
|
<div class="cover">
|
||||||
<a href="{{ url_for('web.show_book', book_id=entry.id) }}">
|
<a href="{{ url_for('web.show_book', book_id=entry.Books.id) }}" {% if simple==false %}data-toggle="modal" data-target="#bookDetailsModal" data-remote="false"{% endif %}>
|
||||||
<span class="img" title="{{entry.title}}">
|
<span class="img" title="{{entry.Books.title}}">
|
||||||
<img src="{{ url_for('web.get_cover', book_id=entry.id) }}" />
|
{{ image.book_cover(entry.Books, alt=author.name|safe) }}
|
||||||
{% if entry.id in read_book_ids %}<span class="badge read glyphicon glyphicon-ok"></span>{% endif %}
|
{% if entry[2] == True %}<span class="badge read glyphicon glyphicon-ok"></span>{% endif %}
|
||||||
</span>
|
</span>
|
||||||
</a>
|
</a>
|
||||||
</div>
|
</div>
|
||||||
<div class="meta">
|
<div class="meta">
|
||||||
<a href="{{ url_for('web.show_book', book_id=entry.id) }}">
|
<a href="{{ url_for('web.show_book', book_id=entry.Books.id) }}" {% if simple==false %}data-toggle="modal" data-target="#bookDetailsModal" data-remote="false"{% endif %}>
|
||||||
<p title="{{ entry.title }}" class="title">{{entry.title|shortentitle}}</p>
|
<p title="{{ entry.Books.title }}" class="title">{{entry.Books.title|shortentitle}}</p>
|
||||||
</a>
|
</a>
|
||||||
<p class="author">
|
<p class="author">
|
||||||
{% for author in entry.authors %}
|
{% for author in entry.Books.authors %}
|
||||||
{% if loop.index > g.config_authors_max and g.config_authors_max != 0 %}
|
{% if loop.index > g.config_authors_max and g.config_authors_max != 0 %}
|
||||||
{% if not loop.first %}
|
{% if not loop.first %}
|
||||||
<span class="author-hidden-divider">&</span>
|
<span class="author-hidden-divider">&</span>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<a class="author-name author-hidden" href="{{url_for('web.books_list', data='author', sort_param='new', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
<a class="author-name author-hidden" href="{{url_for('web.books_list', data='author', sort_param='stored', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
||||||
{% if loop.last %}
|
{% if loop.last %}
|
||||||
<a href="#" class="author-expand" data-authors-max="{{g.config_authors_max}}" data-collapse-caption="({{_('reduce')}})">(...)</a>
|
<a href="#" class="author-expand" data-authors-max="{{g.config_authors_max}}" data-collapse-caption="({{_('reduce')}})">(...)</a>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
@ -60,26 +59,26 @@
|
||||||
{% if not loop.first %}
|
{% if not loop.first %}
|
||||||
<span>&</span>
|
<span>&</span>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<a class="author-name" href="{{url_for('web.books_list', data='author', sort_param='new', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
<a class="author-name" href="{{url_for('web.books_list', data='author', sort_param='stored', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
{% for format in entry.data %}
|
{% for format in entry.Books.data %}
|
||||||
{% if format.format|lower in g.constants.EXTENSIONS_AUDIO %}
|
{% if format.format|lower in g.constants.EXTENSIONS_AUDIO %}
|
||||||
<span class="glyphicon glyphicon-music"></span>
|
<span class="glyphicon glyphicon-music"></span>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
</p>
|
</p>
|
||||||
{% if entry.series.__len__() > 0 %}
|
{% if entry.Books.series.__len__() > 0 %}
|
||||||
<p class="series">
|
<p class="series">
|
||||||
<a href="{{url_for('web.books_list', data='series', sort_param='new', book_id=entry.series[0].id )}}">
|
<a href="{{url_for('web.books_list', data='series', sort_param='stored', book_id=entry.Books.series[0].id )}}">
|
||||||
{{entry.series[0].name}}
|
{{entry.Books.series[0].name}}
|
||||||
</a>
|
</a>
|
||||||
({{entry.series_index|formatseriesindex}})
|
({{entry.Books.series_index|formatseriesindex}})
|
||||||
</p>
|
</p>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% if entry.ratings.__len__() > 0 %}
|
{% if entry.Books.ratings.__len__() > 0 %}
|
||||||
<div class="rating">
|
<div class="rating">
|
||||||
{% for number in range((entry.ratings[0].rating/2)|int(2)) %}
|
{% for number in range((entry.Books.ratings[0].rating/2)|int(2)) %}
|
||||||
<span class="glyphicon glyphicon-star good"></span>
|
<span class="glyphicon glyphicon-star good"></span>
|
||||||
{% if loop.last and loop.index < 5 %}
|
{% if loop.last and loop.index < 5 %}
|
||||||
{% for numer in range(5 - loop.index) %}
|
{% for numer in range(5 - loop.index) %}
|
||||||
|
@ -92,7 +91,6 @@
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
{% endif %}
|
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
@ -123,7 +121,7 @@
|
||||||
</p>
|
</p>
|
||||||
{% if entry.series.__len__() > 0 %}
|
{% if entry.series.__len__() > 0 %}
|
||||||
<p class="series">
|
<p class="series">
|
||||||
<a href="{{url_for('web.books_list', data='series', sort_param='new', book_id=entry.series[0].id )}}">
|
<a href="{{url_for('web.books_list', data='series', sort_param='stored', book_id=entry.series[0].id )}}">
|
||||||
{{entry.series[0].name}}
|
{{entry.series[0].name}}
|
||||||
</a>
|
</a>
|
||||||
({{entry.series_index|formatseriesindex}})
|
({{entry.series_index|formatseriesindex}})
|
||||||
|
|
|
@ -3,7 +3,8 @@
|
||||||
{% if book %}
|
{% if book %}
|
||||||
<div class="col-sm-3 col-lg-3 col-xs-12">
|
<div class="col-sm-3 col-lg-3 col-xs-12">
|
||||||
<div class="cover">
|
<div class="cover">
|
||||||
<img id="detailcover" title="{{book.title}}" src="{{ url_for('web.get_cover', book_id=book.id, edit=1|uuidfilter) }}" alt="{{ book.title }}"/>
|
<!-- Always use full-sized image for the book edit page -->
|
||||||
|
<img id="detailcover" title="{{book.title}}" src="{{url_for('web.get_cover', book_id=book.id, resolution='og', c=book|last_modified)}}" />
|
||||||
</div>
|
</div>
|
||||||
{% if g.user.role_delete_books() %}
|
{% if g.user.role_delete_books() %}
|
||||||
<div class="text-center">
|
<div class="text-center">
|
||||||
|
@ -22,7 +23,7 @@
|
||||||
|
|
||||||
{% if source_formats|length > 0 and conversion_formats|length > 0 %}
|
{% if source_formats|length > 0 and conversion_formats|length > 0 %}
|
||||||
<div class="text-center more-stuff"><h4>{{_('Convert book format:')}}</h4>
|
<div class="text-center more-stuff"><h4>{{_('Convert book format:')}}</h4>
|
||||||
<form class="padded-bottom" action="{{ url_for('editbook.convert_bookformat', book_id=book.id) }}" method="post" id="book_convert_frm">
|
<form class="padded-bottom" action="{{ url_for('edit-book.convert_bookformat', book_id=book.id) }}" method="post" id="book_convert_frm">
|
||||||
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
|
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
|
||||||
<div class="form-group">
|
<div class="form-group">
|
||||||
<div class="text-left">
|
<div class="text-left">
|
||||||
|
@ -48,7 +49,7 @@
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
|
||||||
</div>
|
</div>
|
||||||
<form role="form" action="{{ url_for('editbook.edit_book', book_id=book.id) }}" method="post" enctype="multipart/form-data" id="book_edit_frm">
|
<form role="form" action="{{ url_for('edit-book.edit_book', book_id=book.id) }}" method="post" enctype="multipart/form-data" id="book_edit_frm">
|
||||||
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
|
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
|
||||||
<div class="col-sm-9 col-xs-12">
|
<div class="col-sm-9 col-xs-12">
|
||||||
<div class="form-group">
|
<div class="form-group">
|
||||||
|
|
|
@ -1,3 +1,3 @@
|
||||||
<a href="{{ url_for('web.show_book', book_id=entry.id) }}" data-toggle="modal" data-target="#bookDetailsModal" data-remote="false">
|
<a href="{{ url_for('web.show_book', book_id=entry.id) }}" {% if simple==false %}data-toggle="modal" data-target="#bookDetailsModal" data-remote="false"{% endif %}>
|
||||||
<span class="title">{{entry.title|shortentitle}}</span>
|
<span class="title">{{entry.title|shortentitle}}</span>
|
||||||
</a>
|
</a>
|
|
@ -6,7 +6,7 @@
|
||||||
data-escape="true"
|
data-escape="true"
|
||||||
{% if g.user.role_edit() %}
|
{% if g.user.role_edit() %}
|
||||||
data-editable-type="text"
|
data-editable-type="text"
|
||||||
data-editable-url="{{ url_for('editbook.edit_list_book', param=parameter)}}"
|
data-editable-url="{{ url_for('edit-book.edit_list_book', param=parameter)}}"
|
||||||
data-editable-title="{{ edit_text }}"
|
data-editable-title="{{ edit_text }}"
|
||||||
data-edit="true"
|
data-edit="true"
|
||||||
{% if validate %}data-edit-validate="{{ _('This Field is Required') }}" {% endif %}
|
{% if validate %}data-edit-validate="{{ _('This Field is Required') }}" {% endif %}
|
||||||
|
@ -66,30 +66,30 @@
|
||||||
{{ text_table_row('authors', _('Enter Authors'),_('Authors'), true, true) }}
|
{{ text_table_row('authors', _('Enter Authors'),_('Authors'), true, true) }}
|
||||||
{{ text_table_row('tags', _('Enter Categories'),_('Categories'), false, true) }}
|
{{ text_table_row('tags', _('Enter Categories'),_('Categories'), false, true) }}
|
||||||
{{ text_table_row('series', _('Enter Series'),_('Series'), false, true) }}
|
{{ text_table_row('series', _('Enter Series'),_('Series'), false, true) }}
|
||||||
<th data-field="series_index" id="series_index" data-visible="{{visiblility.get('series_index')}}" data-edit-validate="{{ _('This Field is Required') }}" data-sortable="true" {% if g.user.role_edit() %} data-editable-type="number" data-editable-placeholder="1" data-editable-step="0.01" data-editable-min="0" data-editable-url="{{ url_for('editbook.edit_list_book', param='series_index')}}" data-edit="true" data-editable-title="{{_('Enter Title')}}"{% endif %}>{{_('Series Index')}}</th>
|
<th data-field="series_index" id="series_index" data-visible="{{visiblility.get('series_index')}}" data-edit-validate="{{ _('This Field is Required') }}" data-sortable="true" {% if g.user.role_edit() %} data-editable-type="number" data-editable-placeholder="1" data-editable-step="0.01" data-editable-min="0" data-editable-url="{{ url_for('edit-book.edit_list_book', param='series_index')}}" data-edit="true" data-editable-title="{{_('Enter Title')}}"{% endif %}>{{_('Series Index')}}</th>
|
||||||
{{ text_table_row('languages', _('Enter Languages'),_('Languages'), false, true) }}
|
{{ text_table_row('languages', _('Enter Languages'),_('Languages'), false, true) }}
|
||||||
<!--th data-field="pubdate" data-type="date" data-visible="{{visiblility.get('pubdate')}}" data-viewformat="dd.mm.yyyy" id="pubdate" data-sortable="true">{{_('Publishing Date')}}</th-->
|
<!--th data-field="pubdate" data-type="date" data-visible="{{visiblility.get('pubdate')}}" data-viewformat="dd.mm.yyyy" id="pubdate" data-sortable="true">{{_('Publishing Date')}}</th-->
|
||||||
{{ text_table_row('publishers', _('Enter Publishers'),_('Publishers'), false, true) }}
|
{{ text_table_row('publishers', _('Enter Publishers'),_('Publishers'), false, true) }}
|
||||||
<th data-field="comments" id="comments" data-escape="true" data-editable-mode="popup" data-visible="{{visiblility.get('comments')}}" data-sortable="false" {% if g.user.role_edit() %} data-editable-type="wysihtml5" data-editable-url="{{ url_for('editbook.edit_list_book', param='comments')}}" data-edit="true" data-editable-title="{{_('Enter comments')}}"{% endif %}>{{_('Comments')}}</th>
|
<th data-field="comments" id="comments" data-escape="true" data-editable-mode="popup" data-visible="{{visiblility.get('comments')}}" data-sortable="false" {% if g.user.role_edit() %} data-editable-type="wysihtml5" data-editable-url="{{ url_for('edit-book.edit_list_book', param='comments')}}" data-edit="true" data-editable-title="{{_('Enter comments')}}"{% endif %}>{{_('Comments')}}</th>
|
||||||
{% if g.user.check_visibility(32768) %}
|
{% if g.user.check_visibility(32768) %}
|
||||||
{{ book_checkbox_row('is_archived', _('Archiv Status'), false)}}
|
{{ book_checkbox_row('is_archived', _('Archiv Status'), false)}}
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{{ book_checkbox_row('read_status', _('Read Status'), false)}}
|
{{ book_checkbox_row('read_status', _('Read Status'), false)}}
|
||||||
{% for c in cc %}
|
{% for c in cc %}
|
||||||
{% if c.datatype == "int" %}
|
{% if c.datatype == "int" %}
|
||||||
<th data-field="custom_column_{{ c.id|string }}" id="custom_column_{{ c.id|string }}" data-visible="{{visiblility.get('custom_column_'+ c.id|string)}}" data-sortable="false" {% if g.user.role_edit() %} data-editable-type="number" data-editable-placeholder="1" data-editable-step="1" data-editable-url="{{ url_for('editbook.edit_list_book', param='custom_column_'+ c.id|string)}}" data-edit="true" data-editable-title="{{_('Enter ') + c.name}}"{% endif %}>{{c.name}}</th>
|
<th data-field="custom_column_{{ c.id|string }}" id="custom_column_{{ c.id|string }}" data-visible="{{visiblility.get('custom_column_'+ c.id|string)}}" data-sortable="false" {% if g.user.role_edit() %} data-editable-type="number" data-editable-placeholder="1" data-editable-step="1" data-editable-url="{{ url_for('edit-book.edit_list_book', param='custom_column_'+ c.id|string)}}" data-edit="true" data-editable-title="{{_('Enter ') + c.name}}"{% endif %}>{{c.name}}</th>
|
||||||
{% elif c.datatype == "rating" %}
|
{% elif c.datatype == "rating" %}
|
||||||
<th data-field="custom_column_{{ c.id|string }}" id="custom_column_{{ c.id|string }}" data-formatter="ratingFormatter" data-visible="{{visiblility.get('custom_column_'+ c.id|string)}}" data-sortable="false" {% if g.user.role_edit() %} data-editable-type="number" data-editable-placeholder="1" data-editable-step="0.5" data-editable-step="1" data-editable-min="1" data-editable-max="5" data-editable-url="{{ url_for('editbook.edit_list_book', param='custom_column_'+ c.id|string)}}" data-edit="true" data-editable-title="{{_('Enter ') + c.name}}"{% endif %}>{{c.name}}</th>
|
<th data-field="custom_column_{{ c.id|string }}" id="custom_column_{{ c.id|string }}" data-formatter="ratingFormatter" data-visible="{{visiblility.get('custom_column_'+ c.id|string)}}" data-sortable="false" {% if g.user.role_edit() %} data-editable-type="number" data-editable-placeholder="1" data-editable-step="0.5" data-editable-step="1" data-editable-min="1" data-editable-max="5" data-editable-url="{{ url_for('edit-book.edit_list_book', param='custom_column_'+ c.id|string)}}" data-edit="true" data-editable-title="{{_('Enter ') + c.name}}"{% endif %}>{{c.name}}</th>
|
||||||
{% elif c.datatype == "float" %}
|
{% elif c.datatype == "float" %}
|
||||||
<th data-field="custom_column_{{ c.id|string }}" id="custom_column_{{ c.id|string }}" data-visible="{{visiblility.get('custom_column_'+ c.id|string)}}" data-sortable="false" {% if g.user.role_edit() %} data-editable-type="number" data-editable-placeholder="1" data-editable-step="0.01" data-editable-url="{{ url_for('editbook.edit_list_book', param='custom_column_'+ c.id|string)}}" data-edit="true" data-editable-title="{{_('Enter ') + c.name}}"{% endif %}>{{c.name}}</th>
|
<th data-field="custom_column_{{ c.id|string }}" id="custom_column_{{ c.id|string }}" data-visible="{{visiblility.get('custom_column_'+ c.id|string)}}" data-sortable="false" {% if g.user.role_edit() %} data-editable-type="number" data-editable-placeholder="1" data-editable-step="0.01" data-editable-url="{{ url_for('edit-book.edit_list_book', param='custom_column_'+ c.id|string)}}" data-edit="true" data-editable-title="{{_('Enter ') + c.name}}"{% endif %}>{{c.name}}</th>
|
||||||
{% elif c.datatype == "enumeration" %}
|
{% elif c.datatype == "enumeration" %}
|
||||||
<th data-field="custom_column_{{ c.id|string }}" id="custom_column_{{ c.id|string }}" data-visible="{{visiblility.get('custom_column_'+ c.id|string)}}" data-sortable="false" {% if g.user.role_edit() %} data-editable-type="select" data-editable-source={{ url_for('editbook.table_get_custom_enum', c_id=c.id) }} data-editable-url="{{ url_for('editbook.edit_list_book', param='custom_column_'+ c.id|string)}}" data-edit="true" data-editable-title="{{_('Enter ') + c.name}}"{% endif %}>{{c.name}}</th>
|
<th data-field="custom_column_{{ c.id|string }}" id="custom_column_{{ c.id|string }}" data-visible="{{visiblility.get('custom_column_'+ c.id|string)}}" data-sortable="false" {% if g.user.role_edit() %} data-editable-type="select" data-editable-source={{ url_for('edit-book.table_get_custom_enum', c_id=c.id) }} data-editable-url="{{ url_for('edit-book.edit_list_book', param='custom_column_'+ c.id|string)}}" data-edit="true" data-editable-title="{{_('Enter ') + c.name}}"{% endif %}>{{c.name}}</th>
|
||||||
{% elif c.datatype in ["datetime"] %}
|
{% elif c.datatype in ["datetime"] %}
|
||||||
<!-- missing -->
|
<!-- missing -->
|
||||||
{% elif c.datatype == "text" %}
|
{% elif c.datatype == "text" %}
|
||||||
{{ text_table_row('custom_column_' + c.id|string, _('Enter ') + c.name, c.name, false, false) }}
|
{{ text_table_row('custom_column_' + c.id|string, _('Enter ') + c.name, c.name, false, false) }}
|
||||||
{% elif c.datatype == "comments" %}
|
{% elif c.datatype == "comments" %}
|
||||||
<th data-field="custom_column_{{ c.id|string }}" id="custom_column_{{ c.id|string }}" data-escape="true" data-editable-mode="popup" data-visible="{{visiblility.get('custom_column_'+ c.id|string)}}" data-sortable="false" {% if g.user.role_edit() %} data-editable-type="wysihtml5" data-editable-url="{{ url_for('editbook.edit_list_book', param='custom_column_'+ c.id|string)}}" data-edit="true" data-editable-title="{{_('Enter ') + c.name}}"{% endif %}>{{c.name}}</th>
|
<th data-field="custom_column_{{ c.id|string }}" id="custom_column_{{ c.id|string }}" data-escape="true" data-editable-mode="popup" data-visible="{{visiblility.get('custom_column_'+ c.id|string)}}" data-sortable="false" {% if g.user.role_edit() %} data-editable-type="wysihtml5" data-editable-url="{{ url_for('edit-book.edit_list_book', param='custom_column_'+ c.id|string)}}" data-edit="true" data-editable-title="{{_('Enter ') + c.name}}"{% endif %}>{{c.name}}</th>
|
||||||
{% elif c.datatype == "bool" %}
|
{% elif c.datatype == "bool" %}
|
||||||
{{ book_checkbox_row('custom_column_' + c.id|string, c.name, false)}}
|
{{ book_checkbox_row('custom_column_' + c.id|string, c.name, false)}}
|
||||||
{% else %}
|
{% else %}
|
||||||
|
|
|
@ -162,8 +162,10 @@
|
||||||
<input type="checkbox" name="Show_detail_random" id="Show_detail_random" {% if conf.show_detail_random() %}checked{% endif %}>
|
<input type="checkbox" name="Show_detail_random" id="Show_detail_random" {% if conf.show_detail_random() %}checked{% endif %}>
|
||||||
<label for="Show_detail_random">{{_('Show Random Books in Detail View')}}</label>
|
<label for="Show_detail_random">{{_('Show Random Books in Detail View')}}</label>
|
||||||
</div>
|
</div>
|
||||||
|
{% if not simple %}
|
||||||
<a href="#" id="get_tags" data-id="0" class="btn btn-default" data-toggle="modal" data-target="#restrictModal">{{_('Add Allowed/Denied Tags')}}</a>
|
<a href="#" id="get_tags" data-id="0" class="btn btn-default" data-toggle="modal" data-target="#restrictModal">{{_('Add Allowed/Denied Tags')}}</a>
|
||||||
<a href="#" id="get_column_values" data-id="0" class="btn btn-default" data-toggle="modal" data-target="#restrictModal">{{_('Add Allowed/Denied custom column values')}}</a>
|
<a href="#" id="get_column_values" data-id="0" class="btn btn-default" data-toggle="modal" data-target="#restrictModal">{{_('Add Allowed/Denied custom column values')}}</a>
|
||||||
|
{% endif %}
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -4,7 +4,8 @@
|
||||||
<div class="row">
|
<div class="row">
|
||||||
<div class="col-sm-3 col-lg-3 col-xs-5">
|
<div class="col-sm-3 col-lg-3 col-xs-5">
|
||||||
<div class="cover">
|
<div class="cover">
|
||||||
<img id="detailcover" title="{{entry.title}}" src="{{ url_for('web.get_cover', book_id=entry.id, edit=1|uuidfilter) }}" alt="{{ entry.title }}" />
|
<!-- Always use full-sized image for the detail page -->
|
||||||
|
<img id="detailcover" title="{{entry.title}}" src="{{url_for('web.get_cover', book_id=entry.id, resolution='og', c=entry|last_modified)}}" />
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
<div class="col-sm-9 col-lg-9 book-meta">
|
<div class="col-sm-9 col-lg-9 book-meta">
|
||||||
|
@ -99,7 +100,7 @@
|
||||||
</div>
|
</div>
|
||||||
<h2 id="title">{{entry.title}}</h2>
|
<h2 id="title">{{entry.title}}</h2>
|
||||||
<p class="author">
|
<p class="author">
|
||||||
{% for author in entry.authors %}
|
{% for author in entry.ordered_authors %}
|
||||||
<a href="{{url_for('web.books_list', data='author', sort_param='stored', book_id=author.id ) }}">{{author.name.replace('|',',')}}</a>
|
<a href="{{url_for('web.books_list', data='author', sort_param='stored', book_id=author.id ) }}">{{author.name.replace('|',',')}}</a>
|
||||||
{% if not loop.last %}
|
{% if not loop.last %}
|
||||||
&
|
&
|
||||||
|
@ -138,7 +139,7 @@
|
||||||
<p>
|
<p>
|
||||||
<span class="glyphicon glyphicon-link"></span>
|
<span class="glyphicon glyphicon-link"></span>
|
||||||
{% for identifier in entry.identifiers %}
|
{% for identifier in entry.identifiers %}
|
||||||
<a href="{{identifier}}" target="_blank" class="btn btn-xs btn-success" role="button">{{identifier.formatType()}}</a>
|
<a href="{{identifier}}" target="_blank" class="btn btn-xs btn-success" role="button">{{identifier.format_type()}}</a>
|
||||||
{%endfor%}
|
{%endfor%}
|
||||||
</p>
|
</p>
|
||||||
</div>
|
</div>
|
||||||
|
@ -295,7 +296,7 @@
|
||||||
{% if g.user.role_edit() %}
|
{% if g.user.role_edit() %}
|
||||||
<div class="btn-toolbar" role="toolbar">
|
<div class="btn-toolbar" role="toolbar">
|
||||||
<div class="btn-group" role="group" aria-label="Edit/Delete book">
|
<div class="btn-group" role="group" aria-label="Edit/Delete book">
|
||||||
<a href="{{ url_for('editbook.edit_book', book_id=entry.id) }}" class="btn btn-sm btn-primary" id="edit_book" role="button"><span class="glyphicon glyphicon-edit"></span> {{_('Edit Metadata')}}</a>
|
<a href="{{ url_for('edit-book.show_edit_book', book_id=entry.id) }}" class="btn btn-sm btn-primary" id="edit_book" role="button"><span class="glyphicon glyphicon-edit"></span> {{_('Edit Metadata')}}</a>
|
||||||
</div>
|
</div>
|
||||||
</div>
|
</div>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
|
|
@ -1,65 +0,0 @@
|
||||||
{% extends "layout.html" %}
|
|
||||||
{% block body %}
|
|
||||||
<div class="discover load-more">
|
|
||||||
<h2>{{title}}</h2>
|
|
||||||
<div class="row display-flex">
|
|
||||||
{% for entry in entries %}
|
|
||||||
<div class="col-sm-3 col-lg-2 col-xs-6 book">
|
|
||||||
<div class="cover">
|
|
||||||
{% if entry.has_cover is defined %}
|
|
||||||
<a href="{{ url_for('web.show_book', book_id=entry.id) }}" data-toggle="modal" data-target="#bookDetailsModal" data-remote="false">
|
|
||||||
<span class="img" title="{{entry.title}}">
|
|
||||||
<img src="{{ url_for('web.get_cover', book_id=entry.id) }}" alt="{{ entry.title }}" />
|
|
||||||
{% if entry.id in read_book_ids %}<span class="badge read glyphicon glyphicon-ok"></span>{% endif %}
|
|
||||||
</span>
|
|
||||||
</a>
|
|
||||||
{% endif %}
|
|
||||||
</div>
|
|
||||||
<div class="meta">
|
|
||||||
<a href="{{ url_for('web.show_book', book_id=entry.id) }}" data-toggle="modal" data-target="#bookDetailsModal" data-remote="false">
|
|
||||||
<p title="{{ entry.title }}" class="title">{{entry.title|shortentitle}}</p>
|
|
||||||
</a>
|
|
||||||
<p class="author">
|
|
||||||
{% for author in entry.authors %}
|
|
||||||
{% if loop.index > g.config_authors_max and g.config_authors_max != 0 %}
|
|
||||||
{% if not loop.first %}
|
|
||||||
<span class="author-hidden-divider">&</span>
|
|
||||||
{% endif %}
|
|
||||||
<a class="author-name author-hidden" href="{{url_for('web.books_list', data='author', sort_param='new', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
|
||||||
{% if loop.last %}
|
|
||||||
<a href="#" class="author-expand" data-authors-max="{{g.config_authors_max}}" data-collapse-caption="({{_('reduce')}})">(...)</a>
|
|
||||||
{% endif %}
|
|
||||||
{% else %}
|
|
||||||
{% if not loop.first %}
|
|
||||||
<span>&</span>
|
|
||||||
{% endif %}
|
|
||||||
<a class="author-name" href="{{url_for('web.books_list', data='author', sort_param='new', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
|
||||||
{% endif %}
|
|
||||||
{% endfor %}
|
|
||||||
</p>
|
|
||||||
{% if entry.series.__len__() > 0 %}
|
|
||||||
<p class="series">
|
|
||||||
<a href="{{url_for('web.books_list', data='series', sort_param='new', book_id=entry.series[0].id )}}">
|
|
||||||
{{entry.series[0].name}}
|
|
||||||
</a>
|
|
||||||
({{entry.series_index|formatseriesindex}})
|
|
||||||
</p>
|
|
||||||
{% endif %}
|
|
||||||
{% if entry.ratings.__len__() > 0 %}
|
|
||||||
<div class="rating">
|
|
||||||
{% for number in range((entry.ratings[0].rating/2)|int(2)) %}
|
|
||||||
<span class="glyphicon glyphicon-star good"></span>
|
|
||||||
{% if loop.last and loop.index < 5 %}
|
|
||||||
{% for numer in range(5 - loop.index) %}
|
|
||||||
<span class="glyphicon glyphicon-star-empty"></span>
|
|
||||||
{% endfor %}
|
|
||||||
{% endif %}
|
|
||||||
{% endfor %}
|
|
||||||
</div>
|
|
||||||
{% endif %}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
{% endfor %}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
{% endblock %}
|
|
|
@ -69,7 +69,7 @@
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<a href="{{ url_for('admin.admin') }}" id="email_back" class="btn btn-default">{{_('Back')}}</a>
|
<a href="{{ url_for('admin.admin') }}" id="email_back" class="btn btn-default">{{_('Back')}}</a>
|
||||||
</form>
|
</form>
|
||||||
{% if g.allow_registration %}
|
{% if g.allow_registration and not simple%}
|
||||||
<div class="col-md-10 col-lg-6">
|
<div class="col-md-10 col-lg-6">
|
||||||
<h2>{{_('Allowed Domains (Whitelist)')}}</h2>
|
<h2>{{_('Allowed Domains (Whitelist)')}}</h2>
|
||||||
<form id="domain_add_allow" action="{{ url_for('admin.add_domain',allow=1)}}" method="POST">
|
<form id="domain_add_allow" action="{{ url_for('admin.add_domain',allow=1)}}" method="POST">
|
||||||
|
|
|
@ -40,35 +40,35 @@
|
||||||
{% if entries and entries[0] %}
|
{% if entries and entries[0] %}
|
||||||
{% for entry in entries %}
|
{% for entry in entries %}
|
||||||
<entry>
|
<entry>
|
||||||
<title>{{entry.title}}</title>
|
<title>{{entry.Books.title}}</title>
|
||||||
<id>urn:uuid:{{entry.uuid}}</id>
|
<id>urn:uuid:{{entry.Books.uuid}}</id>
|
||||||
<updated>{{entry.atom_timestamp}}</updated>
|
<updated>{{entry.Books.atom_timestamp}}</updated>
|
||||||
{% if entry.authors.__len__() > 0 %}
|
{% if entry.Books.authors.__len__() > 0 %}
|
||||||
<author>
|
<author>
|
||||||
<name>{{entry.authors[0].name}}</name>
|
<name>{{entry.Books.authors[0].name}}</name>
|
||||||
</author>
|
</author>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% if entry.publishers.__len__() > 0 %}
|
{% if entry.Books.publishers.__len__() > 0 %}
|
||||||
<publisher>
|
<publisher>
|
||||||
<name>{{entry.publishers[0].name}}</name>
|
<name>{{entry.Books.publishers[0].name}}</name>
|
||||||
</publisher>
|
</publisher>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% for lang in entry.languages %}
|
{% for lang in entry.Books.languages %}
|
||||||
<dcterms:language>{{lang.lang_code}}</dcterms:language>
|
<dcterms:language>{{lang.lang_code}}</dcterms:language>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
{% for tag in entry.tags %}
|
{% for tag in entry.Books.tags %}
|
||||||
<category scheme="http://www.bisg.org/standards/bisac_subject/index.html"
|
<category scheme="http://www.bisg.org/standards/bisac_subject/index.html"
|
||||||
term="{{tag.name}}"
|
term="{{tag.name}}"
|
||||||
label="{{tag.name}}"/>
|
label="{{tag.name}}"/>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
{% if entry.comments[0] %}<summary>{{entry.comments[0].text|striptags}}</summary>{% endif %}
|
{% if entry.Books.comments[0] %}<summary>{{entry.Books.comments[0].text|striptags}}</summary>{% endif %}
|
||||||
{% if entry.has_cover %}
|
{% if entry.Books.has_cover %}
|
||||||
<link type="image/jpeg" href="{{url_for('opds.feed_get_cover', book_id=entry.id)}}" rel="http://opds-spec.org/image"/>
|
<link type="image/jpeg" href="{{url_for('opds.feed_get_cover', book_id=entry.Books.id)}}" rel="http://opds-spec.org/image"/>
|
||||||
<link type="image/jpeg" href="{{url_for('opds.feed_get_cover', book_id=entry.id)}}" rel="http://opds-spec.org/image/thumbnail"/>
|
<link type="image/jpeg" href="{{url_for('opds.feed_get_cover', book_id=entry.Books.id)}}" rel="http://opds-spec.org/image/thumbnail"/>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% for format in entry.data %}
|
{% for format in entry.Books.data %}
|
||||||
<link rel="http://opds-spec.org/acquisition" href="{{ url_for('opds.opds_download_link', book_id=entry.id, book_format=format.format|lower)}}"
|
<link rel="http://opds-spec.org/acquisition" href="{{ url_for('opds.opds_download_link', book_id=entry.Books.id, book_format=format.format|lower)}}"
|
||||||
length="{{format.uncompressed_size}}" mtime="{{entry.atom_timestamp}}" type="{{format.format|lower|mimetype}}"/>
|
length="{{format.uncompressed_size}}" mtime="{{entry.Books.atom_timestamp}}" type="{{format.format|lower|mimetype}}"/>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
</entry>
|
</entry>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
{% import 'image.html' as image %}
|
||||||
<div class="container-fluid">
|
<div class="container-fluid">
|
||||||
{% block body %}{% endblock %}
|
{% block body %}{% endblock %}
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
{% import 'image.html' as image %}
|
||||||
{% extends "layout.html" %}
|
{% extends "layout.html" %}
|
||||||
{% block body %}
|
{% block body %}
|
||||||
<h1 class="{{page}}">{{_(title)}}</h1>
|
<h1 class="{{page}}">{{_(title)}}</h1>
|
||||||
|
@ -27,7 +28,7 @@
|
||||||
<div class="cover">
|
<div class="cover">
|
||||||
<a href="{{url_for('web.books_list', data=data, sort_param='stored', book_id=entry[0].series[0].id )}}">
|
<a href="{{url_for('web.books_list', data=data, sort_param='stored', book_id=entry[0].series[0].id )}}">
|
||||||
<span class="img" title="{{entry[0].series[0].name}}">
|
<span class="img" title="{{entry[0].series[0].name}}">
|
||||||
<img src="{{ url_for('web.get_cover', book_id=entry[3]) }}" alt="{{ entry[0].series[0].name }}"/>
|
{{ image.series(entry[0].series[0], alt=entry[0].series[0].name|shortentitle) }}
|
||||||
<span class="badge">{{entry.count}}</span>
|
<span class="badge">{{entry.count}}</span>
|
||||||
</span>
|
</span>
|
||||||
</a>
|
</a>
|
||||||
|
|
20
cps/templates/image.html
Normal file
20
cps/templates/image.html
Normal file
|
@ -0,0 +1,20 @@
|
||||||
|
{% macro book_cover(book, alt=None) -%}
|
||||||
|
{%- set image_title = book.title if book.title else book.name -%}
|
||||||
|
{%- set image_alt = alt if alt else image_title -%}
|
||||||
|
{% set srcset = book|get_cover_srcset %}
|
||||||
|
<img
|
||||||
|
srcset="{{ srcset }}"
|
||||||
|
src="{{ url_for('web.get_cover', book_id=book.id, resolution='og', c=book|last_modified) }}"
|
||||||
|
alt="{{ image_alt }}"
|
||||||
|
/>
|
||||||
|
{%- endmacro %}
|
||||||
|
|
||||||
|
{% macro series(series, alt=None) -%}
|
||||||
|
{%- set image_alt = alt if alt else image_title -%}
|
||||||
|
{% set srcset = series|get_series_srcset %}
|
||||||
|
<img
|
||||||
|
srcset="{{ srcset }}"
|
||||||
|
src="{{ url_for('web.get_series_cover', series_id=series.id, resolution='og', c='day'|cache_timestamp) }}"
|
||||||
|
alt="{{ book_title }}"
|
||||||
|
/>
|
||||||
|
{%- endmacro %}
|
|
@ -1,30 +1,31 @@
|
||||||
|
{% import 'image.html' as image %}
|
||||||
{% extends "layout.html" %}
|
{% extends "layout.html" %}
|
||||||
{% block body %}
|
{% block body %}
|
||||||
{% if g.user.show_detail_random() %}
|
{% if g.user.show_detail_random() and page != "discover" %}
|
||||||
<div class="discover random-books">
|
<div class="discover random-books">
|
||||||
<h2 class="random-books">{{_('Discover (Random Books)')}}</h2>
|
<h2 class="random-books">{{_('Discover (Random Books)')}}</h2>
|
||||||
<div class="row display-flex">
|
<div class="row display-flex">
|
||||||
{% for entry in random %}
|
{% for entry in random %}
|
||||||
<div class="col-sm-3 col-lg-2 col-xs-6 book" id="books_rand">
|
<div class="col-sm-3 col-lg-2 col-xs-6 book" id="books_rand">
|
||||||
<div class="cover">
|
<div class="cover">
|
||||||
<a href="{{ url_for('web.show_book', book_id=entry.id) }}" data-toggle="modal" data-target="#bookDetailsModal" data-remote="false">
|
<a href="{{ url_for('web.show_book', book_id=entry.Books.id) }}" {% if simple==false %}data-toggle="modal" data-target="#bookDetailsModal" data-remote="false"{% endif %}>
|
||||||
<span class="img" title="{{ entry.title }}">
|
<span class="img" title="{{ entry.Books.title }}">
|
||||||
<img src="{{ url_for('web.get_cover', book_id=entry.id) }}" alt="{{ entry.title }}" />
|
{{ image.book_cover(entry.Books) }}
|
||||||
{% if entry.id in read_book_ids %}<span class="badge read glyphicon glyphicon-ok"></span>{% endif %}
|
{% if entry[2] == True %}<span class="badge read glyphicon glyphicon-ok"></span>{% endif %}
|
||||||
</span>
|
</span>
|
||||||
</a>
|
</a>
|
||||||
</div>
|
</div>
|
||||||
<div class="meta">
|
<div class="meta">
|
||||||
<a href="{{ url_for('web.show_book', book_id=entry.id) }}" data-toggle="modal" data-target="#bookDetailsModal" data-remote="false">
|
<a href="{{ url_for('web.show_book', book_id=entry.Books.id) }}" {% if simple==false %}data-toggle="modal" data-target="#bookDetailsModal" data-remote="false"{% endif %}>
|
||||||
<p title="{{entry.title}}" class="title">{{entry.title|shortentitle}}</p>
|
<p title="{{entry.Books.title}}" class="title">{{entry.Books.title|shortentitle}}</p>
|
||||||
</a>
|
</a>
|
||||||
<p class="author">
|
<p class="author">
|
||||||
{% for author in entry.authors %}
|
{% for author in entry.Books.authors %}
|
||||||
{% if loop.index > g.config_authors_max and g.config_authors_max != 0 %}
|
{% if loop.index > g.config_authors_max and g.config_authors_max != 0 %}
|
||||||
{% if not loop.first %}
|
{% if not loop.first %}
|
||||||
<span class="author-hidden-divider">&</span>
|
<span class="author-hidden-divider">&</span>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<a class="author-name author-hidden" href="{{url_for('web.books_list', data='author', sort_param='new', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
<a class="author-name author-hidden" href="{{url_for('web.books_list', data='author', sort_param='stored', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
||||||
{% if loop.last %}
|
{% if loop.last %}
|
||||||
<a href="#" class="author-expand" data-authors-max="{{g.config_authors_max}}" data-collapse-caption="({{_('reduce')}})">(...)</a>
|
<a href="#" class="author-expand" data-authors-max="{{g.config_authors_max}}" data-collapse-caption="({{_('reduce')}})">(...)</a>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
@ -32,21 +33,21 @@
|
||||||
{% if not loop.first %}
|
{% if not loop.first %}
|
||||||
<span>&</span>
|
<span>&</span>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<a class="author-name" href="{{url_for('web.books_list', data='author', sort_param='new', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
<a class="author-name" href="{{url_for('web.books_list', data='author', sort_param='stored', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
</p>
|
</p>
|
||||||
{% if entry.series.__len__() > 0 %}
|
{% if entry.Books.series.__len__() > 0 %}
|
||||||
<p class="series">
|
<p class="series">
|
||||||
<a href="{{url_for('web.books_list', data='series', sort_param='new', book_id=entry.series[0].id )}}">
|
<a href="{{url_for('web.books_list', data='series', sort_param='stored', book_id=entry.Books.series[0].id )}}">
|
||||||
{{entry.series[0].name}}
|
{{entry.Books.series[0].name}}
|
||||||
</a>
|
</a>
|
||||||
({{entry.series_index|formatseriesindex}})
|
({{entry.Books.series_index|formatseriesindex}})
|
||||||
</p>
|
</p>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% if entry.ratings.__len__() > 0 %}
|
{% if entry.Books.ratings.__len__() > 0 %}
|
||||||
<div class="rating">
|
<div class="rating">
|
||||||
{% for number in range((entry.ratings[0].rating/2)|int(2)) %}
|
{% for number in range((entry.Books.ratings[0].rating/2)|int(2)) %}
|
||||||
<span class="glyphicon glyphicon-star good"></span>
|
<span class="glyphicon glyphicon-star good"></span>
|
||||||
{% if loop.last and loop.index < 5 %}
|
{% if loop.last and loop.index < 5 %}
|
||||||
{% for numer in range(5 - loop.index) %}
|
{% for numer in range(5 - loop.index) %}
|
||||||
|
@ -64,6 +65,7 @@
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<div class="discover load-more">
|
<div class="discover load-more">
|
||||||
<h2 class="{{title}}">{{title}}</h2>
|
<h2 class="{{title}}">{{title}}</h2>
|
||||||
|
{% if page != 'discover' %}
|
||||||
<div class="filterheader hidden-xs">
|
<div class="filterheader hidden-xs">
|
||||||
{% if page == 'hot' %}
|
{% if page == 'hot' %}
|
||||||
<a data-toggle="tooltip" title="{{_('Sort ascending according to download count')}}" id="hot_asc" class="btn btn-primary{% if order == "hotasc" %} active{% endif%}" href="{{url_for('web.books_list', data=page, book_id=id, sort_param='hotasc')}}"><span class="glyphicon glyphicon-sort-by-order"></span></a>
|
<a data-toggle="tooltip" title="{{_('Sort ascending according to download count')}}" id="hot_asc" class="btn btn-primary{% if order == "hotasc" %} active{% endif%}" href="{{url_for('web.books_list', data=page, book_id=id, sort_param='hotasc')}}"><span class="glyphicon glyphicon-sort-by-order"></span></a>
|
||||||
|
@ -83,30 +85,30 @@
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% endif %}
|
{% endif %}
|
||||||
</div>
|
</div>
|
||||||
|
{% endif %}
|
||||||
<div class="row display-flex">
|
<div class="row display-flex">
|
||||||
{% if entries[0] %}
|
{% if entries[0] %}
|
||||||
{% for entry in entries %}
|
{% for entry in entries %}
|
||||||
<div class="col-sm-3 col-lg-2 col-xs-6 book" id="books">
|
<div class="col-sm-3 col-lg-2 col-xs-6 book" id="books">
|
||||||
<div class="cover">
|
<div class="cover">
|
||||||
<a href="{{ url_for('web.show_book', book_id=entry.id) }}" data-toggle="modal" data-target="#bookDetailsModal" data-remote="false">
|
<a href="{{ url_for('web.show_book', book_id=entry.Books.id) }}" {% if simple==false %}data-toggle="modal" data-target="#bookDetailsModal" data-remote="false"{% endif %}>
|
||||||
<span class="img" title="{{ entry.title }}">
|
<span class="img" title="{{ entry.Books.title }}">
|
||||||
<img src="{{ url_for('web.get_cover', book_id=entry.id) }}" alt="{{ entry.title }}"/>
|
{{ image.book_cover(entry.Books) }}
|
||||||
{% if entry.id in read_book_ids %}<span class="badge read glyphicon glyphicon-ok"></span>{% endif %}
|
{% if entry[2] == True %}<span class="badge read glyphicon glyphicon-ok"></span>{% endif %}
|
||||||
</span>
|
</span>
|
||||||
</a>
|
</a>
|
||||||
</div>
|
</div>
|
||||||
<div class="meta">
|
<div class="meta">
|
||||||
<a href="{{ url_for('web.show_book', book_id=entry.id) }}" data-toggle="modal" data-target="#bookDetailsModal" data-remote="false">
|
<a href="{{ url_for('web.show_book', book_id=entry.Books.id) }}" {% if simple==false %}data-toggle="modal" data-target="#bookDetailsModal" data-remote="false"{% endif %}>
|
||||||
<p title="{{ entry.title }}" class="title">{{entry.title|shortentitle}}</p>
|
<p title="{{ entry.Books.title }}" class="title">{{entry.Books.title|shortentitle}}</p>
|
||||||
</a>
|
</a>
|
||||||
<p class="author">
|
<p class="author">
|
||||||
{% for author in entry.authors %}
|
{% for author in entry.Books.authors %}
|
||||||
{% if loop.index > g.config_authors_max and g.config_authors_max != 0 %}
|
{% if loop.index > g.config_authors_max and g.config_authors_max != 0 %}
|
||||||
{% if not loop.first %}
|
{% if not loop.first %}
|
||||||
<span class="author-hidden-divider">&</span>
|
<span class="author-hidden-divider">&</span>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<a class="author-name author-hidden" href="{{url_for('web.books_list', data='author', book_id=author.id, sort_param='new') }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
<a class="author-name author-hidden" href="{{url_for('web.books_list', data='author', book_id=author.id, sort_param='stored') }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
||||||
{% if loop.last %}
|
{% if loop.last %}
|
||||||
<a href="#" class="author-expand" data-authors-max="{{g.config_authors_max}}" data-collapse-caption="({{_('reduce')}})">(...)</a>
|
<a href="#" class="author-expand" data-authors-max="{{g.config_authors_max}}" data-collapse-caption="({{_('reduce')}})">(...)</a>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
@ -114,26 +116,30 @@
|
||||||
{% if not loop.first %}
|
{% if not loop.first %}
|
||||||
<span>&</span>
|
<span>&</span>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<a class="author-name" href="{{url_for('web.books_list', data='author', book_id=author.id, sort_param='new') }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
<a class="author-name" href="{{url_for('web.books_list', data='author', book_id=author.id, sort_param='stored') }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
{% for format in entry.data %}
|
{% for format in entry.Books.data %}
|
||||||
{% if format.format|lower in g.constants.EXTENSIONS_AUDIO %}
|
{% if format.format|lower in g.constants.EXTENSIONS_AUDIO %}
|
||||||
<span class="glyphicon glyphicon-music"></span>
|
<span class="glyphicon glyphicon-music"></span>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{%endfor%}
|
{%endfor%}
|
||||||
</p>
|
</p>
|
||||||
{% if entry.series.__len__() > 0 %}
|
{% if entry.Books.series.__len__() > 0 %}
|
||||||
<p class="series">
|
<p class="series">
|
||||||
<a href="{{url_for('web.books_list', data='series', sort_param='new', book_id=entry.series[0].id )}}">
|
{% if page != "series" %}
|
||||||
{{entry.series[0].name}}
|
<a href="{{url_for('web.books_list', data='series', sort_param='stored', book_id=entry.Books.series[0].id )}}">
|
||||||
|
{{entry.Books.series[0].name}}
|
||||||
</a>
|
</a>
|
||||||
({{entry.series_index|formatseriesindex}})
|
{% else %}
|
||||||
|
<span>{{entry.Books.series[0].name}}</span>
|
||||||
|
{% endif %}
|
||||||
|
({{entry.Books.series_index|formatseriesindex}})
|
||||||
</p>
|
</p>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% if entry.ratings.__len__() > 0 %}
|
{% if entry.Books.ratings.__len__() > 0 %}
|
||||||
<div class="rating">
|
<div class="rating">
|
||||||
{% for number in range((entry.ratings[0].rating/2)|int(2)) %}
|
{% for number in range((entry.Books.ratings[0].rating/2)|int(2)) %}
|
||||||
<span class="glyphicon glyphicon-star good"></span>
|
<span class="glyphicon glyphicon-star good"></span>
|
||||||
{% if loop.last and loop.index < 5 %}
|
{% if loop.last and loop.index < 5 %}
|
||||||
{% for numer in range(5 - loop.index) %}
|
{% for numer in range(5 - loop.index) %}
|
||||||
|
|
|
@ -1,35 +0,0 @@
|
||||||
{% extends "layout.html" %}
|
|
||||||
{% block body %}
|
|
||||||
<h1>{{title}}</h1>
|
|
||||||
<div class="filterheader hidden-xs">
|
|
||||||
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
|
|
||||||
<div id="asc" data-order="{{ order }}" data-id="{{ data }}" class="btn btn-primary {% if order == 1 %} active{% endif%}"><span class="glyphicon glyphicon-sort-by-alphabet"></span></div>
|
|
||||||
<div id="desc" data-id="{{ data }}" class="btn btn-primary{% if order == 0 %} active{% endif%}"><span class="glyphicon glyphicon-sort-by-alphabet-alt"></span></div>
|
|
||||||
{% if charlist|length %}
|
|
||||||
<div id="all" class="active btn btn-primary {% if charlist|length > 9 %}hidden-sm{% endif %}">{{_('All')}}</div>
|
|
||||||
{% endif %}
|
|
||||||
<div class="btn-group character {% if charlist|length > 9 %}hidden-sm{% endif %}" role="group">
|
|
||||||
{% for char in charlist%}
|
|
||||||
<div class="btn btn-primary char">{{char}}</div>
|
|
||||||
{% endfor %}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
<div class="container">
|
|
||||||
<div div id="list" class="col-xs-12 col-sm-6">
|
|
||||||
{% for lang in languages %}
|
|
||||||
{% if loop.index0 == (loop.length/2)|int and loop.length > 20 %}
|
|
||||||
</div>
|
|
||||||
<div id="second" class="col-xs-12 col-sm-6">
|
|
||||||
{% endif %}
|
|
||||||
<div class="row" data-id="{{lang[0].name}}">
|
|
||||||
<div class="col-xs-2 col-sm-2 col-md-1" align="left"><span class="badge">{{lang[1]}}</span></div>
|
|
||||||
<div class="col-xs-10 col-sm-10 col-md-11"><a id="list_{{loop.index0}}" href="{{url_for('web.books_list', book_id=lang[0].lang_code, data=data, sort_param='new')}}">{{lang[0].name}}</a></div>
|
|
||||||
</div>
|
|
||||||
{% endfor %}
|
|
||||||
</div>
|
|
||||||
</div>
|
|
||||||
{% endblock %}
|
|
||||||
{% block js %}
|
|
||||||
<script src="{{ url_for('static', filename='js/filter_list.js') }}"></script>
|
|
||||||
{% endblock %}
|
|
||||||
|
|
|
@ -1,4 +1,5 @@
|
||||||
{% from 'modal_dialogs.html' import restrict_modal, delete_book, filechooser_modal, delete_confirm_modal, change_confirm_modal %}
|
{% from 'modal_dialogs.html' import restrict_modal, delete_book, filechooser_modal, delete_confirm_modal, change_confirm_modal %}
|
||||||
|
{% import 'image.html' as image %}
|
||||||
<!DOCTYPE html>
|
<!DOCTYPE html>
|
||||||
<html lang="{{ g.user.locale }}">
|
<html lang="{{ g.user.locale }}">
|
||||||
<head>
|
<head>
|
||||||
|
@ -60,7 +61,7 @@
|
||||||
{% if g.user.is_authenticated or g.allow_anonymous %}
|
{% if g.user.is_authenticated or g.allow_anonymous %}
|
||||||
{% if g.user.role_upload() and g.allow_upload %}
|
{% if g.user.role_upload() and g.allow_upload %}
|
||||||
<li>
|
<li>
|
||||||
<form id="form-upload" class="navbar-form" action="{{ url_for('editbook.upload') }}" data-title="{{_('Uploading...')}}" data-footer="{{_('Close')}}" data-failed="{{_('Error')}}" data-message="{{_('Upload done, processing, please wait...')}}" method="post" enctype="multipart/form-data">
|
<form id="form-upload" class="navbar-form" action="{{ url_for('edit-book.upload') }}" data-title="{{_('Uploading...')}}" data-footer="{{_('Close')}}" data-failed="{{_('Error')}}" data-message="{{_('Upload done, processing, please wait...')}}" method="post" enctype="multipart/form-data">
|
||||||
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
|
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
|
||||||
<div class="form-group">
|
<div class="form-group">
|
||||||
<span class="btn btn-default btn-file">{{_('Upload')}}<input id="btn-upload" name="btn-upload"
|
<span class="btn btn-default btn-file">{{_('Upload')}}<input id="btn-upload" name="btn-upload"
|
||||||
|
@ -69,7 +70,7 @@
|
||||||
</form>
|
</form>
|
||||||
</li>
|
</li>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% if not g.user.is_anonymous %}
|
{% if not g.user.is_anonymous and not simple%}
|
||||||
<li><a id="top_tasks" href="{{url_for('web.get_tasks_status')}}"><span class="glyphicon glyphicon-tasks"></span> <span class="hidden-sm">{{_('Tasks')}}</span></a></li>
|
<li><a id="top_tasks" href="{{url_for('web.get_tasks_status')}}"><span class="glyphicon glyphicon-tasks"></span> <span class="hidden-sm">{{_('Tasks')}}</span></a></li>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% if g.user.role_admin() %}
|
{% if g.user.role_admin() %}
|
||||||
|
|
|
@ -14,7 +14,7 @@
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<div class="btn-group character {% if charlist|length > 9 %}hidden-sm{% endif %}" role="group">
|
<div class="btn-group character {% if charlist|length > 9 %}hidden-sm{% endif %}" role="group">
|
||||||
{% for char in charlist%}
|
{% for char in charlist%}
|
||||||
<div class="btn btn-primary char">{{char.char}}</div>
|
<div class="btn btn-primary char">{{char[0]}}</div>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
|
@ -29,8 +29,8 @@
|
||||||
</div>
|
</div>
|
||||||
<div id="second" class="col-xs-12 col-sm-6">
|
<div id="second" class="col-xs-12 col-sm-6">
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<div class="row" {% if entry[0].sort %}data-name="{{entry[0].name}}"{% endif %} data-id="{% if entry[0].sort %}{{entry[0].sort}}{% else %}{% if entry.name %}{{entry.name}}{% else %}{{entry[0].name}}{% endif %}{% endif %}">
|
<div class="row" {% if entry[0].sort %}data-name="{{entry[0].name}}"{% endif %} data-id="{% if entry[0].sort %}{{entry[0].sort}}{% else %}{% if entry[0].format %}{{entry[0].format}}{% else %}{% if entry[0].rating %}{{entry[0].rating}}{% else %}{{entry[0].name}}{% endif %}{% endif %}{% endif %}">
|
||||||
<div class="col-xs-2 col-sm-2 col-md-1" align="left"><span class="badge">{{entry.count}}</span></div>
|
<div class="col-xs-2 col-sm-2 col-md-1" align="left"><span class="badge">{{entry[1]}}</span></div>
|
||||||
<div class="col-xs-10 col-sm-10 col-md-11"><a id="list_{{loop.index0}}" href="{% if entry.format %}{{url_for('web.books_list', data=data, sort_param='stored', book_id=entry.format )}}{% else %}{{url_for('web.books_list', data=data, sort_param='stored', book_id=entry[0].id )}}{% endif %}">
|
<div class="col-xs-10 col-sm-10 col-md-11"><a id="list_{{loop.index0}}" href="{% if entry.format %}{{url_for('web.books_list', data=data, sort_param='stored', book_id=entry.format )}}{% else %}{{url_for('web.books_list', data=data, sort_param='stored', book_id=entry[0].id )}}{% endif %}">
|
||||||
{% if entry.name %}
|
{% if entry.name %}
|
||||||
<div class="rating">
|
<div class="rating">
|
||||||
|
|
|
@ -105,7 +105,7 @@
|
||||||
|
|
||||||
<div class="sm2-playlist-wrapper">
|
<div class="sm2-playlist-wrapper">
|
||||||
<ul class="sm2-playlist-bd">
|
<ul class="sm2-playlist-bd">
|
||||||
<li><a href="{{ url_for('web.serve_book', book_id=mp3file,book_format=audioformat)}}"><b>{% for author in entry.authors %}{{author.name.replace('|',',')}}
|
<li><a href="{{ url_for('web.serve_book', book_id=mp3file,book_format=audioformat)}}"><b>{% for author in entry.ordered_authors %}{{author.name.replace('|',',')}}
|
||||||
{% if not loop.last %} & {% endif %} {% endfor %}</b> - {{entry.title}}</a></li>
|
{% if not loop.last %} & {% endif %} {% endfor %}</b> - {{entry.title}}</a></li>
|
||||||
</ul>
|
</ul>
|
||||||
</div>
|
</div>
|
||||||
|
@ -134,7 +134,7 @@ window.calibre = {
|
||||||
filePath: "{{ url_for('static', filename='js/libs/') }}",
|
filePath: "{{ url_for('static', filename='js/libs/') }}",
|
||||||
cssPath: "{{ url_for('static', filename='css/') }}",
|
cssPath: "{{ url_for('static', filename='css/') }}",
|
||||||
bookUrl: "{{ url_for('static', filename=mp3file) }}/",
|
bookUrl: "{{ url_for('static', filename=mp3file) }}/",
|
||||||
bookmarkUrl: "{{ url_for('web.bookmark', book_id=mp3file, book_format=audioformat.upper()) }}",
|
bookmarkUrl: "{{ url_for('web.set_bookmark', book_id=mp3file, book_format=audioformat.upper()) }}",
|
||||||
bookmark: "{{ bookmark.bookmark_key if bookmark != None }}",
|
bookmark: "{{ bookmark.bookmark_key if bookmark != None }}",
|
||||||
useBookmarks: "{{ g.user.is_authenticated | tojson }}"
|
useBookmarks: "{{ g.user.is_authenticated | tojson }}"
|
||||||
};
|
};
|
||||||
|
|
|
@ -86,7 +86,7 @@
|
||||||
window.calibre = {
|
window.calibre = {
|
||||||
filePath: "{{ url_for('static', filename='js/libs/') }}",
|
filePath: "{{ url_for('static', filename='js/libs/') }}",
|
||||||
cssPath: "{{ url_for('static', filename='css/') }}",
|
cssPath: "{{ url_for('static', filename='css/') }}",
|
||||||
bookmarkUrl: "{{ url_for('web.bookmark', book_id=bookid, book_format='EPUB') }}",
|
bookmarkUrl: "{{ url_for('web.set_bookmark', book_id=bookid, book_format='EPUB') }}",
|
||||||
bookUrl: "{{ url_for('web.serve_book', book_id=bookid, book_format='epub', anyname='file.epub') }}",
|
bookUrl: "{{ url_for('web.serve_book', book_id=bookid, book_format='epub', anyname='file.epub') }}",
|
||||||
bookmark: "{{ bookmark.bookmark_key if bookmark != None }}",
|
bookmark: "{{ bookmark.bookmark_key if bookmark != None }}",
|
||||||
useBookmarks: "{{ g.user.is_authenticated | tojson }}"
|
useBookmarks: "{{ g.user.is_authenticated | tojson }}"
|
||||||
|
|
44
cps/templates/schedule_edit.html
Normal file
44
cps/templates/schedule_edit.html
Normal file
|
@ -0,0 +1,44 @@
|
||||||
|
{% extends "layout.html" %}
|
||||||
|
{% block header %}
|
||||||
|
<link href="{{ url_for('static', filename='css/libs/bootstrap-table.min.css') }}" rel="stylesheet">
|
||||||
|
<link href="{{ url_for('static', filename='css/libs/bootstrap-editable.css') }}" rel="stylesheet">
|
||||||
|
{% endblock %}
|
||||||
|
{% block body %}
|
||||||
|
<div class="discover">
|
||||||
|
<h1>{{title}}</h1>
|
||||||
|
<form role="form" class="col-md-10 col-lg-6" method="POST" autocomplete="off">
|
||||||
|
<input type="hidden" name="csrf_token" value="{{ csrf_token() }}">
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="schedule_start_time">{{_('Time at which tasks start to run')}}</label>
|
||||||
|
<select name="schedule_start_time" id="schedule_start_time" class="form-control">
|
||||||
|
{% for n in starttime %}
|
||||||
|
<option value="{{n[0]}}" {% if config.schedule_start_time == n[0] %}selected{% endif %}>{{n[1]}}</option>
|
||||||
|
{% endfor %}
|
||||||
|
</select>
|
||||||
|
</div>
|
||||||
|
<div class="form-group">
|
||||||
|
<label for="schedule_duration">{{_('Maximum tasks duration')}}</label>
|
||||||
|
<select name="schedule_duration" id="schedule_duration" class="form-control">
|
||||||
|
{% for n in duration %}
|
||||||
|
<option value="{{n[0]}}" {% if config.schedule_duration == n[0] %}selected{% endif %}>{{n[1]}}</option>
|
||||||
|
{% endfor %}
|
||||||
|
</select>
|
||||||
|
</div>
|
||||||
|
<div class="form-group">
|
||||||
|
<input type="checkbox" id="schedule_generate_book_covers" name="schedule_generate_book_covers" {% if config.schedule_generate_book_covers %}checked{% endif %}>
|
||||||
|
<label for="schedule_generate_book_covers">{{_('Generate Book Cover Thumbnails')}}</label>
|
||||||
|
</div>
|
||||||
|
<!--div class="form-group">
|
||||||
|
<input type="checkbox" id="schedule_generate_series_covers" name="schedule_generate_series_covers" {% if config.schedule_generate_series_covers %}checked{% endif %}>
|
||||||
|
<label for="schedule_generate_series_covers">{{_('Generate Series Cover Thumbnails')}}</label>
|
||||||
|
</div-->
|
||||||
|
<div class="form-group">
|
||||||
|
<input type="checkbox" id="schedule_reconnect" name="schedule_reconnect" {% if config.schedule_generate_book_covers %}checked{% endif %}>
|
||||||
|
<label for="schedule_reconnect">{{_('Reconnect to Calibre Library')}}</label>
|
||||||
|
</div>
|
||||||
|
|
||||||
|
<button type="submit" name="submit" value="submit" class="btn btn-default">{{_('Save')}}</button>
|
||||||
|
<a href="{{ url_for('admin.admin') }}" id="email_back" class="btn btn-default">{{_('Cancel')}}</a>
|
||||||
|
</form>
|
||||||
|
</div>
|
||||||
|
{% endblock %}
|
|
@ -1,3 +1,4 @@
|
||||||
|
{% import 'image.html' as image %}
|
||||||
{% extends "layout.html" %}
|
{% extends "layout.html" %}
|
||||||
{% block body %}
|
{% block body %}
|
||||||
<div class="discover">
|
<div class="discover">
|
||||||
|
@ -43,16 +44,16 @@
|
||||||
<div class="col-sm-3 col-lg-2 col-xs-6 book">
|
<div class="col-sm-3 col-lg-2 col-xs-6 book">
|
||||||
<div class="cover">
|
<div class="cover">
|
||||||
{% if entry.Books.has_cover is defined %}
|
{% if entry.Books.has_cover is defined %}
|
||||||
<a href="{{ url_for('web.show_book', book_id=entry.Books.id) }}" data-toggle="modal" data-target="#bookDetailsModal" data-remote="false">
|
<a href="{{ url_for('web.show_book', book_id=entry.Books.id) }}" {% if simple==false %}data-toggle="modal" data-target="#bookDetailsModal" data-remote="false"{% endif %}>
|
||||||
<span class="img" title="{{entry.Books.title}}" >
|
<span class="img" title="{{entry.Books.title}}" >
|
||||||
<img src="{{ url_for('web.get_cover', book_id=entry.Books.id) }}" alt="{{ entry.Books.title }}" />
|
{{ image.book_cover(entry.Books) }}
|
||||||
{% if entry.Books.id in read_book_ids %}<span class="badge read glyphicon glyphicon-ok"></span>{% endif %}
|
{% if entry[2] == True %}<span class="badge read glyphicon glyphicon-ok"></span>{% endif %}
|
||||||
</span>
|
</span>
|
||||||
</a>
|
</a>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
</div>
|
</div>
|
||||||
<div class="meta">
|
<div class="meta">
|
||||||
<a href="{{ url_for('web.show_book', book_id=entry.Books.id) }}" data-toggle="modal" data-target="#bookDetailsModal" data-remote="false">
|
<a href="{{ url_for('web.show_book', book_id=entry.Books.id) }}" {% if simple==false %}data-toggle="modal" data-target="#bookDetailsModal" data-remote="false"{% endif %}>
|
||||||
<p title="{{entry.Books.title}}" class="title">{{entry.Books.title|shortentitle}}</p>
|
<p title="{{entry.Books.title}}" class="title">{{entry.Books.title|shortentitle}}</p>
|
||||||
</a>
|
</a>
|
||||||
<p class="author">
|
<p class="author">
|
||||||
|
@ -61,7 +62,7 @@
|
||||||
{% if not loop.first %}
|
{% if not loop.first %}
|
||||||
<span class="author-hidden-divider">&</span>
|
<span class="author-hidden-divider">&</span>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<a class="author-name author-hidden" href="{{url_for('web.books_list', data='author', sort_param='new', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
<a class="author-name author-hidden" href="{{url_for('web.books_list', data='author', sort_param='stored', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
||||||
{% if loop.last %}
|
{% if loop.last %}
|
||||||
<a href="#" class="author-expand" data-authors-max="{{g.config_authors_max}}" data-collapse-caption="({{_('reduce')}})">(...)</a>
|
<a href="#" class="author-expand" data-authors-max="{{g.config_authors_max}}" data-collapse-caption="({{_('reduce')}})">(...)</a>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
@ -69,7 +70,7 @@
|
||||||
{% if not loop.first %}
|
{% if not loop.first %}
|
||||||
<span>&</span>
|
<span>&</span>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<a class="author-name" href="{{url_for('web.books_list', data='author', sort_param='new', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
<a class="author-name" href="{{url_for('web.books_list', data='author', sort_param='stored', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
{% for format in entry.Books.data %}
|
{% for format in entry.Books.data %}
|
||||||
|
@ -80,7 +81,7 @@
|
||||||
</p>
|
</p>
|
||||||
{% if entry.Books.series.__len__() > 0 %}
|
{% if entry.Books.series.__len__() > 0 %}
|
||||||
<p class="series">
|
<p class="series">
|
||||||
<a href="{{url_for('web.books_list', data='series', sort_param='new', book_id=entry.Books.series[0].id )}}">
|
<a href="{{url_for('web.books_list', data='series', sort_param='stored', book_id=entry.Books.series[0].id )}}">
|
||||||
{{entry.Books.series[0].name}}
|
{{entry.Books.series[0].name}}
|
||||||
</a>
|
</a>
|
||||||
({{entry.Books.series_index|formatseriesindex}})
|
({{entry.Books.series_index|formatseriesindex}})
|
||||||
|
|
|
@ -1,3 +1,4 @@
|
||||||
|
{% import 'image.html' as image %}
|
||||||
{% extends "layout.html" %}
|
{% extends "layout.html" %}
|
||||||
{% block body %}
|
{% block body %}
|
||||||
<div class="discover">
|
<div class="discover">
|
||||||
|
@ -32,24 +33,24 @@
|
||||||
{% for entry in entries %}
|
{% for entry in entries %}
|
||||||
<div class="col-sm-3 col-lg-2 col-xs-6 book">
|
<div class="col-sm-3 col-lg-2 col-xs-6 book">
|
||||||
<div class="cover">
|
<div class="cover">
|
||||||
<a href="{{ url_for('web.show_book', book_id=entry.id) }}" data-toggle="modal" data-target="#bookDetailsModal" data-remote="false">
|
<a href="{{ url_for('web.show_book', book_id=entry.Books.id) }}" {% if simple==false %}data-toggle="modal" data-target="#bookDetailsModal" data-remote="false"{% endif %}>
|
||||||
<span class="img" title="{{entry.title}}" >
|
<span class="img" title="{{entry.Books.title}}" >
|
||||||
<img src="{{ url_for('web.get_cover', book_id=entry.id) }}" alt="{{ entry.title }}" />
|
{{ image.book_cover(entry.Books) }}
|
||||||
{% if entry.id in read_book_ids %}<span class="badge read glyphicon glyphicon-ok"></span>{% endif %}
|
{% if entry[2] == True %}<span class="badge read glyphicon glyphicon-ok"></span>{% endif %}
|
||||||
</span>
|
</span>
|
||||||
</a>
|
</a>
|
||||||
</div>
|
</div>
|
||||||
<div class="meta">
|
<div class="meta">
|
||||||
<a href="{{ url_for('web.show_book', book_id=entry.id) }}" data-toggle="modal" data-target="#bookDetailsModal" data-remote="false">
|
<a href="{{ url_for('web.show_book', book_id=entry.Books.id) }}" {% if simple==false %}data-toggle="modal" data-target="#bookDetailsModal" data-remote="false"{% endif %}>
|
||||||
<p title="{{entry.title}}" class="title">{{entry.title|shortentitle}}</p>
|
<p title="{{entry.Books.title}}" class="title">{{entry.Books.title|shortentitle}}</p>
|
||||||
</a>
|
</a>
|
||||||
<p class="author">
|
<p class="author">
|
||||||
{% for author in entry.authors %}
|
{% for author in entry.Books.authors %}
|
||||||
{% if loop.index > g.config_authors_max and g.config_authors_max != 0 %}
|
{% if loop.index > g.config_authors_max and g.config_authors_max != 0 %}
|
||||||
{% if not loop.first %}
|
{% if not loop.first %}
|
||||||
<span class="author-hidden-divider">&</span>
|
<span class="author-hidden-divider">&</span>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<a class="author-name author-hidden" href="{{url_for('web.books_list', data='author', sort_param='new', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
<a class="author-name author-hidden" href="{{url_for('web.books_list', data='author', sort_param='stored', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
||||||
{% if loop.last %}
|
{% if loop.last %}
|
||||||
<a href="#" class="author-expand" data-authors-max="{{g.config_authors_max}}" data-collapse-caption="({{_('reduce')}})">(...)</a>
|
<a href="#" class="author-expand" data-authors-max="{{g.config_authors_max}}" data-collapse-caption="({{_('reduce')}})">(...)</a>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
@ -57,21 +58,21 @@
|
||||||
{% if not loop.first %}
|
{% if not loop.first %}
|
||||||
<span>&</span>
|
<span>&</span>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<a class="author-name" href="{{url_for('web.books_list', data='author', sort_param='new', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
<a class="author-name" href="{{url_for('web.books_list', data='author', sort_param='stored', book_id=author.id) }}">{{author.name.replace('|',',')|shortentitle(30)}}</a>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
</p>
|
</p>
|
||||||
{% if entry.series.__len__() > 0 %}
|
{% if entry.Books.series.__len__() > 0 %}
|
||||||
<p class="series">
|
<p class="series">
|
||||||
<a href="{{url_for('web.books_list', data='series', sort_param='new', book_id=entry.series[0].id )}}">
|
<a href="{{url_for('web.books_list', data='series', sort_param='stored', book_id=entry.Books.series[0].id )}}">
|
||||||
{{entry.series[0].name}}
|
{{entry.Books.series[0].name}}
|
||||||
</a>
|
</a>
|
||||||
({{entry.series_index|formatseriesindex}})
|
({{entry.Books.series_index|formatseriesindex}})
|
||||||
</p>
|
</p>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% if entry.ratings.__len__() > 0 %}
|
{% if entry.Books.ratings.__len__() > 0 %}
|
||||||
<div class="rating">
|
<div class="rating">
|
||||||
{% for number in range((entry.ratings[0].rating/2)|int(2)) %}
|
{% for number in range((entry.Books.ratings[0].rating/2)|int(2)) %}
|
||||||
<span class="glyphicon glyphicon-star good"></span>
|
<span class="glyphicon glyphicon-star good"></span>
|
||||||
{% if loop.last and loop.index < 5 %}
|
{% if loop.last and loop.index < 5 %}
|
||||||
{% for numer in range(5 - loop.index) %}
|
{% for numer in range(5 - loop.index) %}
|
||||||
|
|
|
@ -35,31 +35,31 @@
|
||||||
<div class="col-sm-3 col-lg-2 col-xs-6 book">
|
<div class="col-sm-3 col-lg-2 col-xs-6 book">
|
||||||
|
|
||||||
<div class="meta">
|
<div class="meta">
|
||||||
<p title="{{entry.title}}" class="title">{{entry.title|shortentitle}}</p>
|
<p title="{{entry.Books.title}}" class="title">{{entry.Books.title|shortentitle}}</p>
|
||||||
<p class="author">
|
<p class="author">
|
||||||
{% for author in entry.authors %}
|
{% for author in entry.Books.authors %}
|
||||||
<a href="{{url_for('web.books_list', data='author', sort_param='new', book_id=author.id) }}">{{author.name.replace('|',',')}}</a>
|
<a href="{{url_for('web.books_list', data='author', sort_param='stored', book_id=author.id) }}">{{author.name.replace('|',',')}}</a>
|
||||||
{% if not loop.last %}
|
{% if not loop.last %}
|
||||||
&
|
&
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
</p>
|
</p>
|
||||||
{% if entry.series.__len__() > 0 %}
|
{% if entry.Books.series.__len__() > 0 %}
|
||||||
<p class="series">
|
<p class="series">
|
||||||
<a href="{{url_for('web.books_list', data='series', sort_param='new', book_id=entry.series[0].id )}}">
|
<a href="{{url_for('web.books_list', data='series', sort_param='stored', book_id=entry.Books.series[0].id )}}">
|
||||||
{{entry.series[0].name}}
|
{{entry.Books.series[0].name}}
|
||||||
</a>
|
</a>
|
||||||
({{entry.series_index}})
|
({{entry.Books.series_index}})
|
||||||
</p>
|
</p>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
</div>
|
</div>
|
||||||
|
|
||||||
<div class="btn-group" role="group" aria-label="Download, send to Kindle, reading">
|
<div class="btn-group" role="group" aria-label="Download, send to Kindle, reading">
|
||||||
{% if g.user.role_download() %}
|
{% if g.user.role_download() %}
|
||||||
{% if entry.data|length %}
|
{% if entry.Books.data|length %}
|
||||||
<div class="btn-group" role="group">
|
<div class="btn-group" role="group">
|
||||||
{% for format in entry.data %}
|
{% for format in entry.Books.data %}
|
||||||
<a href="{{ url_for('web.download_link', book_id=entry.id, book_format=format.format|lower, anyname=entry.id|string+'.'+format.format|lower) }}" id="btnGroupDrop{{entry.id}}{{format.format|lower}}" class="btn btn-primary" role="button">
|
<a href="{{ url_for('web.download_link', book_id=entry.Books.id, book_format=format.format|lower, anyname=entry.Books.id|string+'.'+format.format|lower) }}" id="btnGroupDrop{{entry.Books.id}}{{format.format|lower}}" class="btn btn-primary" role="button">
|
||||||
<span class="glyphicon glyphicon-download"></span>{{format.format}} ({{ format.uncompressed_size|filesizeformat }})
|
<span class="glyphicon glyphicon-download"></span>{{format.format}} ({{ format.uncompressed_size|filesizeformat }})
|
||||||
</a>
|
</a>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
|
|
|
@ -39,7 +39,7 @@
|
||||||
{% if version %}
|
{% if version %}
|
||||||
<tr>
|
<tr>
|
||||||
<th>{{library}}</th>
|
<th>{{library}}</th>
|
||||||
<td>{{_(version)}}</td>
|
<td>{{version}}</td>
|
||||||
</tr>
|
</tr>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
|
|
|
@ -16,6 +16,9 @@
|
||||||
<th data-halign="right" data-align="right" data-field="progress" data-sortable="true" data-sorter="elementSorter">{{_('Progress')}}</th>
|
<th data-halign="right" data-align="right" data-field="progress" data-sortable="true" data-sorter="elementSorter">{{_('Progress')}}</th>
|
||||||
<th data-halign="right" data-align="right" data-field="runtime" data-sortable="true" data-sort-name="rt">{{_('Run Time')}}</th>
|
<th data-halign="right" data-align="right" data-field="runtime" data-sortable="true" data-sort-name="rt">{{_('Run Time')}}</th>
|
||||||
<th data-halign="right" data-align="right" data-field="starttime" data-sortable="true" data-sort-name="id">{{_('Start Time')}}</th>
|
<th data-halign="right" data-align="right" data-field="starttime" data-sortable="true" data-sort-name="id">{{_('Start Time')}}</th>
|
||||||
|
{% if g.user.role_admin() %}
|
||||||
|
<th data-halign="right" data-align="right" data-formatter="TaskActions" data-switchable="false">{{_('Actions')}}</th>
|
||||||
|
{% endif %}
|
||||||
<th data-field="id" data-visible="false"></th>
|
<th data-field="id" data-visible="false"></th>
|
||||||
<th data-field="rt" data-visible="false"></th>
|
<th data-field="rt" data-visible="false"></th>
|
||||||
</tr>
|
</tr>
|
||||||
|
@ -23,6 +26,30 @@
|
||||||
</table>
|
</table>
|
||||||
</div>
|
</div>
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
{% block modal %}
|
||||||
|
{{ delete_book() }}
|
||||||
|
{% if g.user.role_admin() %}
|
||||||
|
<div class="modal fade" id="cancelTaskModal" role="dialog" aria-labelledby="metaCancelTaskLabel">
|
||||||
|
<div class="modal-dialog">
|
||||||
|
<div class="modal-content">
|
||||||
|
<div class="modal-header bg-danger text-center">
|
||||||
|
<span>{{_('Are you really sure?')}}</span>
|
||||||
|
</div>
|
||||||
|
<div class="modal-body text-center">
|
||||||
|
<p>
|
||||||
|
<span>{{_('This task will be cancelled. Any progress made by this task will be saved.')}}</span>
|
||||||
|
<span>{{_('If this is a scheduled task, it will be re-ran during the next scheduled time.')}}</span>
|
||||||
|
</p>
|
||||||
|
</div>
|
||||||
|
<div class="modal-footer">
|
||||||
|
<input type="button" class="btn btn-danger" value="{{_('Ok')}}" name="cancel_task_confirm" id="cancel_task_confirm" data-dismiss="modal">
|
||||||
|
<button type="button" class="btn btn-default" data-dismiss="modal">{{_('Cancel')}}</button>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
{% endblock %}
|
||||||
{% block js %}
|
{% block js %}
|
||||||
<script src="{{ url_for('static', filename='js/libs/bootstrap-table/bootstrap-table.min.js') }}"></script>
|
<script src="{{ url_for('static', filename='js/libs/bootstrap-table/bootstrap-table.min.js') }}"></script>
|
||||||
<script src="{{ url_for('static', filename='js/libs/bootstrap-table/bootstrap-table-locale-all.min.js') }}"></script>
|
<script src="{{ url_for('static', filename='js/libs/bootstrap-table/bootstrap-table-locale-all.min.js') }}"></script>
|
||||||
|
|
|
@ -83,7 +83,7 @@
|
||||||
<input type="checkbox" name="Show_detail_random" id="Show_detail_random" {% if content.show_detail_random() %}checked{% endif %}>
|
<input type="checkbox" name="Show_detail_random" id="Show_detail_random" {% if content.show_detail_random() %}checked{% endif %}>
|
||||||
<label for="Show_detail_random">{{_('Show Random Books in Detail View')}}</label>
|
<label for="Show_detail_random">{{_('Show Random Books in Detail View')}}</label>
|
||||||
</div>
|
</div>
|
||||||
{% if ( g.user and g.user.role_admin() and not new_user ) %}
|
{% if ( g.user and g.user.role_admin() and not new_user ) and not simple %}
|
||||||
<a href="#" id="get_user_tags" class="btn btn-default" data-id="{{content.id}}" data-toggle="modal" data-target="#restrictModal">{{_('Add Allowed/Denied Tags')}}</a>
|
<a href="#" id="get_user_tags" class="btn btn-default" data-id="{{content.id}}" data-toggle="modal" data-target="#restrictModal">{{_('Add Allowed/Denied Tags')}}</a>
|
||||||
<a href="#" id="get_user_column_values" data-id="{{content.id}}" class="btn btn-default" data-toggle="modal" data-target="#restrictModal">{{_('Add allowed/Denied Custom Column Values')}}</a>
|
<a href="#" id="get_user_column_values" data-id="{{content.id}}" class="btn btn-default" data-toggle="modal" data-target="#restrictModal">{{_('Add allowed/Denied Custom Column Values')}}</a>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
@ -131,7 +131,7 @@
|
||||||
</div>
|
</div>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% if kobo_support and not content.role_anonymous() %}
|
{% if kobo_support and not content.role_anonymous() and not simple%}
|
||||||
<div class="form-group">
|
<div class="form-group">
|
||||||
<input type="checkbox" name="kobo_only_shelves_sync" id="kobo_only_shelves_sync" {% if content.kobo_only_shelves_sync %}checked{% endif %}>
|
<input type="checkbox" name="kobo_only_shelves_sync" id="kobo_only_shelves_sync" {% if content.kobo_only_shelves_sync %}checked{% endif %}>
|
||||||
<label for="kobo_only_shelves_sync">{{_('Sync only books in selected shelves with Kobo')}}</label>
|
<label for="kobo_only_shelves_sync">{{_('Sync only books in selected shelves with Kobo')}}</label>
|
||||||
|
|
94
cps/tornado_wsgi.py
Normal file
94
cps/tornado_wsgi.py
Normal file
|
@ -0,0 +1,94 @@
|
||||||
|
# -*- coding: utf-8 -*-
|
||||||
|
|
||||||
|
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
|
||||||
|
# Copyright (C) 2022 OzzieIsaacs
|
||||||
|
#
|
||||||
|
# This program is free software: you can redistribute it and/or modify
|
||||||
|
# it under the terms of the GNU General Public License as published by
|
||||||
|
# the Free Software Foundation, either version 3 of the License, or
|
||||||
|
# (at your option) any later version.
|
||||||
|
#
|
||||||
|
# This program is distributed in the hope that it will be useful,
|
||||||
|
# but WITHOUT ANY WARRANTY; without even the implied warranty of
|
||||||
|
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
|
||||||
|
# GNU General Public License for more details.
|
||||||
|
#
|
||||||
|
# You should have received a copy of the GNU General Public License
|
||||||
|
# along with this program. If not, see <http://www.gnu.org/licenses/>.
|
||||||
|
|
||||||
|
|
||||||
|
from tornado.wsgi import WSGIContainer
|
||||||
|
import tornado
|
||||||
|
|
||||||
|
from tornado import escape
|
||||||
|
from tornado import httputil
|
||||||
|
|
||||||
|
from typing import List, Tuple, Optional, Callable, Any, Dict, Text
|
||||||
|
from types import TracebackType
|
||||||
|
import typing
|
||||||
|
|
||||||
|
if typing.TYPE_CHECKING:
|
||||||
|
from typing import Type # noqa: F401
|
||||||
|
from wsgiref.types import WSGIApplication as WSGIAppType # noqa: F4
|
||||||
|
|
||||||
|
class MyWSGIContainer(WSGIContainer):
|
||||||
|
|
||||||
|
def __call__(self, request: httputil.HTTPServerRequest) -> None:
|
||||||
|
data = {} # type: Dict[str, Any]
|
||||||
|
response = [] # type: List[bytes]
|
||||||
|
|
||||||
|
def start_response(
|
||||||
|
status: str,
|
||||||
|
headers: List[Tuple[str, str]],
|
||||||
|
exc_info: Optional[
|
||||||
|
Tuple[
|
||||||
|
"Optional[Type[BaseException]]",
|
||||||
|
Optional[BaseException],
|
||||||
|
Optional[TracebackType],
|
||||||
|
]
|
||||||
|
] = None,
|
||||||
|
) -> Callable[[bytes], Any]:
|
||||||
|
data["status"] = status
|
||||||
|
data["headers"] = headers
|
||||||
|
return response.append
|
||||||
|
|
||||||
|
app_response = self.wsgi_application(
|
||||||
|
MyWSGIContainer.environ(request), start_response
|
||||||
|
)
|
||||||
|
try:
|
||||||
|
response.extend(app_response)
|
||||||
|
body = b"".join(response)
|
||||||
|
finally:
|
||||||
|
if hasattr(app_response, "close"):
|
||||||
|
app_response.close() # type: ignore
|
||||||
|
if not data:
|
||||||
|
raise Exception("WSGI app did not call start_response")
|
||||||
|
|
||||||
|
status_code_str, reason = data["status"].split(" ", 1)
|
||||||
|
status_code = int(status_code_str)
|
||||||
|
headers = data["headers"] # type: List[Tuple[str, str]]
|
||||||
|
header_set = set(k.lower() for (k, v) in headers)
|
||||||
|
body = escape.utf8(body)
|
||||||
|
if status_code != 304:
|
||||||
|
if "content-length" not in header_set:
|
||||||
|
headers.append(("Content-Length", str(len(body))))
|
||||||
|
if "content-type" not in header_set:
|
||||||
|
headers.append(("Content-Type", "text/html; charset=UTF-8"))
|
||||||
|
if "server" not in header_set:
|
||||||
|
headers.append(("Server", "TornadoServer/%s" % tornado.version))
|
||||||
|
|
||||||
|
start_line = httputil.ResponseStartLine("HTTP/1.1", status_code, reason)
|
||||||
|
header_obj = httputil.HTTPHeaders()
|
||||||
|
for key, value in headers:
|
||||||
|
header_obj.add(key, value)
|
||||||
|
assert request.connection is not None
|
||||||
|
request.connection.write_headers(start_line, header_obj, chunk=body)
|
||||||
|
request.connection.finish()
|
||||||
|
self._log(status_code, request)
|
||||||
|
|
||||||
|
@staticmethod
|
||||||
|
def environ(request: httputil.HTTPServerRequest) -> Dict[Text, Any]:
|
||||||
|
environ = WSGIContainer.environ(request)
|
||||||
|
environ['RAW_URI'] = request.path
|
||||||
|
return environ
|
||||||
|
|
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user