Merge remote-tracking branch 'upstream/master'
This commit is contained in:
commit
2f69e3141e
15
.github/ISSUE_TEMPLATE/bug_report.md
vendored
15
.github/ISSUE_TEMPLATE/bug_report.md
vendored
|
@ -6,6 +6,7 @@ labels: ''
|
||||||
assignees: ''
|
assignees: ''
|
||||||
|
|
||||||
---
|
---
|
||||||
|
<!-- Please have a look at our [Contributing Guidelines](https://github.com/janeczku/calibre-web/blob/master/CONTRIBUTING.md) -->
|
||||||
|
|
||||||
**Describe the bug/problem**
|
**Describe the bug/problem**
|
||||||
A clear and concise description of what the bug is. If you are asking for support, please check our [Wiki](https://github.com/janeczku/calibre-web/wiki) if your question is already answered there.
|
A clear and concise description of what the bug is. If you are asking for support, please check our [Wiki](https://github.com/janeczku/calibre-web/wiki) if your question is already answered there.
|
||||||
|
@ -27,12 +28,12 @@ A clear and concise description of what you expected to happen.
|
||||||
If applicable, add screenshots to help explain your problem.
|
If applicable, add screenshots to help explain your problem.
|
||||||
|
|
||||||
**Environment (please complete the following information):**
|
**Environment (please complete the following information):**
|
||||||
- OS: [e.g. Windows 10/raspian]
|
- OS: [e.g. Windows 10/Raspberry Pi OS]
|
||||||
- Python version [e.g. python2.7]
|
- Python version: [e.g. python2.7]
|
||||||
- Calibre-Web version [e.g. 0.6.5 or master@16.02.20, 19:55 ]:
|
- Calibre-Web version: [e.g. 0.6.8 or 087c4c59 (git rev-parse --short HEAD)]:
|
||||||
- Docker container [ None/Technosoft2000/Linuxuser]:
|
- Docker container: [None/Technosoft2000/Linuxuser]:
|
||||||
- Special Hardware [e.g. Rasperry Pi Zero]
|
- Special Hardware: [e.g. Rasperry Pi Zero]
|
||||||
- Browser [e.g. chrome, safari]
|
- Browser: [e.g. Chrome 83.0.4103.97, Safari 13.3.7, Firefox 68.0.1 ESR]
|
||||||
|
|
||||||
**Additional context**
|
**Additional context**
|
||||||
Add any other context about the problem here. [e.g. access via reverse proxy]
|
Add any other context about the problem here. [e.g. access via reverse proxy, database background sync, special database location]
|
||||||
|
|
2
.github/ISSUE_TEMPLATE/feature_request.md
vendored
2
.github/ISSUE_TEMPLATE/feature_request.md
vendored
|
@ -7,6 +7,8 @@ assignees: ''
|
||||||
|
|
||||||
---
|
---
|
||||||
|
|
||||||
|
<!-- Please have a look at our [Contributing Guidelines](https://github.com/janeczku/calibre-web/blob/master/CONTRIBUTING.md) -->
|
||||||
|
|
||||||
**Is your feature request related to a problem? Please describe.**
|
**Is your feature request related to a problem? Please describe.**
|
||||||
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
|
||||||
|
|
||||||
|
|
|
@ -1,30 +1,30 @@
|
||||||
## How to contribute to Calibre-Web
|
## How to contribute to Calibre-Web
|
||||||
|
|
||||||
First of all, we would like to thank you for reading this text. we are happy you are willing to contribute to Calibre-Web
|
First of all, we would like to thank you for reading this text. We are happy you are willing to contribute to Calibre-Web.
|
||||||
|
|
||||||
### **General**
|
### **General**
|
||||||
|
|
||||||
**Communication language** is english. Google translated texts are not as bad as you might think, they are usually understandable, so don't worry if you generate your post that way.
|
**Communication language** is English. Google translated texts are not as bad as you might think, they are usually understandable, so don't worry if you generate your post that way.
|
||||||
|
|
||||||
**Calibre-Web** is not **Calibre**. If you are having a question regarding Calibre please post this at their [repository](https://github.com/kovidgoyal/calibre).
|
**Calibre-Web** is not **Calibre**. If you are having a question regarding Calibre please post this at their [repository](https://github.com/kovidgoyal/calibre).
|
||||||
|
|
||||||
**Docker-Containers** of Calibre-Web are maintained by different persons than the people who drive Calibre-Web. If we come to the conclusion during our analysis that the problem is related to Docker, we might ask you to open a new issue at the reprository of the Docker Container.
|
**Docker-Containers** of Calibre-Web are maintained by different persons than the people who drive Calibre-Web. If we come to the conclusion during our analysis that the problem is related to Docker, we might ask you to open a new issue at the repository of the Docker Container.
|
||||||
|
|
||||||
If you are having **Basic Installation Problems** with python or it's dependencies, please consider using your favorite search engine to find a solution. In case you can't find a solution, we are happy to help you.
|
If you are having **Basic Installation Problems** with python or its dependencies, please consider using your favorite search engine to find a solution. In case you can't find a solution, we are happy to help you.
|
||||||
|
|
||||||
We can offer only very limited support regarding configuration of **Reverse-Proxy Installations**, **OPDS-Reader** or other programs in combination with Calibre-Web.
|
We can offer only very limited support regarding configuration of **Reverse-Proxy Installations**, **OPDS-Reader** or other programs in combination with Calibre-Web.
|
||||||
|
|
||||||
### **Translation**
|
### **Translation**
|
||||||
|
|
||||||
Some of the user languages in Calibre-Web having missing translations. We are happy to add the missing texts if you translate them. Create a Pull Request, create an issue with the .po file attached, or write an email to "ozzie.fernandez.isaacs@googlemail.com" with attached translation file. To display all book languages in your native language an additional file is used (iso_language_names.py). The content of this file is autogenerated with the corresponding translations of Calibre, please do not edit this file on your own.
|
Some of the user languages in Calibre-Web having missing translations. We are happy to add the missing texts if you translate them. Create a Pull Request, create an issue with the .po file attached, or write an email to "ozzie.fernandez.isaacs@googlemail.com" with attached translation file. To display all book languages in your native language an additional file is used (iso_language_names.py). The content of this file is auto-generated with the corresponding translations of Calibre, please do not edit this file on your own.
|
||||||
|
|
||||||
### **Documentation**
|
### **Documentation**
|
||||||
|
|
||||||
The Calibre-Web documentation is hosted in the Github [Wiki](https://github.com/janeczku/calibre-web/wiki). The Wiki is open to everybody, if you find a problem, feel free to correct it. If information is missing, you are welcome to add it. The content will be reviewed time by time. Please try to be consitent with the form with the other Wiki pages (e.g. the project name is Calibre-Web with 2 capital letters and a dash in between).
|
The Calibre-Web documentation is hosted in the Github [Wiki](https://github.com/janeczku/calibre-web/wiki). The Wiki is open to everybody, if you find a problem, feel free to correct it. If information is missing, you are welcome to add it. The content will be reviewed time by time. Please try to be consistent with the form with the other Wiki pages (e.g. the project name is Calibre-Web with 2 capital letters and a dash in between).
|
||||||
|
|
||||||
### **Reporting a bug**
|
### **Reporting a bug**
|
||||||
|
|
||||||
Do not open up a GitHub issue if the bug is a **security vulnerability** in Calibre-Web. Please write intead an email to "ozzie.fernandez.isaacs@googlemail.com".
|
Do not open up a GitHub issue if the bug is a **security vulnerability** in Calibre-Web. Instead, please write an email to "ozzie.fernandez.isaacs@googlemail.com".
|
||||||
|
|
||||||
Ensure the ***bug was not already reported** by searching on GitHub under [Issues](https://github.com/janeczku/calibre-web/issues). Please also check if a solution for your problem can be found in the [wiki](https://github.com/janeczku/calibre-web/wiki).
|
Ensure the ***bug was not already reported** by searching on GitHub under [Issues](https://github.com/janeczku/calibre-web/issues). Please also check if a solution for your problem can be found in the [wiki](https://github.com/janeczku/calibre-web/wiki).
|
||||||
|
|
||||||
|
@ -33,17 +33,14 @@ If you're unable to find an **open issue** addressing the problem, open a [new o
|
||||||
### **Feature Request**
|
### **Feature Request**
|
||||||
|
|
||||||
If there is a feature missing in Calibre-Web and you can't find a feature request in the [Issues](https://github.com/janeczku/calibre-web/issues) section, you could create a [feature request](https://github.com/janeczku/calibre-web/issues/new?assignees=&labels=&template=feature_request.md&title=).
|
If there is a feature missing in Calibre-Web and you can't find a feature request in the [Issues](https://github.com/janeczku/calibre-web/issues) section, you could create a [feature request](https://github.com/janeczku/calibre-web/issues/new?assignees=&labels=&template=feature_request.md&title=).
|
||||||
We will not extend Calibre-Web with any more login abilitys or add further files storages, or file syncing ability. Furthermore Calibre-Web is made for home usage for company inhouse usage, so requests regarding any sorts of social interaction capability, payment routines, search engine or web site analytics integration will not be implemeted.
|
We will not extend Calibre-Web with any more login abilities or add further files storages, or file syncing ability. Furthermore Calibre-Web is made for home usage for company in-house usage, so requests regarding any sorts of social interaction capability, payment routines, search engine or web site analytics integration will not be implemented.
|
||||||
|
|
||||||
### **Contributing code to Calibre-Web**
|
### **Contributing code to Calibre-Web**
|
||||||
|
|
||||||
Open a new GitHub pull request with the patch. Ensure the PR description clearly describes the problem and solution. Include the relevant issue number if applicable.
|
Open a new GitHub pull request with the patch. Ensure the PR description clearly describes the problem and solution. Include the relevant issue number if applicable.
|
||||||
|
|
||||||
In case your code enhances features of Calibre-Web: Create your pull request for the development branch if your enhancement consits of more than some lines of code in a local section of Calibre-Webs code. This makes it easier to test it and check all implication before it's made public.
|
In case your code enhances features of Calibre-Web: Create your pull request for the development branch if your enhancement consists of more than some lines of code in a local section of Calibre-Webs code. This makes it easier to test it and check all implication before it's made public.
|
||||||
|
|
||||||
Please check if your code runs on Python 2.7 (still necessary in 2020) and mainly on python 3. If possible and the feature is related to operating system functions, try to check it on Windows and Linux.
|
Please check if your code runs on Python 2.7 (still necessary in 2020) and mainly on python 3. If possible and the feature is related to operating system functions, try to check it on Windows and Linux.
|
||||||
Calibre-Web is automatically tested on Linux in combination with python 3.7. The code for testing is in a [seperate repo](https://github.com/OzzieIsaacs/calibre-web-test) on Github. It uses unittest and performs real system tests with selenium, would be great if you could consider also writing some tests.
|
Calibre-Web is automatically tested on Linux in combination with python 3.7. The code for testing is in a [separate repo](https://github.com/OzzieIsaacs/calibre-web-test) on Github. It uses unit tests and performs real system tests with selenium; it would be great if you could consider also writing some tests.
|
||||||
A static code analysis is done by Codacy, but it's partly broken and doesn't run automatically. You could check your code with ESLint before contributing, a configuration file can be found in the projects root folder.
|
A static code analysis is done by Codacy, but it's partly broken and doesn't run automatically. You could check your code with ESLint before contributing, a configuration file can be found in the projects root folder.
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -61,14 +61,14 @@ Optionally, to enable on-the-fly conversion from one ebook format to another whe
|
||||||
Pre-built Docker images are available in these Docker Hub repositories:
|
Pre-built Docker images are available in these Docker Hub repositories:
|
||||||
|
|
||||||
#### **Technosoft2000 - x64**
|
#### **Technosoft2000 - x64**
|
||||||
+ Docker Hub - [https://hub.docker.com/r/technosoft2000/calibre-web/](https://hub.docker.com/r/technosoft2000/calibre-web/)
|
+ Docker Hub - [https://hub.docker.com/r/technosoft2000/calibre-web](https://hub.docker.com/r/technosoft2000/calibre-web)
|
||||||
+ Github - [https://github.com/Technosoft2000/docker-calibre-web](https://github.com/Technosoft2000/docker-calibre-web)
|
+ Github - [https://github.com/Technosoft2000/docker-calibre-web](https://github.com/Technosoft2000/docker-calibre-web)
|
||||||
|
|
||||||
Includes the Calibre `ebook-convert` binary.
|
Includes the Calibre `ebook-convert` binary.
|
||||||
+ The "path to convertertool" should be set to `/opt/calibre/ebook-convert`
|
+ The "path to convertertool" should be set to `/opt/calibre/ebook-convert`
|
||||||
|
|
||||||
#### **LinuxServer - x64, armhf, aarch64**
|
#### **LinuxServer - x64, armhf, aarch64**
|
||||||
+ Docker Hub - [https://hub.docker.com/r/linuxserver/calibre-web/](https://hub.docker.com/r/linuxserver/calibre-web/)
|
+ Docker Hub - [https://hub.docker.com/r/linuxserver/calibre-web](https://hub.docker.com/r/linuxserver/calibre-web)
|
||||||
+ Github - [https://github.com/linuxserver/docker-calibre-web](https://github.com/linuxserver/docker-calibre-web)
|
+ Github - [https://github.com/linuxserver/docker-calibre-web](https://github.com/linuxserver/docker-calibre-web)
|
||||||
+ Github - (Optional Calibre layer) - [https://github.com/linuxserver/docker-calibre-web/tree/calibre](https://github.com/linuxserver/docker-calibre-web/tree/calibre)
|
+ Github - (Optional Calibre layer) - [https://github.com/linuxserver/docker-calibre-web/tree/calibre](https://github.com/linuxserver/docker-calibre-web/tree/calibre)
|
||||||
|
|
||||||
|
@ -83,3 +83,7 @@ Pre-built Docker images are available in these Docker Hub repositories:
|
||||||
# Wiki
|
# Wiki
|
||||||
|
|
||||||
For further information, How To's and FAQ please check the [Wiki](https://github.com/janeczku/calibre-web/wiki)
|
For further information, How To's and FAQ please check the [Wiki](https://github.com/janeczku/calibre-web/wiki)
|
||||||
|
|
||||||
|
# Contributing to Calibre-Web
|
||||||
|
|
||||||
|
Please have a look at our [Contributing Guidelines](https://github.com/janeczku/calibre-web/blob/master/CONTRIBUTING.md)
|
||||||
|
|
32
cps/admin.py
32
cps/admin.py
|
@ -34,7 +34,7 @@ from flask import Blueprint, flash, redirect, url_for, abort, request, make_resp
|
||||||
from flask_login import login_required, current_user, logout_user
|
from flask_login import login_required, current_user, logout_user
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
from sqlalchemy import and_
|
from sqlalchemy import and_
|
||||||
from sqlalchemy.exc import IntegrityError
|
from sqlalchemy.exc import IntegrityError, OperationalError, InvalidRequestError
|
||||||
from sqlalchemy.sql.expression import func
|
from sqlalchemy.sql.expression import func
|
||||||
|
|
||||||
from . import constants, logger, helper, services
|
from . import constants, logger, helper, services
|
||||||
|
@ -99,7 +99,7 @@ def shutdown():
|
||||||
|
|
||||||
if task == 2:
|
if task == 2:
|
||||||
log.warning("reconnecting to calibre database")
|
log.warning("reconnecting to calibre database")
|
||||||
calibre_db.setup_db(config, ub.app_DB_path)
|
calibre_db.reconnect_db(config, ub.app_DB_path)
|
||||||
showtext['text'] = _(u'Reconnect successful')
|
showtext['text'] = _(u'Reconnect successful')
|
||||||
return json.dumps(showtext)
|
return json.dumps(showtext)
|
||||||
|
|
||||||
|
@ -132,6 +132,7 @@ def admin():
|
||||||
allUser = ub.session.query(ub.User).all()
|
allUser = ub.session.query(ub.User).all()
|
||||||
email_settings = config.get_mail_settings()
|
email_settings = config.get_mail_settings()
|
||||||
return render_title_template("admin.html", allUser=allUser, email=email_settings, config=config, commit=commit,
|
return render_title_template("admin.html", allUser=allUser, email=email_settings, config=config, commit=commit,
|
||||||
|
feature_support=feature_support,
|
||||||
title=_(u"Admin page"), page="admin")
|
title=_(u"Admin page"), page="admin")
|
||||||
|
|
||||||
|
|
||||||
|
@ -603,7 +604,7 @@ def _configuration_ldap_helper(to_save, gdriveError):
|
||||||
return reboot_required, _configuration_result(_('LDAP User Object Filter Has Unmatched Parenthesis'),
|
return reboot_required, _configuration_result(_('LDAP User Object Filter Has Unmatched Parenthesis'),
|
||||||
gdriveError)
|
gdriveError)
|
||||||
|
|
||||||
if config.config_ldap_cert_path and not os.path.isdir(config.config_ldap_cert_path):
|
if config.config_ldap_cert_path and not os.path.isfile(config.config_ldap_cert_path):
|
||||||
return reboot_required, _configuration_result(_('LDAP Certificate Location is not Valid, Please Enter Correct Path'),
|
return reboot_required, _configuration_result(_('LDAP Certificate Location is not Valid, Please Enter Correct Path'),
|
||||||
gdriveError)
|
gdriveError)
|
||||||
return reboot_required, None
|
return reboot_required, None
|
||||||
|
@ -613,8 +614,10 @@ def _configuration_update_helper():
|
||||||
reboot_required = False
|
reboot_required = False
|
||||||
db_change = False
|
db_change = False
|
||||||
to_save = request.form.to_dict()
|
to_save = request.form.to_dict()
|
||||||
|
gdriveError = None
|
||||||
|
|
||||||
to_save['config_calibre_dir'] = re.sub('[\\/]metadata\.db$', '', to_save['config_calibre_dir'], flags=re.IGNORECASE)
|
to_save['config_calibre_dir'] = re.sub('[\\/]metadata\.db$', '', to_save['config_calibre_dir'], flags=re.IGNORECASE)
|
||||||
|
try:
|
||||||
db_change |= _config_string(to_save, "config_calibre_dir")
|
db_change |= _config_string(to_save, "config_calibre_dir")
|
||||||
|
|
||||||
# Google drive setup
|
# Google drive setup
|
||||||
|
@ -635,10 +638,14 @@ def _configuration_update_helper():
|
||||||
_config_checkbox_int(to_save, "config_public_reg")
|
_config_checkbox_int(to_save, "config_public_reg")
|
||||||
_config_checkbox_int(to_save, "config_register_email")
|
_config_checkbox_int(to_save, "config_register_email")
|
||||||
reboot_required |= _config_checkbox_int(to_save, "config_kobo_sync")
|
reboot_required |= _config_checkbox_int(to_save, "config_kobo_sync")
|
||||||
|
_config_int(to_save, "config_external_port")
|
||||||
_config_checkbox_int(to_save, "config_kobo_proxy")
|
_config_checkbox_int(to_save, "config_kobo_proxy")
|
||||||
|
|
||||||
|
if "config_upload_formats" in to_save:
|
||||||
|
to_save["config_upload_formats"] = ','.join(
|
||||||
|
helper.uniq([x.lstrip().rstrip().lower() for x in to_save["config_upload_formats"].split(',')]))
|
||||||
_config_string(to_save, "config_upload_formats")
|
_config_string(to_save, "config_upload_formats")
|
||||||
constants.EXTENSIONS_UPLOAD = [x.lstrip().rstrip() for x in config.config_upload_formats.split(',')]
|
constants.EXTENSIONS_UPLOAD = config.config_upload_formats.split(',')
|
||||||
|
|
||||||
_config_string(to_save, "config_calibre")
|
_config_string(to_save, "config_calibre")
|
||||||
_config_string(to_save, "config_converterpath")
|
_config_string(to_save, "config_converterpath")
|
||||||
|
@ -654,6 +661,7 @@ def _configuration_update_helper():
|
||||||
reboot_required |= reboot
|
reboot_required |= reboot
|
||||||
|
|
||||||
# Remote login configuration
|
# Remote login configuration
|
||||||
|
|
||||||
_config_checkbox(to_save, "config_remote_login")
|
_config_checkbox(to_save, "config_remote_login")
|
||||||
if not config.config_remote_login:
|
if not config.config_remote_login:
|
||||||
ub.session.query(ub.RemoteAuthToken).filter(ub.RemoteAuthToken.token_type==0).delete()
|
ub.session.query(ub.RemoteAuthToken).filter(ub.RemoteAuthToken.token_type==0).delete()
|
||||||
|
@ -687,6 +695,9 @@ def _configuration_update_helper():
|
||||||
unrar_status = helper.check_unrar(config.config_rarfile_location)
|
unrar_status = helper.check_unrar(config.config_rarfile_location)
|
||||||
if unrar_status:
|
if unrar_status:
|
||||||
return _configuration_result(unrar_status, gdriveError)
|
return _configuration_result(unrar_status, gdriveError)
|
||||||
|
except (OperationalError, InvalidRequestError):
|
||||||
|
ub.session.rollback()
|
||||||
|
_configuration_result(_(u"Settings DB is not Writeable"), gdriveError)
|
||||||
|
|
||||||
try:
|
try:
|
||||||
metadata_db = os.path.join(config.config_calibre_dir, "metadata.db")
|
metadata_db = os.path.join(config.config_calibre_dir, "metadata.db")
|
||||||
|
@ -719,7 +730,7 @@ def _configuration_result(error_flash=None, gdriveError=None):
|
||||||
gdriveError = _(gdriveError)
|
gdriveError = _(gdriveError)
|
||||||
else:
|
else:
|
||||||
# if config.config_use_google_drive and\
|
# if config.config_use_google_drive and\
|
||||||
if not gdrive_authenticate:
|
if not gdrive_authenticate and gdrive_support:
|
||||||
gdrivefolders = gdriveutils.listRootFolders()
|
gdrivefolders = gdriveutils.listRootFolders()
|
||||||
|
|
||||||
show_back_button = current_user.is_authenticated
|
show_back_button = current_user.is_authenticated
|
||||||
|
@ -783,6 +794,9 @@ def _handle_new_user(to_save, content,languages, translations, kobo_support):
|
||||||
except IntegrityError:
|
except IntegrityError:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
flash(_(u"Found an existing account for this e-mail address or nickname."), category="error")
|
flash(_(u"Found an existing account for this e-mail address or nickname."), category="error")
|
||||||
|
except OperationalError:
|
||||||
|
ub.session.rollback()
|
||||||
|
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||||
|
|
||||||
|
|
||||||
def _handle_edit_user(to_save, content,languages, translations, kobo_support, downloads):
|
def _handle_edit_user(to_save, content,languages, translations, kobo_support, downloads):
|
||||||
|
@ -872,6 +886,9 @@ def _handle_edit_user(to_save, content,languages, translations, kobo_support, do
|
||||||
except IntegrityError:
|
except IntegrityError:
|
||||||
ub.session.rollback()
|
ub.session.rollback()
|
||||||
flash(_(u"An unknown error occured."), category="error")
|
flash(_(u"An unknown error occured."), category="error")
|
||||||
|
except OperationalError:
|
||||||
|
ub.session.rollback()
|
||||||
|
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||||
|
|
||||||
|
|
||||||
@admi.route("/admin/user/new", methods=["GET", "POST"])
|
@admi.route("/admin/user/new", methods=["GET", "POST"])
|
||||||
|
@ -916,7 +933,12 @@ def update_mailsettings():
|
||||||
_config_string(to_save, "mail_password")
|
_config_string(to_save, "mail_password")
|
||||||
_config_string(to_save, "mail_from")
|
_config_string(to_save, "mail_from")
|
||||||
_config_int(to_save, "mail_size", lambda y: int(y)*1024*1024)
|
_config_int(to_save, "mail_size", lambda y: int(y)*1024*1024)
|
||||||
|
try:
|
||||||
config.save()
|
config.save()
|
||||||
|
except (OperationalError, InvalidRequestError):
|
||||||
|
ub.session.rollback()
|
||||||
|
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||||
|
return edit_mailsettings()
|
||||||
|
|
||||||
if to_save.get("test"):
|
if to_save.get("test"):
|
||||||
if current_user.email:
|
if current_user.email:
|
||||||
|
|
24
cps/comic.py
24
cps/comic.py
|
@ -74,10 +74,10 @@ def _cover_processing(tmp_file_name, img, extension):
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def _extractCover(tmp_file_name, original_file_extension, rarExceutable):
|
def _extractCover(tmp_file_name, original_file_extension, rarExecutable):
|
||||||
cover_data = extension = None
|
cover_data = extension = None
|
||||||
if use_comic_meta:
|
if use_comic_meta:
|
||||||
archive = ComicArchive(tmp_file_name)
|
archive = ComicArchive(tmp_file_name, rar_exe_path=rarExecutable)
|
||||||
for index, name in enumerate(archive.getPageNameList()):
|
for index, name in enumerate(archive.getPageNameList()):
|
||||||
ext = os.path.splitext(name)
|
ext = os.path.splitext(name)
|
||||||
if len(ext) > 1:
|
if len(ext) > 1:
|
||||||
|
@ -106,7 +106,7 @@ def _extractCover(tmp_file_name, original_file_extension, rarExceutable):
|
||||||
break
|
break
|
||||||
elif original_file_extension.upper() == '.CBR' and use_rarfile:
|
elif original_file_extension.upper() == '.CBR' and use_rarfile:
|
||||||
try:
|
try:
|
||||||
rarfile.UNRAR_TOOL = rarExceutable
|
rarfile.UNRAR_TOOL = rarExecutable
|
||||||
cf = rarfile.RarFile(tmp_file_name)
|
cf = rarfile.RarFile(tmp_file_name)
|
||||||
for name in cf.getnames():
|
for name in cf.getnames():
|
||||||
ext = os.path.splitext(name)
|
ext = os.path.splitext(name)
|
||||||
|
@ -120,9 +120,9 @@ def _extractCover(tmp_file_name, original_file_extension, rarExceutable):
|
||||||
return _cover_processing(tmp_file_name, cover_data, extension)
|
return _cover_processing(tmp_file_name, cover_data, extension)
|
||||||
|
|
||||||
|
|
||||||
def get_comic_info(tmp_file_path, original_file_name, original_file_extension, rarExceutable):
|
def get_comic_info(tmp_file_path, original_file_name, original_file_extension, rarExecutable):
|
||||||
if use_comic_meta:
|
if use_comic_meta:
|
||||||
archive = ComicArchive(tmp_file_path, rar_exe_path=rarExceutable)
|
archive = ComicArchive(tmp_file_path, rar_exe_path=rarExecutable)
|
||||||
if archive.seemsToBeAComicArchive():
|
if archive.seemsToBeAComicArchive():
|
||||||
if archive.hasMetadata(MetaDataStyle.CIX):
|
if archive.hasMetadata(MetaDataStyle.CIX):
|
||||||
style = MetaDataStyle.CIX
|
style = MetaDataStyle.CIX
|
||||||
|
@ -134,21 +134,15 @@ def get_comic_info(tmp_file_path, original_file_name, original_file_extension, r
|
||||||
# if style is not None:
|
# if style is not None:
|
||||||
loadedMetadata = archive.readMetadata(style)
|
loadedMetadata = archive.readMetadata(style)
|
||||||
|
|
||||||
lang = loadedMetadata.language
|
lang = loadedMetadata.language or ""
|
||||||
if lang:
|
loadedMetadata.language = isoLanguages.get_lang3(lang)
|
||||||
if len(lang) == 2:
|
|
||||||
loadedMetadata.language = isoLanguages.get(part1=lang).name
|
|
||||||
elif len(lang) == 3:
|
|
||||||
loadedMetadata.language = isoLanguages.get(part3=lang).name
|
|
||||||
else:
|
|
||||||
loadedMetadata.language = ""
|
|
||||||
|
|
||||||
return BookMeta(
|
return BookMeta(
|
||||||
file_path=tmp_file_path,
|
file_path=tmp_file_path,
|
||||||
extension=original_file_extension,
|
extension=original_file_extension,
|
||||||
title=loadedMetadata.title or original_file_name,
|
title=loadedMetadata.title or original_file_name,
|
||||||
author=" & ".join([credit["person"] for credit in loadedMetadata.credits if credit["role"] == "Writer"]) or u'Unknown',
|
author=" & ".join([credit["person"] for credit in loadedMetadata.credits if credit["role"] == "Writer"]) or u'Unknown',
|
||||||
cover=_extractCover(tmp_file_path, original_file_extension, rarExceutable),
|
cover=_extractCover(tmp_file_path, original_file_extension, rarExecutable),
|
||||||
description=loadedMetadata.comments or "",
|
description=loadedMetadata.comments or "",
|
||||||
tags="",
|
tags="",
|
||||||
series=loadedMetadata.series or "",
|
series=loadedMetadata.series or "",
|
||||||
|
@ -160,7 +154,7 @@ def get_comic_info(tmp_file_path, original_file_name, original_file_extension, r
|
||||||
extension=original_file_extension,
|
extension=original_file_extension,
|
||||||
title=original_file_name,
|
title=original_file_name,
|
||||||
author=u'Unknown',
|
author=u'Unknown',
|
||||||
cover=_extractCover(tmp_file_path, original_file_extension, rarExceutable),
|
cover=_extractCover(tmp_file_path, original_file_extension, rarExecutable),
|
||||||
description="",
|
description="",
|
||||||
tags="",
|
tags="",
|
||||||
series="",
|
series="",
|
||||||
|
|
|
@ -22,7 +22,7 @@ import os
|
||||||
import json
|
import json
|
||||||
import sys
|
import sys
|
||||||
|
|
||||||
from sqlalchemy import exc, Column, String, Integer, SmallInteger, Boolean, BLOB
|
from sqlalchemy import exc, Column, String, Integer, SmallInteger, Boolean, BLOB, JSON
|
||||||
from sqlalchemy.ext.declarative import declarative_base
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
|
|
||||||
from . import constants, cli, logger, ub
|
from . import constants, cli, logger, ub
|
||||||
|
@ -57,6 +57,7 @@ class _Settings(_Base):
|
||||||
|
|
||||||
config_calibre_dir = Column(String)
|
config_calibre_dir = Column(String)
|
||||||
config_port = Column(Integer, default=constants.DEFAULT_PORT)
|
config_port = Column(Integer, default=constants.DEFAULT_PORT)
|
||||||
|
config_external_port = Column(Integer, default=constants.DEFAULT_PORT)
|
||||||
config_certfile = Column(String)
|
config_certfile = Column(String)
|
||||||
config_keyfile = Column(String)
|
config_keyfile = Column(String)
|
||||||
|
|
||||||
|
@ -92,7 +93,7 @@ class _Settings(_Base):
|
||||||
|
|
||||||
config_use_google_drive = Column(Boolean, default=False)
|
config_use_google_drive = Column(Boolean, default=False)
|
||||||
config_google_drive_folder = Column(String)
|
config_google_drive_folder = Column(String)
|
||||||
config_google_drive_watch_changes_response = Column(String)
|
config_google_drive_watch_changes_response = Column(JSON, default={})
|
||||||
|
|
||||||
config_use_goodreads = Column(Boolean, default=False)
|
config_use_goodreads = Column(Boolean, default=False)
|
||||||
config_goodreads_api_key = Column(String)
|
config_goodreads_api_key = Column(String)
|
||||||
|
@ -102,7 +103,6 @@ class _Settings(_Base):
|
||||||
|
|
||||||
config_kobo_proxy = Column(Boolean, default=False)
|
config_kobo_proxy = Column(Boolean, default=False)
|
||||||
|
|
||||||
|
|
||||||
config_ldap_provider_url = Column(String, default='example.org')
|
config_ldap_provider_url = Column(String, default='example.org')
|
||||||
config_ldap_port = Column(SmallInteger, default=389)
|
config_ldap_port = Column(SmallInteger, default=389)
|
||||||
config_ldap_authentication = Column(SmallInteger, default=constants.LDAP_AUTH_SIMPLE)
|
config_ldap_authentication = Column(SmallInteger, default=constants.LDAP_AUTH_SIMPLE)
|
||||||
|
@ -215,20 +215,20 @@ class _ConfigSQL(object):
|
||||||
return self.show_element_new_user(constants.DETAIL_RANDOM)
|
return self.show_element_new_user(constants.DETAIL_RANDOM)
|
||||||
|
|
||||||
def list_denied_tags(self):
|
def list_denied_tags(self):
|
||||||
mct = self.config_denied_tags.split(",")
|
mct = self.config_denied_tags or ""
|
||||||
return [t.strip() for t in mct]
|
return [t.strip() for t in mct.split(",")]
|
||||||
|
|
||||||
def list_allowed_tags(self):
|
def list_allowed_tags(self):
|
||||||
mct = self.config_allowed_tags.split(",")
|
mct = self.config_allowed_tags or ""
|
||||||
return [t.strip() for t in mct]
|
return [t.strip() for t in mct.split(",")]
|
||||||
|
|
||||||
def list_denied_column_values(self):
|
def list_denied_column_values(self):
|
||||||
mct = self.config_denied_column_value.split(",")
|
mct = self.config_denied_column_value or ""
|
||||||
return [t.strip() for t in mct]
|
return [t.strip() for t in mct.split(",")]
|
||||||
|
|
||||||
def list_allowed_column_values(self):
|
def list_allowed_column_values(self):
|
||||||
mct = self.config_allowed_column_value.split(",")
|
mct = self.config_allowed_column_value or ""
|
||||||
return [t.strip() for t in mct]
|
return [t.strip() for t in mct.split(",")]
|
||||||
|
|
||||||
def get_log_level(self):
|
def get_log_level(self):
|
||||||
return logger.get_level_name(self.config_log_level)
|
return logger.get_level_name(self.config_log_level)
|
||||||
|
@ -281,10 +281,6 @@ class _ConfigSQL(object):
|
||||||
v = column.default.arg
|
v = column.default.arg
|
||||||
setattr(self, k, v)
|
setattr(self, k, v)
|
||||||
|
|
||||||
if self.config_google_drive_watch_changes_response:
|
|
||||||
self.config_google_drive_watch_changes_response = \
|
|
||||||
json.loads(self.config_google_drive_watch_changes_response)
|
|
||||||
|
|
||||||
have_metadata_db = bool(self.config_calibre_dir)
|
have_metadata_db = bool(self.config_calibre_dir)
|
||||||
if have_metadata_db:
|
if have_metadata_db:
|
||||||
if not self.config_use_google_drive:
|
if not self.config_use_google_drive:
|
||||||
|
@ -303,10 +299,6 @@ class _ConfigSQL(object):
|
||||||
'''Apply all configuration values to the underlying storage.'''
|
'''Apply all configuration values to the underlying storage.'''
|
||||||
s = self._read_from_storage() # type: _Settings
|
s = self._read_from_storage() # type: _Settings
|
||||||
|
|
||||||
if self.config_google_drive_watch_changes_response:
|
|
||||||
self.config_google_drive_watch_changes_response = json.dumps(
|
|
||||||
self.config_google_drive_watch_changes_response)
|
|
||||||
|
|
||||||
for k, v in self.__dict__.items():
|
for k, v in self.__dict__.items():
|
||||||
if k[0] == '_':
|
if k[0] == '_':
|
||||||
continue
|
continue
|
||||||
|
@ -361,10 +353,10 @@ def _migrate_table(session, orm_class):
|
||||||
|
|
||||||
def autodetect_calibre_binary():
|
def autodetect_calibre_binary():
|
||||||
if sys.platform == "win32":
|
if sys.platform == "win32":
|
||||||
calibre_path = ["C:\\program files\calibre\ebook-convert.exe",
|
calibre_path = ["C:\\program files\\calibre\\ebook-convert.exe",
|
||||||
"C:\\program files(x86)\calibre\ebook-convert.exe",
|
"C:\\program files(x86)\\calibre\\ebook-convert.exe",
|
||||||
"C:\\program files(x86)\calibre2\ebook-convert.exe",
|
"C:\\program files(x86)\\calibre2\\ebook-convert.exe",
|
||||||
"C:\\program files\calibre2\ebook-convert.exe"]
|
"C:\\program files\\calibre2\\ebook-convert.exe"]
|
||||||
else:
|
else:
|
||||||
calibre_path = ["/opt/calibre/ebook-convert"]
|
calibre_path = ["/opt/calibre/ebook-convert"]
|
||||||
for element in calibre_path:
|
for element in calibre_path:
|
||||||
|
|
|
@ -128,7 +128,7 @@ def selected_roles(dictionary):
|
||||||
BookMeta = namedtuple('BookMeta', 'file_path, extension, title, author, cover, description, tags, series, '
|
BookMeta = namedtuple('BookMeta', 'file_path, extension, title, author, cover, description, tags, series, '
|
||||||
'series_id, languages')
|
'series_id, languages')
|
||||||
|
|
||||||
STABLE_VERSION = {'version': '0.6.9 Beta'}
|
STABLE_VERSION = {'version': '0.6.10 Beta'}
|
||||||
|
|
||||||
NIGHTLY_VERSION = {}
|
NIGHTLY_VERSION = {}
|
||||||
NIGHTLY_VERSION[0] = '$Format:%H$'
|
NIGHTLY_VERSION[0] = '$Format:%H$'
|
||||||
|
|
120
cps/db.py
120
cps/db.py
|
@ -32,8 +32,10 @@ from sqlalchemy import String, Integer, Boolean, TIMESTAMP, Float
|
||||||
from sqlalchemy.orm import relationship, sessionmaker, scoped_session
|
from sqlalchemy.orm import relationship, sessionmaker, scoped_session
|
||||||
from sqlalchemy.ext.declarative import declarative_base
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
from sqlalchemy.exc import OperationalError
|
from sqlalchemy.exc import OperationalError
|
||||||
|
from sqlalchemy.pool import StaticPool
|
||||||
from flask_login import current_user
|
from flask_login import current_user
|
||||||
from sqlalchemy.sql.expression import and_, true, false, text, func, or_
|
from sqlalchemy.sql.expression import and_, true, false, text, func, or_
|
||||||
|
from sqlalchemy.ext.associationproxy import association_proxy
|
||||||
from babel import Locale as LC
|
from babel import Locale as LC
|
||||||
from babel.core import UnknownLocaleError
|
from babel.core import UnknownLocaleError
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
|
@ -98,41 +100,61 @@ class Identifiers(Base):
|
||||||
self.book = book
|
self.book = book
|
||||||
|
|
||||||
def formatType(self):
|
def formatType(self):
|
||||||
if self.type == "amazon":
|
format_type = self.type.lower()
|
||||||
|
if format_type == 'amazon':
|
||||||
return u"Amazon"
|
return u"Amazon"
|
||||||
elif self.type == "isbn":
|
elif format_type.startswith("amazon_"):
|
||||||
|
return u"Amazon.{0}".format(format_type[7:])
|
||||||
|
elif format_type == "isbn":
|
||||||
return u"ISBN"
|
return u"ISBN"
|
||||||
elif self.type == "doi":
|
elif format_type == "doi":
|
||||||
return u"DOI"
|
return u"DOI"
|
||||||
elif self.type == "goodreads":
|
elif format_type == "douban":
|
||||||
|
return u"Douban"
|
||||||
|
elif format_type == "goodreads":
|
||||||
return u"Goodreads"
|
return u"Goodreads"
|
||||||
elif self.type == "google":
|
elif format_type == "google":
|
||||||
return u"Google Books"
|
return u"Google Books"
|
||||||
elif self.type == "kobo":
|
elif format_type == "kobo":
|
||||||
return u"Kobo"
|
return u"Kobo"
|
||||||
if self.type == "lubimyczytac":
|
elif format_type == "litres":
|
||||||
|
return u"ЛитРес"
|
||||||
|
elif format_type == "issn":
|
||||||
|
return u"ISSN"
|
||||||
|
elif format_type == "isfdb":
|
||||||
|
return u"ISFDB"
|
||||||
|
if format_type == "lubimyczytac":
|
||||||
return u"Lubimyczytac"
|
return u"Lubimyczytac"
|
||||||
else:
|
else:
|
||||||
return self.type
|
return self.type
|
||||||
|
|
||||||
def __repr__(self):
|
def __repr__(self):
|
||||||
if self.type == "amazon" or self.type == "asin":
|
format_type = self.type.lower()
|
||||||
return u"https://amzn.com/{0}".format(self.val)
|
if format_type == "amazon" or format_type == "asin":
|
||||||
elif self.type == "isbn":
|
return u"https://amazon.com/dp/{0}".format(self.val)
|
||||||
|
elif format_type.startswith('amazon_'):
|
||||||
|
return u"https://amazon.{0}/dp/{1}".format(format_type[7:], self.val)
|
||||||
|
elif format_type == "isbn":
|
||||||
return u"https://www.worldcat.org/isbn/{0}".format(self.val)
|
return u"https://www.worldcat.org/isbn/{0}".format(self.val)
|
||||||
elif self.type == "doi":
|
elif format_type == "doi":
|
||||||
return u"https://dx.doi.org/{0}".format(self.val)
|
return u"https://dx.doi.org/{0}".format(self.val)
|
||||||
elif self.type == "goodreads":
|
elif format_type == "goodreads":
|
||||||
return u"https://www.goodreads.com/book/show/{0}".format(self.val)
|
return u"https://www.goodreads.com/book/show/{0}".format(self.val)
|
||||||
elif self.type == "douban":
|
elif format_type == "douban":
|
||||||
return u"https://book.douban.com/subject/{0}".format(self.val)
|
return u"https://book.douban.com/subject/{0}".format(self.val)
|
||||||
elif self.type == "google":
|
elif format_type == "google":
|
||||||
return u"https://books.google.com/books?id={0}".format(self.val)
|
return u"https://books.google.com/books?id={0}".format(self.val)
|
||||||
elif self.type == "kobo":
|
elif format_type == "kobo":
|
||||||
return u"https://www.kobo.com/ebook/{0}".format(self.val)
|
return u"https://www.kobo.com/ebook/{0}".format(self.val)
|
||||||
elif self.type == "lubimyczytac":
|
elif format_type == "lubimyczytac":
|
||||||
return u" https://lubimyczytac.pl/ksiazka/{0}".format(self.val)
|
return u" https://lubimyczytac.pl/ksiazka/{0}/ksiazka".format(self.val)
|
||||||
elif self.type == "url":
|
elif format_type == "litres":
|
||||||
|
return u"https://www.litres.ru/{0}".format(self.val)
|
||||||
|
elif format_type == "issn":
|
||||||
|
return u"https://portal.issn.org/resource/ISSN/{0}".format(self.val)
|
||||||
|
elif format_type == "isfdb":
|
||||||
|
return u"http://www.isfdb.org/cgi-bin/pl.cgi?{0}".format(self.val)
|
||||||
|
elif format_type == "url":
|
||||||
return u"{0}".format(self.val)
|
return u"{0}".format(self.val)
|
||||||
else:
|
else:
|
||||||
return u""
|
return u""
|
||||||
|
@ -280,7 +302,7 @@ class Books(Base):
|
||||||
flags = Column(Integer, nullable=False, default=1)
|
flags = Column(Integer, nullable=False, default=1)
|
||||||
|
|
||||||
authors = relationship('Authors', secondary=books_authors_link, backref='books')
|
authors = relationship('Authors', secondary=books_authors_link, backref='books')
|
||||||
tags = relationship('Tags', secondary=books_tags_link, backref='books',order_by="Tags.name")
|
tags = relationship('Tags', secondary=books_tags_link, backref='books', order_by="Tags.name")
|
||||||
comments = relationship('Comments', backref='books')
|
comments = relationship('Comments', backref='books')
|
||||||
data = relationship('Data', backref='books')
|
data = relationship('Data', backref='books')
|
||||||
series = relationship('Series', secondary=books_series_link, backref='books')
|
series = relationship('Series', secondary=books_series_link, backref='books')
|
||||||
|
@ -370,7 +392,6 @@ class CalibreDB(threading.Thread):
|
||||||
def setup_db(self, config, app_db_path):
|
def setup_db(self, config, app_db_path):
|
||||||
self.config = config
|
self.config = config
|
||||||
self.dispose()
|
self.dispose()
|
||||||
# global engine
|
|
||||||
|
|
||||||
if not config.config_calibre_dir:
|
if not config.config_calibre_dir:
|
||||||
config.invalidate()
|
config.invalidate()
|
||||||
|
@ -382,11 +403,11 @@ class CalibreDB(threading.Thread):
|
||||||
return False
|
return False
|
||||||
|
|
||||||
try:
|
try:
|
||||||
#engine = create_engine('sqlite:///{0}'.format(dbpath),
|
|
||||||
self.engine = create_engine('sqlite://',
|
self.engine = create_engine('sqlite://',
|
||||||
echo=False,
|
echo=False,
|
||||||
isolation_level="SERIALIZABLE",
|
isolation_level="SERIALIZABLE",
|
||||||
connect_args={'check_same_thread': False})
|
connect_args={'check_same_thread': False},
|
||||||
|
poolclass=StaticPool)
|
||||||
self.engine.execute("attach database '{}' as calibre;".format(dbpath))
|
self.engine.execute("attach database '{}' as calibre;".format(dbpath))
|
||||||
self.engine.execute("attach database '{}' as app_settings;".format(app_db_path))
|
self.engine.execute("attach database '{}' as app_settings;".format(app_db_path))
|
||||||
|
|
||||||
|
@ -406,34 +427,46 @@ class CalibreDB(threading.Thread):
|
||||||
books_custom_column_links = {}
|
books_custom_column_links = {}
|
||||||
for row in cc:
|
for row in cc:
|
||||||
if row.datatype not in cc_exceptions:
|
if row.datatype not in cc_exceptions:
|
||||||
books_custom_column_links[row.id] = Table('books_custom_column_' + str(row.id) + '_link', Base.metadata,
|
if row.datatype == 'series':
|
||||||
|
dicttable = {'__tablename__': 'books_custom_column_' + str(row.id) + '_link',
|
||||||
|
'id': Column(Integer, primary_key=True),
|
||||||
|
'book': Column(Integer, ForeignKey('books.id'),
|
||||||
|
primary_key=True),
|
||||||
|
'map_value': Column('value', Integer,
|
||||||
|
ForeignKey('custom_column_' +
|
||||||
|
str(row.id) + '.id'),
|
||||||
|
primary_key=True),
|
||||||
|
'extra': Column(Float),
|
||||||
|
'asoc' : relationship('custom_column_' + str(row.id), uselist=False),
|
||||||
|
'value' : association_proxy('asoc', 'value')
|
||||||
|
}
|
||||||
|
books_custom_column_links[row.id] = type(str('books_custom_column_' + str(row.id) + '_link'),
|
||||||
|
(Base,), dicttable)
|
||||||
|
else:
|
||||||
|
books_custom_column_links[row.id] = Table('books_custom_column_' + str(row.id) + '_link',
|
||||||
|
Base.metadata,
|
||||||
Column('book', Integer, ForeignKey('books.id'),
|
Column('book', Integer, ForeignKey('books.id'),
|
||||||
primary_key=True),
|
primary_key=True),
|
||||||
Column('value', Integer,
|
Column('value', Integer,
|
||||||
ForeignKey('custom_column_' + str(row.id) + '.id'),
|
ForeignKey('custom_column_' +
|
||||||
|
str(row.id) + '.id'),
|
||||||
primary_key=True)
|
primary_key=True)
|
||||||
)
|
)
|
||||||
cc_ids.append([row.id, row.datatype])
|
cc_ids.append([row.id, row.datatype])
|
||||||
if row.datatype == 'bool':
|
|
||||||
ccdict = {'__tablename__': 'custom_column_' + str(row.id),
|
ccdict = {'__tablename__': 'custom_column_' + str(row.id),
|
||||||
'id': Column(Integer, primary_key=True),
|
'id': Column(Integer, primary_key=True)}
|
||||||
'book': Column(Integer, ForeignKey('books.id')),
|
if row.datatype == 'float':
|
||||||
'value': Column(Boolean)}
|
ccdict['value'] = Column(Float)
|
||||||
elif row.datatype == 'int':
|
elif row.datatype == 'int':
|
||||||
ccdict = {'__tablename__': 'custom_column_' + str(row.id),
|
ccdict['value'] = Column(Integer)
|
||||||
'id': Column(Integer, primary_key=True),
|
elif row.datatype == 'bool':
|
||||||
'book': Column(Integer, ForeignKey('books.id')),
|
ccdict['value'] = Column(Boolean)
|
||||||
'value': Column(Integer)}
|
|
||||||
elif row.datatype == 'float':
|
|
||||||
ccdict = {'__tablename__': 'custom_column_' + str(row.id),
|
|
||||||
'id': Column(Integer, primary_key=True),
|
|
||||||
'book': Column(Integer, ForeignKey('books.id')),
|
|
||||||
'value': Column(Float)}
|
|
||||||
else:
|
else:
|
||||||
ccdict = {'__tablename__': 'custom_column_' + str(row.id),
|
ccdict['value'] = Column(String)
|
||||||
'id': Column(Integer, primary_key=True),
|
if row.datatype in ['float', 'int', 'bool']:
|
||||||
'value': Column(String)}
|
ccdict['book'] = Column(Integer, ForeignKey('books.id'))
|
||||||
cc_classes[row.id] = type(str('Custom_Column_' + str(row.id)), (Base,), ccdict)
|
cc_classes[row.id] = type(str('custom_column_' + str(row.id)), (Base,), ccdict)
|
||||||
|
|
||||||
for cc_id in cc_ids:
|
for cc_id in cc_ids:
|
||||||
if (cc_id[1] == 'bool') or (cc_id[1] == 'int') or (cc_id[1] == 'float'):
|
if (cc_id[1] == 'bool') or (cc_id[1] == 'int') or (cc_id[1] == 'float'):
|
||||||
|
@ -443,6 +476,11 @@ class CalibreDB(threading.Thread):
|
||||||
primaryjoin=(
|
primaryjoin=(
|
||||||
Books.id == cc_classes[cc_id[0]].book),
|
Books.id == cc_classes[cc_id[0]].book),
|
||||||
backref='books'))
|
backref='books'))
|
||||||
|
elif (cc_id[1] == 'series'):
|
||||||
|
setattr(Books,
|
||||||
|
'custom_column_' + str(cc_id[0]),
|
||||||
|
relationship(books_custom_column_links[cc_id[0]],
|
||||||
|
backref='books'))
|
||||||
else:
|
else:
|
||||||
setattr(Books,
|
setattr(Books,
|
||||||
'custom_column_' + str(cc_id[0]),
|
'custom_column_' + str(cc_id[0]),
|
||||||
|
|
|
@ -151,8 +151,11 @@ def modify_identifiers(input_identifiers, db_identifiers, db_session):
|
||||||
input_identifiers is a list of read-to-persist Identifiers objects.
|
input_identifiers is a list of read-to-persist Identifiers objects.
|
||||||
db_identifiers is a list of already persisted list of Identifiers objects."""
|
db_identifiers is a list of already persisted list of Identifiers objects."""
|
||||||
changed = False
|
changed = False
|
||||||
input_dict = dict([ (identifier.type.lower(), identifier) for identifier in input_identifiers ])
|
error = False
|
||||||
db_dict = dict([ (identifier.type.lower(), identifier) for identifier in db_identifiers ])
|
input_dict = dict([(identifier.type.lower(), identifier) for identifier in input_identifiers])
|
||||||
|
if len(input_identifiers) != len(input_dict):
|
||||||
|
error = True
|
||||||
|
db_dict = dict([(identifier.type.lower(), identifier) for identifier in db_identifiers ])
|
||||||
# delete db identifiers not present in input or modify them with input val
|
# delete db identifiers not present in input or modify them with input val
|
||||||
for identifier_type, identifier in db_dict.items():
|
for identifier_type, identifier in db_dict.items():
|
||||||
if identifier_type not in input_dict.keys():
|
if identifier_type not in input_dict.keys():
|
||||||
|
@ -167,7 +170,7 @@ def modify_identifiers(input_identifiers, db_identifiers, db_session):
|
||||||
if identifier_type not in db_dict.keys():
|
if identifier_type not in db_dict.keys():
|
||||||
db_session.add(identifier)
|
db_session.add(identifier)
|
||||||
changed = True
|
changed = True
|
||||||
return changed
|
return changed, error
|
||||||
|
|
||||||
|
|
||||||
@editbook.route("/delete/<int:book_id>/", defaults={'book_format': ""})
|
@editbook.route("/delete/<int:book_id>/", defaults={'book_format': ""})
|
||||||
|
@ -354,7 +357,10 @@ def edit_book_comments(comments, book):
|
||||||
def edit_book_languages(languages, book, upload=False):
|
def edit_book_languages(languages, book, upload=False):
|
||||||
input_languages = languages.split(',')
|
input_languages = languages.split(',')
|
||||||
unknown_languages = []
|
unknown_languages = []
|
||||||
|
if not upload:
|
||||||
input_l = isoLanguages.get_language_codes(get_locale(), input_languages, unknown_languages)
|
input_l = isoLanguages.get_language_codes(get_locale(), input_languages, unknown_languages)
|
||||||
|
else:
|
||||||
|
input_l = isoLanguages.get_valid_language_codes(get_locale(), input_languages, unknown_languages)
|
||||||
for l in unknown_languages:
|
for l in unknown_languages:
|
||||||
log.error('%s is not a valid language', l)
|
log.error('%s is not a valid language', l)
|
||||||
flash(_(u"%(langname)s is not a valid language", langname=l), category="warning")
|
flash(_(u"%(langname)s is not a valid language", langname=l), category="warning")
|
||||||
|
@ -375,7 +381,8 @@ def edit_book_publisher(to_save, book):
|
||||||
if to_save["publisher"]:
|
if to_save["publisher"]:
|
||||||
publisher = to_save["publisher"].rstrip().strip()
|
publisher = to_save["publisher"].rstrip().strip()
|
||||||
if len(book.publishers) == 0 or (len(book.publishers) > 0 and publisher != book.publishers[0].name):
|
if len(book.publishers) == 0 or (len(book.publishers) > 0 and publisher != book.publishers[0].name):
|
||||||
changed |= modify_database_object([publisher], book.publishers, db.Publishers, calibre_db.session, 'publisher')
|
changed |= modify_database_object([publisher], book.publishers, db.Publishers, calibre_db.session,
|
||||||
|
'publisher')
|
||||||
elif len(book.publishers):
|
elif len(book.publishers):
|
||||||
changed |= modify_database_object([], book.publishers, db.Publishers, calibre_db.session, 'publisher')
|
changed |= modify_database_object([], book.publishers, db.Publishers, calibre_db.session, 'publisher')
|
||||||
return changed
|
return changed
|
||||||
|
@ -462,9 +469,11 @@ def upload_single_file(request, book, book_id):
|
||||||
requested_file = request.files['btn-upload-format']
|
requested_file = request.files['btn-upload-format']
|
||||||
# check for empty request
|
# check for empty request
|
||||||
if requested_file.filename != '':
|
if requested_file.filename != '':
|
||||||
|
if not current_user.role_upload():
|
||||||
|
abort(403)
|
||||||
if '.' in requested_file.filename:
|
if '.' in requested_file.filename:
|
||||||
file_ext = requested_file.filename.rsplit('.', 1)[-1].lower()
|
file_ext = requested_file.filename.rsplit('.', 1)[-1].lower()
|
||||||
if file_ext not in constants.EXTENSIONS_UPLOAD:
|
if file_ext not in constants.EXTENSIONS_UPLOAD and '' not in constants.EXTENSIONS_UPLOAD:
|
||||||
flash(_("File extension '%(ext)s' is not allowed to be uploaded to this server", ext=file_ext),
|
flash(_("File extension '%(ext)s' is not allowed to be uploaded to this server", ext=file_ext),
|
||||||
category="error")
|
category="error")
|
||||||
return redirect(url_for('web.show_book', book_id=book.id))
|
return redirect(url_for('web.show_book', book_id=book.id))
|
||||||
|
@ -522,6 +531,8 @@ def upload_cover(request, book):
|
||||||
requested_file = request.files['btn-upload-cover']
|
requested_file = request.files['btn-upload-cover']
|
||||||
# check for empty request
|
# check for empty request
|
||||||
if requested_file.filename != '':
|
if requested_file.filename != '':
|
||||||
|
if not current_user.role_upload():
|
||||||
|
abort(403)
|
||||||
ret, message = helper.save_cover(requested_file, book.path)
|
ret, message = helper.save_cover(requested_file, book.path)
|
||||||
if ret is True:
|
if ret is True:
|
||||||
return True
|
return True
|
||||||
|
@ -601,7 +612,10 @@ def edit_book(book_id):
|
||||||
error = helper.update_dir_stucture(edited_books_id, config.config_calibre_dir, input_authors[0])
|
error = helper.update_dir_stucture(edited_books_id, config.config_calibre_dir, input_authors[0])
|
||||||
|
|
||||||
if not error:
|
if not error:
|
||||||
|
if "cover_url" in to_save:
|
||||||
if to_save["cover_url"]:
|
if to_save["cover_url"]:
|
||||||
|
if not current_user.role_upload():
|
||||||
|
return "", (403)
|
||||||
result, error = helper.save_cover_from_url(to_save["cover_url"], book.path)
|
result, error = helper.save_cover_from_url(to_save["cover_url"], book.path)
|
||||||
if result is True:
|
if result is True:
|
||||||
book.has_cover = 1
|
book.has_cover = 1
|
||||||
|
@ -617,8 +631,10 @@ def edit_book(book_id):
|
||||||
|
|
||||||
# Handle identifiers
|
# Handle identifiers
|
||||||
input_identifiers = identifier_list(to_save, book)
|
input_identifiers = identifier_list(to_save, book)
|
||||||
modif_date |= modify_identifiers(input_identifiers, book.identifiers, calibre_db.session)
|
modification, warning = modify_identifiers(input_identifiers, book.identifiers, calibre_db.session)
|
||||||
|
if warning:
|
||||||
|
flash(_("Identifiers are not Case Sensitive, Overwriting Old Identifier"), category="warning")
|
||||||
|
modif_date |= modification
|
||||||
# Handle book tags
|
# Handle book tags
|
||||||
modif_date |= edit_book_tags(to_save['tags'], book)
|
modif_date |= edit_book_tags(to_save['tags'], book)
|
||||||
|
|
||||||
|
@ -647,6 +663,7 @@ def edit_book(book_id):
|
||||||
|
|
||||||
if modif_date:
|
if modif_date:
|
||||||
book.last_modified = datetime.utcnow()
|
book.last_modified = datetime.utcnow()
|
||||||
|
calibre_db.session.merge(book)
|
||||||
calibre_db.session.commit()
|
calibre_db.session.commit()
|
||||||
if config.config_use_google_drive:
|
if config.config_use_google_drive:
|
||||||
gdriveutils.updateGdriveCalibreFromLocal()
|
gdriveutils.updateGdriveCalibreFromLocal()
|
||||||
|
@ -690,7 +707,7 @@ def identifier_list(to_save, book):
|
||||||
val_key = id_val_prefix + type_key[len(id_type_prefix):]
|
val_key = id_val_prefix + type_key[len(id_type_prefix):]
|
||||||
if val_key not in to_save.keys():
|
if val_key not in to_save.keys():
|
||||||
continue
|
continue
|
||||||
result.append( db.Identifiers(to_save[val_key], type_value, book.id) )
|
result.append(db.Identifiers(to_save[val_key], type_value, book.id))
|
||||||
return result
|
return result
|
||||||
|
|
||||||
@editbook.route("/upload", methods=["GET", "POST"])
|
@editbook.route("/upload", methods=["GET", "POST"])
|
||||||
|
@ -710,7 +727,7 @@ def upload():
|
||||||
# check if file extension is correct
|
# check if file extension is correct
|
||||||
if '.' in requested_file.filename:
|
if '.' in requested_file.filename:
|
||||||
file_ext = requested_file.filename.rsplit('.', 1)[-1].lower()
|
file_ext = requested_file.filename.rsplit('.', 1)[-1].lower()
|
||||||
if file_ext not in constants.EXTENSIONS_UPLOAD:
|
if file_ext not in constants.EXTENSIONS_UPLOAD and '' not in constants.EXTENSIONS_UPLOAD:
|
||||||
flash(
|
flash(
|
||||||
_("File extension '%(ext)s' is not allowed to be uploaded to this server",
|
_("File extension '%(ext)s' is not allowed to be uploaded to this server",
|
||||||
ext=file_ext), category="error")
|
ext=file_ext), category="error")
|
||||||
|
@ -833,8 +850,8 @@ def upload():
|
||||||
|
|
||||||
# move cover to final directory, including book id
|
# move cover to final directory, including book id
|
||||||
if has_cover:
|
if has_cover:
|
||||||
try:
|
|
||||||
new_coverpath = os.path.join(config.config_calibre_dir, db_book.path, "cover.jpg")
|
new_coverpath = os.path.join(config.config_calibre_dir, db_book.path, "cover.jpg")
|
||||||
|
try:
|
||||||
copyfile(meta.cover, new_coverpath)
|
copyfile(meta.cover, new_coverpath)
|
||||||
os.unlink(meta.cover)
|
os.unlink(meta.cover)
|
||||||
except OSError as e:
|
except OSError as e:
|
||||||
|
|
13
cps/epub.py
13
cps/epub.py
|
@ -22,6 +22,7 @@ import zipfile
|
||||||
from lxml import etree
|
from lxml import etree
|
||||||
|
|
||||||
from . import isoLanguages
|
from . import isoLanguages
|
||||||
|
from .helper import split_authors
|
||||||
from .constants import BookMeta
|
from .constants import BookMeta
|
||||||
|
|
||||||
|
|
||||||
|
@ -64,7 +65,7 @@ def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
||||||
tmp = p.xpath('dc:%s/text()' % s, namespaces=ns)
|
tmp = p.xpath('dc:%s/text()' % s, namespaces=ns)
|
||||||
if len(tmp) > 0:
|
if len(tmp) > 0:
|
||||||
if s == 'creator':
|
if s == 'creator':
|
||||||
epub_metadata[s] = ' & '.join(p.xpath('dc:%s/text()' % s, namespaces=ns))
|
epub_metadata[s] = ' & '.join(split_authors(p.xpath('dc:%s/text()' % s, namespaces=ns)))
|
||||||
elif s == 'subject':
|
elif s == 'subject':
|
||||||
epub_metadata[s] = ', '.join(p.xpath('dc:%s/text()' % s, namespaces=ns))
|
epub_metadata[s] = ', '.join(p.xpath('dc:%s/text()' % s, namespaces=ns))
|
||||||
else:
|
else:
|
||||||
|
@ -82,16 +83,8 @@ def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
|
||||||
else:
|
else:
|
||||||
epub_metadata['description'] = ""
|
epub_metadata['description'] = ""
|
||||||
|
|
||||||
if epub_metadata['language'] == u'Unknown':
|
|
||||||
epub_metadata['language'] = ""
|
|
||||||
else:
|
|
||||||
lang = epub_metadata['language'].split('-', 1)[0].lower()
|
lang = epub_metadata['language'].split('-', 1)[0].lower()
|
||||||
if len(lang) == 2:
|
epub_metadata['language'] = isoLanguages.get_lang3(lang)
|
||||||
epub_metadata['language'] = isoLanguages.get(part1=lang).name
|
|
||||||
elif len(lang) == 3:
|
|
||||||
epub_metadata['language'] = isoLanguages.get(part3=lang).name
|
|
||||||
else:
|
|
||||||
epub_metadata['language'] = ""
|
|
||||||
|
|
||||||
series = tree.xpath("/pkg:package/pkg:metadata/pkg:meta[@name='calibre:series']/@content", namespaces=ns)
|
series = tree.xpath("/pkg:package/pkg:metadata/pkg:meta[@name='calibre:series']/@content", namespaces=ns)
|
||||||
if len(series) > 0:
|
if len(series) > 0:
|
||||||
|
|
|
@ -34,18 +34,17 @@ from flask import Blueprint, flash, request, redirect, url_for, abort
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
from flask_login import login_required
|
from flask_login import login_required
|
||||||
|
|
||||||
try:
|
|
||||||
from googleapiclient.errors import HttpError
|
|
||||||
except ImportError:
|
|
||||||
pass
|
|
||||||
|
|
||||||
from . import logger, gdriveutils, config, ub, calibre_db
|
from . import logger, gdriveutils, config, ub, calibre_db
|
||||||
from .web import admin_required
|
from .web import admin_required
|
||||||
|
|
||||||
|
|
||||||
gdrive = Blueprint('gdrive', __name__)
|
gdrive = Blueprint('gdrive', __name__)
|
||||||
log = logger.create()
|
log = logger.create()
|
||||||
|
|
||||||
|
try:
|
||||||
|
from googleapiclient.errors import HttpError
|
||||||
|
except ImportError as err:
|
||||||
|
log.debug("Cannot import googleapiclient, using GDrive will not work: %s", err)
|
||||||
|
|
||||||
current_milli_time = lambda: int(round(time() * 1000))
|
current_milli_time = lambda: int(round(time() * 1000))
|
||||||
|
|
||||||
gdrive_watch_callback_token = 'target=calibreweb-watch_files'
|
gdrive_watch_callback_token = 'target=calibreweb-watch_files'
|
||||||
|
@ -73,7 +72,7 @@ def google_drive_callback():
|
||||||
credentials = gdriveutils.Gauth.Instance().auth.flow.step2_exchange(auth_code)
|
credentials = gdriveutils.Gauth.Instance().auth.flow.step2_exchange(auth_code)
|
||||||
with open(gdriveutils.CREDENTIALS, 'w') as f:
|
with open(gdriveutils.CREDENTIALS, 'w') as f:
|
||||||
f.write(credentials.to_json())
|
f.write(credentials.to_json())
|
||||||
except ValueError as error:
|
except (ValueError, AttributeError) as error:
|
||||||
log.error(error)
|
log.error(error)
|
||||||
return redirect(url_for('admin.configuration'))
|
return redirect(url_for('admin.configuration'))
|
||||||
|
|
||||||
|
@ -94,8 +93,7 @@ def watch_gdrive():
|
||||||
try:
|
try:
|
||||||
result = gdriveutils.watchChange(gdriveutils.Gdrive.Instance().drive, notification_id,
|
result = gdriveutils.watchChange(gdriveutils.Gdrive.Instance().drive, notification_id,
|
||||||
'web_hook', address, gdrive_watch_callback_token, current_milli_time() + 604800*1000)
|
'web_hook', address, gdrive_watch_callback_token, current_milli_time() + 604800*1000)
|
||||||
config.config_google_drive_watch_changes_response = json.dumps(result)
|
config.config_google_drive_watch_changes_response = result
|
||||||
# after save(), config_google_drive_watch_changes_response will be a json object, not string
|
|
||||||
config.save()
|
config.save()
|
||||||
except HttpError as e:
|
except HttpError as e:
|
||||||
reason=json.loads(e.content)['error']['errors'][0]
|
reason=json.loads(e.content)['error']['errors'][0]
|
||||||
|
@ -118,7 +116,7 @@ def revoke_watch_gdrive():
|
||||||
last_watch_response['resourceId'])
|
last_watch_response['resourceId'])
|
||||||
except HttpError:
|
except HttpError:
|
||||||
pass
|
pass
|
||||||
config.config_google_drive_watch_changes_response = None
|
config.config_google_drive_watch_changes_response = {}
|
||||||
config.save()
|
config.save()
|
||||||
return redirect(url_for('admin.configuration'))
|
return redirect(url_for('admin.configuration'))
|
||||||
|
|
||||||
|
@ -155,7 +153,7 @@ def on_received_watch_confirmation():
|
||||||
log.info('Setting up new DB')
|
log.info('Setting up new DB')
|
||||||
# prevent error on windows, as os.rename does on exisiting files
|
# prevent error on windows, as os.rename does on exisiting files
|
||||||
move(os.path.join(tmpDir, "tmp_metadata.db"), dbpath)
|
move(os.path.join(tmpDir, "tmp_metadata.db"), dbpath)
|
||||||
calibre_db.setup_db(config, ub.app_DB_path)
|
calibre_db.reconnect_db(config, ub.app_DB_path)
|
||||||
except Exception as e:
|
except Exception as e:
|
||||||
log.exception(e)
|
log.exception(e)
|
||||||
updateMetaData()
|
updateMetaData()
|
||||||
|
|
|
@ -27,14 +27,18 @@ from sqlalchemy import Column, UniqueConstraint
|
||||||
from sqlalchemy import String, Integer
|
from sqlalchemy import String, Integer
|
||||||
from sqlalchemy.orm import sessionmaker, scoped_session
|
from sqlalchemy.orm import sessionmaker, scoped_session
|
||||||
from sqlalchemy.ext.declarative import declarative_base
|
from sqlalchemy.ext.declarative import declarative_base
|
||||||
|
from sqlalchemy.exc import OperationalError, InvalidRequestError
|
||||||
|
|
||||||
try:
|
try:
|
||||||
from pydrive.auth import GoogleAuth
|
from pydrive.auth import GoogleAuth
|
||||||
from pydrive.drive import GoogleDrive
|
from pydrive.drive import GoogleDrive
|
||||||
from pydrive.auth import RefreshError
|
from pydrive.auth import RefreshError
|
||||||
from apiclient import errors
|
from apiclient import errors
|
||||||
|
from httplib2 import ServerNotFoundError
|
||||||
gdrive_support = True
|
gdrive_support = True
|
||||||
except ImportError:
|
importError = None
|
||||||
|
except ImportError as err:
|
||||||
|
importError = err
|
||||||
gdrive_support = False
|
gdrive_support = False
|
||||||
|
|
||||||
from . import logger, cli, config
|
from . import logger, cli, config
|
||||||
|
@ -50,6 +54,8 @@ if gdrive_support:
|
||||||
logger.get('googleapiclient.discovery_cache').setLevel(logger.logging.ERROR)
|
logger.get('googleapiclient.discovery_cache').setLevel(logger.logging.ERROR)
|
||||||
if not logger.is_debug_enabled():
|
if not logger.is_debug_enabled():
|
||||||
logger.get('googleapiclient.discovery').setLevel(logger.logging.ERROR)
|
logger.get('googleapiclient.discovery').setLevel(logger.logging.ERROR)
|
||||||
|
else:
|
||||||
|
log.debug("Cannot import pydrive,httplib2, using gdrive will not work: %s", importError)
|
||||||
|
|
||||||
|
|
||||||
class Singleton:
|
class Singleton:
|
||||||
|
@ -97,7 +103,11 @@ class Singleton:
|
||||||
@Singleton
|
@Singleton
|
||||||
class Gauth:
|
class Gauth:
|
||||||
def __init__(self):
|
def __init__(self):
|
||||||
|
try:
|
||||||
self.auth = GoogleAuth(settings_file=SETTINGS_YAML)
|
self.auth = GoogleAuth(settings_file=SETTINGS_YAML)
|
||||||
|
except NameError as error:
|
||||||
|
log.error(error)
|
||||||
|
self.auth = None
|
||||||
|
|
||||||
|
|
||||||
@Singleton
|
@Singleton
|
||||||
|
@ -192,14 +202,18 @@ def getDrive(drive=None, gauth=None):
|
||||||
return drive
|
return drive
|
||||||
|
|
||||||
def listRootFolders():
|
def listRootFolders():
|
||||||
|
try:
|
||||||
drive = getDrive(Gdrive.Instance().drive)
|
drive = getDrive(Gdrive.Instance().drive)
|
||||||
folder = "'root' in parents and mimeType = 'application/vnd.google-apps.folder' and trashed = false"
|
folder = "'root' in parents and mimeType = 'application/vnd.google-apps.folder' and trashed = false"
|
||||||
fileList = drive.ListFile({'q': folder}).GetList()
|
fileList = drive.ListFile({'q': folder}).GetList()
|
||||||
|
except ServerNotFoundError as e:
|
||||||
|
log.info("GDrive Error %s" % e)
|
||||||
|
fileList = []
|
||||||
return fileList
|
return fileList
|
||||||
|
|
||||||
|
|
||||||
def getEbooksFolder(drive):
|
def getEbooksFolder(drive):
|
||||||
return getFolderInFolder('root',config.config_google_drive_folder,drive)
|
return getFolderInFolder('root', config.config_google_drive_folder, drive)
|
||||||
|
|
||||||
|
|
||||||
def getFolderInFolder(parentId, folderName, drive):
|
def getFolderInFolder(parentId, folderName, drive):
|
||||||
|
@ -229,7 +243,7 @@ def getEbooksFolderId(drive=None):
|
||||||
gDriveId.path = '/'
|
gDriveId.path = '/'
|
||||||
session.merge(gDriveId)
|
session.merge(gDriveId)
|
||||||
session.commit()
|
session.commit()
|
||||||
return
|
return gDriveId.gdrive_id
|
||||||
|
|
||||||
|
|
||||||
def getFile(pathId, fileName, drive):
|
def getFile(pathId, fileName, drive):
|
||||||
|
@ -474,8 +488,13 @@ def getChangeById (drive, change_id):
|
||||||
|
|
||||||
# Deletes the local hashes database to force search for new folder names
|
# Deletes the local hashes database to force search for new folder names
|
||||||
def deleteDatabaseOnChange():
|
def deleteDatabaseOnChange():
|
||||||
|
try:
|
||||||
session.query(GdriveId).delete()
|
session.query(GdriveId).delete()
|
||||||
session.commit()
|
session.commit()
|
||||||
|
except (OperationalError, InvalidRequestError):
|
||||||
|
session.rollback()
|
||||||
|
log.info(u"GDrive DB is not Writeable")
|
||||||
|
|
||||||
|
|
||||||
def updateGdriveCalibreFromLocal():
|
def updateGdriveCalibreFromLocal():
|
||||||
copyToDrive(Gdrive.Instance().drive, config.config_calibre_dir, False, True)
|
copyToDrive(Gdrive.Instance().drive, config.config_calibre_dir, False, True)
|
||||||
|
@ -583,8 +602,12 @@ def get_error_text(client_secrets=None):
|
||||||
if not os.path.isfile(CLIENT_SECRETS):
|
if not os.path.isfile(CLIENT_SECRETS):
|
||||||
return 'client_secrets.json is missing or not readable'
|
return 'client_secrets.json is missing or not readable'
|
||||||
|
|
||||||
|
try:
|
||||||
with open(CLIENT_SECRETS, 'r') as settings:
|
with open(CLIENT_SECRETS, 'r') as settings:
|
||||||
filedata = json.load(settings)
|
filedata = json.load(settings)
|
||||||
|
except PermissionError:
|
||||||
|
return 'client_secrets.json is missing or not readable'
|
||||||
|
|
||||||
if 'web' not in filedata:
|
if 'web' not in filedata:
|
||||||
return 'client_secrets.json is not configured for web application'
|
return 'client_secrets.json is not configured for web application'
|
||||||
if 'redirect_uris' not in filedata['web']:
|
if 'redirect_uris' not in filedata['web']:
|
||||||
|
|
|
@ -21,7 +21,6 @@ from __future__ import division, print_function, unicode_literals
|
||||||
import sys
|
import sys
|
||||||
import os
|
import os
|
||||||
import io
|
import io
|
||||||
import json
|
|
||||||
import mimetypes
|
import mimetypes
|
||||||
import re
|
import re
|
||||||
import shutil
|
import shutil
|
||||||
|
@ -36,7 +35,7 @@ from babel.units import format_unit
|
||||||
from flask import send_from_directory, make_response, redirect, abort
|
from flask import send_from_directory, make_response, redirect, abort
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
from flask_login import current_user
|
from flask_login import current_user
|
||||||
from sqlalchemy.sql.expression import true, false, and_, or_, text, func
|
from sqlalchemy.sql.expression import true, false, and_, text, func
|
||||||
from werkzeug.datastructures import Headers
|
from werkzeug.datastructures import Headers
|
||||||
from werkzeug.security import generate_password_hash
|
from werkzeug.security import generate_password_hash
|
||||||
from . import calibre_db
|
from . import calibre_db
|
||||||
|
@ -59,10 +58,9 @@ try:
|
||||||
except ImportError:
|
except ImportError:
|
||||||
use_PIL = False
|
use_PIL = False
|
||||||
|
|
||||||
from . import logger, config, get_locale, db, ub, isoLanguages, worker
|
from . import logger, config, get_locale, db, ub, worker
|
||||||
from . import gdriveutils as gd
|
from . import gdriveutils as gd
|
||||||
from .constants import STATIC_DIR as _STATIC_DIR
|
from .constants import STATIC_DIR as _STATIC_DIR
|
||||||
from .pagination import Pagination
|
|
||||||
from .subproc_wrapper import process_wait
|
from .subproc_wrapper import process_wait
|
||||||
from .worker import STAT_WAITING, STAT_FAIL, STAT_STARTED, STAT_FINISH_SUCCESS
|
from .worker import STAT_WAITING, STAT_FAIL, STAT_STARTED, STAT_FINISH_SUCCESS
|
||||||
from .worker import TASK_EMAIL, TASK_CONVERT, TASK_UPLOAD, TASK_CONVERT_ANY
|
from .worker import TASK_EMAIL, TASK_CONVERT, TASK_UPLOAD, TASK_CONVERT_ANY
|
||||||
|
@ -100,10 +98,10 @@ def convert_book_format(book_id, calibrepath, old_book_format, new_book_format,
|
||||||
# text = _(u"%(format)s: %(book)s", format=new_book_format, book=book.title)
|
# text = _(u"%(format)s: %(book)s", format=new_book_format, book=book.title)
|
||||||
else:
|
else:
|
||||||
settings = dict()
|
settings = dict()
|
||||||
text = (u"%s -> %s: %s" % (old_book_format, new_book_format, book.title))
|
txt = (u"%s -> %s: %s" % (old_book_format, new_book_format, book.title))
|
||||||
settings['old_book_format'] = old_book_format
|
settings['old_book_format'] = old_book_format
|
||||||
settings['new_book_format'] = new_book_format
|
settings['new_book_format'] = new_book_format
|
||||||
worker.add_convert(file_path, book.id, user_id, text, settings, kindle_mail)
|
worker.add_convert(file_path, book.id, user_id, txt, settings, kindle_mail)
|
||||||
return None
|
return None
|
||||||
else:
|
else:
|
||||||
error_message = _(u"%(format)s not found: %(fn)s",
|
error_message = _(u"%(format)s not found: %(fn)s",
|
||||||
|
@ -239,22 +237,22 @@ def get_valid_filename(value, replace_whitespace=True):
|
||||||
value = value[:-1]+u'_'
|
value = value[:-1]+u'_'
|
||||||
value = value.replace("/", "_").replace(":", "_").strip('\0')
|
value = value.replace("/", "_").replace(":", "_").strip('\0')
|
||||||
if use_unidecode:
|
if use_unidecode:
|
||||||
value = (unidecode.unidecode(value)).strip()
|
value = (unidecode.unidecode(value))
|
||||||
else:
|
else:
|
||||||
value = value.replace(u'§', u'SS')
|
value = value.replace(u'§', u'SS')
|
||||||
value = value.replace(u'ß', u'ss')
|
value = value.replace(u'ß', u'ss')
|
||||||
value = unicodedata.normalize('NFKD', value)
|
value = unicodedata.normalize('NFKD', value)
|
||||||
re_slugify = re.compile(r'[\W\s-]', re.UNICODE)
|
re_slugify = re.compile(r'[\W\s-]', re.UNICODE)
|
||||||
if isinstance(value, str): # Python3 str, Python2 unicode
|
if isinstance(value, str): # Python3 str, Python2 unicode
|
||||||
value = re_slugify.sub('', value).strip()
|
value = re_slugify.sub('', value)
|
||||||
else:
|
else:
|
||||||
value = unicode(re_slugify.sub('', value).strip())
|
value = unicode(re_slugify.sub('', value))
|
||||||
if replace_whitespace:
|
if replace_whitespace:
|
||||||
# *+:\"/<>? are replaced by _
|
# *+:\"/<>? are replaced by _
|
||||||
value = re.sub(r'[\*\+:\\\"/<>\?]+', u'_', value, flags=re.U)
|
value = re.sub(r'[*+:\\\"/<>?]+', u'_', value, flags=re.U)
|
||||||
# pipe has to be replaced with comma
|
# pipe has to be replaced with comma
|
||||||
value = re.sub(r'[\|]+', u',', value, flags=re.U)
|
value = re.sub(r'[|]+', u',', value, flags=re.U)
|
||||||
value = value[:128]
|
value = value[:128].strip()
|
||||||
if not value:
|
if not value:
|
||||||
raise ValueError("Filename cannot be empty")
|
raise ValueError("Filename cannot be empty")
|
||||||
if sys.version_info.major == 3:
|
if sys.version_info.major == 3:
|
||||||
|
@ -263,6 +261,22 @@ def get_valid_filename(value, replace_whitespace=True):
|
||||||
return value.decode('utf-8')
|
return value.decode('utf-8')
|
||||||
|
|
||||||
|
|
||||||
|
def split_authors(values):
|
||||||
|
authors_list = []
|
||||||
|
for value in values:
|
||||||
|
authors = re.split('[&;]', value)
|
||||||
|
for author in authors:
|
||||||
|
commas = author.count(',')
|
||||||
|
if commas == 1:
|
||||||
|
author_split = author.split(',')
|
||||||
|
authors_list.append(author_split[1].strip() + ' ' + author_split[0].strip())
|
||||||
|
elif commas > 1:
|
||||||
|
authors_list.extend([x.strip() for x in author.split(',')])
|
||||||
|
else:
|
||||||
|
authors_list.append(author.strip())
|
||||||
|
return authors_list
|
||||||
|
|
||||||
|
|
||||||
def get_sorted_author(value):
|
def get_sorted_author(value):
|
||||||
try:
|
try:
|
||||||
if ',' not in value:
|
if ',' not in value:
|
||||||
|
@ -270,7 +284,10 @@ def get_sorted_author(value):
|
||||||
combined = "(" + ")|(".join(regexes) + ")"
|
combined = "(" + ")|(".join(regexes) + ")"
|
||||||
value = value.split(" ")
|
value = value.split(" ")
|
||||||
if re.match(combined, value[-1].upper()):
|
if re.match(combined, value[-1].upper()):
|
||||||
|
if len(value) > 1:
|
||||||
value2 = value[-2] + ", " + " ".join(value[:-2]) + " " + value[-1]
|
value2 = value[-2] + ", " + " ".join(value[:-2]) + " " + value[-1]
|
||||||
|
else:
|
||||||
|
value2 = value[0]
|
||||||
elif len(value) == 1:
|
elif len(value) == 1:
|
||||||
value2 = value[0]
|
value2 = value[0]
|
||||||
else:
|
else:
|
||||||
|
@ -279,6 +296,9 @@ def get_sorted_author(value):
|
||||||
value2 = value
|
value2 = value
|
||||||
except Exception as ex:
|
except Exception as ex:
|
||||||
log.error("Sorting author %s failed: %s", value, ex)
|
log.error("Sorting author %s failed: %s", value, ex)
|
||||||
|
if isinstance(list, value2):
|
||||||
|
value2 = value[0]
|
||||||
|
else:
|
||||||
value2 = value
|
value2 = value
|
||||||
return value2
|
return value2
|
||||||
|
|
||||||
|
@ -295,15 +315,16 @@ def delete_book_file(book, calibrepath, book_format=None):
|
||||||
return True, None
|
return True, None
|
||||||
else:
|
else:
|
||||||
if os.path.isdir(path):
|
if os.path.isdir(path):
|
||||||
if len(next(os.walk(path))[1]):
|
|
||||||
log.error("Deleting book %s failed, path has subfolders: %s", book.id, book.path)
|
|
||||||
return False , _("Deleting book %(id)s failed, path has subfolders: %(path)s",
|
|
||||||
id=book.id,
|
|
||||||
path=book.path)
|
|
||||||
try:
|
try:
|
||||||
for root, __, files in os.walk(path):
|
for root, folders, files in os.walk(path):
|
||||||
for f in files:
|
for f in files:
|
||||||
os.unlink(os.path.join(root, f))
|
os.unlink(os.path.join(root, f))
|
||||||
|
if len(folders):
|
||||||
|
log.warning("Deleting book {} failed, path {} has subfolders: {}".format(book.id,
|
||||||
|
book.path, folders))
|
||||||
|
return True, _("Deleting bookfolder for book %(id)s failed, path has subfolders: %(path)s",
|
||||||
|
id=book.id,
|
||||||
|
path=book.path)
|
||||||
shutil.rmtree(path)
|
shutil.rmtree(path)
|
||||||
except (IOError, OSError) as e:
|
except (IOError, OSError) as e:
|
||||||
log.error("Deleting book %s failed: %s", book.id, e)
|
log.error("Deleting book %s failed: %s", book.id, e)
|
||||||
|
@ -339,13 +360,13 @@ def update_dir_structure_file(book_id, calibrepath, first_author):
|
||||||
new_title_path = os.path.join(os.path.dirname(path), new_titledir)
|
new_title_path = os.path.join(os.path.dirname(path), new_titledir)
|
||||||
try:
|
try:
|
||||||
if not os.path.exists(new_title_path):
|
if not os.path.exists(new_title_path):
|
||||||
os.renames(path, new_title_path)
|
os.renames(os.path.normcase(path), os.path.normcase(new_title_path))
|
||||||
else:
|
else:
|
||||||
log.info("Copying title: %s into existing: %s", path, new_title_path)
|
log.info("Copying title: %s into existing: %s", path, new_title_path)
|
||||||
for dir_name, __, file_list in os.walk(path):
|
for dir_name, __, file_list in os.walk(path):
|
||||||
for file in file_list:
|
for file in file_list:
|
||||||
os.renames(os.path.join(dir_name, file),
|
os.renames(os.path.normcase(os.path.join(dir_name, file)),
|
||||||
os.path.join(new_title_path + dir_name[len(path):], file))
|
os.path.normcase(os.path.join(new_title_path + dir_name[len(path):], file)))
|
||||||
path = new_title_path
|
path = new_title_path
|
||||||
localbook.path = localbook.path.split('/')[0] + '/' + new_titledir
|
localbook.path = localbook.path.split('/')[0] + '/' + new_titledir
|
||||||
except OSError as ex:
|
except OSError as ex:
|
||||||
|
@ -356,7 +377,7 @@ def update_dir_structure_file(book_id, calibrepath, first_author):
|
||||||
if authordir != new_authordir:
|
if authordir != new_authordir:
|
||||||
new_author_path = os.path.join(calibrepath, new_authordir, os.path.basename(path))
|
new_author_path = os.path.join(calibrepath, new_authordir, os.path.basename(path))
|
||||||
try:
|
try:
|
||||||
os.renames(path, new_author_path)
|
os.renames(os.path.normcase(path), os.path.normcase(new_author_path))
|
||||||
localbook.path = new_authordir + '/' + localbook.path.split('/')[1]
|
localbook.path = new_authordir + '/' + localbook.path.split('/')[1]
|
||||||
except OSError as ex:
|
except OSError as ex:
|
||||||
log.error("Rename author from: %s to %s: %s", path, new_author_path, ex)
|
log.error("Rename author from: %s to %s: %s", path, new_author_path, ex)
|
||||||
|
@ -365,12 +386,14 @@ def update_dir_structure_file(book_id, calibrepath, first_author):
|
||||||
src=path, dest=new_author_path, error=str(ex))
|
src=path, dest=new_author_path, error=str(ex))
|
||||||
# Rename all files from old names to new names
|
# Rename all files from old names to new names
|
||||||
if authordir != new_authordir or titledir != new_titledir:
|
if authordir != new_authordir or titledir != new_titledir:
|
||||||
|
new_name = ""
|
||||||
try:
|
try:
|
||||||
new_name = get_valid_filename(localbook.title) + ' - ' + get_valid_filename(new_authordir)
|
new_name = get_valid_filename(localbook.title) + ' - ' + get_valid_filename(new_authordir)
|
||||||
path_name = os.path.join(calibrepath, new_authordir, os.path.basename(path))
|
path_name = os.path.join(calibrepath, new_authordir, os.path.basename(path))
|
||||||
for file_format in localbook.data:
|
for file_format in localbook.data:
|
||||||
os.renames(os.path.join(path_name, file_format.name + '.' + file_format.format.lower()),
|
os.renames(os.path.normcase(
|
||||||
os.path.join(path_name, new_name + '.' + file_format.format.lower()))
|
os.path.join(path_name, file_format.name + '.' + file_format.format.lower())),
|
||||||
|
os.path.normcase(os.path.join(path_name, new_name + '.' + file_format.format.lower())))
|
||||||
file_format.name = new_name
|
file_format.name = new_name
|
||||||
except OSError as ex:
|
except OSError as ex:
|
||||||
log.error("Rename file in path %s to %s: %s", path, new_name, ex)
|
log.error("Rename file in path %s to %s: %s", path, new_name, ex)
|
||||||
|
@ -466,17 +489,20 @@ def reset_password(user_id):
|
||||||
def generate_random_password():
|
def generate_random_password():
|
||||||
s = "abcdefghijklmnopqrstuvwxyz01234567890ABCDEFGHIJKLMNOPQRSTUVWXYZ!@#$%&*()?"
|
s = "abcdefghijklmnopqrstuvwxyz01234567890ABCDEFGHIJKLMNOPQRSTUVWXYZ!@#$%&*()?"
|
||||||
passlen = 8
|
passlen = 8
|
||||||
|
if sys.version_info < (3, 0):
|
||||||
|
return "".join(s[ord(c) % len(s)] for c in os.urandom(passlen))
|
||||||
|
else:
|
||||||
return "".join(s[c % len(s)] for c in os.urandom(passlen))
|
return "".join(s[c % len(s)] for c in os.urandom(passlen))
|
||||||
|
|
||||||
|
|
||||||
def uniq(input):
|
def uniq(inpt):
|
||||||
output = []
|
output = []
|
||||||
for x in input:
|
for x in inpt:
|
||||||
if x not in output:
|
if x not in output:
|
||||||
output.append(x)
|
output.append(x)
|
||||||
return output
|
return output
|
||||||
|
|
||||||
################################## External interface
|
# ################################# External interface #################################
|
||||||
|
|
||||||
|
|
||||||
def update_dir_stucture(book_id, calibrepath, first_author=None):
|
def update_dir_stucture(book_id, calibrepath, first_author=None):
|
||||||
|
@ -553,7 +579,6 @@ def save_cover_from_url(url, book_path):
|
||||||
return False, _("Cover Format Error")
|
return False, _("Cover Format Error")
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def save_cover_from_filestorage(filepath, saved_filename, img):
|
def save_cover_from_filestorage(filepath, saved_filename, img):
|
||||||
if hasattr(img, '_content'):
|
if hasattr(img, '_content'):
|
||||||
f = open(os.path.join(filepath, saved_filename), "wb")
|
f = open(os.path.join(filepath, saved_filename), "wb")
|
||||||
|
@ -612,7 +637,6 @@ def save_cover(img, book_path):
|
||||||
return save_cover_from_filestorage(os.path.join(config.config_calibre_dir, book_path), "cover.jpg", img)
|
return save_cover_from_filestorage(os.path.join(config.config_calibre_dir, book_path), "cover.jpg", img)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
def do_download_file(book, book_format, client, data, headers):
|
def do_download_file(book, book_format, client, data, headers):
|
||||||
if config.config_use_google_drive:
|
if config.config_use_google_drive:
|
||||||
startTime = time.time()
|
startTime = time.time()
|
||||||
|
@ -713,7 +737,7 @@ def render_task_status(tasklist):
|
||||||
task['runtime'] = format_runtime(task['formRuntime'])
|
task['runtime'] = format_runtime(task['formRuntime'])
|
||||||
|
|
||||||
# localize the task status
|
# localize the task status
|
||||||
if isinstance( task['stat'], int):
|
if isinstance(task['stat'], int):
|
||||||
if task['stat'] == STAT_WAITING:
|
if task['stat'] == STAT_WAITING:
|
||||||
task['status'] = _(u'Waiting')
|
task['status'] = _(u'Waiting')
|
||||||
elif task['stat'] == STAT_FAIL:
|
elif task['stat'] == STAT_FAIL:
|
||||||
|
@ -726,7 +750,7 @@ def render_task_status(tasklist):
|
||||||
task['status'] = _(u'Unknown Status')
|
task['status'] = _(u'Unknown Status')
|
||||||
|
|
||||||
# localize the task type
|
# localize the task type
|
||||||
if isinstance( task['taskType'], int):
|
if isinstance(task['taskType'], int):
|
||||||
if task['taskType'] == TASK_EMAIL:
|
if task['taskType'] == TASK_EMAIL:
|
||||||
task['taskMessage'] = _(u'E-mail: ') + task['taskMess']
|
task['taskMessage'] = _(u'E-mail: ') + task['taskMess']
|
||||||
elif task['taskType'] == TASK_CONVERT:
|
elif task['taskType'] == TASK_CONVERT:
|
||||||
|
@ -782,6 +806,7 @@ def get_cc_columns(filter_config_custom_read=False):
|
||||||
|
|
||||||
return cc
|
return cc
|
||||||
|
|
||||||
|
|
||||||
def get_download_link(book_id, book_format, client):
|
def get_download_link(book_id, book_format, client):
|
||||||
book_format = book_format.split(".")[0]
|
book_format = book_format.split(".")[0]
|
||||||
book = calibre_db.get_filtered_book(book_id)
|
book = calibre_db.get_filtered_book(book_id)
|
||||||
|
|
|
@ -66,3 +66,27 @@ def get_language_codes(locale, language_names, remainder=None):
|
||||||
if remainder is not None:
|
if remainder is not None:
|
||||||
remainder.extend(language_names)
|
remainder.extend(language_names)
|
||||||
return languages
|
return languages
|
||||||
|
|
||||||
|
def get_valid_language_codes(locale, language_names, remainder=None):
|
||||||
|
languages = list()
|
||||||
|
if "" in language_names:
|
||||||
|
language_names.remove("")
|
||||||
|
for k, v in get_language_names(locale).items():
|
||||||
|
if k in language_names:
|
||||||
|
languages.append(k)
|
||||||
|
language_names.remove(k)
|
||||||
|
if remainder is not None and len(language_names):
|
||||||
|
remainder.extend(language_names)
|
||||||
|
return languages
|
||||||
|
|
||||||
|
def get_lang3(lang):
|
||||||
|
try:
|
||||||
|
if len(lang) == 2:
|
||||||
|
ret_value = get(part1=lang).part3
|
||||||
|
elif len(lang) == 3:
|
||||||
|
ret_value = lang
|
||||||
|
else:
|
||||||
|
ret_value = ""
|
||||||
|
except KeyError:
|
||||||
|
ret_value = lang
|
||||||
|
return ret_value
|
||||||
|
|
|
@ -111,3 +111,10 @@ def timestamptodate(date, fmt=None):
|
||||||
@jinjia.app_template_filter('yesno')
|
@jinjia.app_template_filter('yesno')
|
||||||
def yesno(value, yes, no):
|
def yesno(value, yes, no):
|
||||||
return yes if value else no
|
return yes if value else no
|
||||||
|
|
||||||
|
@jinjia.app_template_filter('formatfloat')
|
||||||
|
def formatfloat(value, decimals=1):
|
||||||
|
formatedstring = '%d' % value
|
||||||
|
if (value % 1) != 0:
|
||||||
|
formatedstring = ('%s.%d' % (formatedstring, (value % 1) * 10**decimals)).rstrip('0')
|
||||||
|
return formatedstring
|
||||||
|
|
48
cps/kobo.py
48
cps/kobo.py
|
@ -19,8 +19,6 @@
|
||||||
|
|
||||||
import base64
|
import base64
|
||||||
import datetime
|
import datetime
|
||||||
import itertools
|
|
||||||
import json
|
|
||||||
import sys
|
import sys
|
||||||
import os
|
import os
|
||||||
import uuid
|
import uuid
|
||||||
|
@ -131,7 +129,7 @@ def HandleSyncRequest():
|
||||||
sync_token = SyncToken.SyncToken.from_headers(request.headers)
|
sync_token = SyncToken.SyncToken.from_headers(request.headers)
|
||||||
log.info("Kobo library sync request received.")
|
log.info("Kobo library sync request received.")
|
||||||
if not current_app.wsgi_app.is_proxied:
|
if not current_app.wsgi_app.is_proxied:
|
||||||
log.debug('Kobo: Received unproxied request, changed request port to server port')
|
log.debug('Kobo: Received unproxied request, changed request port to external server port')
|
||||||
|
|
||||||
# TODO: Limit the number of books return per sync call, and rely on the sync-continuatation header
|
# TODO: Limit the number of books return per sync call, and rely on the sync-continuatation header
|
||||||
# instead so that the device triggers another sync.
|
# instead so that the device triggers another sync.
|
||||||
|
@ -254,7 +252,7 @@ def generate_sync_response(sync_token, sync_results):
|
||||||
@download_required
|
@download_required
|
||||||
def HandleMetadataRequest(book_uuid):
|
def HandleMetadataRequest(book_uuid):
|
||||||
if not current_app.wsgi_app.is_proxied:
|
if not current_app.wsgi_app.is_proxied:
|
||||||
log.debug('Kobo: Received unproxied request, changed request port to server port')
|
log.debug('Kobo: Received unproxied request, changed request port to external server port')
|
||||||
log.info("Kobo library metadata request received for book %s" % book_uuid)
|
log.info("Kobo library metadata request received for book %s" % book_uuid)
|
||||||
book = calibre_db.get_book_by_uuid(book_uuid)
|
book = calibre_db.get_book_by_uuid(book_uuid)
|
||||||
if not book or not book.data:
|
if not book or not book.data:
|
||||||
|
@ -267,14 +265,15 @@ def HandleMetadataRequest(book_uuid):
|
||||||
|
|
||||||
def get_download_url_for_book(book, book_format):
|
def get_download_url_for_book(book, book_format):
|
||||||
if not current_app.wsgi_app.is_proxied:
|
if not current_app.wsgi_app.is_proxied:
|
||||||
if ':' in request.host and not request.host.endswith(']') :
|
if ':' in request.host and not request.host.endswith(']'):
|
||||||
host = "".join(request.host.split(':')[:-1])
|
host = "".join(request.host.split(':')[:-1])
|
||||||
else:
|
else:
|
||||||
host = request.host
|
host = request.host
|
||||||
|
|
||||||
return "{url_scheme}://{url_base}:{url_port}/download/{book_id}/{book_format}".format(
|
return "{url_scheme}://{url_base}:{url_port}/download/{book_id}/{book_format}".format(
|
||||||
url_scheme=request.scheme,
|
url_scheme=request.scheme,
|
||||||
url_base=host,
|
url_base=host,
|
||||||
url_port=config.config_port,
|
url_port=config.config_external_port,
|
||||||
book_id=book.id,
|
book_id=book.id,
|
||||||
book_format=book_format.lower()
|
book_format=book_format.lower()
|
||||||
)
|
)
|
||||||
|
@ -317,8 +316,15 @@ def get_description(book):
|
||||||
# TODO handle multiple authors
|
# TODO handle multiple authors
|
||||||
def get_author(book):
|
def get_author(book):
|
||||||
if not book.authors:
|
if not book.authors:
|
||||||
return None
|
return {"Contributors": None}
|
||||||
return book.authors[0].name
|
if len(book.authors) > 1:
|
||||||
|
author_list = []
|
||||||
|
autor_roles = []
|
||||||
|
for author in book.authors:
|
||||||
|
autor_roles.append({"Name":author.name, "Role":"Author"})
|
||||||
|
author_list.append(author.name)
|
||||||
|
return {"ContributorRoles": autor_roles, "Contributors":author_list}
|
||||||
|
return {"ContributorRoles": [{"Name":book.authors[0].name, "Role":"Author"}], "Contributors": book.authors[0].name}
|
||||||
|
|
||||||
|
|
||||||
def get_publisher(book):
|
def get_publisher(book):
|
||||||
|
@ -357,7 +363,7 @@ def get_metadata(book):
|
||||||
book_uuid = book.uuid
|
book_uuid = book.uuid
|
||||||
metadata = {
|
metadata = {
|
||||||
"Categories": ["00000000-0000-0000-0000-000000000001",],
|
"Categories": ["00000000-0000-0000-0000-000000000001",],
|
||||||
"Contributors": get_author(book),
|
# "Contributors": get_author(book),
|
||||||
"CoverImageId": book_uuid,
|
"CoverImageId": book_uuid,
|
||||||
"CrossRevisionId": book_uuid,
|
"CrossRevisionId": book_uuid,
|
||||||
"CurrentDisplayPrice": {"CurrencyCode": "USD", "TotalAmount": 0},
|
"CurrentDisplayPrice": {"CurrencyCode": "USD", "TotalAmount": 0},
|
||||||
|
@ -381,6 +387,7 @@ def get_metadata(book):
|
||||||
"Title": book.title,
|
"Title": book.title,
|
||||||
"WorkId": book_uuid,
|
"WorkId": book_uuid,
|
||||||
}
|
}
|
||||||
|
metadata.update(get_author(book))
|
||||||
|
|
||||||
if get_series(book):
|
if get_series(book):
|
||||||
if sys.version_info < (3, 0):
|
if sys.version_info < (3, 0):
|
||||||
|
@ -399,7 +406,7 @@ def get_metadata(book):
|
||||||
|
|
||||||
|
|
||||||
@kobo.route("/v1/library/tags", methods=["POST", "DELETE"])
|
@kobo.route("/v1/library/tags", methods=["POST", "DELETE"])
|
||||||
@login_required
|
@requires_kobo_auth
|
||||||
# Creates a Shelf with the given items, and returns the shelf's uuid.
|
# Creates a Shelf with the given items, and returns the shelf's uuid.
|
||||||
def HandleTagCreate():
|
def HandleTagCreate():
|
||||||
# catch delete requests, otherwise the are handeld in the book delete handler
|
# catch delete requests, otherwise the are handeld in the book delete handler
|
||||||
|
@ -434,6 +441,7 @@ def HandleTagCreate():
|
||||||
|
|
||||||
|
|
||||||
@kobo.route("/v1/library/tags/<tag_id>", methods=["DELETE", "PUT"])
|
@kobo.route("/v1/library/tags/<tag_id>", methods=["DELETE", "PUT"])
|
||||||
|
@requires_kobo_auth
|
||||||
def HandleTagUpdate(tag_id):
|
def HandleTagUpdate(tag_id):
|
||||||
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.uuid == tag_id,
|
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.uuid == tag_id,
|
||||||
ub.Shelf.user_id == current_user.id).one_or_none()
|
ub.Shelf.user_id == current_user.id).one_or_none()
|
||||||
|
@ -488,7 +496,7 @@ def add_items_to_shelf(items, shelf):
|
||||||
|
|
||||||
|
|
||||||
@kobo.route("/v1/library/tags/<tag_id>/items", methods=["POST"])
|
@kobo.route("/v1/library/tags/<tag_id>/items", methods=["POST"])
|
||||||
@login_required
|
@requires_kobo_auth
|
||||||
def HandleTagAddItem(tag_id):
|
def HandleTagAddItem(tag_id):
|
||||||
items = None
|
items = None
|
||||||
try:
|
try:
|
||||||
|
@ -518,7 +526,7 @@ def HandleTagAddItem(tag_id):
|
||||||
|
|
||||||
|
|
||||||
@kobo.route("/v1/library/tags/<tag_id>/items/delete", methods=["POST"])
|
@kobo.route("/v1/library/tags/<tag_id>/items/delete", methods=["POST"])
|
||||||
@login_required
|
@requires_kobo_auth
|
||||||
def HandleTagRemoveItem(tag_id):
|
def HandleTagRemoveItem(tag_id):
|
||||||
items = None
|
items = None
|
||||||
try:
|
try:
|
||||||
|
@ -627,7 +635,7 @@ def create_kobo_tag(shelf):
|
||||||
|
|
||||||
|
|
||||||
@kobo.route("/v1/library/<book_uuid>/state", methods=["GET", "PUT"])
|
@kobo.route("/v1/library/<book_uuid>/state", methods=["GET", "PUT"])
|
||||||
@login_required
|
@requires_kobo_auth
|
||||||
def HandleStateRequest(book_uuid):
|
def HandleStateRequest(book_uuid):
|
||||||
book = calibre_db.get_book_by_uuid(book_uuid)
|
book = calibre_db.get_book_by_uuid(book_uuid)
|
||||||
if not book or not book.data:
|
if not book or not book.data:
|
||||||
|
@ -801,7 +809,7 @@ def TopLevelEndpoint():
|
||||||
|
|
||||||
|
|
||||||
@kobo.route("/v1/library/<book_uuid>", methods=["DELETE"])
|
@kobo.route("/v1/library/<book_uuid>", methods=["DELETE"])
|
||||||
@login_required
|
@requires_kobo_auth
|
||||||
def HandleBookDeletionRequest(book_uuid):
|
def HandleBookDeletionRequest(book_uuid):
|
||||||
log.info("Kobo book deletion request received for book %s" % book_uuid)
|
log.info("Kobo book deletion request received for book %s" % book_uuid)
|
||||||
book = calibre_db.get_book_by_uuid(book_uuid)
|
book = calibre_db.get_book_by_uuid(book_uuid)
|
||||||
|
@ -917,7 +925,7 @@ def HandleInitRequest():
|
||||||
kobo_resources = NATIVE_KOBO_RESOURCES()
|
kobo_resources = NATIVE_KOBO_RESOURCES()
|
||||||
|
|
||||||
if not current_app.wsgi_app.is_proxied:
|
if not current_app.wsgi_app.is_proxied:
|
||||||
log.debug('Kobo: Received unproxied request, changed request port to server port')
|
log.debug('Kobo: Received unproxied request, changed request port to external server port')
|
||||||
if ':' in request.host and not request.host.endswith(']'):
|
if ':' in request.host and not request.host.endswith(']'):
|
||||||
host = "".join(request.host.split(':')[:-1])
|
host = "".join(request.host.split(':')[:-1])
|
||||||
else:
|
else:
|
||||||
|
@ -925,8 +933,9 @@ def HandleInitRequest():
|
||||||
calibre_web_url = "{url_scheme}://{url_base}:{url_port}".format(
|
calibre_web_url = "{url_scheme}://{url_base}:{url_port}".format(
|
||||||
url_scheme=request.scheme,
|
url_scheme=request.scheme,
|
||||||
url_base=host,
|
url_base=host,
|
||||||
url_port=config.config_port
|
url_port=config.config_external_port
|
||||||
)
|
)
|
||||||
|
log.debug('Kobo: Received unproxied request, changed request url to %s', calibre_web_url)
|
||||||
kobo_resources["image_host"] = calibre_web_url
|
kobo_resources["image_host"] = calibre_web_url
|
||||||
kobo_resources["image_url_quality_template"] = unquote(calibre_web_url +
|
kobo_resources["image_url_quality_template"] = unquote(calibre_web_url +
|
||||||
url_for("kobo.HandleCoverImageRequest",
|
url_for("kobo.HandleCoverImageRequest",
|
||||||
|
@ -935,16 +944,14 @@ def HandleInitRequest():
|
||||||
width="{width}",
|
width="{width}",
|
||||||
height="{height}",
|
height="{height}",
|
||||||
Quality='{Quality}',
|
Quality='{Quality}',
|
||||||
isGreyscale='isGreyscale'
|
isGreyscale='isGreyscale'))
|
||||||
))
|
|
||||||
kobo_resources["image_url_template"] = unquote(calibre_web_url +
|
kobo_resources["image_url_template"] = unquote(calibre_web_url +
|
||||||
url_for("kobo.HandleCoverImageRequest",
|
url_for("kobo.HandleCoverImageRequest",
|
||||||
auth_token=kobo_auth.get_auth_token(),
|
auth_token=kobo_auth.get_auth_token(),
|
||||||
book_uuid="{ImageId}",
|
book_uuid="{ImageId}",
|
||||||
width="{width}",
|
width="{width}",
|
||||||
height="{height}",
|
height="{height}",
|
||||||
isGreyscale='false'
|
isGreyscale='false'))
|
||||||
))
|
|
||||||
else:
|
else:
|
||||||
kobo_resources["image_host"] = url_for("web.index", _external=True).strip("/")
|
kobo_resources["image_host"] = url_for("web.index", _external=True).strip("/")
|
||||||
kobo_resources["image_url_quality_template"] = unquote(url_for("kobo.HandleCoverImageRequest",
|
kobo_resources["image_url_quality_template"] = unquote(url_for("kobo.HandleCoverImageRequest",
|
||||||
|
@ -963,7 +970,6 @@ def HandleInitRequest():
|
||||||
isGreyscale='false',
|
isGreyscale='false',
|
||||||
_external=True))
|
_external=True))
|
||||||
|
|
||||||
|
|
||||||
response = make_response(jsonify({"Resources": kobo_resources}))
|
response = make_response(jsonify({"Resources": kobo_resources}))
|
||||||
response.headers["x-kobo-apitoken"] = "e30="
|
response.headers["x-kobo-apitoken"] = "e30="
|
||||||
|
|
||||||
|
|
|
@ -126,11 +126,11 @@ def setup(log_file, log_level=None):
|
||||||
file_handler.baseFilename = log_file
|
file_handler.baseFilename = log_file
|
||||||
else:
|
else:
|
||||||
try:
|
try:
|
||||||
file_handler = RotatingFileHandler(log_file, maxBytes=50000, backupCount=2)
|
file_handler = RotatingFileHandler(log_file, maxBytes=50000, backupCount=2, encoding='utf-8')
|
||||||
except IOError:
|
except IOError:
|
||||||
if log_file == DEFAULT_LOG_FILE:
|
if log_file == DEFAULT_LOG_FILE:
|
||||||
raise
|
raise
|
||||||
file_handler = RotatingFileHandler(DEFAULT_LOG_FILE, maxBytes=50000, backupCount=2)
|
file_handler = RotatingFileHandler(DEFAULT_LOG_FILE, maxBytes=50000, backupCount=2, encoding='utf-8')
|
||||||
log_file = ""
|
log_file = ""
|
||||||
file_handler.setFormatter(FORMATTER)
|
file_handler.setFormatter(FORMATTER)
|
||||||
|
|
||||||
|
@ -152,11 +152,11 @@ def create_access_log(log_file, log_name, formatter):
|
||||||
access_log.propagate = False
|
access_log.propagate = False
|
||||||
access_log.setLevel(logging.INFO)
|
access_log.setLevel(logging.INFO)
|
||||||
try:
|
try:
|
||||||
file_handler = RotatingFileHandler(log_file, maxBytes=50000, backupCount=2)
|
file_handler = RotatingFileHandler(log_file, maxBytes=50000, backupCount=2, encoding='utf-8')
|
||||||
except IOError:
|
except IOError:
|
||||||
if log_file == DEFAULT_ACCESS_LOG:
|
if log_file == DEFAULT_ACCESS_LOG:
|
||||||
raise
|
raise
|
||||||
file_handler = RotatingFileHandler(DEFAULT_ACCESS_LOG, maxBytes=50000, backupCount=2)
|
file_handler = RotatingFileHandler(DEFAULT_ACCESS_LOG, maxBytes=50000, backupCount=2, encoding='utf-8')
|
||||||
log_file = ""
|
log_file = ""
|
||||||
|
|
||||||
file_handler.setFormatter(formatter)
|
file_handler.setFormatter(formatter)
|
||||||
|
|
|
@ -122,10 +122,10 @@ if ub.oauth_support:
|
||||||
ele2 = dict(provider_name='google',
|
ele2 = dict(provider_name='google',
|
||||||
id=oauth_ids[1].id,
|
id=oauth_ids[1].id,
|
||||||
active=oauth_ids[1].active,
|
active=oauth_ids[1].active,
|
||||||
scope=["https://www.googleapis.com/auth/plus.me", "https://www.googleapis.com/auth/userinfo.email"],
|
scope=["https://www.googleapis.com/auth/userinfo.email"],
|
||||||
oauth_client_id=oauth_ids[1].oauth_client_id,
|
oauth_client_id=oauth_ids[1].oauth_client_id,
|
||||||
oauth_client_secret=oauth_ids[1].oauth_client_secret,
|
oauth_client_secret=oauth_ids[1].oauth_client_secret,
|
||||||
obtain_link='https://github.com/settings/developers')
|
obtain_link='https://console.developers.google.com/apis/credentials')
|
||||||
oauthblueprints.append(ele1)
|
oauthblueprints.append(ele1)
|
||||||
oauthblueprints.append(ele2)
|
oauthblueprints.append(ele2)
|
||||||
|
|
||||||
|
@ -287,7 +287,7 @@ if ub.oauth_support:
|
||||||
flash(_(u"Unlink to %(oauth)s Failed", oauth=oauth_check[provider]), category="error")
|
flash(_(u"Unlink to %(oauth)s Failed", oauth=oauth_check[provider]), category="error")
|
||||||
except NoResultFound:
|
except NoResultFound:
|
||||||
log.warning("oauth %s for user %d not found", provider, current_user.id)
|
log.warning("oauth %s for user %d not found", provider, current_user.id)
|
||||||
flash(_(u"Not Linked to %(oauth)s.", oauth=oauth_check[provider]), category="error")
|
flash(_(u"Not Linked to %(oauth)s", oauth=provider), category="error")
|
||||||
return redirect(url_for('web.profile'))
|
return redirect(url_for('web.profile'))
|
||||||
|
|
||||||
|
|
||||||
|
@ -355,4 +355,4 @@ if ub.oauth_support:
|
||||||
@oauth.route('/unlink/google', methods=["GET"])
|
@oauth.route('/unlink/google', methods=["GET"])
|
||||||
@login_required
|
@login_required
|
||||||
def google_login_unlink():
|
def google_login_unlink():
|
||||||
return unlink_oauth(oauthblueprints[1]['blueprint'].name)
|
return unlink_oauth(oauthblueprints[1]['id'])
|
||||||
|
|
|
@ -77,6 +77,7 @@ class ReverseProxied(object):
|
||||||
servr = environ.get('HTTP_X_FORWARDED_HOST', '')
|
servr = environ.get('HTTP_X_FORWARDED_HOST', '')
|
||||||
if servr:
|
if servr:
|
||||||
environ['HTTP_HOST'] = servr
|
environ['HTTP_HOST'] = servr
|
||||||
|
self.proxied = True
|
||||||
return self.app(environ, start_response)
|
return self.app(environ, start_response)
|
||||||
|
|
||||||
@property
|
@property
|
||||||
|
|
|
@ -27,6 +27,8 @@ try:
|
||||||
from gevent.pywsgi import WSGIServer
|
from gevent.pywsgi import WSGIServer
|
||||||
from gevent.pool import Pool
|
from gevent.pool import Pool
|
||||||
from gevent import __version__ as _version
|
from gevent import __version__ as _version
|
||||||
|
from greenlet import GreenletExit
|
||||||
|
import ssl
|
||||||
VERSION = 'Gevent ' + _version
|
VERSION = 'Gevent ' + _version
|
||||||
_GEVENT = True
|
_GEVENT = True
|
||||||
except ImportError:
|
except ImportError:
|
||||||
|
@ -143,6 +145,16 @@ class WebServer(object):
|
||||||
output = _readable_listen_address(self.listen_address, self.listen_port)
|
output = _readable_listen_address(self.listen_address, self.listen_port)
|
||||||
log.info('Starting Gevent server on %s', output)
|
log.info('Starting Gevent server on %s', output)
|
||||||
self.wsgiserver = WSGIServer(sock, self.app, log=self.access_logger, spawn=Pool(), **ssl_args)
|
self.wsgiserver = WSGIServer(sock, self.app, log=self.access_logger, spawn=Pool(), **ssl_args)
|
||||||
|
if ssl_args:
|
||||||
|
wrap_socket = self.wsgiserver.wrap_socket
|
||||||
|
def my_wrap_socket(*args, **kwargs):
|
||||||
|
try:
|
||||||
|
return wrap_socket(*args, **kwargs)
|
||||||
|
except (ssl.SSLError) as ex:
|
||||||
|
log.warning('Gevent SSL Error: %s', ex)
|
||||||
|
raise GreenletExit
|
||||||
|
|
||||||
|
self.wsgiserver.wrap_socket = my_wrap_socket
|
||||||
self.wsgiserver.serve_forever()
|
self.wsgiserver.serve_forever()
|
||||||
finally:
|
finally:
|
||||||
if self.unix_socket_file:
|
if self.unix_socket_file:
|
||||||
|
@ -194,7 +206,7 @@ class WebServer(object):
|
||||||
os.execv(sys.executable, arguments)
|
os.execv(sys.executable, arguments)
|
||||||
return True
|
return True
|
||||||
|
|
||||||
def _killServer(self, ignored_signum, ignored_frame):
|
def _killServer(self, __, ___):
|
||||||
self.stop()
|
self.stop()
|
||||||
|
|
||||||
def stop(self, restart=False):
|
def stop(self, restart=False):
|
||||||
|
|
|
@ -27,7 +27,10 @@ except ImportError:
|
||||||
from urllib.parse import unquote
|
from urllib.parse import unquote
|
||||||
|
|
||||||
from flask import json
|
from flask import json
|
||||||
from .. import logger as log
|
from .. import logger
|
||||||
|
|
||||||
|
|
||||||
|
log = logger.create()
|
||||||
|
|
||||||
|
|
||||||
def b64encode_json(json_data):
|
def b64encode_json(json_data):
|
||||||
|
@ -45,7 +48,8 @@ def to_epoch_timestamp(datetime_object):
|
||||||
def get_datetime_from_json(json_object, field_name):
|
def get_datetime_from_json(json_object, field_name):
|
||||||
try:
|
try:
|
||||||
return datetime.utcfromtimestamp(json_object[field_name])
|
return datetime.utcfromtimestamp(json_object[field_name])
|
||||||
except KeyError:
|
except (KeyError, OSError, OverflowError):
|
||||||
|
# OSError is thrown on Windows if timestamp is <1970 or >2038
|
||||||
return datetime.min
|
return datetime.min
|
||||||
|
|
||||||
|
|
||||||
|
|
|
@ -64,9 +64,11 @@ def init_app(app, config):
|
||||||
app.config['LDAP_OPENLDAP'] = bool(config.config_ldap_openldap)
|
app.config['LDAP_OPENLDAP'] = bool(config.config_ldap_openldap)
|
||||||
app.config['LDAP_GROUP_OBJECT_FILTER'] = config.config_ldap_group_object_filter
|
app.config['LDAP_GROUP_OBJECT_FILTER'] = config.config_ldap_group_object_filter
|
||||||
app.config['LDAP_GROUP_MEMBERS_FIELD'] = config.config_ldap_group_members_field
|
app.config['LDAP_GROUP_MEMBERS_FIELD'] = config.config_ldap_group_members_field
|
||||||
# app.config['LDAP_CUSTOM_OPTIONS'] = {'OPT_NETWORK_TIMEOUT': 10}
|
|
||||||
|
|
||||||
|
try:
|
||||||
_ldap.init_app(app)
|
_ldap.init_app(app)
|
||||||
|
except RuntimeError as e:
|
||||||
|
log.error(e)
|
||||||
|
|
||||||
|
|
||||||
def get_object_details(user=None, group=None, query_filter=None, dn_only=False):
|
def get_object_details(user=None, group=None, query_filter=None, dn_only=False):
|
||||||
|
|
49
cps/shelf.py
49
cps/shelf.py
|
@ -27,9 +27,10 @@ from flask import Blueprint, request, flash, redirect, url_for
|
||||||
from flask_babel import gettext as _
|
from flask_babel import gettext as _
|
||||||
from flask_login import login_required, current_user
|
from flask_login import login_required, current_user
|
||||||
from sqlalchemy.sql.expression import func
|
from sqlalchemy.sql.expression import func
|
||||||
|
from sqlalchemy.exc import OperationalError, InvalidRequestError
|
||||||
|
|
||||||
from . import logger, ub, searched_ids, db, calibre_db
|
from . import logger, ub, searched_ids, calibre_db
|
||||||
from .web import render_title_template
|
from .web import login_required_if_no_ano, render_title_template
|
||||||
|
|
||||||
|
|
||||||
shelf = Blueprint('shelf', __name__)
|
shelf = Blueprint('shelf', __name__)
|
||||||
|
@ -91,8 +92,16 @@ def add_to_shelf(shelf_id, book_id):
|
||||||
|
|
||||||
shelf.books.append(ub.BookShelf(shelf=shelf.id, book_id=book_id, order=maxOrder + 1))
|
shelf.books.append(ub.BookShelf(shelf=shelf.id, book_id=book_id, order=maxOrder + 1))
|
||||||
shelf.last_modified = datetime.utcnow()
|
shelf.last_modified = datetime.utcnow()
|
||||||
|
try:
|
||||||
ub.session.merge(shelf)
|
ub.session.merge(shelf)
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
|
except (OperationalError, InvalidRequestError):
|
||||||
|
ub.session.rollback()
|
||||||
|
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||||
|
if "HTTP_REFERER" in request.environ:
|
||||||
|
return redirect(request.environ["HTTP_REFERER"])
|
||||||
|
else:
|
||||||
|
return redirect(url_for('web.index'))
|
||||||
if not xhr:
|
if not xhr:
|
||||||
flash(_(u"Book has been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
flash(_(u"Book has been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
||||||
if "HTTP_REFERER" in request.environ:
|
if "HTTP_REFERER" in request.environ:
|
||||||
|
@ -143,9 +152,13 @@ def search_to_shelf(shelf_id):
|
||||||
maxOrder = maxOrder + 1
|
maxOrder = maxOrder + 1
|
||||||
shelf.books.append(ub.BookShelf(shelf=shelf.id, book_id=book, order=maxOrder))
|
shelf.books.append(ub.BookShelf(shelf=shelf.id, book_id=book, order=maxOrder))
|
||||||
shelf.last_modified = datetime.utcnow()
|
shelf.last_modified = datetime.utcnow()
|
||||||
|
try:
|
||||||
ub.session.merge(shelf)
|
ub.session.merge(shelf)
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
flash(_(u"Books have been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
flash(_(u"Books have been added to shelf: %(sname)s", sname=shelf.name), category="success")
|
||||||
|
except (OperationalError, InvalidRequestError):
|
||||||
|
ub.session.rollback()
|
||||||
|
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||||
else:
|
else:
|
||||||
flash(_(u"Could not add books to shelf: %(sname)s", sname=shelf.name), category="error")
|
flash(_(u"Could not add books to shelf: %(sname)s", sname=shelf.name), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
@ -180,10 +193,17 @@ def remove_from_shelf(shelf_id, book_id):
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
return "Book already removed from shelf", 410
|
return "Book already removed from shelf", 410
|
||||||
|
|
||||||
|
try:
|
||||||
ub.session.delete(book_shelf)
|
ub.session.delete(book_shelf)
|
||||||
shelf.last_modified = datetime.utcnow()
|
shelf.last_modified = datetime.utcnow()
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
|
except (OperationalError, InvalidRequestError):
|
||||||
|
ub.session.rollback()
|
||||||
|
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||||
|
if "HTTP_REFERER" in request.environ:
|
||||||
|
return redirect(request.environ["HTTP_REFERER"])
|
||||||
|
else:
|
||||||
|
return redirect(url_for('web.index'))
|
||||||
if not xhr:
|
if not xhr:
|
||||||
flash(_(u"Book has been removed from shelf: %(sname)s", sname=shelf.name), category="success")
|
flash(_(u"Book has been removed from shelf: %(sname)s", sname=shelf.name), category="success")
|
||||||
if "HTTP_REFERER" in request.environ:
|
if "HTTP_REFERER" in request.environ:
|
||||||
|
@ -235,7 +255,11 @@ def create_shelf():
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
flash(_(u"Shelf %(title)s created", title=to_save["title"]), category="success")
|
flash(_(u"Shelf %(title)s created", title=to_save["title"]), category="success")
|
||||||
return redirect(url_for('shelf.show_shelf', shelf_id=shelf.id))
|
return redirect(url_for('shelf.show_shelf', shelf_id=shelf.id))
|
||||||
|
except (OperationalError, InvalidRequestError):
|
||||||
|
ub.session.rollback()
|
||||||
|
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||||
except Exception:
|
except Exception:
|
||||||
|
ub.session.rollback()
|
||||||
flash(_(u"There was an error"), category="error")
|
flash(_(u"There was an error"), category="error")
|
||||||
return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Create a Shelf"), page="shelfcreate")
|
return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Create a Shelf"), page="shelfcreate")
|
||||||
else:
|
else:
|
||||||
|
@ -280,7 +304,11 @@ def edit_shelf(shelf_id):
|
||||||
try:
|
try:
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
flash(_(u"Shelf %(title)s changed", title=to_save["title"]), category="success")
|
flash(_(u"Shelf %(title)s changed", title=to_save["title"]), category="success")
|
||||||
|
except (OperationalError, InvalidRequestError):
|
||||||
|
ub.session.rollback()
|
||||||
|
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||||
except Exception:
|
except Exception:
|
||||||
|
ub.session.rollback()
|
||||||
flash(_(u"There was an error"), category="error")
|
flash(_(u"There was an error"), category="error")
|
||||||
return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Edit a shelf"), page="shelfedit")
|
return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Edit a shelf"), page="shelfedit")
|
||||||
else:
|
else:
|
||||||
|
@ -298,17 +326,22 @@ def delete_shelf_helper(cur_shelf):
|
||||||
log.info("successfully deleted %s", cur_shelf)
|
log.info("successfully deleted %s", cur_shelf)
|
||||||
|
|
||||||
|
|
||||||
|
|
||||||
@shelf.route("/shelf/delete/<int:shelf_id>")
|
@shelf.route("/shelf/delete/<int:shelf_id>")
|
||||||
@login_required
|
@login_required
|
||||||
def delete_shelf(shelf_id):
|
def delete_shelf(shelf_id):
|
||||||
cur_shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
cur_shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
||||||
|
try:
|
||||||
delete_shelf_helper(cur_shelf)
|
delete_shelf_helper(cur_shelf)
|
||||||
|
except (OperationalError, InvalidRequestError):
|
||||||
|
ub.session.rollback()
|
||||||
|
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||||
return redirect(url_for('web.index'))
|
return redirect(url_for('web.index'))
|
||||||
|
|
||||||
|
|
||||||
@shelf.route("/shelf/<int:shelf_id>", defaults={'shelf_type': 1})
|
@shelf.route("/shelf/<int:shelf_id>", defaults={'shelf_type': 1})
|
||||||
@shelf.route("/shelf/<int:shelf_id>/<int:shelf_type>")
|
@shelf.route("/shelf/<int:shelf_id>/<int:shelf_type>")
|
||||||
@login_required
|
@login_required_if_no_ano
|
||||||
def show_shelf(shelf_type, shelf_id):
|
def show_shelf(shelf_type, shelf_id):
|
||||||
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
||||||
|
|
||||||
|
@ -327,8 +360,12 @@ def show_shelf(shelf_type, shelf_id):
|
||||||
cur_book = calibre_db.get_book(book.book_id)
|
cur_book = calibre_db.get_book(book.book_id)
|
||||||
if not cur_book:
|
if not cur_book:
|
||||||
log.info('Not existing book %s in %s deleted', book.book_id, shelf)
|
log.info('Not existing book %s in %s deleted', book.book_id, shelf)
|
||||||
|
try:
|
||||||
ub.session.query(ub.BookShelf).filter(ub.BookShelf.book_id == book.book_id).delete()
|
ub.session.query(ub.BookShelf).filter(ub.BookShelf.book_id == book.book_id).delete()
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
|
except (OperationalError, InvalidRequestError):
|
||||||
|
ub.session.rollback()
|
||||||
|
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||||
return render_title_template(page, entries=result, title=_(u"Shelf: '%(name)s'", name=shelf.name),
|
return render_title_template(page, entries=result, title=_(u"Shelf: '%(name)s'", name=shelf.name),
|
||||||
shelf=shelf, page="shelf")
|
shelf=shelf, page="shelf")
|
||||||
else:
|
else:
|
||||||
|
@ -348,7 +385,11 @@ def order_shelf(shelf_id):
|
||||||
setattr(book, 'order', to_save[str(book.book_id)])
|
setattr(book, 'order', to_save[str(book.book_id)])
|
||||||
counter += 1
|
counter += 1
|
||||||
# if order diffrent from before -> shelf.last_modified = datetime.utcnow()
|
# if order diffrent from before -> shelf.last_modified = datetime.utcnow()
|
||||||
|
try:
|
||||||
ub.session.commit()
|
ub.session.commit()
|
||||||
|
except (OperationalError, InvalidRequestError):
|
||||||
|
ub.session.rollback()
|
||||||
|
flash(_(u"Settings DB is not Writeable"), category="error")
|
||||||
|
|
||||||
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
|
||||||
result = list()
|
result = list()
|
||||||
|
|
1
cps/static/css/caliBlur.min.css
vendored
1
cps/static/css/caliBlur.min.css
vendored
|
@ -2949,7 +2949,6 @@ body > div.container-fluid > div > div.col-sm-10 > div > div > div.col-sm-3.col-
|
||||||
#bookDetailsModal > .modal-dialog.modal-lg > .modal-content > .modal-body > div > div > div > div.col-sm-3.col-lg-3.col-xs-5 > div.cover, body > div.container-fluid > div > div.col-sm-10 > div > div > div.col-sm-3.col-lg-3.col-xs-5 > div.cover {
|
#bookDetailsModal > .modal-dialog.modal-lg > .modal-content > .modal-body > div > div > div > div.col-sm-3.col-lg-3.col-xs-5 > div.cover, body > div.container-fluid > div > div.col-sm-10 > div > div > div.col-sm-3.col-lg-3.col-xs-5 > div.cover {
|
||||||
margin: 0;
|
margin: 0;
|
||||||
width: 100%;
|
width: 100%;
|
||||||
height: 100%
|
|
||||||
}
|
}
|
||||||
|
|
||||||
#bookDetailsModal > .modal-dialog.modal-lg > .modal-content > .modal-body > div > div > div > div.col-sm-3.col-lg-3.col-xs-5 > div.cover > img, body > div.container-fluid > div > div.col-sm-10 > div > div > div.col-sm-3.col-lg-3.col-xs-5 > div.cover > img {
|
#bookDetailsModal > .modal-dialog.modal-lg > .modal-content > .modal-body > div > div > div > div.col-sm-3.col-lg-3.col-xs-5 > div.cover > img, body > div.container-fluid > div > div.col-sm-10 > div > div > div.col-sm-3.col-lg-3.col-xs-5 > div.cover > img {
|
||||||
|
|
11
cps/static/css/libs/bootstrap-table.min.css
vendored
11
cps/static/css/libs/bootstrap-table.min.css
vendored
File diff suppressed because one or more lines are too long
|
@ -74,8 +74,8 @@ $(function () {
|
||||||
$("#meta-info").html("<ul id=\"book-list\" class=\"media-list\"></ul>");
|
$("#meta-info").html("<ul id=\"book-list\" class=\"media-list\"></ul>");
|
||||||
}
|
}
|
||||||
if ((ggDone === 3 || (ggDone === 1 && ggResults.length === 0)) &&
|
if ((ggDone === 3 || (ggDone === 1 && ggResults.length === 0)) &&
|
||||||
(dbDone === 3 || (ggDone === 1 && dbResults.length === 0)) &&
|
(dbDone === 3 || (dbDone === 1 && dbResults.length === 0)) &&
|
||||||
(cvDone === 3 || (ggDone === 1 && cvResults.length === 0))) {
|
(cvDone === 3 || (cvDone === 1 && cvResults.length === 0))) {
|
||||||
$("#meta-info").html("<p class=\"text-danger\">" + msg.no_result + "</p>");
|
$("#meta-info").html("<p class=\"text-danger\">" + msg.no_result + "</p>");
|
||||||
return;
|
return;
|
||||||
}
|
}
|
||||||
|
|
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-fi-FI.min.js
vendored
Normal file
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-fi-FI.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-fr-CH.min.js
vendored
Normal file
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-fr-CH.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-fr-LU.min.js
vendored
Normal file
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-fr-LU.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-nl-BE.min.js
vendored
Normal file
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-nl-BE.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-sr-Cyrl-RS.min.js
vendored
Normal file
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-sr-Cyrl-RS.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-sr-Latn-RS.min.js
vendored
Normal file
10
cps/static/js/libs/bootstrap-table/locale/bootstrap-table-sr-Latn-RS.min.js
vendored
Normal file
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
File diff suppressed because one or more lines are too long
|
@ -88,6 +88,12 @@
|
||||||
<div class="col-xs-6 col-sm-6">{{_('Port')}}</div>
|
<div class="col-xs-6 col-sm-6">{{_('Port')}}</div>
|
||||||
<div class="col-xs-6 col-sm-6">{{config.config_port}}</div>
|
<div class="col-xs-6 col-sm-6">{{config.config_port}}</div>
|
||||||
</div>
|
</div>
|
||||||
|
{% if feature_support['kobo'] and config.config_port != config.config_external_port %}
|
||||||
|
<div class="row">
|
||||||
|
<div class="col-xs-6 col-sm-6">{{_('External Port')}}</div>
|
||||||
|
<div class="col-xs-6 col-sm-6">{{config.config_external_port}}</div>
|
||||||
|
</div>
|
||||||
|
{% endif %}
|
||||||
</div>
|
</div>
|
||||||
<div class="col-xs-12 col-sm-6">
|
<div class="col-xs-12 col-sm-6">
|
||||||
<div class="row">
|
<div class="row">
|
||||||
|
|
|
@ -92,6 +92,8 @@
|
||||||
<label for="rating">{{_('Rating')}}</label>
|
<label for="rating">{{_('Rating')}}</label>
|
||||||
<input type="number" name="rating" id="rating" class="rating input-lg" data-clearable="" value="{% if book.ratings %}{{(book.ratings[0].rating / 2)|int}}{% endif %}">
|
<input type="number" name="rating" id="rating" class="rating input-lg" data-clearable="" value="{% if book.ratings %}{{(book.ratings[0].rating / 2)|int}}{% endif %}">
|
||||||
</div>
|
</div>
|
||||||
|
{% if g.user.role_upload() or g.user.role_admin()%}
|
||||||
|
{% if g.allow_upload %}
|
||||||
<div class="form-group">
|
<div class="form-group">
|
||||||
<label for="cover_url">{{_('Fetch Cover from URL (JPEG - Image will be downloaded and stored in database)')}}</label>
|
<label for="cover_url">{{_('Fetch Cover from URL (JPEG - Image will be downloaded and stored in database)')}}</label>
|
||||||
<input type="text" class="form-control" name="cover_url" id="cover_url" value="">
|
<input type="text" class="form-control" name="cover_url" id="cover_url" value="">
|
||||||
|
@ -101,6 +103,8 @@
|
||||||
<div class="upload-cover-input-text" id="upload-cover"></div>
|
<div class="upload-cover-input-text" id="upload-cover"></div>
|
||||||
<input id="btn-upload-cover" name="btn-upload-cover" type="file" accept=".jpg, .jpeg, .png, .webp">
|
<input id="btn-upload-cover" name="btn-upload-cover" type="file" accept=".jpg, .jpeg, .png, .webp">
|
||||||
</div>
|
</div>
|
||||||
|
{% endif %}
|
||||||
|
{% endif %}
|
||||||
<div class="form-group">
|
<div class="form-group">
|
||||||
<label for="pubdate">{{_('Published Date')}}</label>
|
<label for="pubdate">{{_('Published Date')}}</label>
|
||||||
<div style="position: relative">
|
<div style="position: relative">
|
||||||
|
@ -132,19 +136,20 @@
|
||||||
<input type="number" step="{% if c.datatype == 'float' %}0.01{% else %}1{% endif %}" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}" value="{% if book['custom_column_' ~ c.id]|length > 0 %}{{ book['custom_column_' ~ c.id][0].value }}{% endif %}">
|
<input type="number" step="{% if c.datatype == 'float' %}0.01{% else %}1{% endif %}" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}" value="{% if book['custom_column_' ~ c.id]|length > 0 %}{{ book['custom_column_' ~ c.id][0].value }}{% endif %}">
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
|
||||||
{% if c.datatype in ['text', 'series'] and not c.is_multiple %}
|
{% if c.datatype == 'text' %}
|
||||||
<input type="text" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}"
|
|
||||||
{% if book['custom_column_' ~ c.id]|length > 0 %}
|
|
||||||
value="{{ book['custom_column_' ~ c.id][0].value }}"
|
|
||||||
{% endif %}>
|
|
||||||
{% endif %}
|
|
||||||
|
|
||||||
{% if c.datatype in ['text', 'series'] and c.is_multiple %}
|
|
||||||
<input type="text" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}"
|
<input type="text" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}"
|
||||||
{% if book['custom_column_' ~ c.id]|length > 0 %}
|
{% if book['custom_column_' ~ c.id]|length > 0 %}
|
||||||
value="{% for column in book['custom_column_' ~ c.id] %}{{ column.value.strip() }}{% if not loop.last %}, {% endif %}{% endfor %}"{% endif %}>
|
value="{% for column in book['custom_column_' ~ c.id] %}{{ column.value.strip() }}{% if not loop.last %}, {% endif %}{% endfor %}"{% endif %}>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
|
||||||
|
{% if c.datatype == 'series' %}
|
||||||
|
<input type="text" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}"
|
||||||
|
{% if book['custom_column_' ~ c.id]|length > 0 %}
|
||||||
|
value="{% for column in book['custom_column_' ~ c.id] %} {{ '%s [%s]' % (book['custom_column_' ~ c.id][0].value, book['custom_column_' ~ c.id][0].extra|formatfloat(2)) }}{% if not loop.last %}, {% endif %}{% endfor %}"
|
||||||
|
{% endif %}>
|
||||||
|
{% endif %}
|
||||||
|
|
||||||
|
|
||||||
{% if c.datatype == 'enumeration' %}
|
{% if c.datatype == 'enumeration' %}
|
||||||
<select class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}">
|
<select class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}">
|
||||||
<option></option>
|
<option></option>
|
||||||
|
@ -159,9 +164,9 @@
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
|
||||||
{% if c.datatype == 'rating' %}
|
{% if c.datatype == 'rating' %}
|
||||||
<input type="number" min="1" max="5" step="1" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}"
|
<input type="number" min="1" max="5" step="0.5" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}"
|
||||||
{% if book['custom_column_' ~ c.id]|length > 0 %}
|
{% if book['custom_column_' ~ c.id]|length > 0 %}
|
||||||
value="{{ '%d' % (book['custom_column_' ~ c.id][0].value / 2) }}"
|
value="{{ '%.1f' % (book['custom_column_' ~ c.id][0].value / 2) }}"
|
||||||
{% endif %}>
|
{% endif %}>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -30,20 +30,20 @@
|
||||||
<div data-related="gdrive_settings">
|
<div data-related="gdrive_settings">
|
||||||
{% if gdriveError %}
|
{% if gdriveError %}
|
||||||
<div class="form-group">
|
<div class="form-group">
|
||||||
<label>
|
<label id="gdrive_error">
|
||||||
{{_('Google Drive config problem')}}: {{ gdriveError }}
|
{{_('Google Drive config problem')}}: {{ gdriveError }}
|
||||||
</label>
|
</label>
|
||||||
</div>
|
</div>
|
||||||
{% else %}
|
{% else %}
|
||||||
{% if show_authenticate_google_drive and g.user.is_authenticated and config.config_use_google_drive %}
|
{% if show_authenticate_google_drive and g.user.is_authenticated and config.config_use_google_drive %}
|
||||||
<div class="form-group required">
|
<div class="form-group required">
|
||||||
<a href="{{ url_for('gdrive.authenticate_google_drive') }}" class="btn btn-primary">{{_('Authenticate Google Drive')}}</a>
|
<a href="{{ url_for('gdrive.authenticate_google_drive') }}" id="gdrive_auth" class="btn btn-primary">{{_('Authenticate Google Drive')}}</a>
|
||||||
</div>
|
</div>
|
||||||
{% else %}
|
{% else %}
|
||||||
{% if show_authenticate_google_drive and g.user.is_authenticated and not config.config_use_google_drive %}
|
{% if show_authenticate_google_drive and g.user.is_authenticated and not config.config_use_google_drive %}
|
||||||
<div >{{_('Please hit submit to continue with setup')}}</div>
|
<div >{{_('Please hit save to continue with setup')}}</div>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% if not g.user.is_authenticated %}
|
{% if not g.user.is_authenticated and show_login_button %}
|
||||||
<div >{{_('Please finish Google Drive setup after login')}}</div>
|
<div >{{_('Please finish Google Drive setup after login')}}</div>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% if g.user.is_authenticated %}
|
{% if g.user.is_authenticated %}
|
||||||
|
@ -194,10 +194,14 @@
|
||||||
<label for="config_kobo_sync">{{_('Enable Kobo sync')}}</label>
|
<label for="config_kobo_sync">{{_('Enable Kobo sync')}}</label>
|
||||||
</div>
|
</div>
|
||||||
<div data-related="kobo-settings">
|
<div data-related="kobo-settings">
|
||||||
<div class="form-group" style="text-indent:10px;">
|
<div class="form-group" style="margin-left:10px;">
|
||||||
<input type="checkbox" id="config_kobo_proxy" name="config_kobo_proxy" {% if config.config_kobo_proxy %}checked{% endif %}>
|
<input type="checkbox" id="config_kobo_proxy" name="config_kobo_proxy" {% if config.config_kobo_proxy %}checked{% endif %}>
|
||||||
<label for="config_kobo_proxy">{{_('Proxy unknown requests to Kobo Store')}}</label>
|
<label for="config_kobo_proxy">{{_('Proxy unknown requests to Kobo Store')}}</label>
|
||||||
</div>
|
</div>
|
||||||
|
<div class="form-group" style="margin-left:10px;">
|
||||||
|
<label for="config_external_port">{{_('Server External Port (for port forwarded API calls)')}}</label>
|
||||||
|
<input type="number" min="1" max="65535" class="form-control" name="config_external_port" id="config_external_port" value="{% if config.config_external_port != None %}{{ config.config_external_port }}{% endif %}" autocomplete="off" required>
|
||||||
|
</div>
|
||||||
</div>
|
</div>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% if feature_support['goodreads'] %}
|
{% if feature_support['goodreads'] %}
|
||||||
|
|
|
@ -81,7 +81,7 @@
|
||||||
|
|
||||||
{% for format in entry.data %}
|
{% for format in entry.data %}
|
||||||
{% if format.format|lower in audioentries %}
|
{% if format.format|lower in audioentries %}
|
||||||
<li><a target="_blank" href="{{ url_for('web.read_book', book_id=entry.id, book_format=format.format|lower) }}">{{format.format}}</a></li>
|
<li><a target="_blank" href="{{ url_for('web.read_book', book_id=entry.id, book_format=format.format|lower) }}">{{format.format|lower }}</a></li>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
</ul>
|
</ul>
|
||||||
|
@ -174,7 +174,7 @@
|
||||||
{{ c.name }}:
|
{{ c.name }}:
|
||||||
{% for column in entry['custom_column_' ~ c.id] %}
|
{% for column in entry['custom_column_' ~ c.id] %}
|
||||||
{% if c.datatype == 'rating' %}
|
{% if c.datatype == 'rating' %}
|
||||||
{{ '%d' % (column.value / 2) }}
|
{{ (column.value / 2)|formatfloat }}
|
||||||
{% else %}
|
{% else %}
|
||||||
{% if c.datatype == 'bool' %}
|
{% if c.datatype == 'bool' %}
|
||||||
{% if column.value == true %}
|
{% if column.value == true %}
|
||||||
|
@ -182,10 +182,18 @@
|
||||||
{% else %}
|
{% else %}
|
||||||
<span class="glyphicon glyphicon-remove"></span>
|
<span class="glyphicon glyphicon-remove"></span>
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
{% else %}
|
||||||
|
{% if c.datatype == 'float' %}
|
||||||
|
{{ column.value|formatfloat(2) }}
|
||||||
|
{% else %}
|
||||||
|
{% if c.datatype == 'series' %}
|
||||||
|
{{ '%s [%s]' % (column.value, column.extra|formatfloat(2)) }}
|
||||||
{% else %}
|
{% else %}
|
||||||
{{ column.value }}
|
{{ column.value }}
|
||||||
{% endif %}
|
{% endif %}
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
{% endif %}
|
||||||
|
{% endif %}
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
{% endif %}
|
{% endif %}
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -37,7 +37,7 @@
|
||||||
</div>
|
</div>
|
||||||
<label for="mail_size">{{_('Attachment Size Limit')}}</label>
|
<label for="mail_size">{{_('Attachment Size Limit')}}</label>
|
||||||
<div class="form-group input-group">
|
<div class="form-group input-group">
|
||||||
<input type="number" min="1" max="600" step="1" class="form-control" name="mail_size" id="mail_size" value="{% if content.mail_size != None %}{{ (content.mail_size / 1024 / 1024)|int }}{% endif %}">
|
<input type="number" min="1" max="600" step="1" class="form-control" name="mail_size" id="mail_size" value="{% if content.mail_size != None %}{{ (content.mail_size / 1024 / 1024)|int }}{% endif %}" required>
|
||||||
<span class="input-group-btn">
|
<span class="input-group-btn">
|
||||||
<button type="button" id="attachement_size" class="btn btn-default" disabled>MB</button>
|
<button type="button" id="attachement_size" class="btn btn-default" disabled>MB</button>
|
||||||
</span>
|
</span>
|
||||||
|
|
|
@ -5,7 +5,7 @@
|
||||||
{{_('Open the .kobo/Kobo eReader.conf file in a text editor and add (or edit):')}}</a>
|
{{_('Open the .kobo/Kobo eReader.conf file in a text editor and add (or edit):')}}</a>
|
||||||
</p>
|
</p>
|
||||||
<p>
|
<p>
|
||||||
{% if not warning %}'api_endpoint='{{kobo_auth_url}}{% else %}{{warning}}{% endif %}</a>
|
{% if not warning %}api_endpoint={{kobo_auth_url}}{% else %}{{warning}}{% endif %}</a>
|
||||||
</p>
|
</p>
|
||||||
<p>
|
<p>
|
||||||
</div>
|
</div>
|
||||||
|
|
|
@ -64,7 +64,7 @@
|
||||||
<form id="form-upload" class="navbar-form" action="{{ url_for('editbook.upload') }}" method="post" enctype="multipart/form-data">
|
<form id="form-upload" class="navbar-form" action="{{ url_for('editbook.upload') }}" method="post" enctype="multipart/form-data">
|
||||||
<div class="form-group">
|
<div class="form-group">
|
||||||
<span class="btn btn-default btn-file">{{_('Upload')}}<input id="btn-upload" name="btn-upload"
|
<span class="btn btn-default btn-file">{{_('Upload')}}<input id="btn-upload" name="btn-upload"
|
||||||
type="file" accept="{% for format in accept %}.{{format}}{{ ',' if not loop.last }}{% endfor %}" multiple></span>
|
type="file" accept="{% for format in accept %}.{% if format != ''%}{{format}}{% else %}*{% endif %}{{ ',' if not loop.last }}{% endfor %}" multiple></span>
|
||||||
</div>
|
</div>
|
||||||
</form>
|
</form>
|
||||||
</li>
|
</li>
|
||||||
|
|
|
@ -165,7 +165,7 @@
|
||||||
{% endif %}
|
{% endif %}
|
||||||
|
|
||||||
{% if c.datatype == 'rating' %}
|
{% if c.datatype == 'rating' %}
|
||||||
<input type="number" min="1" max="5" step="1" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}">
|
<input type="number" min="1" max="5" step="0.5" class="form-control" name="{{ 'custom_column_' ~ c.id }}" id="{{ 'custom_column_' ~ c.id }}">
|
||||||
{% endif %}
|
{% endif %}
|
||||||
</div>
|
</div>
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
|
|
|
@ -17,7 +17,7 @@
|
||||||
{{entry['series_index']}} - {{entry['series'][0].name}}
|
{{entry['series_index']}} - {{entry['series'][0].name}}
|
||||||
{% endif %}
|
{% endif %}
|
||||||
<br>
|
<br>
|
||||||
{% for author in entry['authors'] %}
|
{% for author in entry['author'] %}
|
||||||
{{author.name.replace('|',',')}}
|
{{author.name.replace('|',',')}}
|
||||||
{% if not loop.last %}
|
{% if not loop.last %}
|
||||||
&
|
&
|
||||||
|
|
|
@ -25,6 +25,7 @@
|
||||||
</tr>
|
</tr>
|
||||||
</tbody>
|
</tbody>
|
||||||
</table>
|
</table>
|
||||||
|
{% if g.user.role_admin() %}
|
||||||
<h3>{{_('Linked Libraries')}}</h3>
|
<h3>{{_('Linked Libraries')}}</h3>
|
||||||
<table id="libs" class="table">
|
<table id="libs" class="table">
|
||||||
<thead>
|
<thead>
|
||||||
|
@ -44,4 +45,5 @@
|
||||||
{% endfor %}
|
{% endfor %}
|
||||||
</tbody>
|
</tbody>
|
||||||
</table>
|
</table>
|
||||||
|
{% endif %}
|
||||||
{% endblock %}
|
{% endblock %}
|
||||||
|
|
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
File diff suppressed because it is too large
Load Diff
Binary file not shown.
Some files were not shown because too many files have changed in this diff Show More
Loading…
Reference in New Issue
Block a user