Merge pull request #21 from janeczku/master

merge from janeczku/master
This commit is contained in:
Ethan Lin 2020-12-29 10:31:41 +08:00 committed by GitHub
commit ee28e3346b
No known key found for this signature in database
GPG Key ID: 4AEE18F83AFDEB23
330 changed files with 80772 additions and 39219 deletions

39
.github/ISSUE_TEMPLATE/bug_report.md vendored Normal file
View File

@ -0,0 +1,39 @@
---
name: Bug/Problem report
about: Create a report to help us improve Calibre-Web
title: ''
labels: ''
assignees: ''
---
<!-- Please have a look at our [Contributing Guidelines](https://github.com/janeczku/calibre-web/blob/master/CONTRIBUTING.md) -->
**Describe the bug/problem**
A clear and concise description of what the bug is. If you are asking for support, please check our [Wiki](https://github.com/janeczku/calibre-web/wiki) if your question is already answered there.
**To Reproduce**
Steps to reproduce the behavior:
1. Go to '...'
2. Click on '....'
3. Scroll down to '....'
4. See error
**Logfile**
Add content of calibre-web.log file or the relevant error, try to reproduce your problem with "debug" log-level to get more output.
**Expected behavior**
A clear and concise description of what you expected to happen.
**Screenshots**
If applicable, add screenshots to help explain your problem.
**Environment (please complete the following information):**
- OS: [e.g. Windows 10/Raspberry Pi OS]
- Python version: [e.g. python2.7]
- Calibre-Web version: [e.g. 0.6.8 or 087c4c59 (git rev-parse --short HEAD)]:
- Docker container: [None/Technosoft2000/Linuxuser]:
- Special Hardware: [e.g. Rasperry Pi Zero]
- Browser: [e.g. Chrome 83.0.4103.97, Safari 13.3.7, Firefox 68.0.1 ESR]
**Additional context**
Add any other context about the problem here. [e.g. access via reverse proxy, database background sync, special database location]

View File

@ -0,0 +1,22 @@
---
name: Feature request
about: Suggest an idea for Calibre-Web
title: ''
labels: ''
assignees: ''
---
<!-- Please have a look at our [Contributing Guidelines](https://github.com/janeczku/calibre-web/blob/master/CONTRIBUTING.md) -->
**Is your feature request related to a problem? Please describe.**
A clear and concise description of what the problem is. Ex. I'm always frustrated when [...]
**Describe the solution you'd like**
A clear and concise description of what you want to happen.
**Describe alternatives you've considered**
A clear and concise description of any alternative solutions or features you've considered.
**Additional context**
Add any other context or screenshots about the feature request here.

8
.gitignore vendored
View File

@ -7,9 +7,11 @@ __pycache__/
# Distribution / packaging # Distribution / packaging
.Python .Python
env/ env/
venv/
eggs/ eggs/
dist/ dist/
build/ build/
vendor/
.eggs/ .eggs/
*.egg-info/ *.egg-info/
.installed.cfg .installed.cfg
@ -19,16 +21,12 @@ build/
# calibre-web # calibre-web
*.db *.db
*.log *.log
config.ini
cps/static/[0-9]*
.idea/ .idea/
*.bak *.bak
*.log.* *.log.*
tags
settings.yaml settings.yaml
gdrive_credentials gdrive_credentials
vendor
client_secrets.json client_secrets.json

46
CONTRIBUTING.md Normal file
View File

@ -0,0 +1,46 @@
## How to contribute to Calibre-Web
First of all, we would like to thank you for reading this text. We are happy you are willing to contribute to Calibre-Web.
### **General**
**Communication language** is English. Google translated texts are not as bad as you might think, they are usually understandable, so don't worry if you generate your post that way.
**Calibre-Web** is not **Calibre**. If you are having a question regarding Calibre please post this at their [repository](https://github.com/kovidgoyal/calibre).
**Docker-Containers** of Calibre-Web are maintained by different persons than the people who drive Calibre-Web. If we come to the conclusion during our analysis that the problem is related to Docker, we might ask you to open a new issue at the repository of the Docker Container.
If you are having **Basic Installation Problems** with python or its dependencies, please consider using your favorite search engine to find a solution. In case you can't find a solution, we are happy to help you.
We can offer only very limited support regarding configuration of **Reverse-Proxy Installations**, **OPDS-Reader** or other programs in combination with Calibre-Web.
### **Translation**
Some of the user languages in Calibre-Web having missing translations. We are happy to add the missing texts if you translate them. Create a Pull Request, create an issue with the .po file attached, or write an email to "ozzie.fernandez.isaacs@googlemail.com" with attached translation file. To display all book languages in your native language an additional file is used (iso_language_names.py). The content of this file is auto-generated with the corresponding translations of Calibre, please do not edit this file on your own.
### **Documentation**
The Calibre-Web documentation is hosted in the Github [Wiki](https://github.com/janeczku/calibre-web/wiki). The Wiki is open to everybody, if you find a problem, feel free to correct it. If information is missing, you are welcome to add it. The content will be reviewed time by time. Please try to be consistent with the form with the other Wiki pages (e.g. the project name is Calibre-Web with 2 capital letters and a dash in between).
### **Reporting a bug**
Do not open up a GitHub issue if the bug is a **security vulnerability** in Calibre-Web. Instead, please write an email to "ozzie.fernandez.isaacs@googlemail.com".
Ensure the ***bug was not already reported** by searching on GitHub under [Issues](https://github.com/janeczku/calibre-web/issues). Please also check if a solution for your problem can be found in the [wiki](https://github.com/janeczku/calibre-web/wiki).
If you're unable to find an **open issue** addressing the problem, open a [new one](https://github.com/janeczku/calibre-web/issues/new?assignees=&labels=&template=bug_report.md&title=). Be sure to include a **title** and **clear description**, as much relevant information as possible, the **issue form** helps you providing the right information. Deleting the form and just pasting the stack trace doesn't speed up fixing the problem. If your issue could be resolved, consider closing the issue.
### **Feature Request**
If there is a feature missing in Calibre-Web and you can't find a feature request in the [Issues](https://github.com/janeczku/calibre-web/issues) section, you could create a [feature request](https://github.com/janeczku/calibre-web/issues/new?assignees=&labels=&template=feature_request.md&title=).
We will not extend Calibre-Web with any more login abilities or add further files storages, or file syncing ability. Furthermore Calibre-Web is made for home usage for company in-house usage, so requests regarding any sorts of social interaction capability, payment routines, search engine or web site analytics integration will not be implemented.
### **Contributing code to Calibre-Web**
Open a new GitHub pull request with the patch. Ensure the PR description clearly describes the problem and solution. Include the relevant issue number if applicable.
In case your code enhances features of Calibre-Web: Create your pull request for the development branch if your enhancement consists of more than some lines of code in a local section of Calibre-Webs code. This makes it easier to test it and check all implication before it's made public.
Please check if your code runs on Python 2.7 (still necessary in 2020) and mainly on python 3. If possible and the feature is related to operating system functions, try to check it on Windows and Linux.
Calibre-Web is automatically tested on Linux in combination with python 3.7. The code for testing is in a [separate repo](https://github.com/OzzieIsaacs/calibre-web-test) on Github. It uses unit tests and performs real system tests with selenium; it would be great if you could consider also writing some tests.
A static code analysis is done by Codacy, but it's partly broken and doesn't run automatically. You could check your code with ESLint before contributing, a configuration file can be found in the projects root folder.

View File

@ -12,7 +12,7 @@ Calibre-Web is a web app providing a clean interface for browsing, reading and d
- full graphical setup - full graphical setup
- User management with fine-grained per-user permissions - User management with fine-grained per-user permissions
- Admin interface - Admin interface
- User Interface in dutch, english, french, german, hungarian, italian, japanese, khmer, polish, russian, simplified chinese, spanish, swedish, ukrainian - User Interface in czech, dutch, english, finnish, french, german, greek, hungarian, italian, japanese, khmer, polish, russian, simplified chinese, spanish, swedish, turkish, ukrainian
- OPDS feed for eBook reader apps - OPDS feed for eBook reader apps
- Filter and search by titles, authors, tags, series and language - Filter and search by titles, authors, tags, series and language
- Create a custom book collection (shelves) - Create a custom book collection (shelves)
@ -21,16 +21,18 @@ Calibre-Web is a web app providing a clean interface for browsing, reading and d
- Restrict eBook download to logged-in users - Restrict eBook download to logged-in users
- Support for public user registration - Support for public user registration
- Send eBooks to Kindle devices with the click of a button - Send eBooks to Kindle devices with the click of a button
- Sync your Kobo devices through Calibre-Web with your Calibre library
- Support for reading eBooks directly in the browser (.txt, .epub, .pdf, .cbr, .cbt, .cbz) - Support for reading eBooks directly in the browser (.txt, .epub, .pdf, .cbr, .cbt, .cbz)
- Upload new books in many formats - Upload new books in many formats, including audio formats (.mp3, .m4a, .m4b)
- Support for Calibre custom columns - Support for Calibre Custom Columns
- Ability to hide content based on categories for certain users - Ability to hide content based on categories and Custom Column content per user
- Self-update capability - Self-update capability
- "Magic Link" login to make it easy to log on eReaders - "Magic Link" login to make it easy to log on eReaders
- Login via LDAP, google/github oauth and via proxy authentication
## Quick start ## Quick start
1. Install dependencies by running `pip install --target vendor -r requirements.txt`. 1. Install dependencies by running `pip3 install --target vendor -r requirements.txt` (python3.x) or `pip install --target vendor -r requirements.txt` (python2.7).
2. Execute the command: `python cps.py` (or `nohup python cps.py` - recommended if you want to exit the terminal window) 2. Execute the command: `python cps.py` (or `nohup python cps.py` - recommended if you want to exit the terminal window)
3. Point your browser to `http://localhost:8083` or `http://localhost:8083/opds` for the OPDS catalog 3. Point your browser to `http://localhost:8083` or `http://localhost:8083/opds` for the OPDS catalog
4. Set `Location of Calibre database` to the path of the folder where your Calibre library (metadata.db) lives, push "submit" button\ 4. Set `Location of Calibre database` to the path of the folder where your Calibre library (metadata.db) lives, push "submit" button\
@ -46,29 +48,27 @@ Please note that running the above install command can fail on some versions of
## Requirements ## Requirements
Python 2.7+, python 3.x+ python 3.x+, (Python 2.7+)
Optionally, to enable on-the-fly conversion from one ebook format to another when using the send-to-kindle feature, or during editing of ebooks metadata: Optionally, to enable on-the-fly conversion from one ebook format to another when using the send-to-kindle feature, or during editing of ebooks metadata:
[Download and install](https://calibre-ebook.com/download) the Calibre desktop program for your platform and enter the folder including program name (normally /opt/calibre/ebook-convert, or C:\Program Files\calibre\ebook-convert.exe) in the field "calibre's converter tool" on the setup page. [Download and install](https://calibre-ebook.com/download) the Calibre desktop program for your platform and enter the folder including program name (normally /opt/calibre/ebook-convert, or C:\Program Files\calibre\ebook-convert.exe) in the field "calibre's converter tool" on the setup page.
\*** DEPRECATED \*** Support will be removed in future releases [Download](https://github.com/pgaskin/kepubify/releases/latest) Kepubify tool for your platform and place the binary starting with `kepubify` in Linux: `\opt\kepubify` Windows: `C:\Program Files\kepubify`.
[Download](http://www.amazon.com/gp/feature.html?docId=1000765211) Amazon's KindleGen tool for your platform and place the binary named `kindlegen` in the `vendor` folder.
## Docker Images ## Docker Images
Pre-built Docker images are available in these Docker Hub repositories: Pre-built Docker images are available in these Docker Hub repositories:
#### **Technosoft2000 - x64** #### **Technosoft2000 - x64**
+ Docker Hub - [https://hub.docker.com/r/technosoft2000/calibre-web/](https://hub.docker.com/r/technosoft2000/calibre-web/) + Docker Hub - [https://hub.docker.com/r/technosoft2000/calibre-web](https://hub.docker.com/r/technosoft2000/calibre-web)
+ Github - [https://github.com/Technosoft2000/docker-calibre-web](https://github.com/Technosoft2000/docker-calibre-web) + Github - [https://github.com/Technosoft2000/docker-calibre-web](https://github.com/Technosoft2000/docker-calibre-web)
Includes the Calibre `ebook-convert` binary. Includes the Calibre `ebook-convert` binary.
+ The "path to convertertool" should be set to `/opt/calibre/ebook-convert` + The "path to convertertool" should be set to `/opt/calibre/ebook-convert`
#### **LinuxServer - x64, armhf, aarch64** #### **LinuxServer - x64, armhf, aarch64**
+ Docker Hub - [https://hub.docker.com/r/linuxserver/calibre-web/](https://hub.docker.com/r/linuxserver/calibre-web/) + Docker Hub - [https://hub.docker.com/r/linuxserver/calibre-web](https://hub.docker.com/r/linuxserver/calibre-web)
+ Github - [https://github.com/linuxserver/docker-calibre-web](https://github.com/linuxserver/docker-calibre-web) + Github - [https://github.com/linuxserver/docker-calibre-web](https://github.com/linuxserver/docker-calibre-web)
+ Github - (Optional Calibre layer) - [https://github.com/linuxserver/docker-calibre-web/tree/calibre](https://github.com/linuxserver/docker-calibre-web/tree/calibre) + Github - (Optional Calibre layer) - [https://github.com/linuxserver/docker-calibre-web/tree/calibre](https://github.com/linuxserver/docker-calibre-web/tree/calibre)
@ -83,3 +83,7 @@ Pre-built Docker images are available in these Docker Hub repositories:
# Wiki # Wiki
For further information, How To's and FAQ please check the [Wiki](https://github.com/janeczku/calibre-web/wiki) For further information, How To's and FAQ please check the [Wiki](https://github.com/janeczku/calibre-web/wiki)
# Contributing to Calibre-Web
Please have a look at our [Contributing Guidelines](https://github.com/janeczku/calibre-web/blob/master/CONTRIBUTING.md)

16
cps.py
View File

@ -31,7 +31,7 @@ else:
sys.path.append(os.path.join(os.path.dirname(os.path.abspath(__file__)), 'vendor')) sys.path.append(os.path.join(os.path.dirname(os.path.abspath(__file__)), 'vendor'))
from cps import create_app from cps import create_app, config
from cps import web_server from cps import web_server
from cps.opds import opds from cps.opds import opds
from cps.web import web from cps.web import web
@ -41,6 +41,14 @@ from cps.shelf import shelf
from cps.admin import admi from cps.admin import admi
from cps.gdrive import gdrive from cps.gdrive import gdrive
from cps.editbooks import editbook from cps.editbooks import editbook
try:
from cps.kobo import kobo, get_kobo_activated
from cps.kobo_auth import kobo_auth
kobo_available = get_kobo_activated()
except ImportError:
kobo_available = False
try: try:
from cps.oauth_bb import oauth from cps.oauth_bb import oauth
oauth_available = True oauth_available = True
@ -56,8 +64,12 @@ def main():
app.register_blueprint(about) app.register_blueprint(about)
app.register_blueprint(shelf) app.register_blueprint(shelf)
app.register_blueprint(admi) app.register_blueprint(admi)
app.register_blueprint(gdrive) if config.config_use_google_drive:
app.register_blueprint(gdrive)
app.register_blueprint(editbook) app.register_blueprint(editbook)
if kobo_available:
app.register_blueprint(kobo)
app.register_blueprint(kobo_auth)
if oauth_available: if oauth_available:
app.register_blueprint(oauth) app.register_blueprint(oauth)
success = web_server.start() success = web_server.start()

32
cps/__init__.py Executable file → Normal file
View File

@ -33,7 +33,7 @@ from flask_login import LoginManager
from flask_babel import Babel from flask_babel import Babel
from flask_principal import Principal from flask_principal import Principal
from . import logger, cache_buster, cli, config_sql, ub, db, services from . import config_sql, logger, cache_buster, cli, ub, db
from .reverseproxy import ReverseProxied from .reverseproxy import ReverseProxied
from .server import WebServer from .server import WebServer
@ -57,17 +57,22 @@ mimetypes.add_type('application/ogg', '.ogg')
mimetypes.add_type('application/ogg', '.oga') mimetypes.add_type('application/ogg', '.oga')
app = Flask(__name__) app = Flask(__name__)
app.config.update(
SESSION_COOKIE_HTTPONLY=True,
SESSION_COOKIE_SAMESITE='Lax',
REMEMBER_COOKIE_SAMESITE='Lax', # will be available in flask-login 0.5.1 earliest
)
lm = LoginManager() lm = LoginManager()
lm.login_view = 'web.login' lm.login_view = 'web.login'
lm.anonymous_user = ub.Anonymous lm.anonymous_user = ub.Anonymous
lm.session_protection = 'strong'
ub.init_db(cli.settingspath) ub.init_db(cli.settingspath)
# pylint: disable=no-member # pylint: disable=no-member
config = config_sql.load_configuration(ub.session) config = config_sql.load_configuration(ub.session)
searched_ids = {}
web_server = WebServer() web_server = WebServer()
babel = Babel() babel = Babel()
@ -75,6 +80,11 @@ _BABEL_TRANSLATIONS = set()
log = logger.create() log = logger.create()
from . import services
db.CalibreDB.setup_db(config, cli.settingspath)
calibre_db = db.CalibreDB()
def create_app(): def create_app():
app.wsgi_app = ReverseProxied(app.wsgi_app) app.wsgi_app = ReverseProxied(app.wsgi_app)
@ -82,17 +92,16 @@ def create_app():
if sys.version_info < (3, 0): if sys.version_info < (3, 0):
app.static_folder = app.static_folder.decode('utf-8') app.static_folder = app.static_folder.decode('utf-8')
app.root_path = app.root_path.decode('utf-8') app.root_path = app.root_path.decode('utf-8')
app.instance_path = app.instance_path .decode('utf-8') app.instance_path = app.instance_path.decode('utf-8')
cache_buster.init_cache_busting(app) cache_buster.init_cache_busting(app)
log.info('Starting Calibre Web...') log.info('Starting Calibre Web...')
Principal(app) Principal(app)
lm.init_app(app) lm.init_app(app)
app.secret_key = os.getenv('SECRET_KEY', 'A0Zr98j/3yX R~XHH!jmN]LWX/,?RT') app.secret_key = os.getenv('SECRET_KEY', config_sql.get_flask_session_key(ub.session))
web_server.init_app(app, config) web_server.init_app(app, config)
db.setup_db(config)
babel.init_app(app) babel.init_app(app)
_BABEL_TRANSLATIONS.update(str(item) for item in babel.list_translations()) _BABEL_TRANSLATIONS.update(str(item) for item in babel.list_translations())
@ -116,14 +125,13 @@ def get_locale():
if user.nickname != 'Guest': # if the account is the guest account bypass the config lang settings if user.nickname != 'Guest': # if the account is the guest account bypass the config lang settings
return user.locale return user.locale
preferred = set() preferred = list()
if request.accept_languages: if request.accept_languages:
for x in request.accept_languages.values(): for x in request.accept_languages.values():
try: try:
preferred.add(str(LC.parse(x.replace('-', '_')))) preferred.append(str(LC.parse(x.replace('-', '_'))))
except (UnknownLocaleError, ValueError) as e: except (UnknownLocaleError, ValueError) as e:
log.warning('Could not parse locale "%s": %s', x, e) log.debug('Could not parse locale "%s": %s', x, e)
# preferred.append('en')
return negotiate_locale(preferred or ['en'], _BABEL_TRANSLATIONS) return negotiate_locale(preferred or ['en'], _BABEL_TRANSLATIONS)
@ -135,6 +143,4 @@ def get_timezone():
from .updater import Updater from .updater import Updater
updater_thread = Updater() updater_thread = Updater()
updater_thread.start()
__all__ = ['app']

View File

@ -22,6 +22,7 @@
from __future__ import division, print_function, unicode_literals from __future__ import division, print_function, unicode_literals
import sys import sys
import platform
import sqlite3 import sqlite3
from collections import OrderedDict from collections import OrderedDict
@ -29,7 +30,7 @@ import babel, pytz, requests, sqlalchemy
import werkzeug, flask, flask_login, flask_principal, jinja2 import werkzeug, flask, flask_login, flask_principal, jinja2
from flask_babel import gettext as _ from flask_babel import gettext as _
from . import db, converter, uploader, server, isoLanguages from . import db, calibre_db, converter, uploader, server, isoLanguages, constants
from .web import render_title_template from .web import render_title_template
try: try:
from flask_login import __version__ as flask_loginVersion from flask_login import __version__ as flask_loginVersion
@ -42,13 +43,22 @@ try:
except ImportError: except ImportError:
unidecode_version = _(u'not installed') unidecode_version = _(u'not installed')
try:
from flask_dance import __version__ as flask_danceVersion
except ImportError:
flask_danceVersion = None
from . import services from . import services
about = flask.Blueprint('about', __name__) about = flask.Blueprint('about', __name__)
_VERSIONS = OrderedDict( _VERSIONS = OrderedDict(
Platform = '{0[0]} {0[2]} {0[3]} {0[4]} {0[5]}'.format(platform.uname()),
Python=sys.version, Python=sys.version,
Calibre_Web=constants.STABLE_VERSION['version'] + ' - '
+ constants.NIGHTLY_VERSION[0].replace('%','%%') + ' - '
+ constants.NIGHTLY_VERSION[1].replace('%','%%'),
WebServer=server.VERSION, WebServer=server.VERSION,
Flask=flask.__version__, Flask=flask.__version__,
Flask_Login=flask_loginVersion, Flask_Login=flask_loginVersion,
@ -63,19 +73,29 @@ _VERSIONS = OrderedDict(
iso639=isoLanguages.__version__, iso639=isoLanguages.__version__,
pytz=pytz.__version__, pytz=pytz.__version__,
Unidecode = unidecode_version, Unidecode = unidecode_version,
Flask_SimpleLDAP = u'installed' if bool(services.ldap) else u'not installed', Flask_SimpleLDAP = u'installed' if bool(services.ldap) else None,
Goodreads = u'installed' if bool(services.goodreads_support) else u'not installed', python_LDAP = services.ldapVersion if bool(services.ldapVersion) else None,
Goodreads = u'installed' if bool(services.goodreads_support) else None,
jsonschema = services.SyncToken.__version__ if bool(services.SyncToken) else None,
flask_dance = flask_danceVersion
) )
_VERSIONS.update(uploader.get_versions()) _VERSIONS.update(uploader.get_versions())
def collect_stats():
_VERSIONS['ebook converter'] = _(converter.get_calibre_version())
_VERSIONS['unrar'] = _(converter.get_unrar_version())
_VERSIONS['kepubify'] = _(converter.get_kepubify_version())
return _VERSIONS
@about.route("/stats") @about.route("/stats")
@flask_login.login_required @flask_login.login_required
def stats(): def stats():
counter = db.session.query(db.Books).count() counter = calibre_db.session.query(db.Books).count()
authors = db.session.query(db.Authors).count() authors = calibre_db.session.query(db.Authors).count()
categorys = db.session.query(db.Tags).count() categorys = calibre_db.session.query(db.Tags).count()
series = db.session.query(db.Series).count() series = calibre_db.session.query(db.Series).count()
_VERSIONS['ebook converter'] = _(converter.get_version()) return render_title_template('stats.html', bookcounter=counter, authorcounter=authors, versions=collect_stats(),
return render_title_template('stats.html', bookcounter=counter, authorcounter=authors, versions=_VERSIONS,
categorycounter=categorys, seriecounter=series, title=_(u"Statistics"), page="stat") categorycounter=categorys, seriecounter=series, title=_(u"Statistics"), page="stat")

File diff suppressed because it is too large Load Diff

View File

@ -1,3 +1,5 @@
# -*- coding: utf-8 -*-
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web) # This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
# Copyright (C) 2016-2019 jkrehm andy29485 OzzieIsaacs # Copyright (C) 2016-2019 jkrehm andy29485 OzzieIsaacs
# #

View File

@ -34,7 +34,7 @@ def version_info():
parser = argparse.ArgumentParser(description='Calibre Web is a web app' parser = argparse.ArgumentParser(description='Calibre Web is a web app'
' providing a interface for browsing, reading and downloading eBooks\n', prog='cps.py') ' providing a interface for browsing, reading and downloading eBooks\n', prog='cps.py')
parser.add_argument('-p', metavar='path', help='path and name to settings db, e.g. /opt/cw.db') parser.add_argument('-p', metavar='path', help='path and name to settings db, e.g. /opt/cw.db')
parser.add_argument('-g', metavar='path', help='path and name to gdrive db, e.g. /opt/gd.db') parser.add_argument('-g', metavar='path', help='path and name to gdrive db, e.g. /opt/gd.db')
parser.add_argument('-c', metavar='path', parser.add_argument('-c', metavar='path',
@ -43,7 +43,7 @@ parser.add_argument('-k', metavar='path',
help='path and name to SSL keyfile, e.g. /opt/test.key, works only in combination with certfile') help='path and name to SSL keyfile, e.g. /opt/test.key, works only in combination with certfile')
parser.add_argument('-v', '--version', action='version', help='Shows version number and exits Calibre-web', parser.add_argument('-v', '--version', action='version', help='Shows version number and exits Calibre-web',
version=version_info()) version=version_info())
parser.add_argument('-i', metavar='ip-adress', help='Server IP-Adress to listen') parser.add_argument('-i', metavar='ip-address', help='Server IP-Address to listen')
parser.add_argument('-s', metavar='user:pass', help='Sets specific username to new password') parser.add_argument('-s', metavar='user:pass', help='Sets specific username to new password')
args = parser.parse_args() args = parser.parse_args()

119
cps/comic.py Executable file → Normal file
View File

@ -1,4 +1,3 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web) # This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
@ -19,10 +18,17 @@
from __future__ import division, print_function, unicode_literals from __future__ import division, print_function, unicode_literals
import os import os
import io
from . import logger, isoLanguages from . import logger, isoLanguages
from .constants import BookMeta from .constants import BookMeta
try:
from PIL import Image as PILImage
use_PIL = True
except ImportError as e:
use_PIL = False
log = logger.create() log = logger.create()
@ -30,22 +36,51 @@ log = logger.create()
try: try:
from comicapi.comicarchive import ComicArchive, MetaDataStyle from comicapi.comicarchive import ComicArchive, MetaDataStyle
use_comic_meta = True use_comic_meta = True
except ImportError as e: try:
log.debug('cannot import comicapi, extracting comic metadata will not work: %s', e) from comicapi import __version__ as comic_version
except ImportError:
comic_version = ''
except (ImportError, LookupError) as e:
log.debug('Cannot import comicapi, extracting comic metadata will not work: %s', e)
import zipfile import zipfile
import tarfile import tarfile
try:
import rarfile
use_rarfile = True
except (ImportError, SyntaxError) as e:
log.debug('Cannot import rarfile, extracting cover files from rar files will not work: %s', e)
use_rarfile = False
use_comic_meta = False use_comic_meta = False
def _cover_processing(tmp_file_name, img, extension):
if use_PIL:
# convert to jpg because calibre only supports jpg
if extension in ('.png', '.webp'):
imgc = PILImage.open(io.BytesIO(img))
im = imgc.convert('RGB')
tmp_bytesio = io.BytesIO()
im.save(tmp_bytesio, format='JPEG')
img = tmp_bytesio.getvalue()
def extractCover(tmp_file_name, original_file_extension): if not img:
return None
tmp_cover_name = os.path.join(os.path.dirname(tmp_file_name), 'cover.jpg')
with open(tmp_cover_name, 'wb') as f:
f.write(img)
return tmp_cover_name
def _extractCover(tmp_file_name, original_file_extension, rarExecutable):
cover_data = extension = None
if use_comic_meta: if use_comic_meta:
archive = ComicArchive(tmp_file_name) archive = ComicArchive(tmp_file_name, rar_exe_path=rarExecutable)
cover_data = None
for index, name in enumerate(archive.getPageNameList()): for index, name in enumerate(archive.getPageNameList()):
ext = os.path.splitext(name) ext = os.path.splitext(name)
if len(ext) > 1: if len(ext) > 1:
extension = ext[1].lower() extension = ext[1].lower()
if extension == '.jpg' or extension == '.jpeg': if extension in ('.jpg', '.jpeg', '.png', '.webp'):
cover_data = archive.getPage(index) cover_data = archive.getPage(index)
break break
else: else:
@ -55,7 +90,7 @@ def extractCover(tmp_file_name, original_file_extension):
ext = os.path.splitext(name) ext = os.path.splitext(name)
if len(ext) > 1: if len(ext) > 1:
extension = ext[1].lower() extension = ext[1].lower()
if extension == '.jpg' or extension == '.jpeg': if extension in ('.jpg', '.jpeg', '.png', '.webp'):
cover_data = cf.read(name) cover_data = cf.read(name)
break break
elif original_file_extension.upper() == '.CBT': elif original_file_extension.upper() == '.CBT':
@ -64,23 +99,28 @@ def extractCover(tmp_file_name, original_file_extension):
ext = os.path.splitext(name) ext = os.path.splitext(name)
if len(ext) > 1: if len(ext) > 1:
extension = ext[1].lower() extension = ext[1].lower()
if extension == '.jpg' or extension == '.jpeg': if extension in ('.jpg', '.jpeg', '.png', '.webp'):
cover_data = cf.extractfile(name).read() cover_data = cf.extractfile(name).read()
break break
prefix = os.path.dirname(tmp_file_name) elif original_file_extension.upper() == '.CBR' and use_rarfile:
if cover_data: try:
tmp_cover_name = prefix + '/cover' + extension rarfile.UNRAR_TOOL = rarExecutable
image = open(tmp_cover_name, 'wb') cf = rarfile.RarFile(tmp_file_name)
image.write(cover_data) for name in cf.getnames():
image.close() ext = os.path.splitext(name)
else: if len(ext) > 1:
tmp_cover_name = None extension = ext[1].lower()
return tmp_cover_name if extension in ('.jpg', '.jpeg', '.png', '.webp'):
cover_data = cf.read(name)
break
except Exception as e:
log.debug('Rarfile failed with error: %s', e)
return _cover_processing(tmp_file_name, cover_data, extension)
def get_comic_info(tmp_file_path, original_file_name, original_file_extension): def get_comic_info(tmp_file_path, original_file_name, original_file_extension, rarExecutable):
if use_comic_meta: if use_comic_meta:
archive = ComicArchive(tmp_file_path) archive = ComicArchive(tmp_file_path, rar_exe_path=rarExecutable)
if archive.seemsToBeAComicArchive(): if archive.seemsToBeAComicArchive():
if archive.hasMetadata(MetaDataStyle.CIX): if archive.hasMetadata(MetaDataStyle.CIX):
style = MetaDataStyle.CIX style = MetaDataStyle.CIX
@ -92,36 +132,29 @@ def get_comic_info(tmp_file_path, original_file_name, original_file_extension):
# if style is not None: # if style is not None:
loadedMetadata = archive.readMetadata(style) loadedMetadata = archive.readMetadata(style)
lang = loadedMetadata.language lang = loadedMetadata.language or ""
if lang: loadedMetadata.language = isoLanguages.get_lang3(lang)
if len(lang) == 2:
loadedMetadata.language = isoLanguages.get(part1=lang).name
elif len(lang) == 3:
loadedMetadata.language = isoLanguages.get(part3=lang).name
else:
loadedMetadata.language = ""
return BookMeta( return BookMeta(
file_path=tmp_file_path, file_path=tmp_file_path,
extension=original_file_extension, extension=original_file_extension,
title=loadedMetadata.title or original_file_name, title=loadedMetadata.title or original_file_name,
author=" & ".join([credit["person"] for credit in loadedMetadata.credits if credit["role"] == "Writer"]) or u"Unknown", author=" & ".join([credit["person"] for credit in loadedMetadata.credits if credit["role"] == "Writer"]) or u'Unknown',
cover=extractCover(tmp_file_path, original_file_extension), cover=_extractCover(tmp_file_path, original_file_extension, rarExecutable),
description=loadedMetadata.comments or "", description=loadedMetadata.comments or "",
tags="", tags="",
series=loadedMetadata.series or "", series=loadedMetadata.series or "",
series_id=loadedMetadata.issue or "", series_id=loadedMetadata.issue or "",
languages=loadedMetadata.language) languages=loadedMetadata.language)
else:
return BookMeta( return BookMeta(
file_path=tmp_file_path, file_path=tmp_file_path,
extension=original_file_extension, extension=original_file_extension,
title=original_file_name, title=original_file_name,
author=u"Unknown", author=u'Unknown',
cover=extractCover(tmp_file_path, original_file_extension), cover=_extractCover(tmp_file_path, original_file_extension, rarExecutable),
description="", description="",
tags="", tags="",
series="", series="",
series_id="", series_id="",
languages="") languages="")

View File

@ -19,18 +19,26 @@
from __future__ import division, print_function, unicode_literals from __future__ import division, print_function, unicode_literals
import os import os
import json
import sys import sys
from sqlalchemy import exc, Column, String, Integer, SmallInteger, Boolean from sqlalchemy import exc, Column, String, Integer, SmallInteger, Boolean, BLOB, JSON
from sqlalchemy.ext.declarative import declarative_base from sqlalchemy.ext.declarative import declarative_base
from . import constants, cli, logger from . import constants, cli, logger, ub
log = logger.create() log = logger.create()
_Base = declarative_base() _Base = declarative_base()
class _Flask_Settings(_Base):
__tablename__ = 'flask_settings'
id = Column(Integer, primary_key=True)
flask_session_key = Column(BLOB, default="")
def __init__(self, key):
self.flask_session_key = key
# Baseclass for representing settings in app.db with email server settings and Calibre database settings # Baseclass for representing settings in app.db with email server settings and Calibre database settings
# (application settings) # (application settings)
@ -38,15 +46,17 @@ class _Settings(_Base):
__tablename__ = 'settings' __tablename__ = 'settings'
id = Column(Integer, primary_key=True) id = Column(Integer, primary_key=True)
mail_server = Column(String, default='mail.example.org') mail_server = Column(String, default=constants.DEFAULT_MAIL_SERVER)
mail_port = Column(Integer, default=25) mail_port = Column(Integer, default=25)
mail_use_ssl = Column(SmallInteger, default=0) mail_use_ssl = Column(SmallInteger, default=0)
mail_login = Column(String, default='mail@example.com') mail_login = Column(String, default='mail@example.com')
mail_password = Column(String, default='mypassword') mail_password = Column(String, default='mypassword')
mail_from = Column(String, default='automailer <mail@example.com>') mail_from = Column(String, default='automailer <mail@example.com>')
mail_size = Column(Integer, default=25*1024*1024)
config_calibre_dir = Column(String) config_calibre_dir = Column(String)
config_port = Column(Integer, default=constants.DEFAULT_PORT) config_port = Column(Integer, default=constants.DEFAULT_PORT)
config_external_port = Column(Integer, default=constants.DEFAULT_PORT)
config_certfile = Column(String) config_certfile = Column(String)
config_keyfile = Column(String) config_keyfile = Column(String)
@ -68,44 +78,58 @@ class _Settings(_Base):
config_anonbrowse = Column(SmallInteger, default=0) config_anonbrowse = Column(SmallInteger, default=0)
config_public_reg = Column(SmallInteger, default=0) config_public_reg = Column(SmallInteger, default=0)
config_remote_login = Column(Boolean, default=False) config_remote_login = Column(Boolean, default=False)
config_kobo_sync = Column(Boolean, default=False)
config_default_role = Column(SmallInteger, default=0) config_default_role = Column(SmallInteger, default=0)
config_default_show = Column(SmallInteger, default=6143) config_default_show = Column(SmallInteger, default=constants.ADMIN_USER_SIDEBAR)
config_columns_to_ignore = Column(String) config_columns_to_ignore = Column(String)
config_denied_tags = Column(String, default="")
config_allowed_tags = Column(String, default="")
config_restricted_column = Column(SmallInteger, default=0)
config_denied_column_value = Column(String, default="")
config_allowed_column_value = Column(String, default="")
config_use_google_drive = Column(Boolean, default=False) config_use_google_drive = Column(Boolean, default=False)
config_google_drive_folder = Column(String) config_google_drive_folder = Column(String)
config_google_drive_watch_changes_response = Column(String) config_google_drive_watch_changes_response = Column(JSON, default={})
config_use_goodreads = Column(Boolean, default=False) config_use_goodreads = Column(Boolean, default=False)
config_goodreads_api_key = Column(String) config_goodreads_api_key = Column(String)
config_goodreads_api_secret = Column(String) config_goodreads_api_secret = Column(String)
config_register_email = Column(Boolean, default=False)
config_login_type = Column(Integer, default=0) config_login_type = Column(Integer, default=0)
# config_oauth_provider = Column(Integer) config_kobo_proxy = Column(Boolean, default=False)
config_ldap_provider_url = Column(String, default='localhost') config_ldap_provider_url = Column(String, default='example.org')
config_ldap_port = Column(SmallInteger, default=389) config_ldap_port = Column(SmallInteger, default=389)
config_ldap_schema = Column(String, default='ldap') config_ldap_authentication = Column(SmallInteger, default=constants.LDAP_AUTH_SIMPLE)
config_ldap_serv_username = Column(String) config_ldap_serv_username = Column(String, default='cn=admin,dc=example,dc=org')
config_ldap_serv_password = Column(String) config_ldap_serv_password = Column(String, default="")
config_ldap_use_ssl = Column(Boolean, default=False) config_ldap_encryption = Column(SmallInteger, default=0)
config_ldap_use_tls = Column(Boolean, default=False) config_ldap_cacert_path = Column(String, default="")
config_ldap_require_cert = Column(Boolean, default=False) config_ldap_cert_path = Column(String, default="")
config_ldap_cert_path = Column(String) config_ldap_key_path = Column(String, default="")
config_ldap_dn = Column(String) config_ldap_dn = Column(String, default='dc=example,dc=org')
config_ldap_user_object = Column(String) config_ldap_user_object = Column(String, default='uid=%s')
config_ldap_openldap = Column(Boolean, default=False) config_ldap_member_user_object = Column(String, default='') #
config_ldap_openldap = Column(Boolean, default=True)
config_ldap_group_object_filter = Column(String, default='(&(objectclass=posixGroup)(cn=%s))')
config_ldap_group_members_field = Column(String, default='memberUid')
config_ldap_group_name = Column(String, default='calibreweb')
config_ebookconverter = Column(Integer, default=0) config_kepubifypath = Column(String, default=None)
config_converterpath = Column(String) config_converterpath = Column(String, default=None)
config_calibre = Column(String) config_calibre = Column(String)
config_rarfile_location = Column(String) config_rarfile_location = Column(String, default=None)
config_upload_formats = Column(String, default=','.join(constants.EXTENSIONS_UPLOAD))
config_updatechannel = Column(Integer, default=constants.UPDATE_STABLE) config_updatechannel = Column(Integer, default=constants.UPDATE_STABLE)
config_reverse_proxy_login_header_name = Column(String)
config_allow_reverse_proxy_header_login = Column(Boolean, default=False)
def __repr__(self): def __repr__(self):
return self.__class__.__name__ return self.__class__.__name__
@ -120,6 +144,22 @@ class _ConfigSQL(object):
self.config_calibre_dir = None self.config_calibre_dir = None
self.load() self.load()
change = False
if self.config_converterpath == None:
change = True
self.config_converterpath = autodetect_calibre_binary()
if self.config_kepubifypath == None:
change = True
self.config_kepubifypath = autodetect_kepubify_binary()
if self.config_rarfile_location == None:
change = True
self.config_rarfile_location = autodetect_unrar_binary()
if change:
self.save()
def _read_from_storage(self): def _read_from_storage(self):
if self._settings is None: if self._settings is None:
log.debug("_ConfigSQL._read_from_storage") log.debug("_ConfigSQL._read_from_storage")
@ -176,12 +216,21 @@ class _ConfigSQL(object):
def show_detail_random(self): def show_detail_random(self):
return self.show_element_new_user(constants.DETAIL_RANDOM) return self.show_element_new_user(constants.DETAIL_RANDOM)
def show_mature_content(self): def list_denied_tags(self):
return self.show_element_new_user(constants.MATURE_CONTENT) mct = self.config_denied_tags or ""
return [t.strip() for t in mct.split(",")]
def mature_content_tags(self): def list_allowed_tags(self):
mct = self.config_mature_content_tags.split(",") mct = self.config_allowed_tags or ""
return [t.strip() for t in mct] return [t.strip() for t in mct.split(",")]
def list_denied_column_values(self):
mct = self.config_denied_column_value or ""
return [t.strip() for t in mct.split(",")]
def list_allowed_column_values(self):
mct = self.config_allowed_column_value or ""
return [t.strip() for t in mct.split(",")]
def get_log_level(self): def get_log_level(self):
return logger.get_level_name(self.config_log_level) return logger.get_level_name(self.config_log_level)
@ -189,7 +238,11 @@ class _ConfigSQL(object):
def get_mail_settings(self): def get_mail_settings(self):
return {k:v for k, v in self.__dict__.items() if k.startswith('mail_')} return {k:v for k, v in self.__dict__.items() if k.startswith('mail_')}
def set_from_dictionary(self, dictionary, field, convertor=None, default=None): def get_mail_server_configured(self):
return not bool(self.mail_server == constants.DEFAULT_MAIL_SERVER)
def set_from_dictionary(self, dictionary, field, convertor=None, default=None, encode=None):
'''Possibly updates a field of this object. '''Possibly updates a field of this object.
The new value, if present, is grabbed from the given dictionary, and optionally passed through a convertor. The new value, if present, is grabbed from the given dictionary, and optionally passed through a convertor.
@ -205,7 +258,10 @@ class _ConfigSQL(object):
return False return False
if convertor is not None: if convertor is not None:
new_value = convertor(new_value) if encode:
new_value = convertor(new_value.encode(encode))
else:
new_value = convertor(new_value)
current_value = self.__dict__.get(field) current_value = self.__dict__.get(field)
if current_value == new_value: if current_value == new_value:
@ -227,17 +283,19 @@ class _ConfigSQL(object):
v = column.default.arg v = column.default.arg
setattr(self, k, v) setattr(self, k, v)
if self.config_google_drive_watch_changes_response:
self.config_google_drive_watch_changes_response = json.loads(self.config_google_drive_watch_changes_response)
have_metadata_db = bool(self.config_calibre_dir) have_metadata_db = bool(self.config_calibre_dir)
if have_metadata_db: if have_metadata_db:
if not self.config_use_google_drive: if not self.config_use_google_drive:
db_file = os.path.join(self.config_calibre_dir, 'metadata.db') db_file = os.path.join(self.config_calibre_dir, 'metadata.db')
have_metadata_db = os.path.isfile(db_file) have_metadata_db = os.path.isfile(db_file)
self.db_configured = have_metadata_db self.db_configured = have_metadata_db
constants.EXTENSIONS_UPLOAD = [x.lstrip().rstrip().lower() for x in self.config_upload_formats.split(',')]
logger.setup(self.config_logfile, self.config_log_level) logfile = logger.setup(self.config_logfile, self.config_log_level)
if logfile != self.config_logfile:
log.warning("Log path %s not valid, falling back to default", self.config_logfile)
self.config_logfile = logfile
self._session.merge(s)
self._session.commit()
def save(self): def save(self):
'''Apply all configuration values to the underlying storage.''' '''Apply all configuration values to the underlying storage.'''
@ -246,8 +304,7 @@ class _ConfigSQL(object):
for k, v in self.__dict__.items(): for k, v in self.__dict__.items():
if k[0] == '_': if k[0] == '_':
continue continue
if hasattr(s, k): # and getattr(s, k, None) != v: if hasattr(s, k):
# log.debug("_Settings save '%s' = %r", k, v)
setattr(s, k, v) setattr(s, k, v)
log.debug("_ConfigSQL updating storage") log.debug("_ConfigSQL updating storage")
@ -255,7 +312,9 @@ class _ConfigSQL(object):
self._session.commit() self._session.commit()
self.load() self.load()
def invalidate(self): def invalidate(self, error=None):
if error:
log.error(error)
log.warning("invalidating configuration") log.warning("invalidating configuration")
self.db_configured = False self.db_configured = False
self.config_calibre_dir = None self.config_calibre_dir = None
@ -270,12 +329,18 @@ def _migrate_table(session, orm_class):
try: try:
session.query(column).first() session.query(column).first()
except exc.OperationalError as err: except exc.OperationalError as err:
log.debug("%s: %s", column_name, err) log.debug("%s: %s", column_name, err.args[0])
if column.default is not None: if column.default is not None:
if sys.version_info < (3, 0): if sys.version_info < (3, 0):
if isinstance(column.default.arg,unicode): if isinstance(column.default.arg, unicode):
column.default.arg = column.default.arg.encode('utf-8') column.default.arg = column.default.arg.encode('utf-8')
column_default = "" if column.default is None else ("DEFAULT %r" % column.default.arg) if column.default is None:
column_default = ""
else:
if isinstance(column.default.arg, bool):
column_default = ("DEFAULT %r" % int(column.default.arg))
else:
column_default = ("DEFAULT %r" % column.default.arg)
alter_table = "ALTER TABLE %s ADD COLUMN `%s` %s %s" % (orm_class.__tablename__, alter_table = "ALTER TABLE %s ADD COLUMN `%s` %s %s" % (orm_class.__tablename__,
column_name, column_name,
column.type, column.type,
@ -287,22 +352,47 @@ def _migrate_table(session, orm_class):
if changed: if changed:
session.commit() session.commit()
def autodetect_calibre_binary(): def autodetect_calibre_binary():
if sys.platform == "win32": if sys.platform == "win32":
calibre_path = ["C:\\program files\calibre\calibre-convert.exe", calibre_path = ["C:\\program files\\calibre\\ebook-convert.exe",
"C:\\program files(x86)\calibre\calibre-convert.exe"] "C:\\program files(x86)\\calibre\\ebook-convert.exe",
"C:\\program files(x86)\\calibre2\\ebook-convert.exe",
"C:\\program files\\calibre2\\ebook-convert.exe"]
else: else:
calibre_path = ["/opt/calibre/ebook-convert"] calibre_path = ["/opt/calibre/ebook-convert"]
for element in calibre_path: for element in calibre_path:
if os.path.isfile(element) and os.access(element, os.X_OK): if os.path.isfile(element) and os.access(element, os.X_OK):
return element return element
return None return ""
def autodetect_unrar_binary():
if sys.platform == "win32":
calibre_path = ["C:\\program files\\WinRar\\unRAR.exe",
"C:\\program files(x86)\\WinRar\\unRAR.exe"]
else:
calibre_path = ["/usr/bin/unrar"]
for element in calibre_path:
if os.path.isfile(element) and os.access(element, os.X_OK):
return element
return ""
def autodetect_kepubify_binary():
if sys.platform == "win32":
calibre_path = ["C:\\program files\\kepubify\\kepubify-windows-64Bit.exe",
"C:\\program files(x86)\\kepubify\\kepubify-windows-64Bit.exe"]
else:
calibre_path = ["/opt/kepubify/kepubify-linux-64bit", "/opt/kepubify/kepubify-linux-32bit"]
for element in calibre_path:
if os.path.isfile(element) and os.access(element, os.X_OK):
return element
return ""
def _migrate_database(session): def _migrate_database(session):
# make sure the table is created, if it does not exist # make sure the table is created, if it does not exist
_Base.metadata.create_all(session.bind) _Base.metadata.create_all(session.bind)
_migrate_table(session, _Settings) _migrate_table(session, _Settings)
_migrate_table(session, _Flask_Settings)
def load_configuration(session): def load_configuration(session):
@ -311,5 +401,20 @@ def load_configuration(session):
if not session.query(_Settings).count(): if not session.query(_Settings).count():
session.add(_Settings()) session.add(_Settings())
session.commit() session.commit()
conf = _ConfigSQL(session)
# Migrate from global restrictions to user based restrictions
if bool(conf.config_default_show & constants.MATURE_CONTENT) and conf.config_denied_tags == "":
conf.config_denied_tags = conf.config_mature_content_tags
conf.save()
session.query(ub.User).filter(ub.User.mature_content != True). \
update({"denied_tags": conf.config_mature_content_tags}, synchronize_session=False)
session.commit()
return conf
return _ConfigSQL(session) def get_flask_session_key(session):
flask_settings = session.query(_Flask_Settings).one_or_none()
if flask_settings == None:
flask_settings = _Flask_Settings(os.urandom(32))
session.add(flask_settings)
session.commit()
return flask_settings.flask_session_key

View File

@ -80,9 +80,12 @@ MATURE_CONTENT = 1 << 11
SIDEBAR_PUBLISHER = 1 << 12 SIDEBAR_PUBLISHER = 1 << 12
SIDEBAR_RATING = 1 << 13 SIDEBAR_RATING = 1 << 13
SIDEBAR_FORMAT = 1 << 14 SIDEBAR_FORMAT = 1 << 14
SIDEBAR_ARCHIVED = 1 << 15
SIDEBAR_DOWNLOAD = 1 << 16
SIDEBAR_LIST = 1 << 17
ADMIN_USER_ROLES = sum(r for r in ALL_ROLES.values()) & ~ROLE_EDIT_SHELFS & ~ROLE_ANONYMOUS ADMIN_USER_ROLES = sum(r for r in ALL_ROLES.values()) & ~ROLE_ANONYMOUS
ADMIN_USER_SIDEBAR = (SIDEBAR_FORMAT << 1) - 1 ADMIN_USER_SIDEBAR = (SIDEBAR_LIST << 1) - 1
UPDATE_STABLE = 0 << 0 UPDATE_STABLE = 0 << 0
AUTO_UPDATE_STABLE = 1 << 0 AUTO_UPDATE_STABLE = 1 << 0
@ -92,25 +95,28 @@ AUTO_UPDATE_NIGHTLY = 1 << 2
LOGIN_STANDARD = 0 LOGIN_STANDARD = 0
LOGIN_LDAP = 1 LOGIN_LDAP = 1
LOGIN_OAUTH = 2 LOGIN_OAUTH = 2
# LOGIN_OAUTH_GOOGLE = 3
LDAP_AUTH_ANONYMOUS = 0
LDAP_AUTH_UNAUTHENTICATE = 1
LDAP_AUTH_SIMPLE = 0
DEFAULT_MAIL_SERVER = "mail.example.org"
DEFAULT_PASSWORD = "admin123" DEFAULT_PASSWORD = "admin123"
DEFAULT_PORT = 8083 DEFAULT_PORT = 8083
env_CALIBRE_PORT = os.environ.get("CALIBRE_PORT", DEFAULT_PORT)
try: try:
env_CALIBRE_PORT = os.environ.get("CALIBRE_PORT", DEFAULT_PORT)
DEFAULT_PORT = int(env_CALIBRE_PORT) DEFAULT_PORT = int(env_CALIBRE_PORT)
except ValueError: except ValueError:
print('Environment variable CALIBRE_PORT has invalid value (%s), faling back to default (8083)' % env_CALIBRE_PORT) print('Environment variable CALIBRE_PORT has invalid value (%s), faling back to default (8083)' % env_CALIBRE_PORT)
del env_CALIBRE_PORT del env_CALIBRE_PORT
EXTENSIONS_AUDIO = {'mp3', 'm4a', 'm4b'} EXTENSIONS_AUDIO = {'mp3', 'mp4', 'ogg', 'opus', 'wav', 'flac', 'm4a', 'm4b'}
EXTENSIONS_CONVERT = {'pdf', 'epub', 'mobi', 'azw3', 'docx', 'rtf', 'fb2', 'lit', 'lrf', 'txt', 'htmlz', 'rtf', 'odt'} EXTENSIONS_CONVERT_FROM = ['pdf', 'epub', 'mobi', 'azw3', 'docx', 'rtf', 'fb2', 'lit', 'lrf', 'txt', 'htmlz', 'rtf', 'odt','cbz','cbr']
EXTENSIONS_UPLOAD = {'txt', 'pdf', 'epub', 'mobi', 'azw', 'azw3', 'cbr', 'cbz', 'cbt', 'djvu', 'prc', 'doc', 'docx', EXTENSIONS_CONVERT_TO = ['pdf', 'epub', 'mobi', 'azw3', 'docx', 'rtf', 'fb2', 'lit', 'lrf', 'txt', 'htmlz', 'rtf', 'odt']
'fb2', 'html', 'rtf', 'odt', 'mp3', 'm4a', 'm4b'} EXTENSIONS_UPLOAD = {'txt', 'pdf', 'epub', 'kepub', 'mobi', 'azw', 'azw3', 'cbr', 'cbz', 'cbt', 'djvu', 'prc', 'doc', 'docx',
# EXTENSIONS_READER = set(['txt', 'pdf', 'epub', 'zip', 'cbz', 'tar', 'cbt'] + 'fb2', 'html', 'rtf', 'lit', 'odt', 'mp3', 'mp4', 'ogg', 'opus', 'wav', 'flac', 'm4a', 'm4b'}
# (['rar','cbr'] if feature_support['rar'] else []))
def has_flag(value, bit_flag): def has_flag(value, bit_flag):
@ -124,7 +130,7 @@ def selected_roles(dictionary):
BookMeta = namedtuple('BookMeta', 'file_path, extension, title, author, cover, description, tags, series, ' BookMeta = namedtuple('BookMeta', 'file_path, extension, title, author, cover, description, tags, series, '
'series_id, languages') 'series_id, languages')
STABLE_VERSION = {'version': '0.6.5 Beta'} STABLE_VERSION = {'version': '0.6.10 Beta'}
NIGHTLY_VERSION = {} NIGHTLY_VERSION = {}
NIGHTLY_VERSION[0] = '$Format:%H$' NIGHTLY_VERSION[0] = '$Format:%H$'

View File

@ -29,8 +29,8 @@ log = logger.create()
# _() necessary to make babel aware of string for translation # _() necessary to make babel aware of string for translation
_NOT_CONFIGURED = _('not configured') _NOT_CONFIGURED = _('not configured')
_NOT_INSTALLED = 'not installed' _NOT_INSTALLED = _('not installed')
_EXECUTION_ERROR = 'Execution permissions missing' _EXECUTION_ERROR = _('Execution permissions missing')
def _get_command_version(path, pattern, argument=None): def _get_command_version(path, pattern, argument=None):
@ -48,10 +48,15 @@ def _get_command_version(path, pattern, argument=None):
return _NOT_INSTALLED return _NOT_INSTALLED
def get_version(): def get_calibre_version():
version = None return _get_command_version(config.config_converterpath, r'ebook-convert.*\(calibre', '--version') \
if config.config_ebookconverter == 1: or _NOT_CONFIGURED
version = _get_command_version(config.config_converterpath, r'Amazon kindlegen\(')
elif config.config_ebookconverter == 2:
version = _get_command_version(config.config_converterpath, r'ebook-convert.*\(calibre', '--version') def get_unrar_version():
return version or _NOT_CONFIGURED return _get_command_version(config.config_rarfile_location, r'UNRAR.*\d') or _NOT_CONFIGURED
def get_kepubify_version():
return _get_command_version(config.config_kepubifypath, r'kepubify\s','--version') or _NOT_CONFIGURED

702
cps/db.py Executable file → Normal file
View File

@ -22,59 +22,77 @@ import sys
import os import os
import re import re
import ast import ast
import json
from datetime import datetime
from sqlalchemy import create_engine from sqlalchemy import create_engine
from sqlalchemy import Table, Column, ForeignKey from sqlalchemy import Table, Column, ForeignKey, CheckConstraint
from sqlalchemy import String, Integer, Boolean from sqlalchemy import String, Integer, Boolean, TIMESTAMP, Float
from sqlalchemy.orm import relationship, sessionmaker, scoped_session from sqlalchemy.orm import relationship, sessionmaker, scoped_session
from sqlalchemy.ext.declarative import declarative_base from sqlalchemy.orm.collections import InstrumentedList
from sqlalchemy.ext.declarative import declarative_base, DeclarativeMeta
from sqlalchemy.pool import StaticPool
from flask_login import current_user
from sqlalchemy.sql.expression import and_, true, false, text, func, or_
from sqlalchemy.ext.associationproxy import association_proxy
from babel import Locale as LC
from babel.core import UnknownLocaleError
from flask_babel import gettext as _
from . import logger, ub, isoLanguages
from .pagination import Pagination
session = None from weakref import WeakSet
cc_exceptions = ['datetime', 'comments', 'float', 'composite', 'series']
try:
import unidecode
use_unidecode = True
except ImportError:
use_unidecode = False
cc_exceptions = ['datetime', 'comments', 'composite', 'series']
cc_classes = {} cc_classes = {}
Base = declarative_base() Base = declarative_base()
books_authors_link = Table('books_authors_link', Base.metadata, books_authors_link = Table('books_authors_link', Base.metadata,
Column('book', Integer, ForeignKey('books.id'), primary_key=True), Column('book', Integer, ForeignKey('books.id'), primary_key=True),
Column('author', Integer, ForeignKey('authors.id'), primary_key=True) Column('author', Integer, ForeignKey('authors.id'), primary_key=True)
) )
books_tags_link = Table('books_tags_link', Base.metadata, books_tags_link = Table('books_tags_link', Base.metadata,
Column('book', Integer, ForeignKey('books.id'), primary_key=True), Column('book', Integer, ForeignKey('books.id'), primary_key=True),
Column('tag', Integer, ForeignKey('tags.id'), primary_key=True) Column('tag', Integer, ForeignKey('tags.id'), primary_key=True)
) )
books_series_link = Table('books_series_link', Base.metadata, books_series_link = Table('books_series_link', Base.metadata,
Column('book', Integer, ForeignKey('books.id'), primary_key=True), Column('book', Integer, ForeignKey('books.id'), primary_key=True),
Column('series', Integer, ForeignKey('series.id'), primary_key=True) Column('series', Integer, ForeignKey('series.id'), primary_key=True)
) )
books_ratings_link = Table('books_ratings_link', Base.metadata, books_ratings_link = Table('books_ratings_link', Base.metadata,
Column('book', Integer, ForeignKey('books.id'), primary_key=True), Column('book', Integer, ForeignKey('books.id'), primary_key=True),
Column('rating', Integer, ForeignKey('ratings.id'), primary_key=True) Column('rating', Integer, ForeignKey('ratings.id'), primary_key=True)
) )
books_languages_link = Table('books_languages_link', Base.metadata, books_languages_link = Table('books_languages_link', Base.metadata,
Column('book', Integer, ForeignKey('books.id'), primary_key=True), Column('book', Integer, ForeignKey('books.id'), primary_key=True),
Column('lang_code', Integer, ForeignKey('languages.id'), primary_key=True) Column('lang_code', Integer, ForeignKey('languages.id'), primary_key=True)
) )
books_publishers_link = Table('books_publishers_link', Base.metadata, books_publishers_link = Table('books_publishers_link', Base.metadata,
Column('book', Integer, ForeignKey('books.id'), primary_key=True), Column('book', Integer, ForeignKey('books.id'), primary_key=True),
Column('publisher', Integer, ForeignKey('publishers.id'), primary_key=True) Column('publisher', Integer, ForeignKey('publishers.id'), primary_key=True)
) )
class Identifiers(Base): class Identifiers(Base):
__tablename__ = 'identifiers' __tablename__ = 'identifiers'
id = Column(Integer, primary_key=True) id = Column(Integer, primary_key=True)
type = Column(String) type = Column(String(collation='NOCASE'), nullable=False, default="isbn")
val = Column(String) val = Column(String(collation='NOCASE'), nullable=False)
book = Column(Integer, ForeignKey('books.id')) book = Column(Integer, ForeignKey('books.id'), nullable=False)
def __init__(self, val, id_type, book): def __init__(self, val, id_type, book):
self.val = val self.val = val
@ -82,41 +100,61 @@ class Identifiers(Base):
self.book = book self.book = book
def formatType(self): def formatType(self):
if self.type == "amazon": format_type = self.type.lower()
if format_type == 'amazon':
return u"Amazon" return u"Amazon"
elif self.type == "isbn": elif format_type.startswith("amazon_"):
return u"Amazon.{0}".format(format_type[7:])
elif format_type == "isbn":
return u"ISBN" return u"ISBN"
elif self.type == "doi": elif format_type == "doi":
return u"DOI" return u"DOI"
elif self.type == "goodreads": elif format_type == "douban":
return u"Douban"
elif format_type == "goodreads":
return u"Goodreads" return u"Goodreads"
elif self.type == "google": elif format_type == "google":
return u"Google Books" return u"Google Books"
elif self.type == "kobo": elif format_type == "kobo":
return u"Kobo" return u"Kobo"
if self.type == "lubimyczytac": elif format_type == "litres":
return u"ЛитРес"
elif format_type == "issn":
return u"ISSN"
elif format_type == "isfdb":
return u"ISFDB"
if format_type == "lubimyczytac":
return u"Lubimyczytac" return u"Lubimyczytac"
else: else:
return self.type return self.type
def __repr__(self): def __repr__(self):
if self.type == "amazon": format_type = self.type.lower()
return u"https://amzn.com/{0}".format(self.val) if format_type == "amazon" or format_type == "asin":
elif self.type == "isbn": return u"https://amazon.com/dp/{0}".format(self.val)
return u"http://www.worldcat.org/isbn/{0}".format(self.val) elif format_type.startswith('amazon_'):
elif self.type == "doi": return u"https://amazon.{0}/dp/{1}".format(format_type[7:], self.val)
return u"http://dx.doi.org/{0}".format(self.val) elif format_type == "isbn":
elif self.type == "goodreads": return u"https://www.worldcat.org/isbn/{0}".format(self.val)
return u"http://www.goodreads.com/book/show/{0}".format(self.val) elif format_type == "doi":
elif self.type == "douban": return u"https://dx.doi.org/{0}".format(self.val)
elif format_type == "goodreads":
return u"https://www.goodreads.com/book/show/{0}".format(self.val)
elif format_type == "douban":
return u"https://book.douban.com/subject/{0}".format(self.val) return u"https://book.douban.com/subject/{0}".format(self.val)
elif self.type == "google": elif format_type == "google":
return u"https://books.google.com/books?id={0}".format(self.val) return u"https://books.google.com/books?id={0}".format(self.val)
elif self.type == "kobo": elif format_type == "kobo":
return u"https://www.kobo.com/ebook/{0}".format(self.val) return u"https://www.kobo.com/ebook/{0}".format(self.val)
elif self.type == "lubimyczytac": elif format_type == "lubimyczytac":
return u" http://lubimyczytac.pl/ksiazka/{0}".format(self.val) return u"https://lubimyczytac.pl/ksiazka/{0}/ksiazka".format(self.val)
elif self.type == "url": elif format_type == "litres":
return u"https://www.litres.ru/{0}".format(self.val)
elif format_type == "issn":
return u"https://portal.issn.org/resource/ISSN/{0}".format(self.val)
elif format_type == "isfdb":
return u"http://www.isfdb.org/cgi-bin/pl.cgi?{0}".format(self.val)
elif format_type == "url":
return u"{0}".format(self.val) return u"{0}".format(self.val)
else: else:
return u"" return u""
@ -126,13 +164,16 @@ class Comments(Base):
__tablename__ = 'comments' __tablename__ = 'comments'
id = Column(Integer, primary_key=True) id = Column(Integer, primary_key=True)
text = Column(String) text = Column(String(collation='NOCASE'), nullable=False)
book = Column(Integer, ForeignKey('books.id')) book = Column(Integer, ForeignKey('books.id'), nullable=False)
def __init__(self, text, book): def __init__(self, text, book):
self.text = text self.text = text
self.book = book self.book = book
def get(self):
return self.text
def __repr__(self): def __repr__(self):
return u"<Comments({0})>".format(self.text) return u"<Comments({0})>".format(self.text)
@ -141,11 +182,14 @@ class Tags(Base):
__tablename__ = 'tags' __tablename__ = 'tags'
id = Column(Integer, primary_key=True, autoincrement=True) id = Column(Integer, primary_key=True, autoincrement=True)
name = Column(String) name = Column(String(collation='NOCASE'), unique=True, nullable=False)
def __init__(self, name): def __init__(self, name):
self.name = name self.name = name
def get(self):
return self.name
def __repr__(self): def __repr__(self):
return u"<Tags('{0})>".format(self.name) return u"<Tags('{0})>".format(self.name)
@ -154,15 +198,18 @@ class Authors(Base):
__tablename__ = 'authors' __tablename__ = 'authors'
id = Column(Integer, primary_key=True) id = Column(Integer, primary_key=True)
name = Column(String) name = Column(String(collation='NOCASE'), unique=True, nullable=False)
sort = Column(String) sort = Column(String(collation='NOCASE'))
link = Column(String) link = Column(String, nullable=False, default="")
def __init__(self, name, sort, link): def __init__(self, name, sort, link):
self.name = name self.name = name
self.sort = sort self.sort = sort
self.link = link self.link = link
def get(self):
return self.name
def __repr__(self): def __repr__(self):
return u"<Authors('{0},{1}{2}')>".format(self.name, self.sort, self.link) return u"<Authors('{0},{1}{2}')>".format(self.name, self.sort, self.link)
@ -171,13 +218,16 @@ class Series(Base):
__tablename__ = 'series' __tablename__ = 'series'
id = Column(Integer, primary_key=True) id = Column(Integer, primary_key=True)
name = Column(String) name = Column(String(collation='NOCASE'), unique=True, nullable=False)
sort = Column(String) sort = Column(String(collation='NOCASE'))
def __init__(self, name, sort): def __init__(self, name, sort):
self.name = name self.name = name
self.sort = sort self.sort = sort
def get(self):
return self.name
def __repr__(self): def __repr__(self):
return u"<Series('{0},{1}')>".format(self.name, self.sort) return u"<Series('{0},{1}')>".format(self.name, self.sort)
@ -186,11 +236,14 @@ class Ratings(Base):
__tablename__ = 'ratings' __tablename__ = 'ratings'
id = Column(Integer, primary_key=True) id = Column(Integer, primary_key=True)
rating = Column(Integer) rating = Column(Integer, CheckConstraint('rating>-1 AND rating<11'), unique=True)
def __init__(self, rating): def __init__(self, rating):
self.rating = rating self.rating = rating
def get(self):
return self.rating
def __repr__(self): def __repr__(self):
return u"<Ratings('{0}')>".format(self.rating) return u"<Ratings('{0}')>".format(self.rating)
@ -199,11 +252,17 @@ class Languages(Base):
__tablename__ = 'languages' __tablename__ = 'languages'
id = Column(Integer, primary_key=True) id = Column(Integer, primary_key=True)
lang_code = Column(String) lang_code = Column(String(collation='NOCASE'), nullable=False, unique=True)
def __init__(self, lang_code): def __init__(self, lang_code):
self.lang_code = lang_code self.lang_code = lang_code
def get(self):
if self.language_name:
return self.language_name
else:
return self.lang_code
def __repr__(self): def __repr__(self):
return u"<Languages('{0}')>".format(self.lang_code) return u"<Languages('{0}')>".format(self.lang_code)
@ -212,25 +271,29 @@ class Publishers(Base):
__tablename__ = 'publishers' __tablename__ = 'publishers'
id = Column(Integer, primary_key=True) id = Column(Integer, primary_key=True)
name = Column(String) name = Column(String(collation='NOCASE'), nullable=False, unique=True)
sort = Column(String) sort = Column(String(collation='NOCASE'))
def __init__(self, name, sort): def __init__(self, name, sort):
self.name = name self.name = name
self.sort = sort self.sort = sort
def get(self):
return self.name
def __repr__(self): def __repr__(self):
return u"<Publishers('{0},{1}')>".format(self.name, self.sort) return u"<Publishers('{0},{1}')>".format(self.name, self.sort)
class Data(Base): class Data(Base):
__tablename__ = 'data' __tablename__ = 'data'
__table_args__ = {'schema': 'calibre'}
id = Column(Integer, primary_key=True) id = Column(Integer, primary_key=True)
book = Column(Integer, ForeignKey('books.id')) book = Column(Integer, ForeignKey('books.id'), nullable=False)
format = Column(String) format = Column(String(collation='NOCASE'), nullable=False)
uncompressed_size = Column(Integer) uncompressed_size = Column(Integer, nullable=False)
name = Column(String) name = Column(String, nullable=False)
def __init__(self, book, book_format, uncompressed_size, name): def __init__(self, book, book_format, uncompressed_size, name):
self.book = book self.book = book
@ -238,6 +301,10 @@ class Data(Base):
self.uncompressed_size = uncompressed_size self.uncompressed_size = uncompressed_size
self.name = name self.name = name
# ToDo: Check
def get(self):
return self.name
def __repr__(self): def __repr__(self):
return u"<Data('{0},{1}{2}{3}')>".format(self.book, self.format, self.uncompressed_size, self.name) return u"<Data('{0},{1}{2}{3}')>".format(self.book, self.format, self.uncompressed_size, self.name)
@ -245,22 +312,25 @@ class Data(Base):
class Books(Base): class Books(Base):
__tablename__ = 'books' __tablename__ = 'books'
DEFAULT_PUBDATE = "0101-01-01 00:00:00+00:00" DEFAULT_PUBDATE = datetime(101, 1, 1, 0, 0, 0, 0) # ("0101-01-01 00:00:00+00:00")
id = Column(Integer, primary_key=True) id = Column(Integer, primary_key=True, autoincrement=True)
title = Column(String) title = Column(String(collation='NOCASE'), nullable=False, default='Unknown')
sort = Column(String) sort = Column(String(collation='NOCASE'))
author_sort = Column(String) author_sort = Column(String(collation='NOCASE'))
timestamp = Column(String) timestamp = Column(TIMESTAMP, default=datetime.utcnow)
pubdate = Column(String) pubdate = Column(TIMESTAMP, default=DEFAULT_PUBDATE)
series_index = Column(String) series_index = Column(String, nullable=False, default="1.0")
last_modified = Column(String) last_modified = Column(TIMESTAMP, default=datetime.utcnow)
path = Column(String) path = Column(String, default="", nullable=False)
has_cover = Column(Integer) has_cover = Column(Integer, default=0)
uuid = Column(String) uuid = Column(String)
isbn = Column(String(collation='NOCASE'), default="")
# Iccn = Column(String(collation='NOCASE'), default="")
flags = Column(Integer, nullable=False, default=1)
authors = relationship('Authors', secondary=books_authors_link, backref='books') authors = relationship('Authors', secondary=books_authors_link, backref='books')
tags = relationship('Tags', secondary=books_tags_link, backref='books',order_by="Tags.name") tags = relationship('Tags', secondary=books_tags_link, backref='books', order_by="Tags.name")
comments = relationship('Comments', backref='books') comments = relationship('Comments', backref='books')
data = relationship('Data', backref='books') data = relationship('Data', backref='books')
series = relationship('Series', secondary=books_series_link, backref='books') series = relationship('Series', secondary=books_series_link, backref='books')
@ -279,7 +349,8 @@ class Books(Base):
self.series_index = series_index self.series_index = series_index
self.last_modified = last_modified self.last_modified = last_modified
self.path = path self.path = path
self.has_cover = has_cover self.has_cover = (has_cover != None)
def __repr__(self): def __repr__(self):
return u"<Books('{0},{1}{2}{3}{4}{5}{6}{7}{8}')>".format(self.title, self.sort, self.author_sort, return u"<Books('{0},{1}{2}{3}{4}{5}{6}{7}{8}')>".format(self.title, self.sort, self.author_sort,
@ -288,7 +359,8 @@ class Books(Base):
@property @property
def atom_timestamp(self): def atom_timestamp(self):
return (self.timestamp or '').replace(' ', 'T') return (self.timestamp.strftime('%Y-%m-%dT%H:%M:%S+00:00') or '')
class Custom_Columns(Base): class Custom_Columns(Base):
__tablename__ = 'custom_columns' __tablename__ = 'custom_columns'
@ -310,121 +382,387 @@ class Custom_Columns(Base):
return display_dict return display_dict
def update_title_sort(config, conn=None): class AlchemyEncoder(json.JSONEncoder):
# user defined sort function for calibre databases (Series, etc.)
def _title_sort(title):
# calibre sort stuff
title_pat = re.compile(config.config_title_regex, re.IGNORECASE)
match = title_pat.search(title)
if match:
prep = match.group(1)
title = title.replace(prep, '') + ', ' + prep
return title.strip()
conn = conn or session.connection().connection.connection def default(self, obj):
conn.create_function("title_sort", 1, _title_sort) if isinstance(obj.__class__, DeclarativeMeta):
# an SQLAlchemy class
fields = {}
for field in [x for x in dir(obj) if not x.startswith('_') and x != 'metadata']:
if field == 'books':
continue
data = obj.__getattribute__(field)
try:
if isinstance(data, str):
data = data.replace("'", "\'")
elif isinstance(data, InstrumentedList):
el = list()
for ele in data:
if ele.get:
el.append(ele.get())
else:
el.append(json.dumps(ele, cls=AlchemyEncoder))
data = ",".join(el)
if data == '[]':
data = ""
else:
json.dumps(data)
fields[field] = data
except:
fields[field] = ""
# a json-encodable dict
return fields
return json.JSONEncoder.default(self, obj)
def setup_db(config): class CalibreDB():
dispose() _init = False
engine = None
config = None
session_factory = None
# This is a WeakSet so that references here don't keep other CalibreDB
# instances alive once they reach the end of their respective scopes
instances = WeakSet()
if not config.config_calibre_dir: def __init__(self):
config.invalidate() """ Initialize a new CalibreDB session
return False """
self.session = None
if self._init:
self.initSession()
dbpath = os.path.join(config.config_calibre_dir, "metadata.db") self.instances.add(self)
if not os.path.exists(dbpath):
config.invalidate()
return False
try:
engine = create_engine('sqlite:///{0}'.format(dbpath),
echo=False,
isolation_level="SERIALIZABLE",
connect_args={'check_same_thread': False})
conn = engine.connect()
except:
config.invalidate()
return False
config.db_configured = True def initSession(self):
update_title_sort(config, conn.connection) self.session = self.session_factory()
# conn.connection.create_function('lower', 1, lcase) self.update_title_sort(self.config)
# conn.connection.create_function('upper', 1, ucase)
if not cc_classes: @classmethod
cc = conn.execute("SELECT id, datatype FROM custom_columns") def setup_db(cls, config, app_db_path):
cls.config = config
cls.dispose()
if not config.config_calibre_dir:
config.invalidate()
return False
dbpath = os.path.join(config.config_calibre_dir, "metadata.db")
if not os.path.exists(dbpath):
config.invalidate()
return False
try:
cls.engine = create_engine('sqlite://',
echo=False,
isolation_level="SERIALIZABLE",
connect_args={'check_same_thread': False},
poolclass=StaticPool)
cls.engine.execute("attach database '{}' as calibre;".format(dbpath))
cls.engine.execute("attach database '{}' as app_settings;".format(app_db_path))
conn = cls.engine.connect()
# conn.text_factory = lambda b: b.decode(errors = 'ignore') possible fix for #1302
except Exception as e:
config.invalidate(e)
return False
config.db_configured = True
if not cc_classes:
cc = conn.execute("SELECT id, datatype FROM custom_columns")
cc_ids = []
books_custom_column_links = {}
for row in cc:
if row.datatype not in cc_exceptions:
if row.datatype == 'series':
dicttable = {'__tablename__': 'books_custom_column_' + str(row.id) + '_link',
'id': Column(Integer, primary_key=True),
'book': Column(Integer, ForeignKey('books.id'),
primary_key=True),
'map_value': Column('value', Integer,
ForeignKey('custom_column_' +
str(row.id) + '.id'),
primary_key=True),
'extra': Column(Float),
'asoc': relationship('custom_column_' + str(row.id), uselist=False),
'value': association_proxy('asoc', 'value')
}
books_custom_column_links[row.id] = type(str('books_custom_column_' + str(row.id) + '_link'),
(Base,), dicttable)
else:
books_custom_column_links[row.id] = Table('books_custom_column_' + str(row.id) + '_link',
Base.metadata,
Column('book', Integer, ForeignKey('books.id'),
primary_key=True),
Column('value', Integer,
ForeignKey('custom_column_' +
str(row.id) + '.id'),
primary_key=True)
)
cc_ids.append([row.id, row.datatype])
cc_ids = []
books_custom_column_links = {}
for row in cc:
if row.datatype not in cc_exceptions:
books_custom_column_links[row.id] = Table('books_custom_column_' + str(row.id) + '_link', Base.metadata,
Column('book', Integer, ForeignKey('books.id'),
primary_key=True),
Column('value', Integer,
ForeignKey('custom_column_' + str(row.id) + '.id'),
primary_key=True)
)
cc_ids.append([row.id, row.datatype])
if row.datatype == 'bool':
ccdict = {'__tablename__': 'custom_column_' + str(row.id), ccdict = {'__tablename__': 'custom_column_' + str(row.id),
'id': Column(Integer, primary_key=True), 'id': Column(Integer, primary_key=True)}
'book': Column(Integer, ForeignKey('books.id')), if row.datatype == 'float':
'value': Column(Boolean)} ccdict['value'] = Column(Float)
elif row.datatype == 'int': elif row.datatype == 'int':
ccdict = {'__tablename__': 'custom_column_' + str(row.id), ccdict['value'] = Column(Integer)
'id': Column(Integer, primary_key=True), elif row.datatype == 'bool':
'book': Column(Integer, ForeignKey('books.id')), ccdict['value'] = Column(Boolean)
'value': Column(Integer)} else:
ccdict['value'] = Column(String)
if row.datatype in ['float', 'int', 'bool']:
ccdict['book'] = Column(Integer, ForeignKey('books.id'))
cc_classes[row.id] = type(str('custom_column_' + str(row.id)), (Base,), ccdict)
for cc_id in cc_ids:
if (cc_id[1] == 'bool') or (cc_id[1] == 'int') or (cc_id[1] == 'float'):
setattr(Books,
'custom_column_' + str(cc_id[0]),
relationship(cc_classes[cc_id[0]],
primaryjoin=(
Books.id == cc_classes[cc_id[0]].book),
backref='books'))
elif (cc_id[1] == 'series'):
setattr(Books,
'custom_column_' + str(cc_id[0]),
relationship(books_custom_column_links[cc_id[0]],
backref='books'))
else: else:
ccdict = {'__tablename__': 'custom_column_' + str(row.id), setattr(Books,
'id': Column(Integer, primary_key=True), 'custom_column_' + str(cc_id[0]),
'value': Column(String)} relationship(cc_classes[cc_id[0]],
cc_classes[row.id] = type(str('Custom_Column_' + str(row.id)), (Base,), ccdict) secondary=books_custom_column_links[cc_id[0]],
backref='books'))
for cc_id in cc_ids: cls.session_factory = scoped_session(sessionmaker(autocommit=False,
if (cc_id[1] == 'bool') or (cc_id[1] == 'int'): autoflush=True,
setattr(Books, 'custom_column_' + str(cc_id[0]), relationship(cc_classes[cc_id[0]], bind=cls.engine))
primaryjoin=( for inst in cls.instances:
Books.id == cc_classes[cc_id[0]].book), inst.initSession()
backref='books'))
else: cls._init = True
setattr(Books, 'custom_column_' + str(cc_id[0]), relationship(cc_classes[cc_id[0]], return True
secondary=books_custom_column_links[cc_id[0]],
backref='books')) def get_book(self, book_id):
return self.session.query(Books).filter(Books.id == book_id).first()
def get_filtered_book(self, book_id, allow_show_archived=False):
return self.session.query(Books).filter(Books.id == book_id). \
filter(self.common_filters(allow_show_archived)).first()
def get_book_by_uuid(self, book_uuid):
return self.session.query(Books).filter(Books.uuid == book_uuid).first()
def get_book_format(self, book_id, format):
return self.session.query(Data).filter(Data.book == book_id).filter(Data.format == format).first()
# Language and content filters for displaying in the UI
def common_filters(self, allow_show_archived=False):
if not allow_show_archived:
archived_books = (
ub.session.query(ub.ArchivedBook)
.filter(ub.ArchivedBook.user_id == int(current_user.id))
.filter(ub.ArchivedBook.is_archived == True)
.all()
)
archived_book_ids = [archived_book.book_id for archived_book in archived_books]
archived_filter = Books.id.notin_(archived_book_ids)
else:
archived_filter = true()
if current_user.filter_language() != "all":
lang_filter = Books.languages.any(Languages.lang_code == current_user.filter_language())
else:
lang_filter = true()
negtags_list = current_user.list_denied_tags()
postags_list = current_user.list_allowed_tags()
neg_content_tags_filter = false() if negtags_list == [''] else Books.tags.any(Tags.name.in_(negtags_list))
pos_content_tags_filter = true() if postags_list == [''] else Books.tags.any(Tags.name.in_(postags_list))
if self.config.config_restricted_column:
pos_cc_list = current_user.allowed_column_value.split(',')
pos_content_cc_filter = true() if pos_cc_list == [''] else \
getattr(Books, 'custom_column_' + str(self.config.config_restricted_column)). \
any(cc_classes[self.config.config_restricted_column].value.in_(pos_cc_list))
neg_cc_list = current_user.denied_column_value.split(',')
neg_content_cc_filter = false() if neg_cc_list == [''] else \
getattr(Books, 'custom_column_' + str(self.config.config_restricted_column)). \
any(cc_classes[self.config.config_restricted_column].value.in_(neg_cc_list))
else:
pos_content_cc_filter = true()
neg_content_cc_filter = false()
return and_(lang_filter, pos_content_tags_filter, ~neg_content_tags_filter,
pos_content_cc_filter, ~neg_content_cc_filter, archived_filter)
# Fill indexpage with all requested data from database
def fill_indexpage(self, page, pagesize, database, db_filter, order, *join):
return self.fill_indexpage_with_archived_books(page, pagesize, database, db_filter, order, False, *join)
def fill_indexpage_with_archived_books(self, page, pagesize, database, db_filter, order, allow_show_archived,
*join):
pagesize = pagesize or self.config.config_books_per_page
if current_user.show_detail_random():
randm = self.session.query(Books) \
.filter(self.common_filters(allow_show_archived)) \
.order_by(func.random()) \
.limit(self.config.config_random_books)
else:
randm = false()
off = int(int(pagesize) * (page - 1))
query = self.session.query(database) \
.join(*join, isouter=True) \
.filter(db_filter) \
.filter(self.common_filters(allow_show_archived))
pagination = Pagination(page, pagesize,
len(query.all()))
entries = query.order_by(*order).offset(off).limit(pagesize).all()
for book in entries:
book = self.order_authors(book)
return entries, randm, pagination
# Orders all Authors in the list according to authors sort
def order_authors(self, entry):
sort_authors = entry.author_sort.split('&')
authors_ordered = list()
error = False
ids = [a.id for a in entry.authors]
for auth in sort_authors:
results = self.session.query(Authors).filter(Authors.sort == auth.lstrip().strip()).all()
# ToDo: How to handle not found authorname
if not len(results):
error = True
break
for r in results:
if r.id in ids:
authors_ordered.append(r)
if not error:
entry.authors = authors_ordered
return entry
def get_typeahead(self, database, query, replace=('', ''), tag_filter=true()):
query = query or ''
self.session.connection().connection.connection.create_function("lower", 1, lcase)
entries = self.session.query(database).filter(tag_filter). \
filter(func.lower(database.name).ilike("%" + query + "%")).all()
json_dumps = json.dumps([dict(name=r.name.replace(*replace)) for r in entries])
return json_dumps
def check_exists_book(self, authr, title):
self.session.connection().connection.connection.create_function("lower", 1, lcase)
q = list()
authorterms = re.split(r'\s*&\s*', authr)
for authorterm in authorterms:
q.append(Books.authors.any(func.lower(Authors.name).ilike("%" + authorterm + "%")))
return self.session.query(Books) \
.filter(and_(Books.authors.any(and_(*q)), func.lower(Books.title).ilike("%" + title + "%"))).first()
# read search results from calibre-database and return it (function is used for feed and simple search
def get_search_results(self, term, offset=None, order=None, limit=None):
order = order or [Books.sort]
pagination = None
term.strip().lower()
self.session.connection().connection.connection.create_function("lower", 1, lcase)
q = list()
authorterms = re.split("[, ]+", term)
for authorterm in authorterms:
q.append(Books.authors.any(func.lower(Authors.name).ilike("%" + authorterm + "%")))
result = self.session.query(Books).filter(self.common_filters(True)).filter(
or_(Books.tags.any(func.lower(Tags.name).ilike("%" + term + "%")),
Books.series.any(func.lower(Series.name).ilike("%" + term + "%")),
Books.authors.any(and_(*q)),
Books.publishers.any(func.lower(Publishers.name).ilike("%" + term + "%")),
func.lower(Books.title).ilike("%" + term + "%")
)).order_by(*order).all()
result_count = len(result)
if offset != None and limit != None:
offset = int(offset)
limit_all = offset + int(limit)
pagination = Pagination((offset / (int(limit)) + 1), limit, result_count)
else:
offset = 0
limit_all = result_count
ub.store_ids(result)
return result[offset:limit_all], result_count, pagination
# Creates for all stored languages a translated speaking name in the array for the UI
def speaking_language(self, languages=None):
from . import get_locale
if not languages:
languages = self.session.query(Languages) \
.join(books_languages_link) \
.join(Books) \
.filter(self.common_filters()) \
.group_by(text('books_languages_link.lang_code')).all()
for lang in languages:
try:
cur_l = LC.parse(lang.lang_code)
lang.name = cur_l.get_language_name(get_locale())
except UnknownLocaleError:
lang.name = _(isoLanguages.get(part3=lang.lang_code).name)
return languages
def update_title_sort(self, config, conn=None):
# user defined sort function for calibre databases (Series, etc.)
def _title_sort(title):
# calibre sort stuff
title_pat = re.compile(config.config_title_regex, re.IGNORECASE)
match = title_pat.search(title)
if match:
prep = match.group(1)
title = title[len(prep):] + ', ' + prep
return title.strip()
conn = conn or self.session.connection().connection.connection
conn.create_function("title_sort", 1, _title_sort)
@classmethod
def dispose(cls):
# global session
for inst in cls.instances:
old_session = inst.session
inst.session = None
if old_session:
try:
old_session.close()
except:
pass
if old_session.bind:
try:
old_session.bind.dispose()
except Exception:
pass
for attr in list(Books.__dict__.keys()):
if attr.startswith("custom_column_"):
setattr(Books, attr, None)
for db_class in cc_classes.values():
Base.metadata.remove(db_class.__table__)
cc_classes.clear()
for table in reversed(Base.metadata.sorted_tables):
name = table.key
if name.startswith("custom_column_") or name.startswith("books_custom_column_"):
if table is not None:
Base.metadata.remove(table)
def reconnect_db(self, config, app_db_path):
self.dispose()
self.engine.dispose()
self.setup_db(config, app_db_path)
global session def lcase(s):
Session = scoped_session(sessionmaker(autocommit=False, try:
autoflush=False, return unidecode.unidecode(s.lower())
bind=engine)) except Exception as e:
session = Session() log = logger.create()
return True log.exception(e)
return s.lower()
def dispose():
global session
old_session = session
session = None
if old_session:
try: old_session.close()
except: pass
if old_session.bind:
try: old_session.bind.dispose()
except: pass
for attr in list(Books.__dict__.keys()):
if attr.startswith("custom_column_"):
setattr(Books, attr, None)
for db_class in cc_classes.values():
Base.metadata.remove(db_class.__table__)
cc_classes.clear()
for table in reversed(Base.metadata.sorted_tables):
name = table.key
if name.startswith("custom_column_") or name.startswith("books_custom_column_"):
if table is not None:
Base.metadata.remove(table)

55
cps/debug_info.py Normal file
View File

@ -0,0 +1,55 @@
# -*- coding: utf-8 -*-
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
# Copyright (C) 2012-2019 cervinko, idalin, SiphonSquirrel, ouzklcn, akushsky,
# OzzieIsaacs, bodybybuddha, jkrehm, matthazinski, janeczku
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import shutil
import glob
import zipfile
import json
import io
import os
from flask import send_file
from . import logger, config
from .about import collect_stats
log = logger.create()
def assemble_logfiles(file_name):
log_list = glob.glob(file_name + '*')
wfd = io.StringIO()
for f in log_list:
with open(f, 'r') as fd:
shutil.copyfileobj(fd, wfd)
return send_file(wfd,
as_attachment=True,
attachment_filename=os.path.basename(file_name))
def send_debug():
file_list = glob.glob(logger.get_logfile(config.config_logfile) + '*')
file_list.extend(glob.glob(logger.get_accesslogfile(config.config_access_logfile) + '*'))
memory_zip = io.BytesIO()
with zipfile.ZipFile(memory_zip, 'w', compression=zipfile.ZIP_DEFLATED) as zf:
zf.writestr('libs.txt', json.dumps(collect_stats()))
for fp in file_list:
zf.write(fp, os.path.basename(fp))
memory_zip.seek(0)
return send_file(memory_zip,
as_attachment=True,
attachment_filename="Calibre-Web-debug-pack.zip")

File diff suppressed because it is too large Load Diff

View File

@ -1,4 +1,3 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web) # This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
@ -23,9 +22,11 @@ import zipfile
from lxml import etree from lxml import etree
from . import isoLanguages from . import isoLanguages
from .helper import split_authors
from .constants import BookMeta from .constants import BookMeta
def extractCover(zipFile, coverFile, coverpath, tmp_file_name): def extractCover(zipFile, coverFile, coverpath, tmp_file_name):
if coverFile is None: if coverFile is None:
return None return None
@ -65,34 +66,26 @@ def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
tmp = p.xpath('dc:%s/text()' % s, namespaces=ns) tmp = p.xpath('dc:%s/text()' % s, namespaces=ns)
if len(tmp) > 0: if len(tmp) > 0:
if s == 'creator': if s == 'creator':
epub_metadata[s] = ' & '.join(p.xpath('dc:%s/text()' % s, namespaces=ns)) epub_metadata[s] = ' & '.join(split_authors(tmp))
elif s == 'subject': elif s == 'subject':
epub_metadata[s] = ', '.join(p.xpath('dc:%s/text()' % s, namespaces=ns)) epub_metadata[s] = ', '.join(tmp)
else: else:
epub_metadata[s] = p.xpath('dc:%s/text()' % s, namespaces=ns)[0] epub_metadata[s] = tmp[0]
else: else:
epub_metadata[s] = "Unknown" epub_metadata[s] = u'Unknown'
if epub_metadata['subject'] == "Unknown": if epub_metadata['subject'] == u'Unknown':
epub_metadata['subject'] = '' epub_metadata['subject'] = ''
if epub_metadata['description'] == "Unknown": if epub_metadata['description'] == u'Unknown':
description = tree.xpath("//*[local-name() = 'description']/text()") description = tree.xpath("//*[local-name() = 'description']/text()")
if len(description) > 0: if len(description) > 0:
epub_metadata['description'] = description epub_metadata['description'] = description
else: else:
epub_metadata['description'] = "" epub_metadata['description'] = ""
if epub_metadata['language'] == "Unknown": lang = epub_metadata['language'].split('-', 1)[0].lower()
epub_metadata['language'] = "" epub_metadata['language'] = isoLanguages.get_lang3(lang)
else:
lang = epub_metadata['language'].split('-', 1)[0].lower()
if len(lang) == 2:
epub_metadata['language'] = isoLanguages.get(part1=lang).name
elif len(lang) == 3:
epub_metadata['language'] = isoLanguages.get(part3=lang).name
else:
epub_metadata['language'] = ""
series = tree.xpath("/pkg:package/pkg:metadata/pkg:meta[@name='calibre:series']/@content", namespaces=ns) series = tree.xpath("/pkg:package/pkg:metadata/pkg:meta[@name='calibre:series']/@content", namespaces=ns)
if len(series) > 0: if len(series) > 0:
@ -123,9 +116,14 @@ def get_epub_info(tmp_file_path, original_file_name, original_file_extension):
markupTree = etree.fromstring(markup) markupTree = etree.fromstring(markup)
# no matter xhtml or html with no namespace # no matter xhtml or html with no namespace
imgsrc = markupTree.xpath("//*[local-name() = 'img']/@src") imgsrc = markupTree.xpath("//*[local-name() = 'img']/@src")
# imgsrc maybe startwith "../"" so fullpath join then relpath to cwd # Alternative image source
filename = os.path.relpath(os.path.join(os.path.dirname(os.path.join(coverpath, coversection[0])), imgsrc[0])) if not len(imgsrc):
coverfile = extractCover(epubZip, filename, "", tmp_file_path) imgsrc = markupTree.xpath("//attribute::*[contains(local-name(), 'href')]")
if len(imgsrc):
# imgsrc maybe startwith "../"" so fullpath join then relpath to cwd
filename = os.path.relpath(os.path.join(os.path.dirname(os.path.join(coverpath, coversection[0])),
imgsrc[0]))
coverfile = extractCover(epubZip, filename, "", tmp_file_path)
else: else:
coverfile = extractCover(epubZip, coversection[0], coverpath, tmp_file_path) coverfile = extractCover(epubZip, coversection[0], coverpath, tmp_file_path)

View File

@ -1,4 +1,3 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web) # This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)

View File

@ -1,4 +1,3 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web) # This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
@ -23,6 +22,7 @@
from __future__ import division, print_function, unicode_literals from __future__ import division, print_function, unicode_literals
import os import os
import sys
import hashlib import hashlib
import json import json
import tempfile import tempfile
@ -34,18 +34,17 @@ from flask import Blueprint, flash, request, redirect, url_for, abort
from flask_babel import gettext as _ from flask_babel import gettext as _
from flask_login import login_required from flask_login import login_required
try: from . import logger, gdriveutils, config, ub, calibre_db
from googleapiclient.errors import HttpError
except ImportError:
pass
from . import logger, gdriveutils, config, db
from .web import admin_required from .web import admin_required
gdrive = Blueprint('gdrive', __name__) gdrive = Blueprint('gdrive', __name__)
log = logger.create() log = logger.create()
try:
from googleapiclient.errors import HttpError
except ImportError as err:
log.debug("Cannot import googleapiclient, using GDrive will not work: %s", err)
current_milli_time = lambda: int(round(time() * 1000)) current_milli_time = lambda: int(round(time() * 1000))
gdrive_watch_callback_token = 'target=calibreweb-watch_files' gdrive_watch_callback_token = 'target=calibreweb-watch_files'
@ -73,7 +72,7 @@ def google_drive_callback():
credentials = gdriveutils.Gauth.Instance().auth.flow.step2_exchange(auth_code) credentials = gdriveutils.Gauth.Instance().auth.flow.step2_exchange(auth_code)
with open(gdriveutils.CREDENTIALS, 'w') as f: with open(gdriveutils.CREDENTIALS, 'w') as f:
f.write(credentials.to_json()) f.write(credentials.to_json())
except ValueError as error: except (ValueError, AttributeError) as error:
log.error(error) log.error(error)
return redirect(url_for('admin.configuration')) return redirect(url_for('admin.configuration'))
@ -94,8 +93,7 @@ def watch_gdrive():
try: try:
result = gdriveutils.watchChange(gdriveutils.Gdrive.Instance().drive, notification_id, result = gdriveutils.watchChange(gdriveutils.Gdrive.Instance().drive, notification_id,
'web_hook', address, gdrive_watch_callback_token, current_milli_time() + 604800*1000) 'web_hook', address, gdrive_watch_callback_token, current_milli_time() + 604800*1000)
config.config_google_drive_watch_changes_response = json.dumps(result) config.config_google_drive_watch_changes_response = result
# after save(), config_google_drive_watch_changes_response will be a json object, not string
config.save() config.save()
except HttpError as e: except HttpError as e:
reason=json.loads(e.content)['error']['errors'][0] reason=json.loads(e.content)['error']['errors'][0]
@ -118,41 +116,45 @@ def revoke_watch_gdrive():
last_watch_response['resourceId']) last_watch_response['resourceId'])
except HttpError: except HttpError:
pass pass
config.config_google_drive_watch_changes_response = None config.config_google_drive_watch_changes_response = {}
config.save() config.save()
return redirect(url_for('admin.configuration')) return redirect(url_for('admin.configuration'))
@gdrive.route("/gdrive/watch/callback", methods=['GET', 'POST']) @gdrive.route("/gdrive/watch/callback", methods=['GET', 'POST'])
def on_received_watch_confirmation(): def on_received_watch_confirmation():
if not config.config_google_drive_watch_changes_response:
return ''
if request.headers.get('X-Goog-Channel-Token') != gdrive_watch_callback_token \
or request.headers.get('X-Goog-Resource-State') != 'change' \
or not request.data:
return '' # redirect(url_for('admin.configuration'))
log.debug('%r', request.headers) log.debug('%r', request.headers)
if request.headers.get('X-Goog-Channel-Token') == gdrive_watch_callback_token \ log.debug('%r', request.data)
and request.headers.get('X-Goog-Resource-State') == 'change' \ log.info('Change received from gdrive')
and request.data:
data = request.data try:
j = json.loads(request.data)
def updateMetaData(): log.info('Getting change details')
log.info('Change received from gdrive') response = gdriveutils.getChangeById(gdriveutils.Gdrive.Instance().drive, j['id'])
log.debug('%r', data) log.debug('%r', response)
try: if response:
j = json.loads(data) if sys.version_info < (3, 0):
log.info('Getting change details') dbpath = os.path.join(config.config_calibre_dir, "metadata.db")
response = gdriveutils.getChangeById(gdriveutils.Gdrive.Instance().drive, j['id']) else:
log.debug('%r', response) dbpath = os.path.join(config.config_calibre_dir, "metadata.db").encode()
if response: if not response['deleted'] and response['file']['title'] == 'metadata.db' \
dbpath = os.path.join(config.config_calibre_dir, "metadata.db") and response['file']['md5Checksum'] != hashlib.md5(dbpath):
if not response['deleted'] and response['file']['title'] == 'metadata.db' and response['file']['md5Checksum'] != hashlib.md5(dbpath): tmpDir = tempfile.gettempdir()
tmpDir = tempfile.gettempdir() log.info('Database file updated')
log.info('Database file updated') copyfile(dbpath, os.path.join(tmpDir, "metadata.db_" + str(current_milli_time())))
copyfile(dbpath, os.path.join(tmpDir, "metadata.db_" + str(current_milli_time()))) log.info('Backing up existing and downloading updated metadata.db')
log.info('Backing up existing and downloading updated metadata.db') gdriveutils.downloadFile(None, "metadata.db", os.path.join(tmpDir, "tmp_metadata.db"))
gdriveutils.downloadFile(None, "metadata.db", os.path.join(tmpDir, "tmp_metadata.db")) log.info('Setting up new DB')
log.info('Setting up new DB') # prevent error on windows, as os.rename does on exisiting files
# prevent error on windows, as os.rename does on exisiting files move(os.path.join(tmpDir, "tmp_metadata.db"), dbpath)
move(os.path.join(tmpDir, "tmp_metadata.db"), dbpath) calibre_db.reconnect_db(config, ub.app_DB_path)
db.setup_db(config) except Exception as e:
except Exception as e: log.exception(e)
log.exception(e)
updateMetaData()
return '' return ''

View File

@ -1,4 +1,3 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web) # This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
@ -21,6 +20,8 @@ from __future__ import division, print_function, unicode_literals
import os import os
import json import json
import shutil import shutil
import chardet
import ssl
from flask import Response, stream_with_context from flask import Response, stream_with_context
from sqlalchemy import create_engine from sqlalchemy import create_engine
@ -28,14 +29,18 @@ from sqlalchemy import Column, UniqueConstraint
from sqlalchemy import String, Integer from sqlalchemy import String, Integer
from sqlalchemy.orm import sessionmaker, scoped_session from sqlalchemy.orm import sessionmaker, scoped_session
from sqlalchemy.ext.declarative import declarative_base from sqlalchemy.ext.declarative import declarative_base
from sqlalchemy.exc import OperationalError, InvalidRequestError
try: try:
from pydrive.auth import GoogleAuth from pydrive.auth import GoogleAuth
from pydrive.drive import GoogleDrive from pydrive.drive import GoogleDrive
from pydrive.auth import RefreshError from pydrive.auth import RefreshError
from apiclient import errors from apiclient import errors
from httplib2 import ServerNotFoundError
gdrive_support = True gdrive_support = True
except ImportError: importError = None
except ImportError as err:
importError = err
gdrive_support = False gdrive_support = False
from . import logger, cli, config from . import logger, cli, config
@ -47,6 +52,12 @@ CREDENTIALS = os.path.join(_CONFIG_DIR, 'gdrive_credentials')
CLIENT_SECRETS = os.path.join(_CONFIG_DIR, 'client_secrets.json') CLIENT_SECRETS = os.path.join(_CONFIG_DIR, 'client_secrets.json')
log = logger.create() log = logger.create()
if gdrive_support:
logger.get('googleapiclient.discovery_cache').setLevel(logger.logging.ERROR)
if not logger.is_debug_enabled():
logger.get('googleapiclient.discovery').setLevel(logger.logging.ERROR)
else:
log.debug("Cannot import pydrive,httplib2, using gdrive will not work: %s", importError)
class Singleton: class Singleton:
@ -94,7 +105,11 @@ class Singleton:
@Singleton @Singleton
class Gauth: class Gauth:
def __init__(self): def __init__(self):
self.auth = GoogleAuth(settings_file=SETTINGS_YAML) try:
self.auth = GoogleAuth(settings_file=SETTINGS_YAML)
except NameError as error:
log.error(error)
self.auth = None
@Singleton @Singleton
@ -189,14 +204,18 @@ def getDrive(drive=None, gauth=None):
return drive return drive
def listRootFolders(): def listRootFolders():
drive = getDrive(Gdrive.Instance().drive) try:
folder = "'root' in parents and mimeType = 'application/vnd.google-apps.folder' and trashed = false" drive = getDrive(Gdrive.Instance().drive)
fileList = drive.ListFile({'q': folder}).GetList() folder = "'root' in parents and mimeType = 'application/vnd.google-apps.folder' and trashed = false"
fileList = drive.ListFile({'q': folder}).GetList()
except ServerNotFoundError as e:
log.info("GDrive Error %s" % e)
fileList = []
return fileList return fileList
def getEbooksFolder(drive): def getEbooksFolder(drive):
return getFolderInFolder('root',config.config_google_drive_folder,drive) return getFolderInFolder('root', config.config_google_drive_folder, drive)
def getFolderInFolder(parentId, folderName, drive): def getFolderInFolder(parentId, folderName, drive):
@ -226,7 +245,7 @@ def getEbooksFolderId(drive=None):
gDriveId.path = '/' gDriveId.path = '/'
session.merge(gDriveId) session.merge(gDriveId)
session.commit() session.commit()
return return gDriveId.gdrive_id
def getFile(pathId, fileName, drive): def getFile(pathId, fileName, drive):
@ -471,8 +490,13 @@ def getChangeById (drive, change_id):
# Deletes the local hashes database to force search for new folder names # Deletes the local hashes database to force search for new folder names
def deleteDatabaseOnChange(): def deleteDatabaseOnChange():
session.query(GdriveId).delete() try:
session.commit() session.query(GdriveId).delete()
session.commit()
except (OperationalError, InvalidRequestError):
session.rollback()
log.info(u"GDrive DB is not Writeable")
def updateGdriveCalibreFromLocal(): def updateGdriveCalibreFromLocal():
copyToDrive(Gdrive.Instance().drive, config.config_calibre_dir, False, True) copyToDrive(Gdrive.Instance().drive, config.config_calibre_dir, False, True)
@ -580,8 +604,12 @@ def get_error_text(client_secrets=None):
if not os.path.isfile(CLIENT_SECRETS): if not os.path.isfile(CLIENT_SECRETS):
return 'client_secrets.json is missing or not readable' return 'client_secrets.json is missing or not readable'
with open(CLIENT_SECRETS, 'r') as settings: try:
filedata = json.load(settings) with open(CLIENT_SECRETS, 'r') as settings:
filedata = json.load(settings)
except PermissionError:
return 'client_secrets.json is missing or not readable'
if 'web' not in filedata: if 'web' not in filedata:
return 'client_secrets.json is not configured for web application' return 'client_secrets.json is not configured for web application'
if 'redirect_uris' not in filedata['web']: if 'redirect_uris' not in filedata['web']:

File diff suppressed because it is too large Load Diff

View File

@ -57,12 +57,36 @@ def get_language_name(locale, lang_code):
def get_language_codes(locale, language_names, remainder=None): def get_language_codes(locale, language_names, remainder=None):
language_names = set(x.strip().lower() for x in language_names if x) language_names = set(x.strip().lower() for x in language_names if x)
languages = list()
for k, v in get_language_names(locale).items(): for k, v in get_language_names(locale).items():
v = v.lower() v = v.lower()
if v in language_names: if v in language_names:
languages.append(k)
language_names.remove(v) language_names.remove(v)
yield k
if remainder is not None: if remainder is not None:
remainder.extend(language_names) remainder.extend(language_names)
return languages
def get_valid_language_codes(locale, language_names, remainder=None):
languages = list()
if "" in language_names:
language_names.remove("")
for k, v in get_language_names(locale).items():
if k in language_names:
languages.append(k)
language_names.remove(k)
if remainder is not None and len(language_names):
remainder.extend(language_names)
return languages
def get_lang3(lang):
try:
if len(lang) == 2:
ret_value = get(part1=lang).part3
elif len(lang) == 3:
ret_value = lang
else:
ret_value = ""
except KeyError:
ret_value = lang
return ret_value

File diff suppressed because it is too large Load Diff

View File

@ -1,4 +1,3 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web) # This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
@ -26,7 +25,7 @@
from __future__ import division, print_function, unicode_literals from __future__ import division, print_function, unicode_literals
import datetime import datetime
import mimetypes import mimetypes
import re from uuid import uuid4
from babel.dates import format_date from babel.dates import format_date
from flask import Blueprint, request, url_for from flask import Blueprint, request, url_for
@ -45,6 +44,8 @@ log = logger.create()
def url_for_other_page(page): def url_for_other_page(page):
args = request.view_args.copy() args = request.view_args.copy()
args['page'] = page args['page'] = page
for get, val in request.args.items():
args[get] = val
return url_for(request.endpoint, **args) return url_for(request.endpoint, **args)
@ -77,18 +78,18 @@ def mimetype_filter(val):
@jinjia.app_template_filter('formatdate') @jinjia.app_template_filter('formatdate')
def formatdate_filter(val): def formatdate_filter(val):
try: try:
conformed_timestamp = re.sub(r"[:]|([-](?!((\d{2}[:]\d{2})|(\d{4}))$))", '', val) return format_date(val, format='medium', locale=get_locale())
formatdate = datetime.datetime.strptime(conformed_timestamp[:15], "%Y%m%d %H%M%S")
return format_date(formatdate, format='medium', locale=get_locale())
except AttributeError as e: except AttributeError as e:
log.error('Babel error: %s, Current user locale: %s, Current User: %s', e, current_user.locale, current_user.nickname) log.error('Babel error: %s, Current user locale: %s, Current User: %s', e,
return formatdate current_user.locale,
current_user.nickname
)
return val
@jinjia.app_template_filter('formatdateinput') @jinjia.app_template_filter('formatdateinput')
def format_date_input(val): def format_date_input(val):
conformed_timestamp = re.sub(r"[:]|([-](?!((\d{2}[:]\d{2})|(\d{4}))$))", '', val) input_date = val.isoformat().split('T', 1)[0] # Hack to support dates <1900
date_obj = datetime.datetime.strptime(conformed_timestamp[:15], "%Y%m%d %H%M%S")
input_date = date_obj.isoformat().split('T', 1)[0] # Hack to support dates <1900
return '' if input_date == "0101-01-01" else input_date return '' if input_date == "0101-01-01" else input_date
@ -110,8 +111,25 @@ def yesno(value, yes, no):
return yes if value else no return yes if value else no
'''@jinjia.app_template_filter('canread') @jinjia.app_template_filter('formatfloat')
def canread(ext): def formatfloat(value, decimals=1):
if isinstance(ext, db.Data): formatedstring = '%d' % value
ext = ext.format if (value % 1) != 0:
return ext.lower() in EXTENSIONS_READER''' formatedstring = ('%s.%d' % (formatedstring, (value % 1) * 10**decimals)).rstrip('0')
return formatedstring
@jinjia.app_template_filter('formatseriesindex')
def formatseriesindex_filter(series_index):
if series_index:
if int(series_index) - series_index == 0:
return int(series_index)
else:
return series_index
return 0
@jinjia.app_template_filter('uuidfilter')
def uuidfilter(var):
return uuid4()

1118
cps/kobo.py Normal file

File diff suppressed because it is too large Load Diff

168
cps/kobo_auth.py Normal file
View File

@ -0,0 +1,168 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
# Copyright (C) 2018-2019 shavitmichael, OzzieIsaacs
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
"""This module is used to control authentication/authorization of Kobo sync requests.
This module also includes research notes into the auth protocol used by Kobo devices.
Log-in:
When first booting a Kobo device the user must sign into a Kobo (or affiliate) account.
Upon successful sign-in, the user is redirected to
https://auth.kobobooks.com/CrossDomainSignIn?id=<some id>
which serves the following response:
<script type='text/javascript'>
location.href='kobo://UserAuthenticated?userId=<redacted>&userKey<redacted>&email=<redacted>&returnUrl=https%3a%2f%2fwww.kobo.com';
</script>
And triggers the insertion of a userKey into the device's User table.
Together, the device's DeviceId and UserKey act as an *irrevocable* authentication
token to most (if not all) Kobo APIs. In fact, in most cases only the UserKey is
required to authorize the API call.
Changing Kobo password *does not* invalidate user keys! This is apparently a known
issue for a few years now https://www.mobileread.com/forums/showpost.php?p=3476851&postcount=13
(although this poster hypothesised that Kobo could blacklist a DeviceId, many endpoints
will still grant access given the userkey.)
Official Kobo Store Api authorization:
* For most of the endpoints we care about (sync, metadata, tags, etc), the userKey is
passed in the x-kobo-userkey header, and is sufficient to authorize the API call.
* Some endpoints (e.g: AnnotationService) instead make use of Bearer tokens pass through
an authorization header. To get a BearerToken, the device makes a POST request to the
v1/auth/device endpoint with the secret UserKey and the device's DeviceId.
* The book download endpoint passes an auth token as a URL param instead of a header.
Our implementation:
We pretty much ignore all of the above. To authenticate the user, we generate a random
and unique token that they append to the CalibreWeb Url when setting up the api_store
setting on the device.
Thus, every request from the device to the api_store will hit CalibreWeb with the
auth_token in the url (e.g: https://mylibrary.com/<auth_token>/v1/library/sync).
In addition, once authenticated we also set the login cookie on the response that will
be sent back for the duration of the session to authorize subsequent API calls (in
particular calls to non-Kobo specific endpoints such as the CalibreWeb book download).
"""
from binascii import hexlify
from datetime import datetime
from os import urandom
from flask import g, Blueprint, url_for, abort, request
from flask_login import login_user, login_required
from flask_babel import gettext as _
from . import logger, ub, lm
from .web import render_title_template
try:
from functools import wraps
except ImportError:
pass # We're not using Python 3
log = logger.create()
def register_url_value_preprocessor(kobo):
@kobo.url_value_preprocessor
def pop_auth_token(__, values):
g.auth_token = values.pop("auth_token")
def disable_failed_auth_redirect_for_blueprint(bp):
lm.blueprint_login_views[bp.name] = None
def get_auth_token():
if "auth_token" in g:
return g.get("auth_token")
else:
return None
def requires_kobo_auth(f):
@wraps(f)
def inner(*args, **kwargs):
auth_token = get_auth_token()
if auth_token is not None:
user = (
ub.session.query(ub.User)
.join(ub.RemoteAuthToken)
.filter(ub.RemoteAuthToken.auth_token == auth_token).filter(ub.RemoteAuthToken.token_type==1)
.first()
)
if user is not None:
login_user(user)
return f(*args, **kwargs)
log.debug("Received Kobo request without a recognizable auth token.")
return abort(401)
return inner
kobo_auth = Blueprint("kobo_auth", __name__, url_prefix="/kobo_auth")
@kobo_auth.route("/generate_auth_token/<int:user_id>")
@login_required
def generate_auth_token(user_id):
host_list = request.host.rsplit(':')
if len(host_list) == 1:
host = ':'.join(host_list)
else:
host = ':'.join(host_list[0:-1])
if host.startswith('127.') or host.lower() == 'localhost' or host.startswith('[::ffff:7f'):
warning = _('PLease access calibre-web from non localhost to get valid api_endpoint for kobo device')
return render_title_template(
"generate_kobo_auth_url.html",
title=_(u"Kobo Setup"),
warning = warning
)
else:
# Invalidate any prevously generated Kobo Auth token for this user.
auth_token = ub.session.query(ub.RemoteAuthToken).filter(
ub.RemoteAuthToken.user_id == user_id
).filter(ub.RemoteAuthToken.token_type==1).first()
if not auth_token:
auth_token = ub.RemoteAuthToken()
auth_token.user_id = user_id
auth_token.expiration = datetime.max
auth_token.auth_token = (hexlify(urandom(16))).decode("utf-8")
auth_token.token_type = 1
ub.session.add(auth_token)
ub.session.commit()
return render_title_template(
"generate_kobo_auth_url.html",
title=_(u"Kobo Setup"),
kobo_auth_url=url_for(
"kobo.TopLevelEndpoint", auth_token=auth_token.auth_token, _external=True
),
warning = False
)
@kobo_auth.route("/deleteauthtoken/<int:user_id>")
@login_required
def delete_auth_token(user_id):
# Invalidate any prevously generated Kobo Auth token for this user.
ub.session.query(ub.RemoteAuthToken).filter(ub.RemoteAuthToken.user_id == user_id)\
.filter(ub.RemoteAuthToken.token_type==1).delete()
ub.session.commit()
return ""

View File

@ -67,6 +67,8 @@ def get_level_name(level):
def is_valid_logfile(file_path): def is_valid_logfile(file_path):
if file_path == LOG_TO_STDERR or file_path == LOG_TO_STDOUT:
return True
if not file_path: if not file_path:
return True return True
if os.path.isdir(file_path): if os.path.isdir(file_path):
@ -80,7 +82,6 @@ def _absolute_log_file(log_file, default_log_file):
if not os.path.dirname(log_file): if not os.path.dirname(log_file):
log_file = os.path.join(_CONFIG_DIR, log_file) log_file = os.path.join(_CONFIG_DIR, log_file)
return os.path.abspath(log_file) return os.path.abspath(log_file)
return default_log_file return default_log_file
@ -105,13 +106,15 @@ def setup(log_file, log_level=None):
# avoid spamming the log with debug messages from libraries # avoid spamming the log with debug messages from libraries
r.setLevel(log_level) r.setLevel(log_level)
log_file = _absolute_log_file(log_file, DEFAULT_LOG_FILE) # Otherwise name get's destroyed on windows
if log_file != LOG_TO_STDERR and log_file != LOG_TO_STDOUT:
log_file = _absolute_log_file(log_file, DEFAULT_LOG_FILE)
previous_handler = r.handlers[0] if r.handlers else None previous_handler = r.handlers[0] if r.handlers else None
if previous_handler: if previous_handler:
# if the log_file has not changed, don't create a new handler # if the log_file has not changed, don't create a new handler
if getattr(previous_handler, 'baseFilename', None) == log_file: if getattr(previous_handler, 'baseFilename', None) == log_file:
return return "" if log_file == DEFAULT_LOG_FILE else log_file
logging.debug("logging to %s level %s", log_file, r.level) logging.debug("logging to %s level %s", log_file, r.level)
if log_file == LOG_TO_STDERR or log_file == LOG_TO_STDOUT: if log_file == LOG_TO_STDERR or log_file == LOG_TO_STDOUT:
@ -119,21 +122,23 @@ def setup(log_file, log_level=None):
file_handler = StreamHandler(sys.stdout) file_handler = StreamHandler(sys.stdout)
file_handler.baseFilename = log_file file_handler.baseFilename = log_file
else: else:
file_handler = StreamHandler() file_handler = StreamHandler(sys.stderr)
file_handler.baseFilename = log_file file_handler.baseFilename = log_file
else: else:
try: try:
file_handler = RotatingFileHandler(log_file, maxBytes=50000, backupCount=2) file_handler = RotatingFileHandler(log_file, maxBytes=50000, backupCount=2, encoding='utf-8')
except IOError: except IOError:
if log_file == DEFAULT_LOG_FILE: if log_file == DEFAULT_LOG_FILE:
raise raise
file_handler = RotatingFileHandler(DEFAULT_LOG_FILE, maxBytes=50000, backupCount=2) file_handler = RotatingFileHandler(DEFAULT_LOG_FILE, maxBytes=50000, backupCount=2, encoding='utf-8')
log_file = ""
file_handler.setFormatter(FORMATTER) file_handler.setFormatter(FORMATTER)
for h in r.handlers: for h in r.handlers:
r.removeHandler(h) r.removeHandler(h)
h.close() h.close()
r.addHandler(file_handler) r.addHandler(file_handler)
return "" if log_file == DEFAULT_LOG_FILE else log_file
def create_access_log(log_file, log_name, formatter): def create_access_log(log_file, log_name, formatter):
@ -146,11 +151,18 @@ def create_access_log(log_file, log_name, formatter):
access_log = logging.getLogger(log_name) access_log = logging.getLogger(log_name)
access_log.propagate = False access_log.propagate = False
access_log.setLevel(logging.INFO) access_log.setLevel(logging.INFO)
try:
file_handler = RotatingFileHandler(log_file, maxBytes=50000, backupCount=2, encoding='utf-8')
except IOError:
if log_file == DEFAULT_ACCESS_LOG:
raise
file_handler = RotatingFileHandler(DEFAULT_ACCESS_LOG, maxBytes=50000, backupCount=2, encoding='utf-8')
log_file = ""
file_handler = RotatingFileHandler(log_file, maxBytes=50000, backupCount=2)
file_handler.setFormatter(formatter) file_handler.setFormatter(formatter)
access_log.addHandler(file_handler) access_log.addHandler(file_handler)
return access_log return access_log, \
"" if _absolute_log_file(log_file, DEFAULT_ACCESS_LOG) == DEFAULT_ACCESS_LOG else log_file
# Enable logging of smtp lib debug output # Enable logging of smtp lib debug output

View File

@ -1,4 +1,3 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web) # This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
@ -24,13 +23,24 @@ from flask import session
try: try:
from flask_dance.consumer.backend.sqla import SQLAlchemyBackend, first, _get_real_user from flask_dance.consumer.backend.sqla import SQLAlchemyBackend, first, _get_real_user
from sqlalchemy.orm.exc import NoResultFound from sqlalchemy.orm.exc import NoResultFound
backend_resultcode = False # prevent storing values with this resultcode
except ImportError:
# fails on flask-dance >1.3, due to renaming
try:
from flask_dance.consumer.storage.sqla import SQLAlchemyStorage as SQLAlchemyBackend
from flask_dance.consumer.storage.sqla import first, _get_real_user
from sqlalchemy.orm.exc import NoResultFound
backend_resultcode = True # prevent storing values with this resultcode
except ImportError:
pass
try:
class OAuthBackend(SQLAlchemyBackend): class OAuthBackend(SQLAlchemyBackend):
""" """
Stores and retrieves OAuth tokens using a relational database through Stores and retrieves OAuth tokens using a relational database through
the `SQLAlchemy`_ ORM. the `SQLAlchemy`_ ORM.
.. _SQLAlchemy: http://www.sqlalchemy.org/ .. _SQLAlchemy: https://www.sqlalchemy.org/
""" """
def __init__(self, model, session, provider_id, def __init__(self, model, session, provider_id,
user=None, user_id=None, user_required=None, anon_user=None, user=None, user_id=None, user_required=None, anon_user=None,
@ -40,7 +50,7 @@ try:
def get(self, blueprint, user=None, user_id=None): def get(self, blueprint, user=None, user_id=None):
if self.provider_id + '_oauth_token' in session and session[self.provider_id + '_oauth_token'] != '': if self.provider_id + '_oauth_token' in session and session[self.provider_id + '_oauth_token'] != '':
return session[blueprint.name + '_oauth_token'] return session[self.provider_id + '_oauth_token']
# check cache # check cache
cache_key = self.make_cache_key(blueprint=blueprint, user=user, user_id=user_id) cache_key = self.make_cache_key(blueprint=blueprint, user=user, user_id=user_id)
token = self.cache.get(cache_key) token = self.cache.get(cache_key)
@ -62,7 +72,7 @@ try:
use_provider_user_id = True use_provider_user_id = True
if self.user_required and not u and not uid and not use_provider_user_id: if self.user_required and not u and not uid and not use_provider_user_id:
#raise ValueError("Cannot get OAuth token without an associated user") # raise ValueError("Cannot get OAuth token without an associated user")
return None return None
# check for user ID # check for user ID
if hasattr(self.model, "user_id") and uid: if hasattr(self.model, "user_id") and uid:
@ -87,7 +97,7 @@ try:
def set(self, blueprint, token, user=None, user_id=None): def set(self, blueprint, token, user=None, user_id=None):
uid = first([user_id, self.user_id, blueprint.config.get("user_id")]) uid = first([user_id, self.user_id, blueprint.config.get("user_id")])
u = first(_get_real_user(ref, self.anon_user) u = first(_get_real_user(ref, self.anon_user)
for ref in (user, self.user, blueprint.config.get("user"))) for ref in (user, self.user, blueprint.config.get("user")))
if self.user_required and not u and not uid: if self.user_required and not u and not uid:
raise ValueError("Cannot set OAuth token without an associated user") raise ValueError("Cannot set OAuth token without an associated user")
@ -153,5 +163,5 @@ try:
blueprint=blueprint, user=user, user_id=user_id, blueprint=blueprint, user=user, user_id=user_id,
)) ))
except ImportError: except Exception:
pass pass

View File

@ -1,4 +1,3 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web) # This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
@ -36,8 +35,7 @@ from sqlalchemy.orm.exc import NoResultFound
from . import constants, logger, config, app, ub from . import constants, logger, config, app, ub
from .web import login_required from .web import login_required
from .oauth import OAuthBackend from .oauth import OAuthBackend, backend_resultcode
# from .web import github_oauth_required
oauth_check = {} oauth_check = {}
@ -50,7 +48,7 @@ def oauth_required(f):
def inner(*args, **kwargs): def inner(*args, **kwargs):
if config.config_login_type == constants.LOGIN_OAUTH: if config.config_login_type == constants.LOGIN_OAUTH:
return f(*args, **kwargs) return f(*args, **kwargs)
if request.is_xhr: if request.headers.get('X-Requested-With') == 'XMLHttpRequest':
data = {'status': 'error', 'message': 'Not Found'} data = {'status': 'error', 'message': 'Not Found'}
response = make_response(json.dumps(data, ensure_ascii=False)) response = make_response(json.dumps(data, ensure_ascii=False))
response.headers["Content-Type"] = "application/json; charset=utf-8" response.headers["Content-Type"] = "application/json; charset=utf-8"
@ -60,29 +58,29 @@ def oauth_required(f):
return inner return inner
def register_oauth_blueprint(id, show_name): def register_oauth_blueprint(cid, show_name):
oauth_check[id] = show_name oauth_check[cid] = show_name
def register_user_with_oauth(user=None): def register_user_with_oauth(user=None):
all_oauth = {} all_oauth = {}
for oauth in oauth_check.keys(): for oauth_key in oauth_check.keys():
if str(oauth) + '_oauth_user_id' in session and session[str(oauth) + '_oauth_user_id'] != '': if str(oauth_key) + '_oauth_user_id' in session and session[str(oauth_key) + '_oauth_user_id'] != '':
all_oauth[oauth] = oauth_check[oauth] all_oauth[oauth_key] = oauth_check[oauth_key]
if len(all_oauth.keys()) == 0: if len(all_oauth.keys()) == 0:
return return
if user is None: if user is None:
flash(_(u"Register with %(provider)s", provider=", ".join(list(all_oauth.values()))), category="success") flash(_(u"Register with %(provider)s", provider=", ".join(list(all_oauth.values()))), category="success")
else: else:
for oauth in all_oauth.keys(): for oauth_key in all_oauth.keys():
# Find this OAuth token in the database, or create it # Find this OAuth token in the database, or create it
query = ub.session.query(ub.OAuth).filter_by( query = ub.session.query(ub.OAuth).filter_by(
provider=oauth, provider=oauth_key,
provider_user_id=session[str(oauth) + "_oauth_user_id"], provider_user_id=session[str(oauth_key) + "_oauth_user_id"],
) )
try: try:
oauth = query.one() oauth_key = query.one()
oauth.user_id = user.id oauth_key.user_id = user.id
except NoResultFound: except NoResultFound:
# no found, return error # no found, return error
return return
@ -94,39 +92,40 @@ def register_user_with_oauth(user=None):
def logout_oauth_user(): def logout_oauth_user():
for oauth in oauth_check.keys(): for oauth_key in oauth_check.keys():
if str(oauth) + '_oauth_user_id' in session: if str(oauth_key) + '_oauth_user_id' in session:
session.pop(str(oauth) + '_oauth_user_id') session.pop(str(oauth_key) + '_oauth_user_id')
if ub.oauth_support: if ub.oauth_support:
oauthblueprints =[] oauthblueprints = []
if not ub.session.query(ub.OAuthProvider).count(): if not ub.session.query(ub.OAuthProvider).count():
oauth = ub.OAuthProvider() oauthProvider = ub.OAuthProvider()
oauth.provider_name = "github" oauthProvider.provider_name = "github"
oauth.active = False oauthProvider.active = False
ub.session.add(oauth) ub.session.add(oauthProvider)
ub.session.commit() ub.session.commit()
oauth = ub.OAuthProvider() oauthProvider = ub.OAuthProvider()
oauth.provider_name = "google" oauthProvider.provider_name = "google"
oauth.active = False oauthProvider.active = False
ub.session.add(oauth) ub.session.add(oauthProvider)
ub.session.commit() ub.session.commit()
oauth_ids = ub.session.query(ub.OAuthProvider).all() oauth_ids = ub.session.query(ub.OAuthProvider).all()
ele1=dict(provider_name='github', ele1 = dict(provider_name='github',
id=oauth_ids[0].id, id=oauth_ids[0].id,
active=oauth_ids[0].active, active=oauth_ids[0].active,
oauth_client_id=oauth_ids[0].oauth_client_id, oauth_client_id=oauth_ids[0].oauth_client_id,
scope=None, scope=None,
oauth_client_secret=oauth_ids[0].oauth_client_secret, oauth_client_secret=oauth_ids[0].oauth_client_secret,
obtain_link='https://github.com/settings/developers') obtain_link='https://github.com/settings/developers')
ele2=dict(provider_name='google', ele2 = dict(provider_name='google',
id=oauth_ids[1].id, id=oauth_ids[1].id,
active=oauth_ids[1].active, active=oauth_ids[1].active,
scope=["https://www.googleapis.com/auth/plus.me", "https://www.googleapis.com/auth/userinfo.email"], scope=["https://www.googleapis.com/auth/userinfo.email"],
oauth_client_id=oauth_ids[1].oauth_client_id, oauth_client_id=oauth_ids[1].oauth_client_id,
oauth_client_secret=oauth_ids[1].oauth_client_secret, oauth_client_secret=oauth_ids[1].oauth_client_secret,
obtain_link='https://github.com/settings/developers') obtain_link='https://console.developers.google.com/apis/credentials')
oauthblueprints.append(ele1) oauthblueprints.append(ele1)
oauthblueprints.append(ele2) oauthblueprints.append(ele2)
@ -139,12 +138,12 @@ if ub.oauth_support:
client_id=element['oauth_client_id'], client_id=element['oauth_client_id'],
client_secret=element['oauth_client_secret'], client_secret=element['oauth_client_secret'],
redirect_to="oauth."+element['provider_name']+"_login", redirect_to="oauth."+element['provider_name']+"_login",
scope = element['scope'] scope=element['scope']
) )
element['blueprint']=blueprint element['blueprint'] = blueprint
app.register_blueprint(blueprint, url_prefix="/login")
element['blueprint'].backend = OAuthBackend(ub.OAuth, ub.session, str(element['id']), element['blueprint'].backend = OAuthBackend(ub.OAuth, ub.session, str(element['id']),
user=current_user, user_required=True) user=current_user, user_required=True)
app.register_blueprint(blueprint, url_prefix="/login")
if element['active']: if element['active']:
register_oauth_blueprint(element['id'], element['provider_name']) register_oauth_blueprint(element['id'], element['provider_name'])
@ -191,54 +190,64 @@ if ub.oauth_support:
provider_user_id=provider_user_id, provider_user_id=provider_user_id,
) )
try: try:
oauth = query.one() oauth_entry = query.one()
# update token # update token
oauth.token = token oauth_entry.token = token
except NoResultFound: except NoResultFound:
oauth = ub.OAuth( oauth_entry = ub.OAuth(
provider=provider_id, provider=provider_id,
provider_user_id=provider_user_id, provider_user_id=provider_user_id,
token=token, token=token,
) )
try: try:
ub.session.add(oauth) ub.session.add(oauth_entry)
ub.session.commit() ub.session.commit()
except Exception as e: except Exception as e:
log.exception(e) log.exception(e)
ub.session.rollback() ub.session.rollback()
# Disable Flask-Dance's default behavior for saving the OAuth token # Disable Flask-Dance's default behavior for saving the OAuth token
return False # Value differrs depending on flask-dance version
return backend_resultcode
def bind_oauth_or_register(provider_id, provider_user_id, redirect_url): def bind_oauth_or_register(provider_id, provider_user_id, redirect_url, provider_name):
query = ub.session.query(ub.OAuth).filter_by( query = ub.session.query(ub.OAuth).filter_by(
provider=provider_id, provider=provider_id,
provider_user_id=provider_user_id, provider_user_id=provider_user_id,
) )
try: try:
oauth = query.one() oauth_entry = query.first()
# already bind with user, just login # already bind with user, just login
if oauth.user: if oauth_entry.user:
login_user(oauth.user) login_user(oauth_entry.user)
log.debug(u"You are now logged in as: '%s'", oauth_entry.user.nickname)
flash(_(u"you are now logged in as: '%(nickname)s'", nickname= oauth_entry.user.nickname),
category="success")
return redirect(url_for('web.index')) return redirect(url_for('web.index'))
else: else:
# bind to current user # bind to current user
if current_user and current_user.is_authenticated: if current_user and current_user.is_authenticated:
oauth.user = current_user oauth_entry.user = current_user
try: try:
ub.session.add(oauth) ub.session.add(oauth_entry)
ub.session.commit() ub.session.commit()
flash(_(u"Link to %(oauth)s Succeeded", oauth=provider_name), category="success")
return redirect(url_for('web.profile'))
except Exception as e: except Exception as e:
log.exception(e) log.exception(e)
ub.session.rollback() ub.session.rollback()
return redirect(url_for('web.login')) else:
#if config.config_public_reg: flash(_(u"Login failed, No User Linked With OAuth Account"), category="error")
log.info('Login failed, No User Linked With OAuth Account')
return redirect(url_for('web.login'))
# return redirect(url_for('web.login'))
# if config.config_public_reg:
# return redirect(url_for('web.register')) # return redirect(url_for('web.register'))
#else: # else:
# flash(_(u"Public registration is not enabled"), category="error") # flash(_(u"Public registration is not enabled"), category="error")
# return redirect(url_for(redirect_url)) # return redirect(url_for(redirect_url))
except NoResultFound: except (NoResultFound, AttributeError):
return redirect(url_for(redirect_url)) return redirect(url_for(redirect_url))
@ -249,8 +258,8 @@ if ub.oauth_support:
) )
try: try:
oauths = query.all() oauths = query.all()
for oauth in oauths: for oauth_entry in oauths:
status.append(int(oauth.provider)) status.append(int(oauth_entry.provider))
return status return status
except NoResultFound: except NoResultFound:
return None return None
@ -264,21 +273,21 @@ if ub.oauth_support:
user_id=current_user.id, user_id=current_user.id,
) )
try: try:
oauth = query.one() oauth_entry = query.one()
if current_user and current_user.is_authenticated: if current_user and current_user.is_authenticated:
oauth.user = current_user oauth_entry.user = current_user
try: try:
ub.session.delete(oauth) ub.session.delete(oauth_entry)
ub.session.commit() ub.session.commit()
logout_oauth_user() logout_oauth_user()
flash(_(u"Unlink to %(oauth)s success.", oauth=oauth_check[provider]), category="success") flash(_(u"Unlink to %(oauth)s Succeeded", oauth=oauth_check[provider]), category="success")
except Exception as e: except Exception as e:
log.exception(e) log.exception(e)
ub.session.rollback() ub.session.rollback()
flash(_(u"Unlink to %(oauth)s failed.", oauth=oauth_check[provider]), category="error") flash(_(u"Unlink to %(oauth)s Failed", oauth=oauth_check[provider]), category="error")
except NoResultFound: except NoResultFound:
log.warning("oauth %s for user %d not fount", provider, current_user.id) log.warning("oauth %s for user %d not found", provider, current_user.id)
flash(_(u"Not linked to %(oauth)s.", oauth=oauth_check[provider]), category="error") flash(_(u"Not Linked to %(oauth)s", oauth=provider), category="error")
return redirect(url_for('web.profile')) return redirect(url_for('web.profile'))
@ -293,11 +302,11 @@ if ub.oauth_support:
error=error, error=error,
description=error_description, description=error_description,
uri=error_uri, uri=error_uri,
) # ToDo: Translate ) # ToDo: Translate
flash(msg, category="error") flash(msg, category="error")
@oauth.route('/github') @oauth.route('/link/github')
@oauth_required @oauth_required
def github_login(): def github_login():
if not github.authorized: if not github.authorized:
@ -305,7 +314,7 @@ if ub.oauth_support:
account_info = github.get('/user') account_info = github.get('/user')
if account_info.ok: if account_info.ok:
account_info_json = account_info.json() account_info_json = account_info.json()
return bind_oauth_or_register(oauthblueprints[0]['id'], account_info_json['id'], 'github.login') return bind_oauth_or_register(oauthblueprints[0]['id'], account_info_json['id'], 'github.login', 'github')
flash(_(u"GitHub Oauth error, please retry later."), category="error") flash(_(u"GitHub Oauth error, please retry later."), category="error")
return redirect(url_for('web.login')) return redirect(url_for('web.login'))
@ -316,7 +325,7 @@ if ub.oauth_support:
return unlink_oauth(oauthblueprints[0]['id']) return unlink_oauth(oauthblueprints[0]['id'])
@oauth.route('/login/google') @oauth.route('/link/google')
@oauth_required @oauth_required
def google_login(): def google_login():
if not google.authorized: if not google.authorized:
@ -324,7 +333,7 @@ if ub.oauth_support:
resp = google.get("/oauth2/v2/userinfo") resp = google.get("/oauth2/v2/userinfo")
if resp.ok: if resp.ok:
account_info_json = resp.json() account_info_json = resp.json()
return bind_oauth_or_register(oauthblueprints[1]['id'], account_info_json['id'], 'google.login') return bind_oauth_or_register(oauthblueprints[1]['id'], account_info_json['id'], 'google.login', 'google')
flash(_(u"Google Oauth error, please retry later."), category="error") flash(_(u"Google Oauth error, please retry later."), category="error")
return redirect(url_for('web.login')) return redirect(url_for('web.login'))
@ -339,11 +348,11 @@ if ub.oauth_support:
error=error, error=error,
description=error_description, description=error_description,
uri=error_uri, uri=error_uri,
) # ToDo: Translate ) # ToDo: Translate
flash(msg, category="error") flash(msg, category="error")
@oauth.route('/unlink/google', methods=["GET"]) @oauth.route('/unlink/google', methods=["GET"])
@login_required @login_required
def google_login_unlink(): def google_login_unlink():
return unlink_oauth(oauthblueprints[1]['blueprint'].name) return unlink_oauth(oauthblueprints[1]['id'])

View File

@ -1,4 +1,3 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web) # This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
@ -26,16 +25,18 @@ import sys
import datetime import datetime
from functools import wraps from functools import wraps
from flask import Blueprint, request, render_template, Response, g, make_response from flask import Blueprint, request, render_template, Response, g, make_response, abort
from flask_login import current_user from flask_login import current_user
from sqlalchemy.sql.expression import func, text, or_, and_ from sqlalchemy.sql.expression import func, text, or_, and_
from werkzeug.security import check_password_hash from werkzeug.security import check_password_hash
from . import constants, logger, config, db, ub, services from . import constants, logger, config, db, calibre_db, ub, services, get_locale, isoLanguages
from .helper import fill_indexpage, get_download_link, get_book_cover from .helper import get_download_link, get_book_cover
from .pagination import Pagination from .pagination import Pagination
from .web import common_filters, get_search_results, render_read_books, download_required from .web import render_read_books, download_required, load_user_from_request
from flask_babel import gettext as _
from babel import Locale as LC
from babel.core import UnknownLocaleError
opds = Blueprint('opds', __name__) opds = Blueprint('opds', __name__)
@ -50,11 +51,25 @@ def requires_basic_auth_if_no_ano(f):
if not auth or auth.type != 'basic' or not check_auth(auth.username, auth.password): if not auth or auth.type != 'basic' or not check_auth(auth.username, auth.password):
return authenticate() return authenticate()
return f(*args, **kwargs) return f(*args, **kwargs)
if config.config_login_type == constants.LOGIN_LDAP and services.ldap: if config.config_login_type == constants.LOGIN_LDAP and services.ldap and config.config_anonbrowse != 1:
return services.ldap.basic_auth_required(f) return services.ldap.basic_auth_required(f)
return decorated return decorated
class FeedObject:
def __init__(self, rating_id, rating_name):
self.rating_id = rating_id
self.rating_name = rating_name
@property
def id(self):
return self.rating_id
@property
def name(self):
return self.rating_name
@opds.route("/opds/") @opds.route("/opds/")
@opds.route("/opds") @opds.route("/opds")
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
@ -85,15 +100,15 @@ def feed_normal_search():
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
def feed_new(): def feed_new():
off = request.args.get("offset") or 0 off = request.args.get("offset") or 0
entries, __, pagination = fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
db.Books, True, [db.Books.timestamp.desc()]) db.Books, True, [db.Books.timestamp.desc()])
return render_xml_template('feed.xml', entries=entries, pagination=pagination) return render_xml_template('feed.xml', entries=entries, pagination=pagination)
@opds.route("/opds/discover") @opds.route("/opds/discover")
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
def feed_discover(): def feed_discover():
entries = db.session.query(db.Books).filter(common_filters()).order_by(func.random())\ entries = calibre_db.session.query(db.Books).filter(calibre_db.common_filters()).order_by(func.random())\
.limit(config.config_books_per_page) .limit(config.config_books_per_page)
pagination = Pagination(1, config.config_books_per_page, int(config.config_books_per_page)) pagination = Pagination(1, config.config_books_per_page, int(config.config_books_per_page))
return render_xml_template('feed.xml', entries=entries, pagination=pagination) return render_xml_template('feed.xml', entries=entries, pagination=pagination)
@ -103,8 +118,9 @@ def feed_discover():
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
def feed_best_rated(): def feed_best_rated():
off = request.args.get("offset") or 0 off = request.args.get("offset") or 0
entries, __, pagination = fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
db.Books, db.Books.ratings.any(db.Ratings.rating > 9), [db.Books.timestamp.desc()]) db.Books, db.Books.ratings.any(db.Ratings.rating > 9),
[db.Books.timestamp.desc()])
return render_xml_template('feed.xml', entries=entries, pagination=pagination) return render_xml_template('feed.xml', entries=entries, pagination=pagination)
@ -117,16 +133,13 @@ def feed_hot():
hot_books = all_books.offset(off).limit(config.config_books_per_page) hot_books = all_books.offset(off).limit(config.config_books_per_page)
entries = list() entries = list()
for book in hot_books: for book in hot_books:
downloadBook = db.session.query(db.Books).filter(db.Books.id == book.Downloads.book_id).first() downloadBook = calibre_db.get_book(book.Downloads.book_id)
if downloadBook: if downloadBook:
entries.append( entries.append(
db.session.query(db.Books).filter(common_filters()) calibre_db.get_filtered_book(book.Downloads.book_id)
.filter(db.Books.id == book.Downloads.book_id).first()
) )
else: else:
ub.delete_download(book.Downloads.book_id) ub.delete_download(book.Downloads.book_id)
# ub.session.query(ub.Downloads).filter(book.Downloads.book_id == ub.Downloads.book_id).delete()
# ub.session.commit()
numBooks = entries.__len__() numBooks = entries.__len__()
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1),
config.config_books_per_page, numBooks) config.config_books_per_page, numBooks)
@ -137,10 +150,13 @@ def feed_hot():
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
def feed_authorindex(): def feed_authorindex():
off = request.args.get("offset") or 0 off = request.args.get("offset") or 0
entries = db.session.query(db.Authors).join(db.books_authors_link).join(db.Books).filter(common_filters())\ entries = calibre_db.session.query(db.Authors).join(db.books_authors_link).join(db.Books)\
.group_by(text('books_authors_link.author')).order_by(db.Authors.sort).limit(config.config_books_per_page).offset(off) .filter(calibre_db.common_filters())\
.group_by(text('books_authors_link.author'))\
.order_by(db.Authors.sort).limit(config.config_books_per_page)\
.offset(off)
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page, pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
len(db.session.query(db.Authors).all())) len(calibre_db.session.query(db.Authors).all()))
return render_xml_template('feed.xml', listelements=entries, folder='opds.feed_author', pagination=pagination) return render_xml_template('feed.xml', listelements=entries, folder='opds.feed_author', pagination=pagination)
@ -148,8 +164,10 @@ def feed_authorindex():
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
def feed_author(book_id): def feed_author(book_id):
off = request.args.get("offset") or 0 off = request.args.get("offset") or 0
entries, __, pagination = fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
db.Books, db.Books.authors.any(db.Authors.id == book_id), [db.Books.timestamp.desc()]) db.Books,
db.Books.authors.any(db.Authors.id == book_id),
[db.Books.timestamp.desc()])
return render_xml_template('feed.xml', entries=entries, pagination=pagination) return render_xml_template('feed.xml', entries=entries, pagination=pagination)
@ -157,10 +175,14 @@ def feed_author(book_id):
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
def feed_publisherindex(): def feed_publisherindex():
off = request.args.get("offset") or 0 off = request.args.get("offset") or 0
entries = db.session.query(db.Publishers).join(db.books_publishers_link).join(db.Books).filter(common_filters())\ entries = calibre_db.session.query(db.Publishers)\
.group_by(text('books_publishers_link.publisher')).order_by(db.Publishers.sort).limit(config.config_books_per_page).offset(off) .join(db.books_publishers_link)\
.join(db.Books).filter(calibre_db.common_filters())\
.group_by(text('books_publishers_link.publisher'))\
.order_by(db.Publishers.sort)\
.limit(config.config_books_per_page).offset(off)
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page, pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
len(db.session.query(db.Publishers).all())) len(calibre_db.session.query(db.Publishers).all()))
return render_xml_template('feed.xml', listelements=entries, folder='opds.feed_publisher', pagination=pagination) return render_xml_template('feed.xml', listelements=entries, folder='opds.feed_publisher', pagination=pagination)
@ -168,9 +190,10 @@ def feed_publisherindex():
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
def feed_publisher(book_id): def feed_publisher(book_id):
off = request.args.get("offset") or 0 off = request.args.get("offset") or 0
entries, __, pagination = fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
db.Books, db.Books.publishers.any(db.Publishers.id == book_id), db.Books,
[db.Books.timestamp.desc()]) db.Books.publishers.any(db.Publishers.id == book_id),
[db.Books.timestamp.desc()])
return render_xml_template('feed.xml', entries=entries, pagination=pagination) return render_xml_template('feed.xml', entries=entries, pagination=pagination)
@ -178,10 +201,16 @@ def feed_publisher(book_id):
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
def feed_categoryindex(): def feed_categoryindex():
off = request.args.get("offset") or 0 off = request.args.get("offset") or 0
entries = db.session.query(db.Tags).join(db.books_tags_link).join(db.Books).filter(common_filters())\ entries = calibre_db.session.query(db.Tags)\
.group_by(text('books_tags_link.tag')).order_by(db.Tags.name).offset(off).limit(config.config_books_per_page) .join(db.books_tags_link)\
.join(db.Books)\
.filter(calibre_db.common_filters())\
.group_by(text('books_tags_link.tag'))\
.order_by(db.Tags.name)\
.offset(off)\
.limit(config.config_books_per_page)
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page, pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
len(db.session.query(db.Tags).all())) len(calibre_db.session.query(db.Tags).all()))
return render_xml_template('feed.xml', listelements=entries, folder='opds.feed_category', pagination=pagination) return render_xml_template('feed.xml', listelements=entries, folder='opds.feed_category', pagination=pagination)
@ -189,8 +218,10 @@ def feed_categoryindex():
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
def feed_category(book_id): def feed_category(book_id):
off = request.args.get("offset") or 0 off = request.args.get("offset") or 0
entries, __, pagination = fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
db.Books, db.Books.tags.any(db.Tags.id == book_id), [db.Books.timestamp.desc()]) db.Books,
db.Books.tags.any(db.Tags.id == book_id),
[db.Books.timestamp.desc()])
return render_xml_template('feed.xml', entries=entries, pagination=pagination) return render_xml_template('feed.xml', entries=entries, pagination=pagination)
@ -198,10 +229,15 @@ def feed_category(book_id):
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
def feed_seriesindex(): def feed_seriesindex():
off = request.args.get("offset") or 0 off = request.args.get("offset") or 0
entries = db.session.query(db.Series).join(db.books_series_link).join(db.Books).filter(common_filters())\ entries = calibre_db.session.query(db.Series)\
.group_by(text('books_series_link.series')).order_by(db.Series.sort).offset(off).all() .join(db.books_series_link)\
.join(db.Books)\
.filter(calibre_db.common_filters())\
.group_by(text('books_series_link.series'))\
.order_by(db.Series.sort)\
.offset(off).all()
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page, pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
len(db.session.query(db.Series).all())) len(calibre_db.session.query(db.Series).all()))
return render_xml_template('feed.xml', listelements=entries, folder='opds.feed_series', pagination=pagination) return render_xml_template('feed.xml', listelements=entries, folder='opds.feed_series', pagination=pagination)
@ -209,22 +245,112 @@ def feed_seriesindex():
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
def feed_series(book_id): def feed_series(book_id):
off = request.args.get("offset") or 0 off = request.args.get("offset") or 0
entries, __, pagination = fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
db.Books, db.Books.series.any(db.Series.id == book_id), [db.Books.series_index]) db.Books,
db.Books.series.any(db.Series.id == book_id),
[db.Books.series_index])
return render_xml_template('feed.xml', entries=entries, pagination=pagination) return render_xml_template('feed.xml', entries=entries, pagination=pagination)
@opds.route("/opds/shelfindex/", defaults={'public': 0}) @opds.route("/opds/ratings")
@opds.route("/opds/shelfindex/<string:public>")
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
def feed_shelfindex(public): def feed_ratingindex():
off = request.args.get("offset") or 0 off = request.args.get("offset") or 0
if public != 0: entries = calibre_db.session.query(db.Ratings, func.count('books_ratings_link.book').label('count'),
shelf = g.public_shelfes (db.Ratings.rating / 2).label('name')) \
number = len(shelf) .join(db.books_ratings_link)\
.join(db.Books)\
.filter(calibre_db.common_filters()) \
.group_by(text('books_ratings_link.rating'))\
.order_by(db.Ratings.rating).all()
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
len(entries))
element = list()
for entry in entries:
element.append(FeedObject(entry[0].id, "{} Stars".format(entry.name)))
return render_xml_template('feed.xml', listelements=element, folder='opds.feed_ratings', pagination=pagination)
@opds.route("/opds/ratings/<book_id>")
@requires_basic_auth_if_no_ano
def feed_ratings(book_id):
off = request.args.get("offset") or 0
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
db.Books,
db.Books.ratings.any(db.Ratings.id == book_id),
[db.Books.timestamp.desc()])
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
@opds.route("/opds/formats")
@requires_basic_auth_if_no_ano
def feed_formatindex():
off = request.args.get("offset") or 0
entries = calibre_db.session.query(db.Data).join(db.Books)\
.filter(calibre_db.common_filters()) \
.group_by(db.Data.format)\
.order_by(db.Data.format).all()
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
len(entries))
element = list()
for entry in entries:
element.append(FeedObject(entry.format, entry.format))
return render_xml_template('feed.xml', listelements=element, folder='opds.feed_format', pagination=pagination)
@opds.route("/opds/formats/<book_id>")
@requires_basic_auth_if_no_ano
def feed_format(book_id):
off = request.args.get("offset") or 0
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
db.Books,
db.Books.data.any(db.Data.format == book_id.upper()),
[db.Books.timestamp.desc()])
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
@opds.route("/opds/language")
@opds.route("/opds/language/")
@requires_basic_auth_if_no_ano
def feed_languagesindex():
off = request.args.get("offset") or 0
if current_user.filter_language() == u"all":
languages = calibre_db.speaking_language()
else: else:
shelf = g.user.shelf try:
number = shelf.count() cur_l = LC.parse(current_user.filter_language())
except UnknownLocaleError:
cur_l = None
languages = calibre_db.session.query(db.Languages).filter(
db.Languages.lang_code == current_user.filter_language()).all()
if cur_l:
languages[0].name = cur_l.get_language_name(get_locale())
else:
languages[0].name = _(isoLanguages.get(part3=languages[0].lang_code).name)
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
len(languages))
return render_xml_template('feed.xml', listelements=languages, folder='opds.feed_languages', pagination=pagination)
@opds.route("/opds/language/<int:book_id>")
@requires_basic_auth_if_no_ano
def feed_languages(book_id):
off = request.args.get("offset") or 0
entries, __, pagination = calibre_db.fill_indexpage((int(off) / (int(config.config_books_per_page)) + 1), 0,
db.Books,
db.Books.languages.any(db.Languages.id == book_id),
[db.Books.timestamp.desc()])
return render_xml_template('feed.xml', entries=entries, pagination=pagination)
@opds.route("/opds/shelfindex")
@requires_basic_auth_if_no_ano
def feed_shelfindex():
off = request.args.get("offset") or 0
shelf = g.shelves_access
number = len(shelf)
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page, pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
number) number)
return render_xml_template('feed.xml', listelements=shelf, folder='opds.feed_shelf', pagination=pagination) return render_xml_template('feed.xml', listelements=shelf, folder='opds.feed_shelf', pagination=pagination)
@ -235,7 +361,8 @@ def feed_shelfindex(public):
def feed_shelf(book_id): def feed_shelf(book_id):
off = request.args.get("offset") or 0 off = request.args.get("offset") or 0
if current_user.is_anonymous: if current_user.is_anonymous:
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.is_public == 1, ub.Shelf.id == book_id).first() shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.is_public == 1,
ub.Shelf.id == book_id).first()
else: else:
shelf = ub.session.query(ub.Shelf).filter(or_(and_(ub.Shelf.user_id == int(current_user.id), shelf = ub.session.query(ub.Shelf).filter(or_(and_(ub.Shelf.user_id == int(current_user.id),
ub.Shelf.id == book_id), ub.Shelf.id == book_id),
@ -247,25 +374,34 @@ def feed_shelf(book_id):
books_in_shelf = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == book_id).order_by( books_in_shelf = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == book_id).order_by(
ub.BookShelf.order.asc()).all() ub.BookShelf.order.asc()).all()
for book in books_in_shelf: for book in books_in_shelf:
cur_book = db.session.query(db.Books).filter(db.Books.id == book.book_id).first() cur_book = calibre_db.get_book(book.book_id)
result.append(cur_book) result.append(cur_book)
pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page, pagination = Pagination((int(off) / (int(config.config_books_per_page)) + 1), config.config_books_per_page,
len(result)) len(result))
return render_xml_template('feed.xml', entries=result, pagination=pagination) return render_xml_template('feed.xml', entries=result, pagination=pagination)
@opds.route("/opds/download/<book_id>/<book_format>/") @opds.route("/opds/download/<book_id>/<book_format>/")
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
@download_required
def opds_download_link(book_id, book_format): def opds_download_link(book_id, book_format):
return get_download_link(book_id,book_format) # I gave up with this: With enabled ldap login, the user doesn't get logged in, therefore it's always guest
# workaround, loading the user from the request and checking it's download rights here
# in case of anonymous browsing user is None
user = load_user_from_request(request) or current_user
if not user.role_download():
return abort(403)
if "Kobo" in request.headers.get('User-Agent'):
client = "kobo"
else:
client = ""
return get_download_link(book_id, book_format.lower(), client)
@opds.route("/ajax/book/<string:uuid>/<library>") @opds.route("/ajax/book/<string:uuid>/<library>")
@opds.route("/ajax/book/<string:uuid>") @opds.route("/ajax/book/<string:uuid>", defaults={'library': ""})
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
def get_metadata_calibre_companion(uuid, library): def get_metadata_calibre_companion(uuid, library):
entry = db.session.query(db.Books).filter(db.Books.uuid.like("%" + uuid + "%")).first() entry = calibre_db.session.query(db.Books).filter(db.Books.uuid.like("%" + uuid + "%")).first()
if entry is not None: if entry is not None:
js = render_template('json.txt', entry=entry) js = render_template('json.txt', entry=entry)
response = make_response(js) response = make_response(js)
@ -277,17 +413,20 @@ def get_metadata_calibre_companion(uuid, library):
def feed_search(term): def feed_search(term):
if term: if term:
term = term.strip().lower() entries, __, ___ = calibre_db.get_search_results(term)
entries = get_search_results( term)
entriescount = len(entries) if len(entries) > 0 else 1 entriescount = len(entries) if len(entries) > 0 else 1
pagination = Pagination(1, entriescount, entriescount) pagination = Pagination(1, entriescount, entriescount)
return render_xml_template('feed.xml', searchterm=term, entries=entries, pagination=pagination) return render_xml_template('feed.xml', searchterm=term, entries=entries, pagination=pagination)
else: else:
return render_xml_template('feed.xml', searchterm="") return render_xml_template('feed.xml', searchterm="")
def check_auth(username, password): def check_auth(username, password):
if sys.version_info.major == 3: if sys.version_info.major == 3:
username=username.encode('windows-1252') try:
username = username.encode('windows-1252')
except UnicodeEncodeError:
username = username.encode('utf-8')
user = ub.session.query(ub.User).filter(func.lower(ub.User.nickname) == user = ub.session.query(ub.User).filter(func.lower(ub.User.nickname) ==
username.decode('utf-8').lower()).first() username.decode('utf-8').lower()).first()
return bool(user and check_password_hash(str(user.password), password)) return bool(user and check_password_hash(str(user.password), password))
@ -301,13 +440,14 @@ def authenticate():
def render_xml_template(*args, **kwargs): def render_xml_template(*args, **kwargs):
#ToDo: return time in current timezone similar to %z # ToDo: return time in current timezone similar to %z
currtime = datetime.datetime.now().strftime("%Y-%m-%dT%H:%M:%S+00:00") currtime = datetime.datetime.now().strftime("%Y-%m-%dT%H:%M:%S+00:00")
xml = render_template(current_time=currtime, instance=config.config_calibre_web_title, *args, **kwargs) xml = render_template(current_time=currtime, instance=config.config_calibre_web_title, *args, **kwargs)
response = make_response(xml) response = make_response(xml)
response.headers["Content-Type"] = "application/atom+xml; charset=utf-8" response.headers["Content-Type"] = "application/atom+xml; charset=utf-8"
return response return response
@opds.route("/opds/thumb_240_240/<book_id>") @opds.route("/opds/thumb_240_240/<book_id>")
@opds.route("/opds/cover_240_240/<book_id>") @opds.route("/opds/cover_240_240/<book_id>")
@opds.route("/opds/cover_90_90/<book_id>") @opds.route("/opds/cover_90_90/<book_id>")
@ -316,15 +456,18 @@ def render_xml_template(*args, **kwargs):
def feed_get_cover(book_id): def feed_get_cover(book_id):
return get_book_cover(book_id) return get_book_cover(book_id)
@opds.route("/opds/readbooks/")
@opds.route("/opds/readbooks")
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
def feed_read_books(): def feed_read_books():
off = request.args.get("offset") or 0 off = request.args.get("offset") or 0
return render_read_books(int(off) / (int(config.config_books_per_page)) + 1, True, True) result, pagination = render_read_books(int(off) / (int(config.config_books_per_page)) + 1, True, True)
return render_xml_template('feed.xml', entries=result, pagination=pagination)
@opds.route("/opds/unreadbooks/") @opds.route("/opds/unreadbooks")
@requires_basic_auth_if_no_ano @requires_basic_auth_if_no_ano
def feed_unread_books(): def feed_unread_books():
off = request.args.get("offset") or 0 off = request.args.get("offset") or 0
return render_read_books(int(off) / (int(config.config_books_per_page)) + 1, False, True) result, pagination = render_read_books(int(off) / (int(config.config_books_per_page)) + 1, False, True)
return render_xml_template('feed.xml', entries=result, pagination=pagination)

View File

@ -1,4 +1,3 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web) # This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)

View File

@ -1,4 +1,3 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# Flask License # Flask License

View File

@ -1,4 +1,3 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# Flask License # Flask License
@ -60,10 +59,13 @@ class ReverseProxied(object):
def __init__(self, application): def __init__(self, application):
self.app = application self.app = application
self.proxied = False
def __call__(self, environ, start_response): def __call__(self, environ, start_response):
self.proxied = False
script_name = environ.get('HTTP_X_SCRIPT_NAME', '') script_name = environ.get('HTTP_X_SCRIPT_NAME', '')
if script_name: if script_name:
self.proxied = True
environ['SCRIPT_NAME'] = script_name environ['SCRIPT_NAME'] = script_name
path_info = environ.get('PATH_INFO', '') path_info = environ.get('PATH_INFO', '')
if path_info and path_info.startswith(script_name): if path_info and path_info.startswith(script_name):
@ -75,4 +77,9 @@ class ReverseProxied(object):
servr = environ.get('HTTP_X_FORWARDED_HOST', '') servr = environ.get('HTTP_X_FORWARDED_HOST', '')
if servr: if servr:
environ['HTTP_HOST'] = servr environ['HTTP_HOST'] = servr
self.proxied = True
return self.app(environ, start_response) return self.app(environ, start_response)
@property
def is_proxied(self):
return self.proxied

View File

@ -27,6 +27,8 @@ try:
from gevent.pywsgi import WSGIServer from gevent.pywsgi import WSGIServer
from gevent.pool import Pool from gevent.pool import Pool
from gevent import __version__ as _version from gevent import __version__ as _version
from greenlet import GreenletExit
import ssl
VERSION = 'Gevent ' + _version VERSION = 'Gevent ' + _version
_GEVENT = True _GEVENT = True
except ImportError: except ImportError:
@ -43,7 +45,6 @@ from . import logger
log = logger.create() log = logger.create()
def _readable_listen_address(address, port): def _readable_listen_address(address, port):
if ':' in address: if ':' in address:
address = "[" + address + "]" address = "[" + address + "]"
@ -73,7 +74,11 @@ class WebServer(object):
if config.config_access_log: if config.config_access_log:
log_name = "gevent.access" if _GEVENT else "tornado.access" log_name = "gevent.access" if _GEVENT else "tornado.access"
formatter = logger.ACCESS_FORMATTER_GEVENT if _GEVENT else logger.ACCESS_FORMATTER_TORNADO formatter = logger.ACCESS_FORMATTER_GEVENT if _GEVENT else logger.ACCESS_FORMATTER_TORNADO
self.access_logger = logger.create_access_log(config.config_access_logfile, log_name, formatter) self.access_logger, logfile = logger.create_access_log(config.config_access_logfile, log_name, formatter)
if logfile != config.config_access_logfile:
log.warning("Accesslog path %s not valid, falling back to default", config.config_access_logfile)
config.config_access_logfile = logfile
config.save()
else: else:
if not _GEVENT: if not _GEVENT:
logger.get('tornado.access').disabled = True logger.get('tornado.access').disabled = True
@ -84,7 +89,8 @@ class WebServer(object):
if os.path.isfile(certfile_path) and os.path.isfile(keyfile_path): if os.path.isfile(certfile_path) and os.path.isfile(keyfile_path):
self.ssl_args = dict(certfile=certfile_path, keyfile=keyfile_path) self.ssl_args = dict(certfile=certfile_path, keyfile=keyfile_path)
else: else:
log.warning('The specified paths for the ssl certificate file and/or key file seem to be broken. Ignoring ssl.') log.warning('The specified paths for the ssl certificate file and/or key file seem to be broken. '
'Ignoring ssl.')
log.warning('Cert path: %s', certfile_path) log.warning('Cert path: %s', certfile_path)
log.warning('Key path: %s', keyfile_path) log.warning('Key path: %s', keyfile_path)
@ -139,6 +145,16 @@ class WebServer(object):
output = _readable_listen_address(self.listen_address, self.listen_port) output = _readable_listen_address(self.listen_address, self.listen_port)
log.info('Starting Gevent server on %s', output) log.info('Starting Gevent server on %s', output)
self.wsgiserver = WSGIServer(sock, self.app, log=self.access_logger, spawn=Pool(), **ssl_args) self.wsgiserver = WSGIServer(sock, self.app, log=self.access_logger, spawn=Pool(), **ssl_args)
if ssl_args:
wrap_socket = self.wsgiserver.wrap_socket
def my_wrap_socket(*args, **kwargs):
try:
return wrap_socket(*args, **kwargs)
except (ssl.SSLError, OSError) as ex:
log.warning('Gevent SSL Error: %s', ex)
raise GreenletExit
self.wsgiserver.wrap_socket = my_wrap_socket
self.wsgiserver.serve_forever() self.wsgiserver.serve_forever()
finally: finally:
if self.unix_socket_file: if self.unix_socket_file:
@ -146,7 +162,7 @@ class WebServer(object):
self.unix_socket_file = None self.unix_socket_file = None
def _start_tornado(self): def _start_tornado(self):
if os.name == 'nt': if os.name == 'nt' and sys.version_info > (3, 7):
import asyncio import asyncio
asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy()) asyncio.set_event_loop_policy(asyncio.WindowsSelectorEventLoopPolicy())
log.info('Starting Tornado server on %s', _readable_listen_address(self.listen_address, self.listen_port)) log.info('Starting Tornado server on %s', _readable_listen_address(self.listen_address, self.listen_port))
@ -156,7 +172,7 @@ class WebServer(object):
max_buffer_size=209700000, max_buffer_size=209700000,
ssl_options=self.ssl_args) ssl_options=self.ssl_args)
http_server.listen(self.listen_port, self.listen_address) http_server.listen(self.listen_port, self.listen_address)
self.wsgiserver = IOLoop.instance() self.wsgiserver = IOLoop.current()
self.wsgiserver.start() self.wsgiserver.start()
# wait for stop signal # wait for stop signal
self.wsgiserver.close(True) self.wsgiserver.close(True)
@ -171,12 +187,15 @@ class WebServer(object):
except Exception as ex: except Exception as ex:
log.error("Error starting server: %s", ex) log.error("Error starting server: %s", ex)
print("Error starting server: %s" % ex) print("Error starting server: %s" % ex)
self.stop()
return False return False
finally: finally:
self.wsgiserver = None self.wsgiserver = None
if not self.restart: if not self.restart:
log.info("Performing shutdown of Calibre-Web") log.info("Performing shutdown of Calibre-Web")
# prevent irritiating log of pending tasks message from asyncio
logger.get('asyncio').setLevel(logger.logging.CRITICAL)
return True return True
log.info("Performing restart of Calibre-Web") log.info("Performing restart of Calibre-Web")
@ -187,14 +206,17 @@ class WebServer(object):
os.execv(sys.executable, arguments) os.execv(sys.executable, arguments)
return True return True
def _killServer(self, ignored_signum, ignored_frame): def _killServer(self, __, ___):
self.stop() self.stop()
def stop(self, restart=False): def stop(self, restart=False):
from . import updater_thread
updater_thread.stop()
log.info("webserver stop (restart=%s)", restart) log.info("webserver stop (restart=%s)", restart)
self.restart = restart self.restart = restart
if self.wsgiserver: if self.wsgiserver:
if _GEVENT: if _GEVENT:
self.wsgiserver.close() self.wsgiserver.close()
else: else:
self.wsgiserver.add_callback(self.wsgiserver.stop) self.wsgiserver.add_callback_from_signal(self.wsgiserver.stop)

176
cps/services/SyncToken.py Normal file
View File

@ -0,0 +1,176 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*-
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
# Copyright (C) 2018-2019 shavitmichael, OzzieIsaacs
#
# This program is free software: you can redistribute it and/or modify
# it under the terms of the GNU General Public License as published by
# the Free Software Foundation, either version 3 of the License, or
# (at your option) any later version.
#
# This program is distributed in the hope that it will be useful,
# but WITHOUT ANY WARRANTY; without even the implied warranty of
# MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
# GNU General Public License for more details.
#
# You should have received a copy of the GNU General Public License
# along with this program. If not, see <http://www.gnu.org/licenses/>.
import sys
from base64 import b64decode, b64encode
from jsonschema import validate, exceptions, __version__
from datetime import datetime
try:
from urllib import unquote
except ImportError:
from urllib.parse import unquote
from flask import json
from .. import logger
log = logger.create()
def b64encode_json(json_data):
if sys.version_info < (3, 0):
return b64encode(json.dumps(json_data))
else:
return b64encode(json.dumps(json_data).encode())
# Python3 has a timestamp() method we could be calling, however it's not avaiable in python2.
def to_epoch_timestamp(datetime_object):
return (datetime_object - datetime(1970, 1, 1)).total_seconds()
def get_datetime_from_json(json_object, field_name):
try:
return datetime.utcfromtimestamp(json_object[field_name])
except (KeyError, OSError, OverflowError):
# OSError is thrown on Windows if timestamp is <1970 or >2038
return datetime.min
class SyncToken:
""" The SyncToken is used to persist state accross requests.
When serialized over the response headers, the Kobo device will propagate the token onto following
requests to the service. As an example use-case, the SyncToken is used to detect books that have been added
to the library since the last time the device synced to the server.
Attributes:
books_last_created: Datetime representing the newest book that the device knows about.
books_last_modified: Datetime representing the last modified book that the device knows about.
"""
SYNC_TOKEN_HEADER = "x-kobo-synctoken"
VERSION = "1-1-0"
LAST_MODIFIED_ADDED_VERSION = "1-1-0"
MIN_VERSION = "1-0-0"
token_schema = {
"type": "object",
"properties": {"version": {"type": "string"}, "data": {"type": "object"}, },
}
# This Schema doesn't contain enough information to detect and propagate book deletions from Calibre to the device.
# A potential solution might be to keep a list of all known book uuids in the token, and look for any missing
# from the db.
data_schema_v1 = {
"type": "object",
"properties": {
"raw_kobo_store_token": {"type": "string"},
"books_last_modified": {"type": "string"},
"books_last_created": {"type": "string"},
"archive_last_modified": {"type": "string"},
"reading_state_last_modified": {"type": "string"},
"tags_last_modified": {"type": "string"},
},
}
def __init__(
self,
raw_kobo_store_token="",
books_last_created=datetime.min,
books_last_modified=datetime.min,
archive_last_modified=datetime.min,
reading_state_last_modified=datetime.min,
tags_last_modified=datetime.min,
):
self.raw_kobo_store_token = raw_kobo_store_token
self.books_last_created = books_last_created
self.books_last_modified = books_last_modified
self.archive_last_modified = archive_last_modified
self.reading_state_last_modified = reading_state_last_modified
self.tags_last_modified = tags_last_modified
@staticmethod
def from_headers(headers):
sync_token_header = headers.get(SyncToken.SYNC_TOKEN_HEADER, "")
if sync_token_header == "":
return SyncToken()
# On the first sync from a Kobo device, we may receive the SyncToken
# from the official Kobo store. Without digging too deep into it, that
# token is of the form [b64encoded blob].[b64encoded blob 2]
if "." in sync_token_header:
return SyncToken(raw_kobo_store_token=sync_token_header)
try:
sync_token_json = json.loads(
b64decode(sync_token_header + "=" * (-len(sync_token_header) % 4))
)
validate(sync_token_json, SyncToken.token_schema)
if sync_token_json["version"] < SyncToken.MIN_VERSION:
raise ValueError
data_json = sync_token_json["data"]
validate(sync_token_json, SyncToken.data_schema_v1)
except (exceptions.ValidationError, ValueError):
log.error("Sync token contents do not follow the expected json schema.")
return SyncToken()
raw_kobo_store_token = data_json["raw_kobo_store_token"]
try:
books_last_modified = get_datetime_from_json(data_json, "books_last_modified")
books_last_created = get_datetime_from_json(data_json, "books_last_created")
archive_last_modified = get_datetime_from_json(data_json, "archive_last_modified")
reading_state_last_modified = get_datetime_from_json(data_json, "reading_state_last_modified")
tags_last_modified = get_datetime_from_json(data_json, "tags_last_modified")
except TypeError:
log.error("SyncToken timestamps don't parse to a datetime.")
return SyncToken(raw_kobo_store_token=raw_kobo_store_token)
return SyncToken(
raw_kobo_store_token=raw_kobo_store_token,
books_last_created=books_last_created,
books_last_modified=books_last_modified,
archive_last_modified=archive_last_modified,
reading_state_last_modified=reading_state_last_modified,
tags_last_modified=tags_last_modified
)
def set_kobo_store_header(self, store_headers):
store_headers.set(SyncToken.SYNC_TOKEN_HEADER, self.raw_kobo_store_token)
def merge_from_store_response(self, store_response):
self.raw_kobo_store_token = store_response.headers.get(
SyncToken.SYNC_TOKEN_HEADER, ""
)
def to_headers(self, headers):
headers[SyncToken.SYNC_TOKEN_HEADER] = self.build_sync_token()
def build_sync_token(self):
token = {
"version": SyncToken.VERSION,
"data": {
"raw_kobo_store_token": self.raw_kobo_store_token,
"books_last_modified": to_epoch_timestamp(self.books_last_modified),
"books_last_created": to_epoch_timestamp(self.books_last_created),
"archive_last_modified": to_epoch_timestamp(self.archive_last_modified),
"reading_state_last_modified": to_epoch_timestamp(self.reading_state_last_modified),
"tags_last_modified": to_epoch_timestamp(self.tags_last_modified)
},
}
return b64encode_json(token)

View File

@ -26,13 +26,22 @@ log = logger.create()
try: from . import goodreads_support try: from . import goodreads_support
except ImportError as err: except ImportError as err:
log.debug("cannot import goodreads, showing authors-metadata will not work: %s", err) log.debug("Cannot import goodreads, showing authors-metadata will not work: %s", err)
goodreads_support = None goodreads_support = None
try: from . import simpleldap as ldap try:
from . import simpleldap as ldap
from .simpleldap import ldapVersion
except ImportError as err: except ImportError as err:
log.debug("cannot import simpleldap, logging in with ldap will not work: %s", err) log.debug("Cannot import simpleldap, logging in with ldap will not work: %s", err)
ldap = None ldap = None
ldapVersion = None
try:
from . import SyncToken as SyncToken
kobo = True
except ImportError as err:
log.debug("Cannot import SyncToken, syncing books with Kobo Devices will not work: %s", err)
kobo = None
SyncToken = None

View File

@ -20,7 +20,10 @@ from __future__ import division, print_function, unicode_literals
import time import time
from functools import reduce from functools import reduce
from goodreads.client import GoodreadsClient try:
from goodreads.client import GoodreadsClient
except ImportError:
from betterreads.client import GoodreadsClient
try: import Levenshtein try: import Levenshtein
except ImportError: Levenshtein = False except ImportError: Levenshtein = False
@ -69,7 +72,7 @@ def get_author_info(author_name):
author_info = _client.find_author(author_name=author_name) author_info = _client.find_author(author_name=author_name)
except Exception as ex: except Exception as ex:
# Skip goodreads, if site is down/inaccessible # Skip goodreads, if site is down/inaccessible
log.warning('Goodreads website is down/inaccessible? %s', ex) log.warning('Goodreads website is down/inaccessible? %s', ex.__str__())
return return
if author_info: if author_info:
@ -95,8 +98,12 @@ def get_other_books(author_info, library_books=None):
for book in author_info.books: for book in author_info.books:
if book.isbn in identifiers: if book.isbn in identifiers:
continue continue
if book.gid["#text"] in identifiers: if isinstance(book.gid, int):
continue if book.gid in identifiers:
continue
else:
if book.gid["#text"] in identifiers:
continue
if Levenshtein and library_titles: if Levenshtein and library_titles:
goodreads_title = book._book_dict['title_without_series'] goodreads_title = book._book_dict['title_without_series']

View File

@ -20,9 +20,13 @@ from __future__ import division, print_function, unicode_literals
import base64 import base64
from flask_simpleldap import LDAP, LDAPException from flask_simpleldap import LDAP, LDAPException
from flask_simpleldap import ldap as pyLDAP
from .. import constants, logger from .. import constants, logger
try:
from ldap.pkginfo import __version__ as ldapVersion
except ImportError:
pass
log = logger.create() log = logger.create()
_ldap = LDAP() _ldap = LDAP()
@ -34,44 +38,91 @@ def init_app(app, config):
app.config['LDAP_HOST'] = config.config_ldap_provider_url app.config['LDAP_HOST'] = config.config_ldap_provider_url
app.config['LDAP_PORT'] = config.config_ldap_port app.config['LDAP_PORT'] = config.config_ldap_port
app.config['LDAP_SCHEMA'] = config.config_ldap_schema app.config['LDAP_CUSTOM_OPTIONS'] = {pyLDAP.OPT_REFERRALS: 0}
app.config['LDAP_USERNAME'] = config.config_ldap_user_object.replace('%s', config.config_ldap_serv_username)\ if config.config_ldap_encryption == 2:
+ ',' + config.config_ldap_dn app.config['LDAP_SCHEMA'] = 'ldaps'
app.config['LDAP_PASSWORD'] = base64.b64decode(config.config_ldap_serv_password) else:
app.config['LDAP_REQUIRE_CERT'] = bool(config.config_ldap_require_cert) app.config['LDAP_SCHEMA'] = 'ldap'
if config.config_ldap_require_cert: if config.config_ldap_authentication > constants.LDAP_AUTH_ANONYMOUS:
app.config['LDAP_CERT_PATH'] = config.config_ldap_cert_path if config.config_ldap_authentication > constants.LDAP_AUTH_UNAUTHENTICATE:
if config.config_ldap_serv_password is None:
config.config_ldap_serv_password = ''
app.config['LDAP_PASSWORD'] = base64.b64decode(config.config_ldap_serv_password)
else:
app.config['LDAP_PASSWORD'] = base64.b64decode("")
app.config['LDAP_USERNAME'] = config.config_ldap_serv_username
else:
app.config['LDAP_USERNAME'] = ""
app.config['LDAP_PASSWORD'] = base64.b64decode("")
if bool(config.config_ldap_cert_path):
app.config['LDAP_CUSTOM_OPTIONS'].update({
pyLDAP.OPT_X_TLS_REQUIRE_CERT: pyLDAP.OPT_X_TLS_DEMAND,
pyLDAP.OPT_X_TLS_CACERTFILE: config.config_ldap_cacert_path,
pyLDAP.OPT_X_TLS_CERTFILE: config.config_ldap_cert_path,
pyLDAP.OPT_X_TLS_KEYFILE: config.config_ldap_key_path,
pyLDAP.OPT_X_TLS_NEWCTX: 0
})
app.config['LDAP_BASE_DN'] = config.config_ldap_dn app.config['LDAP_BASE_DN'] = config.config_ldap_dn
app.config['LDAP_USER_OBJECT_FILTER'] = config.config_ldap_user_object app.config['LDAP_USER_OBJECT_FILTER'] = config.config_ldap_user_object
app.config['LDAP_USE_SSL'] = bool(config.config_ldap_use_ssl)
app.config['LDAP_USE_TLS'] = bool(config.config_ldap_use_tls) app.config['LDAP_USE_TLS'] = bool(config.config_ldap_encryption == 1)
app.config['LDAP_USE_SSL'] = bool(config.config_ldap_encryption == 2)
app.config['LDAP_OPENLDAP'] = bool(config.config_ldap_openldap) app.config['LDAP_OPENLDAP'] = bool(config.config_ldap_openldap)
app.config['LDAP_GROUP_OBJECT_FILTER'] = config.config_ldap_group_object_filter
app.config['LDAP_GROUP_MEMBERS_FIELD'] = config.config_ldap_group_members_field
_ldap.init_app(app) try:
_ldap.init_app(app)
except ValueError:
if bool(config.config_ldap_cert_path):
app.config['LDAP_CUSTOM_OPTIONS'].pop(pyLDAP.OPT_X_TLS_NEWCTX)
try:
_ldap.init_app(app)
except RuntimeError as e:
log.error(e)
except RuntimeError as e:
log.error(e)
def get_object_details(user=None,query_filter=None):
return _ldap.get_object_details(user, query_filter=query_filter)
def bind():
return _ldap.bind()
def get_group_members(group):
return _ldap.get_group_members(group)
def basic_auth_required(func): def basic_auth_required(func):
return _ldap.basic_auth_required(func) return _ldap.basic_auth_required(func)
def bind_user(username, password): def bind_user(username, password):
# ulf= _ldap.get_object_details('admin')
'''Attempts a LDAP login. '''Attempts a LDAP login.
:returns: True if login succeeded, False if login failed, None if server unavailable. :returns: True if login succeeded, False if login failed, None if server unavailable.
''' '''
try: try:
result = _ldap.bind_user(username, password) if _ldap.get_object_details(username):
log.debug("LDAP login '%s': %r", username, result) result = _ldap.bind_user(username, password)
return result is not None log.debug("LDAP login '%s': %r", username, result)
return result is not None, None
return None, None # User not found
except (TypeError, AttributeError, KeyError) as ex:
error = ("LDAP bind_user: %s" % ex)
return None, error
except LDAPException as ex: except LDAPException as ex:
if ex.message == 'Invalid credentials': if ex.message == 'Invalid credentials':
log.info("LDAP login '%s' failed: %s", username, ex) error = "LDAP admin login failed"
return False return None, error
if ex.message == "Can't contact LDAP server": if ex.message == "Can't contact LDAP server":
log.warning('LDAP Server down: %s', ex) # log.warning('LDAP Server down: %s', ex)
return None error = ('LDAP Server down: %s' % ex)
return None, error
else: else:
log.warning('LDAP Server error: %s', ex.message) error = ('LDAP Server error: %s' % ex.message)
return None return None, error

220
cps/services/worker.py Normal file
View File

@ -0,0 +1,220 @@
from __future__ import division, print_function, unicode_literals
import threading
import abc
import uuid
import time
try:
import queue
except ImportError:
import Queue as queue
from datetime import datetime
from collections import namedtuple
from cps import logger
log = logger.create()
# task 'status' consts
STAT_WAITING = 0
STAT_FAIL = 1
STAT_STARTED = 2
STAT_FINISH_SUCCESS = 3
# Only retain this many tasks in dequeued list
TASK_CLEANUP_TRIGGER = 20
QueuedTask = namedtuple('QueuedTask', 'num, user, added, task')
def _get_main_thread():
for t in threading.enumerate():
if t.__class__.__name__ == '_MainThread':
return t
raise Exception("main thread not found?!")
class ImprovedQueue(queue.Queue):
def to_list(self):
"""
Returns a copy of all items in the queue without removing them.
"""
with self.mutex:
return list(self.queue)
#Class for all worker tasks in the background
class WorkerThread(threading.Thread):
_instance = None
@classmethod
def getInstance(cls):
if cls._instance is None:
cls._instance = WorkerThread()
return cls._instance
def __init__(self):
threading.Thread.__init__(self)
self.dequeued = list()
self.doLock = threading.Lock()
self.queue = ImprovedQueue()
self.num = 0
self.start()
@classmethod
def add(cls, user, task):
ins = cls.getInstance()
ins.num += 1
ins.queue.put(QueuedTask(
num=ins.num,
user=user,
added=datetime.now(),
task=task,
))
@property
def tasks(self):
with self.doLock:
tasks = self.queue.to_list() + self.dequeued
return sorted(tasks, key=lambda x: x.num)
def cleanup_tasks(self):
with self.doLock:
dead = []
alive = []
for x in self.dequeued:
(dead if x.task.dead else alive).append(x)
# if the ones that we need to keep are within the trigger, do nothing else
delta = len(self.dequeued) - len(dead)
if delta > TASK_CLEANUP_TRIGGER:
ret = alive
else:
# otherwise, lop off the oldest dead tasks until we hit the target trigger
ret = sorted(dead, key=lambda x: x.task.end_time)[-TASK_CLEANUP_TRIGGER:] + alive
self.dequeued = sorted(ret, key=lambda x: x.num)
# Main thread loop starting the different tasks
def run(self):
main_thread = _get_main_thread()
while main_thread.is_alive():
try:
# this blocks until something is available. This can cause issues when the main thread dies - this
# thread will remain alive. We implement a timeout to unblock every second which allows us to check if
# the main thread is still alive.
# We don't use a daemon here because we don't want the tasks to just be abruptly halted, leading to
# possible file / database corruption
item = self.queue.get(timeout=1)
except queue.Empty as ex:
time.sleep(1)
continue
with self.doLock:
# add to list so that in-progress tasks show up
self.dequeued.append(item)
# once we hit our trigger, start cleaning up dead tasks
if len(self.dequeued) > TASK_CLEANUP_TRIGGER:
self.cleanup_tasks()
# sometimes tasks (like Upload) don't actually have work to do and are created as already finished
if item.task.stat is STAT_WAITING:
# CalibreTask.start() should wrap all exceptions in it's own error handling
item.task.start(self)
self.queue.task_done()
class CalibreTask:
__metaclass__ = abc.ABCMeta
def __init__(self, message):
self._progress = 0
self.stat = STAT_WAITING
self.error = None
self.start_time = None
self.end_time = None
self.message = message
self.id = uuid.uuid4()
@abc.abstractmethod
def run(self, worker_thread):
"""Provides the caller some human-readable name for this class"""
raise NotImplementedError
@abc.abstractmethod
def name(self):
"""Provides the caller some human-readable name for this class"""
raise NotImplementedError
def start(self, *args):
self.start_time = datetime.now()
self.stat = STAT_STARTED
# catch any unhandled exceptions in a task and automatically fail it
try:
self.run(*args)
except Exception as e:
self._handleError(str(e))
log.exception(e)
self.end_time = datetime.now()
@property
def stat(self):
return self._stat
@stat.setter
def stat(self, x):
self._stat = x
@property
def progress(self):
return self._progress
@progress.setter
def progress(self, x):
if not 0 <= x <= 1:
raise ValueError("Task progress should within [0, 1] range")
self._progress = x
@property
def error(self):
return self._error
@error.setter
def error(self, x):
self._error = x
@property
def runtime(self):
return (self.end_time or datetime.now()) - self.start_time
@property
def dead(self):
"""Determines whether or not this task can be garbage collected
We have a separate dictating this because there may be certain tasks that want to override this
"""
# By default, we're good to clean a task if it's "Done"
return self.stat in (STAT_FINISH_SUCCESS, STAT_FAIL)
@progress.setter
def progress(self, x):
# todo: throw error if outside of [0,1]
self._progress = x
def _handleError(self, error_message):
log.exception(error_message)
self.stat = STAT_FAIL
self.progress = 1
self.error = error_message
def _handleSuccess(self):
self.stat = STAT_FINISH_SUCCESS
self.progress = 1

View File

@ -1,4 +1,3 @@
#!/usr/bin/env python
# -*- coding: utf-8 -*- # -*- coding: utf-8 -*-
# This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web) # This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
@ -22,51 +21,65 @@
# along with this program. If not, see <http://www.gnu.org/licenses/>. # along with this program. If not, see <http://www.gnu.org/licenses/>.
from __future__ import division, print_function, unicode_literals from __future__ import division, print_function, unicode_literals
from datetime import datetime
from flask import Blueprint, request, flash, redirect, url_for from flask import Blueprint, request, flash, redirect, url_for
from flask_babel import gettext as _ from flask_babel import gettext as _
from flask_login import login_required, current_user from flask_login import login_required, current_user
from sqlalchemy.sql.expression import func, or_, and_ from sqlalchemy.sql.expression import func
from sqlalchemy.exc import OperationalError, InvalidRequestError
from . import logger, ub, searched_ids, db from . import logger, ub, calibre_db
from .web import render_title_template from .web import login_required_if_no_ano, render_title_template
shelf = Blueprint('shelf', __name__) shelf = Blueprint('shelf', __name__)
log = logger.create() log = logger.create()
def check_shelf_edit_permissions(cur_shelf):
if not cur_shelf.is_public and not cur_shelf.user_id == int(current_user.id):
log.error("User %s not allowed to edit shelf %s", current_user, cur_shelf)
return False
if cur_shelf.is_public and not current_user.role_edit_shelfs():
log.info("User %s not allowed to edit public shelves", current_user)
return False
return True
def check_shelf_view_permissions(cur_shelf):
if cur_shelf.is_public:
return True
if current_user.is_anonymous or cur_shelf.user_id != current_user.id:
log.error("User is unauthorized to view non-public shelf: %s", cur_shelf)
return False
return True
@shelf.route("/shelf/add/<int:shelf_id>/<int:book_id>") @shelf.route("/shelf/add/<int:shelf_id>/<int:book_id>")
@login_required @login_required
def add_to_shelf(shelf_id, book_id): def add_to_shelf(shelf_id, book_id):
xhr = request.headers.get('X-Requested-With') == 'XMLHttpRequest'
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first() shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
if shelf is None: if shelf is None:
log.error("Invalid shelf specified: %s", shelf_id) log.error("Invalid shelf specified: %s", shelf_id)
if not request.is_xhr: if not xhr:
flash(_(u"Invalid shelf specified"), category="error") flash(_(u"Invalid shelf specified"), category="error")
return redirect(url_for('web.index')) return redirect(url_for('web.index'))
return "Invalid shelf specified", 400 return "Invalid shelf specified", 400
if not shelf.is_public and not shelf.user_id == int(current_user.id): if not check_shelf_edit_permissions(shelf):
log.error("User %s not allowed to add a book to %s", current_user, shelf) if not xhr:
if not request.is_xhr:
flash(_(u"Sorry you are not allowed to add a book to the the shelf: %(shelfname)s", shelfname=shelf.name), flash(_(u"Sorry you are not allowed to add a book to the the shelf: %(shelfname)s", shelfname=shelf.name),
category="error") category="error")
return redirect(url_for('web.index')) return redirect(url_for('web.index'))
return "Sorry you are not allowed to add a book to the the shelf: %s" % shelf.name, 403 return "Sorry you are not allowed to add a book to the the shelf: %s" % shelf.name, 403
if shelf.is_public and not current_user.role_edit_shelfs():
log.info("User %s not allowed to edit public shelves", current_user)
if not request.is_xhr:
flash(_(u"You are not allowed to edit public shelves"), category="error")
return redirect(url_for('web.index'))
return "User is not allowed to edit public shelves", 403
book_in_shelf = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id, book_in_shelf = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id,
ub.BookShelf.book_id == book_id).first() ub.BookShelf.book_id == book_id).first()
if book_in_shelf: if book_in_shelf:
log.error("Book %s is already part of %s", book_id, shelf) log.error("Book %s is already part of %s", book_id, shelf)
if not request.is_xhr: if not xhr:
flash(_(u"Book is already part of the shelf: %(shelfname)s", shelfname=shelf.name), category="error") flash(_(u"Book is already part of the shelf: %(shelfname)s", shelfname=shelf.name), category="error")
return redirect(url_for('web.index')) return redirect(url_for('web.index'))
return "Book is already part of the shelf: %s" % shelf.name, 400 return "Book is already part of the shelf: %s" % shelf.name, 400
@ -77,10 +90,19 @@ def add_to_shelf(shelf_id, book_id):
else: else:
maxOrder = maxOrder[0] maxOrder = maxOrder[0]
ins = ub.BookShelf(shelf=shelf.id, book_id=book_id, order=maxOrder + 1) shelf.books.append(ub.BookShelf(shelf=shelf.id, book_id=book_id, order=maxOrder + 1))
ub.session.add(ins) shelf.last_modified = datetime.utcnow()
ub.session.commit() try:
if not request.is_xhr: ub.session.merge(shelf)
ub.session.commit()
except (OperationalError, InvalidRequestError):
ub.session.rollback()
flash(_(u"Settings DB is not Writeable"), category="error")
if "HTTP_REFERER" in request.environ:
return redirect(request.environ["HTTP_REFERER"])
else:
return redirect(url_for('web.index'))
if not xhr:
flash(_(u"Book has been added to shelf: %(sname)s", sname=shelf.name), category="success") flash(_(u"Book has been added to shelf: %(sname)s", sname=shelf.name), category="success")
if "HTTP_REFERER" in request.environ: if "HTTP_REFERER" in request.environ:
return redirect(request.environ["HTTP_REFERER"]) return redirect(request.environ["HTTP_REFERER"])
@ -98,28 +120,22 @@ def search_to_shelf(shelf_id):
flash(_(u"Invalid shelf specified"), category="error") flash(_(u"Invalid shelf specified"), category="error")
return redirect(url_for('web.index')) return redirect(url_for('web.index'))
if not shelf.is_public and not shelf.user_id == int(current_user.id): if not check_shelf_edit_permissions(shelf):
log.error("User %s not allowed to add a book to %s", current_user, shelf)
flash(_(u"You are not allowed to add a book to the the shelf: %(name)s", name=shelf.name), category="error") flash(_(u"You are not allowed to add a book to the the shelf: %(name)s", name=shelf.name), category="error")
return redirect(url_for('web.index')) return redirect(url_for('web.index'))
if shelf.is_public and not current_user.role_edit_shelfs(): if current_user.id in ub.searched_ids and ub.searched_ids[current_user.id]:
log.error("User %s not allowed to edit public shelves", current_user)
flash(_(u"User is not allowed to edit public shelves"), category="error")
return redirect(url_for('web.index'))
if current_user.id in searched_ids and searched_ids[current_user.id]:
books_for_shelf = list() books_for_shelf = list()
books_in_shelf = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id).all() books_in_shelf = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id).all()
if books_in_shelf: if books_in_shelf:
book_ids = list() book_ids = list()
for book_id in books_in_shelf: for book_id in books_in_shelf:
book_ids.append(book_id.book_id) book_ids.append(book_id.book_id)
for searchid in searched_ids[current_user.id]: for searchid in ub.searched_ids[current_user.id]:
if searchid not in book_ids: if searchid not in book_ids:
books_for_shelf.append(searchid) books_for_shelf.append(searchid)
else: else:
books_for_shelf = searched_ids[current_user.id] books_for_shelf = ub.searched_ids[current_user.id]
if not books_for_shelf: if not books_for_shelf:
log.error("Books are already part of %s", shelf) log.error("Books are already part of %s", shelf)
@ -134,10 +150,15 @@ def search_to_shelf(shelf_id):
for book in books_for_shelf: for book in books_for_shelf:
maxOrder = maxOrder + 1 maxOrder = maxOrder + 1
ins = ub.BookShelf(shelf=shelf.id, book_id=book, order=maxOrder) shelf.books.append(ub.BookShelf(shelf=shelf.id, book_id=book, order=maxOrder))
ub.session.add(ins) shelf.last_modified = datetime.utcnow()
ub.session.commit() try:
flash(_(u"Books have been added to shelf: %(sname)s", sname=shelf.name), category="success") ub.session.merge(shelf)
ub.session.commit()
flash(_(u"Books have been added to shelf: %(sname)s", sname=shelf.name), category="success")
except (OperationalError, InvalidRequestError):
ub.session.rollback()
flash(_(u"Settings DB is not Writeable"), category="error")
else: else:
flash(_(u"Could not add books to shelf: %(sname)s", sname=shelf.name), category="error") flash(_(u"Could not add books to shelf: %(sname)s", sname=shelf.name), category="error")
return redirect(url_for('web.index')) return redirect(url_for('web.index'))
@ -146,10 +167,11 @@ def search_to_shelf(shelf_id):
@shelf.route("/shelf/remove/<int:shelf_id>/<int:book_id>") @shelf.route("/shelf/remove/<int:shelf_id>/<int:book_id>")
@login_required @login_required
def remove_from_shelf(shelf_id, book_id): def remove_from_shelf(shelf_id, book_id):
xhr = request.headers.get('X-Requested-With') == 'XMLHttpRequest'
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first() shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
if shelf is None: if shelf is None:
log.error("Invalid shelf specified: %s", shelf_id) log.error("Invalid shelf specified: %s", shelf_id)
if not request.is_xhr: if not xhr:
return redirect(url_for('web.index')) return redirect(url_for('web.index'))
return "Invalid shelf specified", 400 return "Invalid shelf specified", 400
@ -161,34 +183,42 @@ def remove_from_shelf(shelf_id, book_id):
# true 0 x 1 # true 0 x 1
# false 0 x 0 # false 0 x 0
if (not shelf.is_public and shelf.user_id == int(current_user.id)) \ if check_shelf_edit_permissions(shelf):
or (shelf.is_public and current_user.role_edit_shelfs()):
book_shelf = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id, book_shelf = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id,
ub.BookShelf.book_id == book_id).first() ub.BookShelf.book_id == book_id).first()
if book_shelf is None: if book_shelf is None:
log.error("Book %s already removed from %s", book_id, shelf) log.error("Book %s already removed from %s", book_id, shelf)
if not request.is_xhr: if not xhr:
return redirect(url_for('web.index')) return redirect(url_for('web.index'))
return "Book already removed from shelf", 410 return "Book already removed from shelf", 410
ub.session.delete(book_shelf) try:
ub.session.commit() ub.session.delete(book_shelf)
shelf.last_modified = datetime.utcnow()
if not request.is_xhr: ub.session.commit()
except (OperationalError, InvalidRequestError):
ub.session.rollback()
flash(_(u"Settings DB is not Writeable"), category="error")
if "HTTP_REFERER" in request.environ:
return redirect(request.environ["HTTP_REFERER"])
else:
return redirect(url_for('web.index'))
if not xhr:
flash(_(u"Book has been removed from shelf: %(sname)s", sname=shelf.name), category="success") flash(_(u"Book has been removed from shelf: %(sname)s", sname=shelf.name), category="success")
return redirect(request.environ["HTTP_REFERER"]) if "HTTP_REFERER" in request.environ:
return redirect(request.environ["HTTP_REFERER"])
else:
return redirect(url_for('web.index'))
return "", 204 return "", 204
else: else:
log.error("User %s not allowed to remove a book from %s", current_user, shelf) if not xhr:
if not request.is_xhr:
flash(_(u"Sorry you are not allowed to remove a book from this shelf: %(sname)s", sname=shelf.name), flash(_(u"Sorry you are not allowed to remove a book from this shelf: %(sname)s", sname=shelf.name),
category="error") category="error")
return redirect(url_for('web.index')) return redirect(url_for('web.index'))
return "Sorry you are not allowed to remove a book from this shelf: %s" % shelf.name, 403 return "Sorry you are not allowed to remove a book from this shelf: %s" % shelf.name, 403
@shelf.route("/shelf/create", methods=["GET", "POST"]) @shelf.route("/shelf/create", methods=["GET", "POST"])
@login_required @login_required
def create_shelf(): def create_shelf():
@ -199,21 +229,41 @@ def create_shelf():
shelf.is_public = 1 shelf.is_public = 1
shelf.name = to_save["title"] shelf.name = to_save["title"]
shelf.user_id = int(current_user.id) shelf.user_id = int(current_user.id)
existing_shelf = ub.session.query(ub.Shelf).filter(
or_((ub.Shelf.name == to_save["title"]) & (ub.Shelf.is_public == 1), is_shelf_name_unique = False
(ub.Shelf.name == to_save["title"]) & (ub.Shelf.user_id == int(current_user.id)))).first() if shelf.is_public == 1:
if existing_shelf: is_shelf_name_unique = ub.session.query(ub.Shelf) \
flash(_(u"A shelf with the name '%(title)s' already exists.", title=to_save["title"]), category="error") .filter((ub.Shelf.name == to_save["title"]) & (ub.Shelf.is_public == 1)) \
.first() is None
if not is_shelf_name_unique:
flash(_(u"A public shelf with the name '%(title)s' already exists.", title=to_save["title"]),
category="error")
else: else:
is_shelf_name_unique = ub.session.query(ub.Shelf) \
.filter((ub.Shelf.name == to_save["title"]) & (ub.Shelf.is_public == 0) &
(ub.Shelf.user_id == int(current_user.id)))\
.first() is None
if not is_shelf_name_unique:
flash(_(u"A private shelf with the name '%(title)s' already exists.", title=to_save["title"]),
category="error")
if is_shelf_name_unique:
try: try:
ub.session.add(shelf) ub.session.add(shelf)
ub.session.commit() ub.session.commit()
flash(_(u"Shelf %(title)s created", title=to_save["title"]), category="success") flash(_(u"Shelf %(title)s created", title=to_save["title"]), category="success")
return redirect(url_for('shelf.show_shelf', shelf_id=shelf.id))
except (OperationalError, InvalidRequestError):
ub.session.rollback()
flash(_(u"Settings DB is not Writeable"), category="error")
except Exception: except Exception:
ub.session.rollback()
flash(_(u"There was an error"), category="error") flash(_(u"There was an error"), category="error")
return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"create a shelf"), page="shelfcreate") return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Create a Shelf"), page="shelfcreate")
else: else:
return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"create a shelf"), page="shelfcreate") return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Create a Shelf"), page="shelfcreate")
@shelf.route("/shelf/edit/<int:shelf_id>", methods=["GET", "POST"]) @shelf.route("/shelf/edit/<int:shelf_id>", methods=["GET", "POST"])
@ -222,14 +272,31 @@ def edit_shelf(shelf_id):
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first() shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
if request.method == "POST": if request.method == "POST":
to_save = request.form.to_dict() to_save = request.form.to_dict()
existing_shelf = ub.session.query(ub.Shelf).filter(
or_((ub.Shelf.name == to_save["title"]) & (ub.Shelf.is_public == 1), is_shelf_name_unique = False
(ub.Shelf.name == to_save["title"]) & (ub.Shelf.user_id == int(current_user.id)))).filter( if shelf.is_public == 1:
ub.Shelf.id != shelf_id).first() is_shelf_name_unique = ub.session.query(ub.Shelf) \
if existing_shelf: .filter((ub.Shelf.name == to_save["title"]) & (ub.Shelf.is_public == 1)) \
flash(_(u"A shelf with the name '%(title)s' already exists.", title=to_save["title"]), category="error") .filter(ub.Shelf.id != shelf_id) \
.first() is None
if not is_shelf_name_unique:
flash(_(u"A public shelf with the name '%(title)s' already exists.", title=to_save["title"]),
category="error")
else: else:
is_shelf_name_unique = ub.session.query(ub.Shelf) \
.filter((ub.Shelf.name == to_save["title"]) & (ub.Shelf.is_public == 0) &
(ub.Shelf.user_id == int(current_user.id)))\
.filter(ub.Shelf.id != shelf_id)\
.first() is None
if not is_shelf_name_unique:
flash(_(u"A private shelf with the name '%(title)s' already exists.", title=to_save["title"]),
category="error")
if is_shelf_name_unique:
shelf.name = to_save["title"] shelf.name = to_save["title"]
shelf.last_modified = datetime.utcnow()
if "is_public" in to_save: if "is_public" in to_save:
shelf.is_public = 1 shelf.is_public = 1
else: else:
@ -237,68 +304,75 @@ def edit_shelf(shelf_id):
try: try:
ub.session.commit() ub.session.commit()
flash(_(u"Shelf %(title)s changed", title=to_save["title"]), category="success") flash(_(u"Shelf %(title)s changed", title=to_save["title"]), category="success")
except (OperationalError, InvalidRequestError):
ub.session.rollback()
flash(_(u"Settings DB is not Writeable"), category="error")
except Exception: except Exception:
ub.session.rollback()
flash(_(u"There was an error"), category="error") flash(_(u"There was an error"), category="error")
return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Edit a shelf"), page="shelfedit") return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Edit a shelf"), page="shelfedit")
else: else:
return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Edit a shelf"), page="shelfedit") return render_title_template('shelf_edit.html', shelf=shelf, title=_(u"Edit a shelf"), page="shelfedit")
def delete_shelf_helper(cur_shelf):
if not cur_shelf or not check_shelf_edit_permissions(cur_shelf):
return
shelf_id = cur_shelf.id
ub.session.delete(cur_shelf)
ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id).delete()
ub.session.add(ub.ShelfArchive(uuid=cur_shelf.uuid, user_id=cur_shelf.user_id))
ub.session.commit()
log.info("successfully deleted %s", cur_shelf)
@shelf.route("/shelf/delete/<int:shelf_id>") @shelf.route("/shelf/delete/<int:shelf_id>")
@login_required @login_required
def delete_shelf(shelf_id): def delete_shelf(shelf_id):
cur_shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first() cur_shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
deleted = None try:
if current_user.role_admin(): delete_shelf_helper(cur_shelf)
deleted = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).delete() except (OperationalError, InvalidRequestError):
else: ub.session.rollback()
if (not cur_shelf.is_public and cur_shelf.user_id == int(current_user.id)) \ flash(_(u"Settings DB is not Writeable"), category="error")
or (cur_shelf.is_public and current_user.role_edit_shelfs()):
deleted = ub.session.query(ub.Shelf).filter(or_(and_(ub.Shelf.user_id == int(current_user.id),
ub.Shelf.id == shelf_id),
and_(ub.Shelf.is_public == 1,
ub.Shelf.id == shelf_id))).delete()
if deleted:
ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id).delete()
ub.session.commit()
log.info("successfully deleted %s", cur_shelf)
return redirect(url_for('web.index')) return redirect(url_for('web.index'))
# @shelf.route("/shelfdown/<int:shelf_id>")
@shelf.route("/shelf/<int:shelf_id>", defaults={'shelf_type': 1}) @shelf.route("/shelf/<int:shelf_id>", defaults={'shelf_type': 1})
@shelf.route("/shelf/<int:shelf_id>/<int:shelf_type>") @shelf.route("/shelf/<int:shelf_id>/<int:shelf_type>")
@login_required_if_no_ano
def show_shelf(shelf_type, shelf_id): def show_shelf(shelf_type, shelf_id):
if current_user.is_anonymous: shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.is_public == 1, ub.Shelf.id == shelf_id).first()
else:
shelf = ub.session.query(ub.Shelf).filter(or_(and_(ub.Shelf.user_id == int(current_user.id),
ub.Shelf.id == shelf_id),
and_(ub.Shelf.is_public == 1,
ub.Shelf.id == shelf_id))).first()
result = list() result = list()
# user is allowed to access shelf # user is allowed to access shelf
if shelf: if shelf and check_shelf_view_permissions(shelf):
page = "shelf.html" if shelf_type == 1 else 'shelfdown.html' page = "shelf.html" if shelf_type == 1 else 'shelfdown.html'
books_in_shelf = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id).order_by( books_in_shelf = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id)\
ub.BookShelf.order.asc()).all() .order_by(ub.BookShelf.order.asc()).all()
for book in books_in_shelf: for book in books_in_shelf:
cur_book = db.session.query(db.Books).filter(db.Books.id == book.book_id).first() cur_book = calibre_db.get_filtered_book(book.book_id)
if cur_book: if cur_book:
result.append(cur_book) result.append(cur_book)
else: else:
log.info('Not existing book %s in %s deleted', book.book_id, shelf) cur_book = calibre_db.get_book(book.book_id)
ub.session.query(ub.BookShelf).filter(ub.BookShelf.book_id == book.book_id).delete() if not cur_book:
ub.session.commit() log.info('Not existing book %s in %s deleted', book.book_id, shelf)
try:
ub.session.query(ub.BookShelf).filter(ub.BookShelf.book_id == book.book_id).delete()
ub.session.commit()
except (OperationalError, InvalidRequestError):
ub.session.rollback()
flash(_(u"Settings DB is not Writeable"), category="error")
return render_title_template(page, entries=result, title=_(u"Shelf: '%(name)s'", name=shelf.name), return render_title_template(page, entries=result, title=_(u"Shelf: '%(name)s'", name=shelf.name),
shelf=shelf, page="shelf") shelf=shelf, page="shelf")
else: else:
flash(_(u"Error opening shelf. Shelf does not exist or is not accessible"), category="error") flash(_(u"Error opening shelf. Shelf does not exist or is not accessible"), category="error")
return redirect(url_for("web.index")) return redirect(url_for("web.index"))
@shelf.route("/shelf/order/<int:shelf_id>", methods=["GET", "POST"]) @shelf.route("/shelf/order/<int:shelf_id>", methods=["GET", "POST"])
@login_required @login_required
def order_shelf(shelf_id): def order_shelf(shelf_id):
@ -310,21 +384,32 @@ def order_shelf(shelf_id):
for book in books_in_shelf: for book in books_in_shelf:
setattr(book, 'order', to_save[str(book.book_id)]) setattr(book, 'order', to_save[str(book.book_id)])
counter += 1 counter += 1
ub.session.commit() # if order diffrent from before -> shelf.last_modified = datetime.utcnow()
if current_user.is_anonymous: try:
shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.is_public == 1, ub.Shelf.id == shelf_id).first() ub.session.commit()
else: except (OperationalError, InvalidRequestError):
shelf = ub.session.query(ub.Shelf).filter(or_(and_(ub.Shelf.user_id == int(current_user.id), ub.session.rollback()
ub.Shelf.id == shelf_id), flash(_(u"Settings DB is not Writeable"), category="error")
and_(ub.Shelf.is_public == 1,
ub.Shelf.id == shelf_id))).first() shelf = ub.session.query(ub.Shelf).filter(ub.Shelf.id == shelf_id).first()
result = list() result = list()
if shelf: if shelf and check_shelf_view_permissions(shelf):
books_in_shelf2 = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id) \ books_in_shelf2 = ub.session.query(ub.BookShelf).filter(ub.BookShelf.shelf == shelf_id) \
.order_by(ub.BookShelf.order.asc()).all() .order_by(ub.BookShelf.order.asc()).all()
for book in books_in_shelf2: for book in books_in_shelf2:
cur_book = db.session.query(db.Books).filter(db.Books.id == book.book_id).first() cur_book = calibre_db.get_filtered_book(book.book_id)
result.append(cur_book) if cur_book:
result.append({'title': cur_book.title,
'id': cur_book.id,
'author': cur_book.authors,
'series': cur_book.series,
'series_index': cur_book.series_index})
else:
cur_book = calibre_db.get_book(book.book_id)
result.append({'title': _('Hidden Book'),
'id': cur_book.id,
'author': [],
'series': []})
return render_title_template('shelf_order.html', entries=result, return render_title_template('shelf_order.html', entries=result,
title=_(u"Change order of Shelf: '%(name)s'", name=shelf.name), title=_(u"Change order of Shelf: '%(name)s'", name=shelf.name),
shelf=shelf, page="shelforder") shelf=shelf, page="shelforder")

7979
cps/static/css/caliBlur.css Normal file

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1,17 @@
body.serieslist.grid-view div.container-fluid>div>div.col-sm-10:before{
display: none;
}
.cover .badge{
position: absolute;
top: 0;
left: 0;
background-color: #cc7b19;
border-radius: 0;
padding: 0 8px;
box-shadow: 0 0 4px rgba(0,0,0,.6);
line-height: 24px;
}
.cover{
box-shadow: 0 0 4px rgba(0,0,0,.6);
}

View File

@ -4,7 +4,7 @@ body {
overflow-y: auto; overflow-y: auto;
color: white; color: white;
font-family: sans-serif; font-family: sans-serif;
margin: 0px; margin: 0;
} }
#main { #main {
@ -13,7 +13,7 @@ body {
} }
.view { .view {
padding-top:0px; padding-top: 0;
} }
#sidebar a, #sidebar a,
@ -34,18 +34,18 @@ body {
cursor: pointer; cursor: pointer;
padding: 4px; padding: 4px;
transition: all .2s ease; transition: all 0.2s ease;
} }
#sidebar a:hover, #sidebar a:hover,
#sidebar a:focus { #sidebar a:focus {
outline: none; outline: none;
box-shadow: 0px 2px 8px 1px black; box-shadow: 0 2px 8px 1px black;
} }
#sidebar a.active, #sidebar a.active,
#sidebar a.active img + span { #sidebar a.active img + span {
background-color: #45B29D; background-color: #45B29D;
} }
#sidebar li img { #sidebar li img {
@ -79,7 +79,6 @@ body {
font-size: 10px; font-size: 10px;
line-height: 10px; line-height: 10px;
text-align: right; text-align: right;
transition: min-height 150ms ease-in-out; transition: min-height 150ms ease-in-out;
} }
@ -92,18 +91,17 @@ body {
top: 0; top: 0;
left: 0; left: 0;
bottom: 0; bottom: 0;
transition: width 150ms ease-in-out; transition: width 150ms ease-in-out;
} }
#progress .bar-load { #progress .bar-load {
color: #000; color: #000;
background-color: #CCC; background-color: #ccc;
} }
#progress .bar-read { #progress .bar-read {
color: #FFF; color: #fff;
background-color: #45B29D; background-color: #45b29d;
} }
#progress .text { #progress .text {
@ -152,40 +150,8 @@ body {
max-width: 70%; max-width: 70%;
} }
#left { th,
left: 40px; td {
}
#right {
right: 40px;
}
.arrow {
position: absolute;
top: 50%;
margin-top: -32px;
font-size: 64px;
color: #E2E2E2;
font-family: arial, sans-serif;
font-weight: bold;
cursor: pointer;
-webkit-user-select: none;
-khtml-user-select: none;
-moz-user-select: none;
-ms-user-select: none;
user-select: none;
}
.arrow:hover {
color: #777;
}
.arrow:active,
.arrow.active {
color: #000;
}
th, td {
padding: 5px; padding: 5px;
} }
@ -239,18 +205,17 @@ th {
} }
.dark-theme #titlebar { .dark-theme #titlebar {
color: #DDD; color: #ddd;
} }
.dark-theme #titlebar a:active { .dark-theme #titlebar a:active {
color: #FFF; color: #fff;
} }
.dark-theme #progress .bar-read { .dark-theme #progress .bar-read {
background-color: red; background-color: red;
} }
.dark-theme .overlay { .dark-theme .overlay {
background-color: rgba(0,0,0,0.8); background-color: rgba(0, 0, 0, 0.8);
} }

File diff suppressed because one or more lines are too long

View File

@ -230,36 +230,46 @@
z-index: 200; z-index: 200;
max-width: 20em; max-width: 20em;
background-color: #FFFF99; background-color: #FFFF99;
box-shadow: 0px 2px 5px #333; box-shadow: 0px 2px 5px #888;
border-radius: 2px; border-radius: 2px;
padding: 0.6em; padding: 6px;
margin-left: 5px; margin-left: 5px;
cursor: pointer; cursor: pointer;
font: message-box; font: message-box;
font-size: 9px;
word-wrap: break-word; word-wrap: break-word;
} }
.annotationLayer .popup > * {
font-size: 9px;
}
.annotationLayer .popup h1 { .annotationLayer .popup h1 {
font-size: 1em; display: inline-block;
border-bottom: 1px solid #000000; }
margin: 0;
padding-bottom: 0.2em; .annotationLayer .popup span {
display: inline-block;
margin-left: 5px;
} }
.annotationLayer .popup p { .annotationLayer .popup p {
margin: 0; border-top: 1px solid #333;
padding-top: 0.2em; margin-top: 2px;
padding-top: 2px;
} }
.annotationLayer .highlightAnnotation, .annotationLayer .highlightAnnotation,
.annotationLayer .underlineAnnotation, .annotationLayer .underlineAnnotation,
.annotationLayer .squigglyAnnotation, .annotationLayer .squigglyAnnotation,
.annotationLayer .strikeoutAnnotation, .annotationLayer .strikeoutAnnotation,
.annotationLayer .freeTextAnnotation,
.annotationLayer .lineAnnotation svg line, .annotationLayer .lineAnnotation svg line,
.annotationLayer .squareAnnotation svg rect, .annotationLayer .squareAnnotation svg rect,
.annotationLayer .circleAnnotation svg ellipse, .annotationLayer .circleAnnotation svg ellipse,
.annotationLayer .polylineAnnotation svg polyline, .annotationLayer .polylineAnnotation svg polyline,
.annotationLayer .polygonAnnotation svg polygon, .annotationLayer .polygonAnnotation svg polygon,
.annotationLayer .caretAnnotation,
.annotationLayer .inkAnnotation svg polyline, .annotationLayer .inkAnnotation svg polyline,
.annotationLayer .stampAnnotation, .annotationLayer .stampAnnotation,
.annotationLayer .fileAttachmentAnnotation { .annotationLayer .fileAttachmentAnnotation {
@ -279,8 +289,9 @@
overflow: visible; overflow: visible;
border: 9px solid transparent; border: 9px solid transparent;
background-clip: content-box; background-clip: content-box;
-o-border-image: url(images/shadow.png) 9 9 repeat; -webkit-border-image: url(images/shadow.png) 9 9 repeat;
border-image: url(images/shadow.png) 9 9 repeat; -o-border-image: url(images/shadow.png) 9 9 repeat;
border-image: url(images/shadow.png) 9 9 repeat;
background-color: white; background-color: white;
} }
@ -543,15 +554,20 @@ select {
z-index: 100; z-index: 100;
border-top: 1px solid #333; border-top: 1px solid #333;
transition-duration: 200ms; -webkit-transition-duration: 200ms;
transition-timing-function: ease;
transition-duration: 200ms;
-webkit-transition-timing-function: ease;
transition-timing-function: ease;
} }
html[dir='ltr'] #sidebarContainer { html[dir='ltr'] #sidebarContainer {
-webkit-transition-property: left;
transition-property: left; transition-property: left;
left: -200px; left: -200px;
left: calc(-1 * var(--sidebar-width)); left: calc(-1 * var(--sidebar-width));
} }
html[dir='rtl'] #sidebarContainer { html[dir='rtl'] #sidebarContainer {
-webkit-transition-property: right;
transition-property: right; transition-property: right;
right: -200px; right: -200px;
right: calc(-1 * var(--sidebar-width)); right: calc(-1 * var(--sidebar-width));
@ -563,7 +579,8 @@ html[dir='rtl'] #sidebarContainer {
#outerContainer.sidebarResizing #sidebarContainer { #outerContainer.sidebarResizing #sidebarContainer {
/* Improve responsiveness and avoid visual glitches when the sidebar is resized. */ /* Improve responsiveness and avoid visual glitches when the sidebar is resized. */
transition-duration: 0s; -webkit-transition-duration: 0s;
transition-duration: 0s;
/* Prevent e.g. the thumbnails being selected when the sidebar is resized. */ /* Prevent e.g. the thumbnails being selected when the sidebar is resized. */
-webkit-user-select: none; -webkit-user-select: none;
-moz-user-select: none; -moz-user-select: none;
@ -620,8 +637,10 @@ html[dir='rtl'] #sidebarContent {
outline: none; outline: none;
} }
#viewerContainer:not(.pdfPresentationMode) { #viewerContainer:not(.pdfPresentationMode) {
transition-duration: 200ms; -webkit-transition-duration: 200ms;
transition-timing-function: ease; transition-duration: 200ms;
-webkit-transition-timing-function: ease;
transition-timing-function: ease;
} }
html[dir='ltr'] #viewerContainer { html[dir='ltr'] #viewerContainer {
box-shadow: inset 1px 0 0 hsla(0,0%,100%,.05); box-shadow: inset 1px 0 0 hsla(0,0%,100%,.05);
@ -632,15 +651,18 @@ html[dir='rtl'] #viewerContainer {
#outerContainer.sidebarResizing #viewerContainer { #outerContainer.sidebarResizing #viewerContainer {
/* Improve responsiveness and avoid visual glitches when the sidebar is resized. */ /* Improve responsiveness and avoid visual glitches when the sidebar is resized. */
transition-duration: 0s; -webkit-transition-duration: 0s;
transition-duration: 0s;
} }
html[dir='ltr'] #outerContainer.sidebarOpen #viewerContainer:not(.pdfPresentationMode) { html[dir='ltr'] #outerContainer.sidebarOpen #viewerContainer:not(.pdfPresentationMode) {
-webkit-transition-property: left;
transition-property: left; transition-property: left;
left: 200px; left: 200px;
left: var(--sidebar-width); left: var(--sidebar-width);
} }
html[dir='rtl'] #outerContainer.sidebarOpen #viewerContainer:not(.pdfPresentationMode) { html[dir='rtl'] #outerContainer.sidebarOpen #viewerContainer:not(.pdfPresentationMode) {
-webkit-transition-property: right;
transition-property: right; transition-property: right;
right: 200px; right: 200px;
right: var(--sidebar-width); right: var(--sidebar-width);
@ -662,6 +684,8 @@ html[dir='rtl'] #outerContainer.sidebarOpen #viewerContainer:not(.pdfPresentatio
width: 100%; width: 100%;
height: 32px; height: 32px;
background-color: #424242; /* fallback */ background-color: #424242; /* fallback */
background-image: url(images/texture.png),
-webkit-gradient(linear, left top, left bottom, from(hsla(0,0%,30%,.99)), to(hsla(0,0%,25%,.95)));
background-image: url(images/texture.png), background-image: url(images/texture.png),
linear-gradient(hsla(0,0%,30%,.99), hsla(0,0%,25%,.95)); linear-gradient(hsla(0,0%,30%,.99), hsla(0,0%,25%,.95));
} }
@ -697,6 +721,8 @@ html[dir='rtl'] #sidebarResizer {
position: relative; position: relative;
height: 32px; height: 32px;
background-color: #474747; /* fallback */ background-color: #474747; /* fallback */
background-image: url(images/texture.png),
-webkit-gradient(linear, left top, left bottom, from(hsla(0,0%,32%,.99)), to(hsla(0,0%,27%,.95)));
background-image: url(images/texture.png), background-image: url(images/texture.png),
linear-gradient(hsla(0,0%,32%,.99), hsla(0,0%,27%,.95)); linear-gradient(hsla(0,0%,32%,.99), hsla(0,0%,27%,.95));
} }
@ -733,6 +759,7 @@ html[dir='rtl'] #toolbarContainer, .findbar, .secondaryToolbar {
height: 100%; height: 100%;
background-color: #ddd; background-color: #ddd;
overflow: hidden; overflow: hidden;
-webkit-transition: width 200ms;
transition: width 200ms; transition: width 200ms;
} }
@ -748,6 +775,7 @@ html[dir='rtl'] #toolbarContainer, .findbar, .secondaryToolbar {
#loadingBar .progress.indeterminate { #loadingBar .progress.indeterminate {
background-color: #999; background-color: #999;
-webkit-transition: none;
transition: none; transition: none;
} }
@ -815,6 +843,9 @@ html[dir='rtl'] .findbar {
#findInput::-webkit-input-placeholder { #findInput::-webkit-input-placeholder {
color: hsl(0, 0%, 75%); color: hsl(0, 0%, 75%);
} }
#findInput::-moz-placeholder {
font-style: italic;
}
#findInput:-ms-input-placeholder { #findInput:-ms-input-placeholder {
font-style: italic; font-style: italic;
} }
@ -1006,6 +1037,7 @@ html[dir='rtl'] .splitToolbarButton > .toolbarButton {
.splitToolbarButton.toggled > .toolbarButton, .splitToolbarButton.toggled > .toolbarButton,
.toolbarButton.textButton { .toolbarButton.textButton {
background-color: hsla(0,0%,0%,.12); background-color: hsla(0,0%,0%,.12);
background-image: -webkit-gradient(linear, left top, left bottom, from(hsla(0,0%,100%,.05)), to(hsla(0,0%,100%,0)));
background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0)); background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0));
background-clip: padding-box; background-clip: padding-box;
border: 1px solid hsla(0,0%,0%,.35); border: 1px solid hsla(0,0%,0%,.35);
@ -1013,9 +1045,12 @@ html[dir='rtl'] .splitToolbarButton > .toolbarButton {
box-shadow: 0 1px 0 hsla(0,0%,100%,.05) inset, box-shadow: 0 1px 0 hsla(0,0%,100%,.05) inset,
0 0 1px hsla(0,0%,100%,.15) inset, 0 0 1px hsla(0,0%,100%,.15) inset,
0 1px 0 hsla(0,0%,100%,.05); 0 1px 0 hsla(0,0%,100%,.05);
-webkit-transition-property: background-color, border-color, box-shadow;
transition-property: background-color, border-color, box-shadow; transition-property: background-color, border-color, box-shadow;
transition-duration: 150ms; -webkit-transition-duration: 150ms;
transition-timing-function: ease; transition-duration: 150ms;
-webkit-transition-timing-function: ease;
transition-timing-function: ease;
} }
.splitToolbarButton > .toolbarButton:hover, .splitToolbarButton > .toolbarButton:hover,
@ -1072,9 +1107,12 @@ html[dir='rtl'] .splitToolbarButtonSeparator {
padding: 12px 0; padding: 12px 0;
margin: 1px 0; margin: 1px 0;
box-shadow: 0 0 0 1px hsla(0,0%,100%,.03); box-shadow: 0 0 0 1px hsla(0,0%,100%,.03);
-webkit-transition-property: padding;
transition-property: padding; transition-property: padding;
transition-duration: 10ms; -webkit-transition-duration: 10ms;
transition-timing-function: ease; transition-duration: 10ms;
-webkit-transition-timing-function: ease;
transition-timing-function: ease;
} }
.toolbarButton, .toolbarButton,
@ -1094,9 +1132,12 @@ html[dir='rtl'] .splitToolbarButtonSeparator {
user-select: none; user-select: none;
/* Opera does not support user-select, use <... unselectable="on"> instead */ /* Opera does not support user-select, use <... unselectable="on"> instead */
cursor: default; cursor: default;
-webkit-transition-property: background-color, border-color, box-shadow;
transition-property: background-color, border-color, box-shadow; transition-property: background-color, border-color, box-shadow;
transition-duration: 150ms; -webkit-transition-duration: 150ms;
transition-timing-function: ease; transition-duration: 150ms;
-webkit-transition-timing-function: ease;
transition-timing-function: ease;
} }
html[dir='ltr'] .toolbarButton, html[dir='ltr'] .toolbarButton,
@ -1117,6 +1158,7 @@ html[dir='rtl'] .dropdownToolbarButton {
.secondaryToolbarButton:hover, .secondaryToolbarButton:hover,
.secondaryToolbarButton:focus { .secondaryToolbarButton:focus {
background-color: hsla(0,0%,0%,.12); background-color: hsla(0,0%,0%,.12);
background-image: -webkit-gradient(linear, left top, left bottom, from(hsla(0,0%,100%,.05)), to(hsla(0,0%,100%,0)));
background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0)); background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0));
background-clip: padding-box; background-clip: padding-box;
border: 1px solid hsla(0,0%,0%,.35); border: 1px solid hsla(0,0%,0%,.35);
@ -1131,28 +1173,36 @@ html[dir='rtl'] .dropdownToolbarButton {
.dropdownToolbarButton:hover:active, .dropdownToolbarButton:hover:active,
.secondaryToolbarButton:hover:active { .secondaryToolbarButton:hover:active {
background-color: hsla(0,0%,0%,.2); background-color: hsla(0,0%,0%,.2);
background-image: -webkit-gradient(linear, left top, left bottom, from(hsla(0,0%,100%,.05)), to(hsla(0,0%,100%,0)));
background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0)); background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0));
border-color: hsla(0,0%,0%,.35) hsla(0,0%,0%,.4) hsla(0,0%,0%,.45); border-color: hsla(0,0%,0%,.35) hsla(0,0%,0%,.4) hsla(0,0%,0%,.45);
box-shadow: 0 1px 1px hsla(0,0%,0%,.1) inset, box-shadow: 0 1px 1px hsla(0,0%,0%,.1) inset,
0 0 1px hsla(0,0%,0%,.2) inset, 0 0 1px hsla(0,0%,0%,.2) inset,
0 1px 0 hsla(0,0%,100%,.05); 0 1px 0 hsla(0,0%,100%,.05);
-webkit-transition-property: background-color, border-color, box-shadow;
transition-property: background-color, border-color, box-shadow; transition-property: background-color, border-color, box-shadow;
transition-duration: 10ms; -webkit-transition-duration: 10ms;
transition-timing-function: linear; transition-duration: 10ms;
-webkit-transition-timing-function: linear;
transition-timing-function: linear;
} }
.toolbarButton.toggled, .toolbarButton.toggled,
.splitToolbarButton.toggled > .toolbarButton.toggled, .splitToolbarButton.toggled > .toolbarButton.toggled,
.secondaryToolbarButton.toggled { .secondaryToolbarButton.toggled {
background-color: hsla(0,0%,0%,.3); background-color: hsla(0,0%,0%,.3);
background-image: -webkit-gradient(linear, left top, left bottom, from(hsla(0,0%,100%,.05)), to(hsla(0,0%,100%,0)));
background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0)); background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0));
border-color: hsla(0,0%,0%,.4) hsla(0,0%,0%,.45) hsla(0,0%,0%,.5); border-color: hsla(0,0%,0%,.4) hsla(0,0%,0%,.45) hsla(0,0%,0%,.5);
box-shadow: 0 1px 1px hsla(0,0%,0%,.1) inset, box-shadow: 0 1px 1px hsla(0,0%,0%,.1) inset,
0 0 1px hsla(0,0%,0%,.2) inset, 0 0 1px hsla(0,0%,0%,.2) inset,
0 1px 0 hsla(0,0%,100%,.05); 0 1px 0 hsla(0,0%,100%,.05);
-webkit-transition-property: background-color, border-color, box-shadow;
transition-property: background-color, border-color, box-shadow; transition-property: background-color, border-color, box-shadow;
transition-duration: 10ms; -webkit-transition-duration: 10ms;
transition-timing-function: linear; transition-duration: 10ms;
-webkit-transition-timing-function: linear;
transition-timing-function: linear;
} }
.toolbarButton.toggled:hover:active, .toolbarButton.toggled:hover:active,
@ -1493,6 +1543,7 @@ html[dir='rtl'] .verticalToolbarSeparator {
border: 1px solid transparent; border: 1px solid transparent;
border-radius: 2px; border-radius: 2px;
background-color: hsla(0,0%,100%,.09); background-color: hsla(0,0%,100%,.09);
background-image: -webkit-gradient(linear, left top, left bottom, from(hsla(0,0%,100%,.05)), to(hsla(0,0%,100%,0)));
background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0)); background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0));
background-clip: padding-box; background-clip: padding-box;
border: 1px solid hsla(0,0%,0%,.35); border: 1px solid hsla(0,0%,0%,.35);
@ -1503,9 +1554,12 @@ html[dir='rtl'] .verticalToolbarSeparator {
font-size: 12px; font-size: 12px;
line-height: 14px; line-height: 14px;
outline-style: none; outline-style: none;
-webkit-transition-property: background-color, border-color, box-shadow;
transition-property: background-color, border-color, box-shadow; transition-property: background-color, border-color, box-shadow;
transition-duration: 150ms; -webkit-transition-duration: 150ms;
transition-timing-function: ease; transition-duration: 150ms;
-webkit-transition-timing-function: ease;
transition-timing-function: ease;
} }
.toolbarField[type=checkbox] { .toolbarField[type=checkbox] {
@ -1619,6 +1673,7 @@ a:focus > .thumbnail > .thumbnailSelectionRing > .thumbnailImage,
a:focus > .thumbnail > .thumbnailSelectionRing, a:focus > .thumbnail > .thumbnailSelectionRing,
.thumbnail:hover > .thumbnailSelectionRing { .thumbnail:hover > .thumbnailSelectionRing {
background-color: hsla(0,0%,100%,.15); background-color: hsla(0,0%,100%,.15);
background-image: -webkit-gradient(linear, left top, left bottom, from(hsla(0,0%,100%,.05)), to(hsla(0,0%,100%,0)));
background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0)); background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0));
background-clip: padding-box; background-clip: padding-box;
box-shadow: 0 1px 0 hsla(0,0%,100%,.05) inset, box-shadow: 0 1px 0 hsla(0,0%,100%,.05) inset,
@ -1634,6 +1689,7 @@ a:focus > .thumbnail > .thumbnailSelectionRing,
.thumbnail.selected > .thumbnailSelectionRing { .thumbnail.selected > .thumbnailSelectionRing {
background-color: hsla(0,0%,100%,.3); background-color: hsla(0,0%,100%,.3);
background-image: -webkit-gradient(linear, left top, left bottom, from(hsla(0,0%,100%,.05)), to(hsla(0,0%,100%,0)));
background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0)); background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0));
background-clip: padding-box; background-clip: padding-box;
box-shadow: 0 1px 0 hsla(0,0%,100%,.05) inset, box-shadow: 0 1px 0 hsla(0,0%,100%,.05) inset,
@ -1755,6 +1811,7 @@ html[dir='rtl'] .outlineItemToggler::before {
.outlineItem > a:hover, .outlineItem > a:hover,
.attachmentsItem > button:hover { .attachmentsItem > button:hover {
background-color: hsla(0,0%,100%,.02); background-color: hsla(0,0%,100%,.02);
background-image: -webkit-gradient(linear, left top, left bottom, from(hsla(0,0%,100%,.05)), to(hsla(0,0%,100%,0)));
background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0)); background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0));
background-clip: padding-box; background-clip: padding-box;
box-shadow: 0 1px 0 hsla(0,0%,100%,.05) inset, box-shadow: 0 1px 0 hsla(0,0%,100%,.05) inset,
@ -1766,6 +1823,7 @@ html[dir='rtl'] .outlineItemToggler::before {
.outlineItem.selected { .outlineItem.selected {
background-color: hsla(0,0%,100%,.08); background-color: hsla(0,0%,100%,.08);
background-image: -webkit-gradient(linear, left top, left bottom, from(hsla(0,0%,100%,.05)), to(hsla(0,0%,100%,0)));
background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0)); background-image: linear-gradient(hsla(0,0%,100%,.05), hsla(0,0%,100%,0));
background-clip: padding-box; background-clip: padding-box;
box-shadow: 0 1px 0 hsla(0,0%,100%,.05) inset, box-shadow: 0 1px 0 hsla(0,0%,100%,.05) inset,
@ -1850,6 +1908,8 @@ html[dir='rtl'] .outlineItemToggler::before {
font-size: 12px; font-size: 12px;
line-height: 14px; line-height: 14px;
background-color: #474747; /* fallback */ background-color: #474747; /* fallback */
background-image: url(images/texture.png),
-webkit-gradient(linear, left top, left bottom, from(hsla(0,0%,32%,.99)), to(hsla(0,0%,27%,.95)));
background-image: url(images/texture.png), background-image: url(images/texture.png),
linear-gradient(hsla(0,0%,32%,.99), hsla(0,0%,27%,.95)); linear-gradient(hsla(0,0%,32%,.99), hsla(0,0%,27%,.95));
box-shadow: inset 1px 0 0 hsla(0,0%,100%,.08), box-shadow: inset 1px 0 0 hsla(0,0%,100%,.08),

View File

@ -1,114 +1,89 @@
.sm2-bar-ui { .sm2-bar-ui {
font-size: 20px; font-size: 20px;
} }
.sm2-bar-ui.compact { .sm2-bar-ui.compact {
max-width: 90%; max-width: 90%;
} }
.sm2-progress .sm2-progress-ball { .sm2-progress .sm2-progress-ball {
width: .5333em; width: 0.5333em;
height: 1.9333em; height: 1.9333em;
border-radius: 0em; border-radius: 0;
} }
.sm2-progress .sm2-progress-track { .sm2-progress .sm2-progress-track {
height: 0.15em; height: 0.15em;
background: white; background: white;
} }
.sm2-bar-ui .sm2-main-controls, .sm2-bar-ui .sm2-main-controls,
.sm2-bar-ui .sm2-playlist-drawer { .sm2-bar-ui .sm2-playlist-drawer {
background-color: transparent; background-color: transparent;
} }
.sm2-bar-ui .sm2-inline-texture { .sm2-bar-ui .sm2-inline-texture {
background: transparent; background: transparent;
} }
.rating .glyphicon-star { .rating .glyphicon-star {
color: gray; color: gray;
} }
.rating .glyphicon-star.good { .rating .glyphicon-star.good {
color: white; color: white;
} }
body { body {
overflow: hidden; overflow: hidden;
background: #272B30; background: #272b30;
color: #aaa; color: #aaa;
} }
#main { #main {
position: absolute; position: absolute;
width: 100%; width: 100%;
height: 100%; height: 100%;
} }
#area { #area {
width: 80%; width: 80%;
height: 80%; height: 80%;
margin: 5% auto; margin: 5% auto;
max-width: 1250px; max-width: 1250px;
} overflow: hidden;
}
#area iframe { #area iframe {
border: none; border: none;
} }
#prev { #prev {
left: 40px; left: 40px;
} }
#next { #next {
right: 40px; right: 40px;
} }
.arrow { xmp,
position: absolute; pre,
top: 50%; plaintext {
margin-top: -32px; display: block;
font-size: 64px; font-family: -moz-fixed;
color: #E2E2E2; white-space: pre;
font-family: arial, sans-serif; margin: 1em 0;
font-weight: bold; }
cursor: pointer;
-webkit-user-select: none;
-moz-user-select: none;
user-select: none;
}
.arrow:hover { pre {
color: #777; white-space: pre-wrap;
} word-wrap: break-word;
font-family: -moz-fixed;
.arrow:active { column-count: 2;
color: #000; -webkit-columns: 2;
} -moz-columns: 2;
column-gap: 20px;
xmp, -moz-column-gap: 20px;
pre, -webkit-column-gap: 20px;
plaintext { position: relative;
display: block; }
font-family: -moz-fixed;
white-space: pre;
margin: 1em 0;
}
#area {
overflow: hidden;
}
pre {
white-space: pre-wrap;
word-wrap: break-word;
font-family: -moz-fixed;
column-count: 2;
-webkit-columns: 2;
-moz-columns: 2;
column-gap: 20px;
-moz-column-gap: 20px;
-webkit-column-gap: 20px;
position: relative;
}

View File

@ -1,10 +1,11 @@
@font-face { @font-face {
font-family: 'fontello'; font-family: 'fontello';
src: url('fonts/fontello.eot?60518104'); src: url('fonts/fontello.eot?60518104');
src: url('fonts/fontello.eot?60518104#iefix') format('embedded-opentype'), src:
url('fonts/fontello.woff?60518104') format('woff'), url('fonts/fontello.eot?60518104#iefix') format('embedded-opentype'),
url('fonts/fontello.ttf?60518104') format('truetype'), url('fonts/fontello.woff?60518104') format('woff'),
url('fonts/fontello.svg?60518104#fontello') format('svg'); url('fonts/fontello.ttf?60518104') format('truetype'),
url('fonts/fontello.svg?60518104#fontello') format('svg');
font-weight: normal; font-weight: normal;
font-style: normal; font-style: normal;
} }
@ -15,30 +16,22 @@ body {
} }
#main { #main {
/* height: 500px; */
position: absolute; position: absolute;
width: 100%; width: 100%;
height: 100%; height: 100%;
right: 0; right: 0;
/* left: 40px; */
/* -webkit-transform: translate(40px, 0);
-moz-transform: translate(40px, 0); */
/* border-radius: 5px 0px 0px 5px; */
border-radius: 5px; border-radius: 5px;
background: #fff; background: #fff;
overflow: hidden; overflow: hidden;
-webkit-transition: -webkit-transform .4s, width .2s; -webkit-transition: -webkit-transform 0.4s, width 0.2s;
-moz-transition: -webkit-transform .4s, width .2s; -moz-transition: -webkit-transform 0.4s, width 0.2s;
-ms-transition: -webkit-transform .4s, width .2s; -ms-transition: -webkit-transform 0.4s, width 0.2s;
-moz-box-shadow: inset 0 0 50px rgba(0, 0, 0, 0.1);
-moz-box-shadow: inset 0 0 50px rgba(0,0,0,.1); -webkit-box-shadow: inset 0 0 50px rgba(0, 0, 0, 0.1);
-webkit-box-shadow: inset 0 0 50px rgba(0,0,0,.1); -ms-box-shadow: inset 0 0 50px rgba(0, 0, 0, 0.1);
-ms-box-shadow: inset 0 0 50px rgba(0,0,0,.1); box-shadow: inset 0 0 50px rgba(0, 0, 0, 0.1);
box-shadow: inset 0 0 50px rgba(0,0,0,.1);
} }
#titlebar { #titlebar {
height: 8%; height: 8%;
min-height: 20px; min-height: 20px;
@ -48,11 +41,11 @@ body {
color: #4f4f4f; color: #4f4f4f;
font-weight: 100; font-weight: 100;
font-family: Georgia, "Times New Roman", Times, serif; font-family: Georgia, "Times New Roman", Times, serif;
opacity: .5; opacity: 0.5;
text-align: center; text-align: center;
-webkit-transition: opacity .5s; -webkit-transition: opacity 0.5s;
-moz-transition: opacity .5s; -moz-transition: opacity 0.5s;
-ms-transition: opacity .5s; -ms-transition: opacity 0.5s;
z-index: 10; z-index: 10;
} }
@ -66,7 +59,7 @@ body {
line-height: 20px; line-height: 20px;
overflow: hidden; overflow: hidden;
display: inline-block; display: inline-block;
opacity: .5; opacity: 0.5;
padding: 4px; padding: 4px;
border-radius: 4px; border-radius: 4px;
} }
@ -76,35 +69,27 @@ body {
} }
#titlebar a:hover { #titlebar a:hover {
opacity: .8; opacity: 0.8;
border: 1px rgba(0,0,0,.2) solid; border: 1px rgba(0, 0, 0, 0.2) solid;
padding: 3px; padding: 3px;
} }
#titlebar a:active { #titlebar a:active {
opacity: 1; opacity: 1;
color: rgba(0,0,0,.6); color: rgba(0, 0, 0, 0.6);
/* margin: 1px -1px -1px 1px; */ -moz-box-shadow: inset 0 0 6px rgba(155, 155, 155, 0.8);
-moz-box-shadow: inset 0 0 6px rgba(155,155,155,.8); -webkit-box-shadow: inset 0 0 6px rgba(155, 155, 155, 0.8);
-webkit-box-shadow: inset 0 0 6px rgba(155,155,155,.8); -ms-box-shadow: inset 0 0 6px rgba(155, 155, 155, 0.8);
-ms-box-shadow: inset 0 0 6px rgba(155,155,155,.8); box-shadow: inset 0 0 6px rgba(155, 155, 155, 0.8);
box-shadow: inset 0 0 6px rgba(155,155,155,.8);
} }
#book-title { #book-title { font-weight: 600; }
font-weight: 600; #title-seperator { display: none; }
}
#title-seperator {
display: none;
}
#viewer { #viewer {
width: 80%; width: 80%;
height: 80%; height: 80%;
/* margin-left: 10%; */
margin: 0 auto; margin: 0 auto;
/* max-width: 1250px; */
z-index: 2; z-index: 2;
position: relative; position: relative;
overflow: hidden; overflow: hidden;
@ -114,18 +99,22 @@ body {
border: none; border: none;
} }
#left,
#prev { #prev {
left: 40px; left: 40px;
padding-right: 80px;
} }
#right,
#next { #next {
right: 40px; right: 40px;
padding-left: 80px;
} }
.arrow { .arrow {
position: absolute; position: absolute;
top: 50%; top: 50%;
margin-top: -32px; margin-top: -192px;
font-size: 64px; font-size: 64px;
color: #E2E2E2; color: #E2E2E2;
font-family: arial, sans-serif; font-family: arial, sans-serif;
@ -136,6 +125,8 @@ body {
-moz-user-select: none; -moz-user-select: none;
-ms-user-select: none; -ms-user-select: none;
user-select: none; user-select: none;
padding-top: 160px;
padding-bottom: 160px;
} }
.arrow:hover { .arrow:hover {
@ -150,24 +141,20 @@ body {
#sidebar { #sidebar {
background: #6b6b6b; background: #6b6b6b;
position: absolute; position: absolute;
/* left: -260px; */
/* -webkit-transform: translate(-260px, 0);
-moz-transform: translate(-260px, 0); */
top: 0; top: 0;
min-width: 300px; min-width: 300px;
width: 25%; width: 25%;
height: 100%; height: 100%;
-webkit-transition: -webkit-transform .5s; -webkit-transition: -webkit-transform 0.5s;
-moz-transition: -moz-transform .5s; -moz-transition: -moz-transform 0.5s;
-ms-transition: -moz-transform .5s; -ms-transition: -moz-transform 0.5s;
overflow: hidden; overflow: hidden;
} }
#sidebar.open { #sidebar.open {
/* left: 0; */ /* left: 0; */
/* -webkit-transform: translate(0, 0); /* -webkit-transform: translate(0, 0);
-moz-transform: translate(0, 0); */ -moz-transform: translate(0, 0); */
} }
#main.closed { #main.closed {
@ -194,10 +181,10 @@ body {
width: 100%; width: 100%;
padding: 13px 0; padding: 13px 0;
height: 14px; height: 14px;
-moz-box-shadow: 0px 1px 3px rgba(0,0,0,.6); -moz-box-shadow: 0 1px 3px rgba(0, 0, 0, 0.6);
-webkit-box-shadow: 0px 1px 3px rgba(0,0,0,.6); -webkit-box-shadow: 0 1px 3px rgba(0, 0, 0, 0.6);
-ms-box-shadow: 0px 1px 3px rgba(0,0,0,.6); -ms-box-shadow: 0 1px 3px rgba(0, 0, 0, 0.6);
box-shadow: 0px 1px 3px rgba(0,0,0,.6); box-shadow: 0 1px 3px rgba(0, 0, 0, 0.6);
} }
#opener { #opener {
@ -205,19 +192,13 @@ body {
float: left; float: left;
} }
/* #opener #slider {
width: 25px;
} */
#metainfo { #metainfo {
display: inline-block; display: inline-block;
text-align: center; text-align: center;
max-width: 80%; max-width: 80%;
} }
#title-controls { #title-controls { float: right; }
float: right;
}
#panels a { #panels a {
visibility: hidden; visibility: hidden;
@ -229,22 +210,17 @@ body {
margin-left: 6px; margin-left: 6px;
} }
#panels a::before { #panels a::before { visibility: visible; }
visibility: visible; #panels a:hover { color: #aaa; }
}
#panels a:hover {
color: #AAA;
}
#panels a:active { #panels a:active {
color: #AAA; color: #aaa;
margin: 1px 0 -1px 6px; margin: 1px 0 -1px 6px;
} }
#panels a.active, #panels a.active,
#panels a.active:hover { #panels a.active:hover {
color: #AAA; color: #aaa;
} }
#searchBox { #searchBox {
@ -252,28 +228,11 @@ body {
float: left; float: left;
margin-left: 10px; margin-left: 10px;
margin-top: -1px; margin-top: -1px;
/*
border-radius: 5px;
background: #9b9b9b;
float: left;
margin-left: 5px;
margin-top: -5px;
padding: 3px 10px;
color: #000;
border: none;
outline: none; */
} }
input::-webkit-input-placeholder { input::-webkit-input-placeholder { color: #454545; }
color: #454545; input:-moz-placeholder { color: #454545; }
} input:-ms-placeholder { color: #454545; }
input:-moz-placeholder {
color: #454545;
}
input:-ms-placeholder {
color: #454545;
}
#divider { #divider {
position: absolute; position: absolute;
@ -309,13 +268,11 @@ input:-ms-placeholder {
width: 25%; width: 25%;
height: 100%; height: 100%;
visibility: hidden; visibility: hidden;
-webkit-transition: visibility 0 ease .5s; -webkit-transition: visibility 0 ease 0.5s;
-moz-transition: visibility 0 ease .5s; -moz-transition: visibility 0 ease 0.5s;
-ms-transition: visibility 0 ease .5s; -ms-transition: visibility 0 ease 0.5s;
} }
#sidebar.open #tocView, #sidebar.open #tocView,
#sidebar.open #bookmarksView { #sidebar.open #bookmarksView {
overflow-y: auto; overflow-y: auto;
@ -353,7 +310,7 @@ input:-ms-placeholder {
} }
.list_item a { .list_item a {
color: #AAA; color: #aaa;
text-decoration: none; text-decoration: none;
} }
@ -362,7 +319,7 @@ input:-ms-placeholder {
} }
.list_item a.section { .list_item a.section {
font-size: .8em; font-size: 0.8em;
} }
.list_item.currentChapter > a, .list_item.currentChapter > a,
@ -372,7 +329,7 @@ input:-ms-placeholder {
/* #tocView li.openChapter > a, */ /* #tocView li.openChapter > a, */
.list_item a:hover { .list_item a:hover {
color: #E2E2E2; color: #e2e2e2;
} }
.list_item ul { .list_item ul {
@ -433,7 +390,7 @@ input:-ms-placeholder {
} }
#searchResults a { #searchResults a {
color: #AAA; color: #aaa;
text-decoration: none; text-decoration: none;
} }
@ -449,11 +406,11 @@ input:-ms-placeholder {
} }
#searchResults li > p { #searchResults li > p {
color: #AAA; color: #aaa;
} }
#searchResults li a:hover { #searchResults li a:hover {
color: #E2E2E2; color: #e2e2e2;
} }
#searchView.shown { #searchView.shown {
@ -498,7 +455,7 @@ input:-ms-placeholder {
} }
#note-text[disabled], #note-text[disabled="disabled"]{ #note-text[disabled], #note-text[disabled="disabled"]{
opacity: .5; opacity: 0.5;
} }
#note-anchor { #note-anchor {
@ -507,30 +464,30 @@ input:-ms-placeholder {
} }
#settingsPanel { #settingsPanel {
display:none; display: none;
} }
#settingsPanel h3 { #settingsPanel h3 {
color:#f1f1f1; color: #f1f1f1;
font-family:Georgia, "Times New Roman", Times, serif; font-family: Georgia, "Times New Roman", Times, serif;
margin-bottom:10px; margin-bottom: 10px;
} }
#settingsPanel ul { #settingsPanel ul {
margin-top:60px; margin-top: 60px;
list-style-type:none; list-style-type: none;
} }
#settingsPanel li { #settingsPanel li {
font-size:1em; font-size: 1em;
color:#f1f1f1; color: #f1f1f1;
} }
#settingsPanel .xsmall { font-size:x-small; } #settingsPanel .xsmall { font-size: x-small; }
#settingsPanel .small { font-size:small; } #settingsPanel .small { font-size: small; }
#settingsPanel .medium { font-size:medium; } #settingsPanel .medium { font-size: medium; }
#settingsPanel .large { font-size:large; } #settingsPanel .large { font-size: large; }
#settingsPanel .xlarge { font-size:x-large; } #settingsPanel .xlarge { font-size: x-large; }
.highlight { background-color: yellow } .highlight { background-color: yellow }
@ -558,7 +515,7 @@ input:-ms-placeholder {
left: 0; left: 0;
z-index: 1000; z-index: 1000;
opacity: 0; opacity: 0;
background: rgba(255,255,255,0.8); background: rgba(255, 255, 255, 0.8);
-webkit-transition: all 0.3s; -webkit-transition: all 0.3s;
-moz-transition: all 0.3s; -moz-transition: all 0.3s;
-ms-transition: all 0.3s; -ms-transition: all 0.3s;
@ -591,7 +548,7 @@ input:-ms-placeholder {
font-size: 22px; font-size: 22px;
font-weight: 300; font-weight: 300;
opacity: 0.8; opacity: 0.8;
background: rgba(0,0,0,0.1); background: rgba(0, 0, 0, 0.1);
border-radius: 3px 3px 0 0; border-radius: 3px 3px 0 0;
} }
@ -753,9 +710,9 @@ input:-ms-placeholder {
} }
}*/ }*/
@media only screen @media only screen
and (min-device-width : 768px) and (min-device-width : 768px)
and (max-device-width : 1024px) and (max-device-width : 1024px)
and (orientation : landscape) and (orientation : landscape)
/*and (-webkit-min-device-pixel-ratio: 2)*/ { /*and (-webkit-min-device-pixel-ratio: 2)*/ {
#viewer{ #viewer{
@ -825,23 +782,18 @@ and (orientation : landscape)
font-style: normal; font-style: normal;
font-weight: normal; font-weight: normal;
speak: none; speak: none;
display: inline-block; display: inline-block;
text-decoration: inherit; text-decoration: inherit;
width: 1em; width: 1em;
margin-right: .2em; margin-right: 0.2em;
text-align: center; text-align: center;
/* opacity: .8; */
/* For safety - reset parent styles, that can break glyph codes*/ /* For safety - reset parent styles, that can break glyph codes*/
font-variant: normal; font-variant: normal;
text-transform: none; text-transform: none;
/* you can be more comfortable with increased icons size */ /* you can be more comfortable with increased icons size */
font-size: 112%; font-size: 112%;
} }
.icon-search:before { content: '\e807'; } /* '' */ .icon-search:before { content: '\e807'; } /* '' */
.icon-resize-full-1:before { content: '\e804'; } /* '' */ .icon-resize-full-1:before { content: '\e804'; } /* '' */
.icon-cancel-circled2:before { content: '\e80f'; } /* '' */ .icon-cancel-circled2:before { content: '\e80f'; } /* '' */

View File

@ -1,4 +1,5 @@
/* http://davidwalsh.name/css-tooltips */ /* http://davidwalsh.name/css-tooltips */
/* base CSS element */ /* base CSS element */
.popup { .popup {
background: #eee; background: #eee;
@ -9,10 +10,8 @@
position: fixed; position: fixed;
max-width: 300px; max-width: 300px;
font-size: 12px; font-size: 12px;
display: none; display: none;
margin-left: 2px; margin-left: 2px;
margin-top: 30px; margin-top: 30px;
} }
@ -38,7 +37,7 @@
} }
/* below */ /* below */
.popup:before { .popup::before {
position: absolute; position: absolute;
display: inline-block; display: inline-block;
border-bottom: 10px solid #eee; border-bottom: 10px solid #eee;
@ -51,7 +50,7 @@
content: ''; content: '';
} }
.popup:after { .popup::after {
position: absolute; position: absolute;
display: inline-block; display: inline-block;
border-bottom: 9px solid #eee; border-bottom: 9px solid #eee;
@ -64,33 +63,31 @@
} }
/* above */ /* above */
.popup.above:before { .popup.above::before {
border-bottom: none; border-bottom: none;
border-top: 10px solid #eee; border-top: 10px solid #eee;
border-top-color: rgba(0, 0, 0, 0.2); border-top-color: rgba(0, 0, 0, 0.2);
top: 100%; top: 100%;
} }
.popup.above:after { .popup.above::after {
border-bottom: none; border-bottom: none;
border-top: 9px solid #eee; border-top: 9px solid #eee;
top: 100%; top: 100%;
} }
.popup.left:before, .popup.left::before,
.popup.left:after .popup.left::after {
{
left: 20px; left: 20px;
} }
.popup.right:before, .popup.right::before,
.popup.right:after .popup.right::after {
{
left: auto; left: auto;
right: 20px; right: 20px;
} }
.popup.show,
.popup.show, .popup.on { .popup.on {
display: block; display: block;
} }

View File

@ -1,5 +1,20 @@
.tooltip.bottom .tooltip-inner{font-size:13px;font-family:Open Sans Semibold,Helvetica Neue,Helvetica,Arial,sans-serif;-webkit-font-smoothing:antialiased;-moz-osx-font-smoothing:grayscale;padding:3px 10px;border-radius:4px;background-color:#fff;-webkit-box-shadow:0 4px 10px 0 rgba(0,0,0,.35);box-shadow:0 4px 10px 0 rgba(0,0,0,.35);opacity:1;white-space:nowrap;margin-top:-16px!important;line-height:1.71428571;color:#ddd} .tooltip.bottom .tooltip-inner {
font-size: 13px;
font-family: Open Sans Semibold,Helvetica Neue,Helvetica,Arial,sans-serif;
-webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale;
padding: 3px 10px;
border-radius: 4px;
background-color: #fff;
-webkit-box-shadow: 0 4px 10px 0 rgba(0, 0, 0, 0.35);
box-shadow: 0 4px 10px 0 rgba(0, 0, 0, 0.35);
opacity: 1;
white-space: nowrap;
margin-top: -16px !important;
line-height: 1.71428571;
color: #ddd;
}
@font-face { @font-face {
font-family: 'Grand Hotel'; font-family: 'Grand Hotel';
@ -12,140 +27,302 @@ html.http-error {
margin: 0; margin: 0;
height: 100%; height: 100%;
} }
.http-error body { .http-error body {
margin: 0; margin: 0;
height: 100%; height: 100%;
display: table; display: table;
width: 100%; width: 100%;
} }
.http-error body > div { .http-error body > div {
display: table-cell; display: table-cell;
vertical-align: middle; vertical-align: middle;
text-align: center; text-align: center;
} }
body{background:#f2f2f2}body h2{font-weight:normal;color:#444} body {
body { margin-bottom: 40px;} background: #f2f2f2;
a{color: #45b29d}a:hover{color: #444;} margin-bottom: 40px;
.navigation .nav-head{text-transform:uppercase;color:#999;margin:20px 0}.navigation .nav-head:nth-child(1n+2){border-top:1px solid #ccc;padding-top:20px}
.navigation li a{color:#444;text-decoration:none;display:block;padding:10px}.navigation li a:hover{background:rgba(153,153,153,0.4);border-radius:5px}
.navigation li a span{margin-right:10px}
.navigation .create-shelf{margin:30px 0;font-size:12px;text-align:center}.navigation .create-shelf a{color:#fff;background:#45b29d;padding:10px 20px;border-radius:5px;text-decoration:none}
.container-fluid img{display:block;max-width:100%;height:auto}
.container-fluid .discover{margin-bottom:50px}
.container-fluid .new-books{border-top:1px solid #ccc}.container-fluid .new-books h2{margin:50px 0 0 0}
.container-fluid .book{margin-top:20px}.container-fluid .book .cover{height:225px;position:relative}.container-fluid .book .cover img{border:1px solid #fff;/*border-radius:7px;*/box-sizeing:border-box;height:100%;bottom:0;position:absolute;-webkit-box-shadow: 0 5px 8px -6px #777;-moz-box-shadow: 0 5px 8px -6px #777;box-shadow: 0 5px 8px -6px #777;}
.container-fluid .book .meta{margin-top:10px}.container-fluid .book .meta p{margin:0}
.container-fluid .book .meta .title{font-weight:bold;font-size:15px;color:#444}
.container-fluid .book .meta .author{font-size:12px;color:#999}
.container-fluid .book .meta .rating{margin-top:5px}.rating .glyphicon-star{color:#999}.rating .glyphicon-star.good{color:#45b29d}
.container-fluid .author .author-hidden, .container-fluid .author .author-hidden-divider {
display: none;
} }
.navbar-brand{font-family: 'Grand Hotel', cursive; font-size: 35px; color: #45b29d !important;} body h2 {
.more-stuff{margin-top: 20px; padding-top: 20px; border-top: 1px solid #ccc} font-weight: normal;
.more-stuff>li{margin-bottom: 10px;} color:#444;
.navbar-collapse.in .navbar-nav{margin: 0;} }
span.glyphicon.glyphicon-tags {padding-right: 5px;color: #999;vertical-align: text-top;}
.book-meta {padding-bottom: 20px;} a, .danger,.book-remove, .editable-empty, .editable-empty:hover { color: #45b29d; }
.book-meta .tags a {display: inline;}
.book-meta .identifiers a {display: inline;} .book-remove:hover { color: #23527c; }
.btn-default a { color: #444; }
.btn-default a:hover {
color: #45b29d;
text-decoration: None;
}
.btn-default:hover {
color: #45b29d;
}
.editable-click, a.editable-click, a.editable-click:hover { border-bottom: None; }
.navigation .nav-head {
text-transform: uppercase;
color: #999;
margin: 20px 0;
}
.navigation .nav-head:nth-child(1n+2) {
border-top: 1px solid #ccc;
padding-top: 20px;
}
.navigation li a {
color: #444;
text-decoration: none;
display: block;
padding: 10px;
}
.navigation li a:hover {
background: rgba(153, 153, 153, 0.4);
border-radius: 5px;
}
.navigation li a span { margin-right: 10px; }
.navigation .create-shelf {
margin: 30px 0;
font-size: 12px;
text-align: center;
}
.navigation .create-shelf a {
color: #fff;
background: #45b29d;
padding: 10px 20px;
border-radius: 5px;
text-decoration: none;
}
.row.display-flex {
display: flex;
flex-wrap: wrap;
}
.container-fluid img {
display: block;
max-width: 100%;
height: auto;
}
.container-fluid .discover{ margin-bottom: 50px; }
.container-fluid .new-books { border-top: 1px solid #ccc; }
.container-fluid .new-books h2 { margin: 50px 0 0 0; }
.container-fluid .book {
margin-top: 20px;
display: flex;
flex-direction: column;
}
.container-fluid .book .cover {
height: 225px;
position: relative;
}
.container-fluid .book .cover img {
border: 1px solid #fff;
box-sizing: border-box;
height: 100%;
bottom: 0;
position: absolute;
-webkit-box-shadow: 0 5px 8px -6px #777;
-moz-box-shadow: 0 5px 8px -6px #777;
box-shadow: 0 5px 8px -6px #777;
}
.container-fluid .book .meta { margin-top: 10px; }
.container-fluid .book .meta p { margin: 0; }
.container-fluid .book .meta .title {
font-weight: bold;
font-size: 15px;
color: #444;
}
.container-fluid .book .meta .series {
font-weight: 400;
font-size: 12px;
color: #444;
}
.container-fluid .book .meta .author {
font-size: 12px;
color: #999;
}
.container-fluid .book .meta .rating { margin-top: 5px; }
.rating .glyphicon-star-empty { color: #444; }
.rating .glyphicon-star.good { color: #444; }
.rating-clear .glyphicon-remove { color: #333 }
.container-fluid .author .author-hidden, .container-fluid .author .author-hidden-divider { display: none; }
.navbar-brand {
font-family: 'Grand Hotel', cursive;
font-size: 35px;
color: #45b29d !important;
}
.more-stuff {
margin-top: 20px;
padding-top: 20px;
border-top: 1px solid #ccc;
}
.more-stuff>li { margin-bottom: 10px; }
.navbar-collapse.in .navbar-nav { margin: 0; }
span.glyphicon.glyphicon-tags {
padding-right: 5px;
color: #999;
vertical-align: text-top;
}
.book-meta { padding-bottom: 20px; }
.book-meta .tags a { display: inline; }
.book-meta .identifiers a { display: inline; }
.container-fluid .single .cover img { .container-fluid .single .cover img {
border: 1px solid #fff; border: 1px solid #fff;
/*border-radius: 7px;*/ box-sizing: border-box;
box-sizeing: border-box; -webkit-box-shadow: 0 5px 8px -6px #777;
-webkit-box-shadow: 0 5px 8px -6px #777; -moz-box-shadow: 0 5px 8px -6px #777;
-moz-box-shadow: 0 5px 8px -6px #777; box-shadow: 0 5px 8px -6px #777;
box-shadow: 0 5px 8px -6px #777;
} }
.navbar-default .navbar-toggle .icon-bar {background-color: #000;} .navbar-default .navbar-toggle .icon-bar {background-color: #000; }
.navbar-default .navbar-toggle {border-color: #000;} .navbar-default .navbar-toggle {border-color: #000; }
.cover { margin-bottom: 10px;} .cover { margin-bottom: 10px; }
.cover .badge{
position: absolute;
top: 2px;
left: 2px;
background-color: #777;
}
.cover-height { max-height: 100px;} .cover-height { max-height: 100px;}
.col-sm-2 a .cover-small {
margin: 5px;
max-height: 200px;
}
.btn-file {position: relative; overflow: hidden;} .btn-file {position: relative; overflow: hidden;}
.btn-file input[type=file] {position: absolute; top: 0; right: 0; min-width: 100%; min-height: 100%; font-size: 100px; text-align: right; filter: alpha(opacity=0); opacity: 0; outline: none; background: white; cursor: inherit; display: block;}
.btn-file input[type=file] {
position: absolute;
top: 0;
right: 0;
min-width: 100%;
min-height: 100%;
font-size: 100px;
text-align: right;
filter: alpha(opacity=0);
opacity: 0;
outline: none;
background: white;
cursor: inherit;
display: block;
}
.btn-toolbar .btn,.discover .btn { margin-bottom: 5px; } .btn-toolbar .btn,.discover .btn { margin-bottom: 5px; }
.button-link {color:#fff;} .button-link {color: #fff; }
.btn-primary:hover, .btn-primary:focus, .btn-primary:active, .btn-primary.active, .open .dropdown-toggle.btn-primary{ background-color: #1C5484; } .btn-primary:hover, .btn-primary:focus, .btn-primary:active, .btn-primary.active, .open .dropdown-toggle.btn-primary{ background-color: #1C5484; }
.btn-primary.disabled, .btn-primary[disabled], fieldset[disabled] .btn-primary, .btn-primary.disabled:hover, .btn-primary[disabled]:hover, fieldset[disabled] .btn-primary:hover, .btn-primary.disabled:focus, .btn-primary[disabled]:focus, fieldset[disabled] .btn-primary:focus, .btn-primary.disabled:active, .btn-primary[disabled]:active, fieldset[disabled] .btn-primary:active, .btn-primary.disabled.active, .btn-primary[disabled].active, fieldset[disabled] .btn-primary.active { background-color: #89B9E2; } .btn-primary.disabled, .btn-primary[disabled], fieldset[disabled] .btn-primary, .btn-primary.disabled:hover, .btn-primary[disabled]:hover, fieldset[disabled] .btn-primary:hover, .btn-primary.disabled:focus, .btn-primary[disabled]:focus, fieldset[disabled] .btn-primary:focus, .btn-primary.disabled:active, .btn-primary[disabled]:active, fieldset[disabled] .btn-primary:active, .btn-primary.disabled.active, .btn-primary[disabled].active, fieldset[disabled] .btn-primary.active { background-color: #89B9E2; }
.btn-toolbar>.btn+.btn, .btn-toolbar>.btn-group+.btn, .btn-toolbar>.btn+.btn-group, .btn-toolbar>.btn-group+.btn-group { margin-left:0px; } .btn-toolbar>.btn+.btn, .btn-toolbar>.btn-group+.btn, .btn-toolbar>.btn+.btn-group, .btn-toolbar>.btn-group+.btn-group { margin-left: 0px; }
.panel-body {background-color: #f5f5f5;} .panel-body {background-color: #f5f5f5; }
.spinner {margin:0 41%;} .spinner {margin: 0 41%; }
.spinner2 {margin:0 41%;} .spinner2 {margin: 0 41%; }
.intend-form { margin-left:20px; }
table .bg-dark-danger {background-color: #d9534f; color: #fff; }
table .bg-dark-danger a {color: #fff; }
table .bg-dark-danger:hover {background-color: #c9302c; }
table .bg-primary:hover {background-color: #1C5484; }
table .bg-primary a {color: #fff; }
.block-label {display: block;} .block-label {display: block;}
.fake-input {position: absolute; pointer-events: none; top: 0;} .fake-input {position: absolute; pointer-events: none; top: 0; }
input.pill { position: absolute; opacity: 0; } input.pill { position: absolute; opacity: 0; }
input.pill + label { input.pill + label {
border: 2px solid #45b29d; border: 2px solid #45b29d;
border-radius: 15px; border-radius: 15px;
color: #45b29d; color: #45b29d;
cursor: pointer; cursor: pointer;
display: inline-block; display: inline-block;
padding: 3px 15px; padding: 3px 15px;
user-select: none; user-select: none;
-webkit-font-smoothing: antialiased; -webkit-font-smoothing: antialiased;
-moz-osx-font-smoothing: grayscale; -moz-osx-font-smoothing: grayscale;
} }
input.pill:checked + label { input.pill:checked + label {
background-color: #45b29d; background-color: #45b29d;
border-color: #fff; border-color: #fff;
color: #fff; color: #fff;
}
input.pill:not(:checked) + label .glyphicon {
display: none;
} }
.author-bio img {margin: 0 1em 1em 0;} input.pill:not(:checked) + label .glyphicon { display: none; }
.author-link {display: inline-block; margin-top: 10px; width: 100px;}
.author-link img {display: block; height: 100%;}
#remove-from-shelves .btn, .author-bio img { margin: 0 1em 1em 0; }
#shelf-action-errors { .author-link { display: inline-block; margin-top: 10px; width: 100px; }
margin-left: 5px; .author-link img { display: block; height: 100%; }
} #remove-from-shelves .btn, #shelf-action-errors { margin-left: 5px; }
.tags_click, .serie_click, .language_click {margin-right: 5px;} .tags_click, .serie_click, .language_click { margin-right: 5px; }
#meta-info { #meta-info {
height:600px; height: 600px;
overflow-y:scroll; overflow-y: scroll;
} }
.media-list {
padding-right:15px; .media-list { padding-right: 15px; }
.media-body p { text-align: justify; }
#meta-info img {
max-height: 150px;
max-width: 100px;
cursor: pointer;
} }
.media-body p {
text-align: justify;
}
#meta-info img { max-height: 150px; max-width: 100px; cursor: pointer; }
.padded-bottom { margin-bottom: 15px; } .padded-bottom { margin-bottom: 15px; }
.upload-format-input-text { display: initial; }
#btn-upload-format { display: none; }
.upload-cover-input-text { display: initial; }
#btn-upload-cover { display: none; }
.panel-title > a { text-decoration: none; }
.editable-buttons {
display:inline-block;
margin-left: 7px;
}
.upload-format-input-text {display: initial;} .editable-input { display:inline-block; }
#btn-upload-format {display: none;}
.upload-cover-input-text {display: initial;} .editable-cancel {
#btn-upload-cover {display: none;} margin-bottom: 0px !important;
margin-left: 7px !important;
.panel-title > a { text-decoration: none;} }
.editable-buttons { display:inline-block; margin-left: 7px ;}
.editable-input { display:inline-block;}
.editable-cancel { margin-bottom: 0px !important; margin-left: 7px !important;}
.editable-submit { margin-bottom: 0px !important;}
.editable-submit { margin-bottom: 0px !important; }
.filterheader { margin-bottom: 20px; } .filterheader { margin-bottom: 20px; }
.errorlink { margin-top: 20px; }
.emailconfig { margin-top: 10px; }
.errorlink {margin-top: 20px;}
.modal-body .comments { .modal-body .comments {
max-height:300px; max-height: 300px;
overflow-y: auto; overflow-y: auto;
} }
div.log { div.log {
@ -158,3 +335,4 @@ div.log {
white-space: nowrap; white-space: nowrap;
padding: 0.5em; padding: 0.5em;
} }

View File

@ -1,8 +1,8 @@
@media (min-device-width: 768px) { @media (min-device-width: 768px) {
.upload-modal-dialog { .upload-modal-dialog {
position: absolute; position: absolute;
top: 45%; top: 45%;
left: 50%; left: 50%;
transform: translate(-50%, -50%) !important; transform: translate(-50%, -50%) !important;
} }
} }

Binary file not shown.

Before

Width:  |  Height:  |  Size: 178 KiB

After

Width:  |  Height:  |  Size: 19 KiB

View File

@ -40,7 +40,8 @@ function alphanumCase(a, b) {
while (i = (j = t.charAt(x++)).charCodeAt(0)) { while (i = (j = t.charAt(x++)).charCodeAt(0)) {
var m = (i === 46 || (i >= 48 && i <= 57)); var m = (i === 46 || (i >= 48 && i <= 57));
if (m !== n) { // Compare has to be with != otherwise fails
if (m != n) {
tz[++y] = ""; tz[++y] = "";
n = m; n = m;
} }
@ -55,7 +56,8 @@ function alphanumCase(a, b) {
for (var x = 0; aa[x] && bb[x]; x++) { for (var x = 0; aa[x] && bb[x]; x++) {
if (aa[x] !== bb[x]) { if (aa[x] !== bb[x]) {
var c = Number(aa[x]), d = Number(bb[x]); var c = Number(aa[x]), d = Number(bb[x]);
if (c === aa[x] && d === bb[x]) { // Compare has to be with == otherwise fails
if (c == aa[x] && d == bb[x]) {
return c - d; return c - d;
} else { } else {
return (aa[x] > bb[x]) ? 1 : -1; return (aa[x] > bb[x]) ? 1 : -1;
@ -409,6 +411,19 @@ bitjs.archive = bitjs.archive || {};
return "unrar.js"; return "unrar.js";
}; };
/**
* Unrarrer5
* @extends {bitjs.archive.Unarchiver}
* @constructor
*/
bitjs.archive.Unrarrer5 = function(arrayBuffer, optPathToBitJS) {
bitjs.base(this, arrayBuffer, optPathToBitJS);
};
bitjs.inherits(bitjs.archive.Unrarrer5, bitjs.archive.Unarchiver);
bitjs.archive.Unrarrer5.prototype.getScriptFileName = function() {
return "unrar5.js";
};
/** /**
* Untarrer * Untarrer
* @extends {bitjs.archive.Unarchiver} * @extends {bitjs.archive.Unarchiver}

View File

@ -9,10 +9,18 @@
/** /**
* CRC Implementation. * CRC Implementation.
*/ */
/* global Uint8Array, Uint32Array, bitjs, DataView */ /* global Uint8Array, Uint32Array, bitjs, DataView, mem */
/* exported MAXWINMASK, UnpackFilter */ /* exported MAXWINMASK, UnpackFilter */
var CRCTab = new Array(256).fill(0); function emptyArr(n, v) {
var arr = [];
for (var i = 0; i < n; i += 1) {
arr[i] = v;
}
return arr;
}
var CRCTab = emptyArr(256, 0);
function initCRC() { function initCRC() {
for (var i = 0; i < 256; ++i) { for (var i = 0; i < 256; ++i) {

View File

@ -14,10 +14,10 @@
/* global VM_FIXEDGLOBALSIZE, VM_GLOBALMEMSIZE, MAXWINMASK, VM_GLOBALMEMADDR, MAXWINSIZE */ /* global VM_FIXEDGLOBALSIZE, VM_GLOBALMEMSIZE, MAXWINMASK, VM_GLOBALMEMADDR, MAXWINSIZE */
// This file expects to be invoked as a Worker (see onmessage below). // This file expects to be invoked as a Worker (see onmessage below).
importScripts("../io/bitstream.js"); /*importScripts("../io/bitstream.js");
importScripts("../io/bytebuffer.js"); importScripts("../io/bytebuffer.js");
importScripts("archive.js"); importScripts("archive.js");
importScripts("rarvm.js"); importScripts("rarvm.js");*/
// Progress variables. // Progress variables.
var currentFilename = ""; var currentFilename = "";
@ -29,19 +29,21 @@ var totalFilesInArchive = 0;
// Helper functions. // Helper functions.
var info = function(str) { var info = function(str) {
postMessage(new bitjs.archive.UnarchiveInfoEvent(str)); console.log(str);
// postMessage(new bitjs.archive.UnarchiveInfoEvent(str));
}; };
var err = function(str) { var err = function(str) {
postMessage(new bitjs.archive.UnarchiveErrorEvent(str)); console.log(str);
// postMessage(new bitjs.archive.UnarchiveErrorEvent(str));
}; };
var postProgress = function() { var postProgress = function() {
postMessage(new bitjs.archive.UnarchiveProgressEvent( /*postMessage(new bitjs.archive.UnarchiveProgressEvent(
currentFilename, currentFilename,
currentFileNumber, currentFileNumber,
currentBytesUnarchivedInFile, currentBytesUnarchivedInFile,
currentBytesUnarchived, currentBytesUnarchived,
totalUncompressedBytesInArchive, totalUncompressedBytesInArchive,
totalFilesInArchive)); totalFilesInArchive));*/
}; };
// shows a byte value as its hex representation // shows a byte value as its hex representation
@ -1298,7 +1300,7 @@ var unrar = function(arrayBuffer) {
totalUncompressedBytesInArchive = 0; totalUncompressedBytesInArchive = 0;
totalFilesInArchive = 0; totalFilesInArchive = 0;
postMessage(new bitjs.archive.UnarchiveStartEvent()); //postMessage(new bitjs.archive.UnarchiveStartEvent());
var bstream = new bitjs.io.BitStream(arrayBuffer, false /* rtl */); var bstream = new bitjs.io.BitStream(arrayBuffer, false /* rtl */);
var header = new RarVolumeHeader(bstream); var header = new RarVolumeHeader(bstream);
@ -1348,7 +1350,7 @@ var unrar = function(arrayBuffer) {
localfile.unrar(); localfile.unrar();
if (localfile.isValid) { if (localfile.isValid) {
postMessage(new bitjs.archive.UnarchiveExtractEvent(localfile)); // postMessage(new bitjs.archive.UnarchiveExtractEvent(localfile));
postProgress(); postProgress();
} }
} }
@ -1358,7 +1360,7 @@ var unrar = function(arrayBuffer) {
} else { } else {
err("Invalid RAR file"); err("Invalid RAR file");
} }
postMessage(new bitjs.archive.UnarchiveFinishEvent()); // postMessage(new bitjs.archive.UnarchiveFinishEvent());
}; };
// event.data.file has the ArrayBuffer. // event.data.file has the ArrayBuffer.

File diff suppressed because it is too large Load Diff

File diff suppressed because it is too large Load Diff

View File

@ -25,6 +25,14 @@ $("#have_read_cb").on("change", function() {
$(this).closest("form").submit(); $(this).closest("form").submit();
}); });
$(function() {
$("#archived_form").ajaxForm();
});
$("#archived_cb").on("change", function() {
$(this).closest("form").submit();
});
(function() { (function() {
var templates = { var templates = {
add: _.template( add: _.template(

View File

@ -0,0 +1,70 @@
/* This file is part of the Calibre-Web (https://github.com/janeczku/calibre-web)
* Copyright (C) 2018 OzzieIsaacs
*
* This program is free software: you can redistribute it and/or modify
* it under the terms of the GNU General Public License as published by
* the Free Software Foundation, either version 3 of the License, or
* (at your option) any later version.
*
* This program is distributed in the hope that it will be useful,
* but WITHOUT ANY WARRANTY; without even the implied warranty of
* MERCHANTABILITY or FITNESS FOR A PARTICULAR PURPOSE. See the
* GNU General Public License for more details.
*
* You should have received a copy of the GNU General Public License
* along with this program. If not, see <http://www.gnu.org/licenses/>.
*/
var $list = $("#list").isotope({
itemSelector: ".book",
layoutMode: "fitRows",
getSortData: {
title: ".title",
}
});
$("#desc").click(function() {
var page = $(this).data("id");
$.ajax({
method:"post",
contentType: "application/json; charset=utf-8",
dataType: "json",
url: window.location.pathname + "/../../ajax/view",
data: "{\"" + page + "\": {\"dir\": \"desc\"}}",
});
$list.isotope({
sortBy: "name",
sortAscending: true
});
return;
});
$("#asc").click(function() {
var page = $(this).data("id");
$.ajax({
method:"post",
contentType: "application/json; charset=utf-8",
dataType: "json",
url: window.location.pathname + "/../../ajax/view",
data: "{\"" + page + "\": {\"dir\": \"asc\"}}",
});
$list.isotope({
sortBy: "name",
sortAscending: false
});
return;
});
$("#all").click(function() {
// go through all elements and make them visible
$list.isotope({ filter: function() {
return true;
} })
});
$(".char").click(function() {
var character = this.innerText;
$list.isotope({ filter: function() {
return this.attributes["data-id"].value.charAt(0).toUpperCase() == character;
} })
});

View File

@ -19,6 +19,17 @@ var direction = 0; // Descending order
var sort = 0; // Show sorted entries var sort = 0; // Show sorted entries
$("#sort_name").click(function() { $("#sort_name").click(function() {
var class_name = $("h1").attr('Class') + "_sort_name";
var obj = {};
obj[class_name] = sort;
/*$.ajax({
method:"post",
contentType: "application/json; charset=utf-8",
dataType: "json",
url: window.location.pathname + "/../../ajax/view",
data: JSON.stringify({obj}),
});*/
var count = 0; var count = 0;
var index = 0; var index = 0;
var store; var store;
@ -40,12 +51,10 @@ $("#sort_name").click(function() {
count++; count++;
} }
}); });
/*listItems.sort(function(a,b){
return $(a).children()[1].innerText.localeCompare($(b).children()[1].innerText)
});*/
// Find count of middle element // Find count of middle element
if (count > 20) { if (count > 20) {
var middle = parseInt(count / 2) + (count % 2); var middle = parseInt(count / 2, 10) + (count % 2);
// search for the middle of all visibe elements // search for the middle of all visibe elements
$(".row").each(function() { $(".row").each(function() {
index++; index++;
@ -66,6 +75,14 @@ $("#desc").click(function() {
if (direction === 0) { if (direction === 0) {
return; return;
} }
var page = $(this).data("id");
$.ajax({
method:"post",
contentType: "application/json; charset=utf-8",
dataType: "json",
url: window.location.pathname + "/../../ajax/view",
data: "{\"" + page + "\": {\"dir\": \"desc\"}}",
});
var index = 0; var index = 0;
var list = $("#list"); var list = $("#list");
var second = $("#second"); var second = $("#second");
@ -102,9 +119,18 @@ $("#desc").click(function() {
$("#asc").click(function() { $("#asc").click(function() {
if (direction === 1) { if (direction === 1) {
return; return;
} }
var page = $(this).data("id");
$.ajax({
method:"post",
contentType: "application/json; charset=utf-8",
dataType: "json",
url: window.location.pathname + "/../../ajax/view",
data: "{\"" + page + "\": {\"dir\": \"asc\"}}",
});
var index = 0; var index = 0;
var list = $("#list"); var list = $("#list");
var second = $("#second"); var second = $("#second");
@ -131,7 +157,6 @@ $("#asc").click(function() {
}); });
// middle = parseInt(elementLength / 2) + (elementLength % 2); // middle = parseInt(elementLength / 2) + (elementLength % 2);
list.append(reversed.slice(0, index)); list.append(reversed.slice(0, index));
second.append(reversed.slice(index, elementLength)); second.append(reversed.slice(index, elementLength));
} else { } else {
@ -146,7 +171,7 @@ $("#all").click(function() {
// Find count of middle element // Find count of middle element
var listItems = $("#list").children(".row"); var listItems = $("#list").children(".row");
var listlength = listItems.length; var listlength = listItems.length;
var middle = parseInt(listlength / 2) + (listlength % 2); var middle = parseInt(listlength / 2, 10) + (listlength % 2);
// go through all elements and make them visible // go through all elements and make them visible
listItems.each(function() { listItems.each(function() {
$(this).show(); $(this).show();
@ -178,7 +203,7 @@ $(".char").click(function() {
}); });
if (count > 20) { if (count > 20) {
// Find count of middle element // Find count of middle element
var middle = parseInt(count / 2) + (count % 2); var middle = parseInt(count / 2, 10) + (count % 2);
// search for the middle of all visibe elements // search for the middle of all visibe elements
$(".row").each(function() { $(".row").each(function() {
index++; index++;

View File

@ -15,23 +15,29 @@
* along with this program. If not, see <http://www.gnu.org/licenses/>. * along with this program. If not, see <http://www.gnu.org/licenses/>.
*/ */
/* /*
* Get Metadata from Douban Books api and Google Books api * Get Metadata from Douban Books api and Google Books api and ComicVine
* Google Books api document: https://developers.google.com/books/docs/v1/using * Google Books api document: https://developers.google.com/books/docs/v1/using
* Douban Books api document: https://developers.douban.com/wiki/?title=book_v2 (Chinese Only) * Douban Books api document: https://developers.douban.com/wiki/?title=book_v2 (Chinese Only)
* ComicVine api document: https://comicvine.gamespot.com/api/documentation
*/ */
/* global _, i18nMsg, tinymce */ /* global _, i18nMsg, tinymce */
// var dbResults = []; var dbResults = [];
var ggResults = []; var ggResults = [];
var cvResults = [];
$(function () { $(function () {
var msg = i18nMsg; var msg = i18nMsg;
/*var douban = "https://api.douban.com"; var douban = "https://api.douban.com";
var dbSearch = "/v2/book/search";*/ var dbSearch = "/v2/book/search";
// var dbDone = true; var dbDone = 0;
var google = "https://www.googleapis.com"; var google = "https://www.googleapis.com";
var ggSearch = "/books/v1/volumes"; var ggSearch = "/books/v1/volumes";
var ggDone = false; var ggDone = 0;
var comicvine = "https://comicvine.gamespot.com";
var cvSearch = "/api/search/";
var cvDone = 0;
var showFlag = 0; var showFlag = 0;
@ -43,12 +49,23 @@ $(function () {
function populateForm (book) { function populateForm (book) {
tinymce.get("description").setContent(book.description); tinymce.get("description").setContent(book.description);
$("#bookAuthor").val(book.authors); var uniqueTags = [];
$.each(book.tags, function(i, el) {
if ($.inArray(el, uniqueTags) === -1) uniqueTags.push(el);
});
var ampSeparatedAuthors = (book.authors || []).join(" & ");
$("#bookAuthor").val(ampSeparatedAuthors);
$("#book_title").val(book.title); $("#book_title").val(book.title);
$("#tags").val(book.tags.join(",")); $("#tags").val(uniqueTags.join(","));
$("#rating").data("rating").setValue(Math.round(book.rating)); $("#rating").data("rating").setValue(Math.round(book.rating));
$(".cover img").attr("src", book.cover); $(".cover img").attr("src", book.cover);
$("#cover_url").val(book.cover); $("#cover_url").val(book.cover);
$("#pubdate").val(book.publishedDate);
$("#publisher").val(book.publisher);
if (typeof book.series !== "undefined") {
$("#series").val(book.series);
}
} }
function showResult () { function showResult () {
@ -56,76 +73,164 @@ $(function () {
if (showFlag === 1) { if (showFlag === 1) {
$("#meta-info").html("<ul id=\"book-list\" class=\"media-list\"></ul>"); $("#meta-info").html("<ul id=\"book-list\" class=\"media-list\"></ul>");
} }
if (!ggDone) { if ((ggDone === 3 || (ggDone === 1 && ggResults.length === 0)) &&
(dbDone === 3 || (dbDone === 1 && dbResults.length === 0)) &&
(cvDone === 3 || (cvDone === 1 && cvResults.length === 0))) {
$("#meta-info").html("<p class=\"text-danger\">" + msg.no_result + "</p>"); $("#meta-info").html("<p class=\"text-danger\">" + msg.no_result + "</p>");
return; return;
} }
if (ggDone && ggResults.length > 0) { function formatDate (date) {
ggResults.forEach(function(result) { var d = new Date(date),
var book = { month = "" + (d.getMonth() + 1),
id: result.id, day = "" + d.getDate(),
title: result.volumeInfo.title, year = d.getFullYear();
authors: result.volumeInfo.authors || [],
description: result.volumeInfo.description || "",
publisher: result.volumeInfo.publisher || "",
publishedDate: result.volumeInfo.publishedDate || "",
tags: result.volumeInfo.categories || [],
rating: result.volumeInfo.averageRating || 0,
cover: result.volumeInfo.imageLinks ?
result.volumeInfo.imageLinks.thumbnail :
"/static/generic_cover.jpg",
url: "https://books.google.com/books?id=" + result.id,
source: {
id: "google",
description: "Google Books",
url: "https://books.google.com/"
}
};
var $book = $(templates.bookResult(book)); if (month.length < 2) {
$book.find("img").on("click", function () { month = "0" + month;
populateForm(book); }
}); if (day.length < 2) {
day = "0" + day;
}
$("#book-list").append($book); return [year, month, day].join("-");
});
ggDone = false;
} }
/*if (dbDone && dbResults.length > 0) {
dbResults.forEach(function(result) {
var book = {
id: result.id,
title: result.title,
authors: result.author || [],
description: result.summary,
publisher: result.publisher || "",
publishedDate: result.pubdate || "",
tags: result.tags.map(function(tag) {
return tag.title;
}),
rating: result.rating.average || 0,
cover: result.image,
url: "https://book.douban.com/subject/" + result.id,
source: {
id: "douban",
description: "Douban Books",
url: "https://book.douban.com/"
}
};
if (book.rating > 0) { if (ggResults.length > 0) {
book.rating /= 2; if (ggDone < 2) {
} ggResults.forEach(function(result) {
var book = {
id: result.id,
title: result.volumeInfo.title,
authors: result.volumeInfo.authors || [],
description: result.volumeInfo.description || "",
publisher: result.volumeInfo.publisher || "",
publishedDate: result.volumeInfo.publishedDate || "",
tags: result.volumeInfo.categories || [],
rating: result.volumeInfo.averageRating || 0,
cover: result.volumeInfo.imageLinks ?
result.volumeInfo.imageLinks.thumbnail : location + "/../../../static/generic_cover.jpg",
url: "https://books.google.com/books?id=" + result.id,
source: {
id: "google",
description: "Google Books",
url: "https://books.google.com/"
}
};
var $book = $(templates.bookResult(book)); var $book = $(templates.bookResult(book));
$book.find("img").on("click", function () { $book.find("img").on("click", function () {
populateForm(book); populateForm(book);
});
$("#book-list").append($book);
}); });
ggDone = 2;
} else {
ggDone = 3;
}
}
$("#book-list").append($book); if (dbResults.length > 0) {
}); if (dbDone < 2) {
dbDone = false; dbResults.forEach(function(result) {
}*/ var seriesTitle = "";
if (result.series) {
seriesTitle = result.series.title;
}
var dateFomers = result.pubdate.split("-");
var publishedYear = parseInt(dateFomers[0]);
var publishedMonth = parseInt(dateFomers[1]);
var publishedDate = new Date(publishedYear, publishedMonth - 1, 1);
publishedDate = formatDate(publishedDate);
var book = {
id: result.id,
title: result.title,
authors: result.author || [],
description: result.summary,
publisher: result.publisher || "",
publishedDate: publishedDate || "",
tags: result.tags.map(function(tag) {
return tag.title.toLowerCase().replace(/,/g, "_");
}),
rating: result.rating.average || 0,
series: seriesTitle || "",
cover: result.image,
url: "https://book.douban.com/subject/" + result.id,
source: {
id: "douban",
description: "Douban Books",
url: "https://book.douban.com/"
}
};
if (book.rating > 0) {
book.rating /= 2;
}
var $book = $(templates.bookResult(book));
$book.find("img").on("click", function () {
populateForm(book);
});
$("#book-list").append($book);
});
dbDone = 2;
} else {
dbDone = 3;
}
}
if (cvResults.length > 0) {
if (cvDone < 2) {
cvResults.forEach(function(result) {
var seriesTitle = "";
if (result.volume.name) {
seriesTitle = result.volume.name;
}
var dateFomers = "";
if (result.store_date) {
dateFomers = result.store_date.split("-");
} else {
dateFomers = result.date_added.split("-");
}
var publishedYear = parseInt(dateFomers[0]);
var publishedMonth = parseInt(dateFomers[1]);
var publishedDate = new Date(publishedYear, publishedMonth - 1, 1);
publishedDate = formatDate(publishedDate);
var book = {
id: result.id,
title: seriesTitle + " #" + ("00" + result.issue_number).slice(-3) + " - " + result.name,
authors: result.author || [],
description: result.description,
publisher: "",
publishedDate: publishedDate || "",
tags: ["Comics", seriesTitle],
rating: 0,
series: seriesTitle || "",
cover: result.image.original_url,
url: result.site_detail_url,
source: {
id: "comicvine",
description: "ComicVine Books",
url: "https://comicvine.gamespot.com/"
}
};
var $book = $(templates.bookResult(book));
$book.find("img").on("click", function () {
populateForm(book);
});
$("#book-list").append($book);
});
cvDone = 2;
} else {
cvDone = 3;
}
}
} }
function ggSearchBook (title) { function ggSearchBook (title) {
@ -137,20 +242,20 @@ $(function () {
success: function success(data) { success: function success(data) {
if ("items" in data) { if ("items" in data) {
ggResults = data.items; ggResults = data.items;
ggDone = true;
} }
}, },
complete: function complete() { complete: function complete() {
ggDone = true; ggDone = 1;
showResult(); showResult();
$("#show-google").trigger("change"); $("#show-google").trigger("change");
} }
}); });
} }
/*function dbSearchBook (title) { function dbSearchBook (title) {
var apikey = "054022eaeae0b00e0fc068c0c0a2102a";
$.ajax({ $.ajax({
url: douban + dbSearch + "?q=" + title + "&fields=all&count=10", url: douban + dbSearch + "?apikey=" + apikey + "&q=" + title + "&fields=all&count=10",
type: "GET", type: "GET",
dataType: "jsonp", dataType: "jsonp",
jsonp: "callback", jsonp: "callback",
@ -158,22 +263,49 @@ $(function () {
dbResults = data.books; dbResults = data.books;
}, },
error: function error() { error: function error() {
$("#meta-info").html("<p class=\"text-danger\">" + msg.search_error + "!</p>"+ $("#meta-info")[0].innerHTML) $("#meta-info").html("<p class=\"text-danger\">" + msg.search_error + "!</p>" + $("#meta-info")[0].innerHTML);
}, },
complete: function complete() { complete: function complete() {
dbDone = true; dbDone = 1;
showResult(); showResult();
$("#show-douban").trigger("change"); $("#show-douban").trigger("change");
} }
}); });
}*/ }
function cvSearchBook (title) {
var apikey = "57558043c53943d5d1e96a9ad425b0eb85532ee6";
title = encodeURIComponent(title);
$.ajax({
url: comicvine + cvSearch + "?api_key=" + apikey + "&resources=issue&query=" + title + "&sort=name:desc&format=jsonp",
type: "GET",
dataType: "jsonp",
jsonp: "json_callback",
success: function success(data) {
cvResults = data.results;
},
error: function error() {
$("#meta-info").html("<p class=\"text-danger\">" + msg.search_error + "!</p>" + $("#meta-info")[0].innerHTML);
},
complete: function complete() {
cvDone = 1;
showResult();
$("#show-comics").trigger("change");
}
});
}
function doSearch (keyword) { function doSearch (keyword) {
showFlag = 0; showFlag = 0;
dbDone = ggDone = cvDone = 0;
dbResults = [];
ggResults = [];
cvResults = [];
$("#meta-info").text(msg.loading); $("#meta-info").text(msg.loading);
if (keyword) { if (keyword) {
// dbSearchBook(keyword); dbSearchBook(keyword);
ggSearchBook(keyword); ggSearchBook(keyword);
cvSearchBook(keyword);
} }
} }

View File

@ -162,10 +162,15 @@ function initProgressClick() {
function loadFromArrayBuffer(ab) { function loadFromArrayBuffer(ab) {
var start = (new Date).getTime(); var start = (new Date).getTime();
var h = new Uint8Array(ab, 0, 10); var h = new Uint8Array(ab, 0, 10);
unrar5(ab);
var pathToBitJS = "../../static/js/archive/"; var pathToBitJS = "../../static/js/archive/";
var lastCompletion = 0; var lastCompletion = 0;
if (h[0] === 0x52 && h[1] === 0x61 && h[2] === 0x72 && h[3] === 0x21) { //Rar! /*if (h[0] === 0x52 && h[1] === 0x61 && h[2] === 0x72 && h[3] === 0x21) { //Rar!
unarchiver = new bitjs.archive.Unrarrer(ab, pathToBitJS); if (h[7] === 0x01) {
unarchiver = new bitjs.archive.Unrarrer(ab, pathToBitJS);
} else {
unarchiver = new bitjs.archive.Unrarrer5(ab, pathToBitJS);
}
} else if (h[0] === 80 && h[1] === 75) { //PK (Zip) } else if (h[0] === 80 && h[1] === 75) { //PK (Zip)
unarchiver = new bitjs.archive.Unzipper(ab, pathToBitJS); unarchiver = new bitjs.archive.Unzipper(ab, pathToBitJS);
} else if (h[0] === 255 && h[1] === 216) { // JPEG } else if (h[0] === 255 && h[1] === 216) { // JPEG
@ -229,7 +234,7 @@ function loadFromArrayBuffer(ab) {
unarchiver.start(); unarchiver.start();
} else { } else {
alert("Some error"); alert("Some error");
} }*/
} }
function scrollTocToActive() { function scrollTocToActive() {

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

View File

@ -0,0 +1 @@
!function(a){a.fn.datepicker.dates.cs={days:["Neděle","Pondělí","Úterý","Středa","Čtvrtek","Pátek","Sobota"],daysShort:["Ned","Pon","Úte","Stř","Čtv","Pát","Sob"],daysMin:["Ne","Po","Út","St","Čt","Pá","So"],months:["Leden","Únor","Březen","Duben","Květen","Červen","Červenec","Srpen","Září","Říjen","Listopad","Prosinec"],monthsShort:["Led","Úno","Bře","Dub","Kvě","Čer","Čnc","Srp","Zář","Říj","Lis","Pro"],today:"Dnes",clear:"Vymazat",monthsTitle:"Měsíc",weekStart:1,format:"dd.mm.yyyy"}}(jQuery);

View File

@ -0,0 +1 @@
!function(a){a.fn.datepicker.dates.el={days:["Κυριακή","Δευτέρα","Τρίτη","Τετάρτη","Πέμπτη","Παρασκευή","Σάββατο"],daysShort:["Κυρ","Δευ","Τρι","Τετ","Πεμ","Παρ","Σαβ"],daysMin:["Κυ","Δε","Τρ","Τε","Πε","Πα","Σα"],months:["Ιανουάριος","Φεβρουάριος","Μάρτιος","Απρίλιος","Μάιος","Ιούνιος","Ιούλιος","Αύγουστος","Σεπτέμβριος","Οκτώβριος","Νοέμβριος","Δεκέμβριος"],monthsShort:["Ιαν","Φεβ","Μαρ","Απρ","Μάι","Ιουν","Ιουλ","Αυγ","Σεπ","Οκτ","Νοε","Δεκ"],today:"Σήμερα",clear:"Καθαρισμός",weekStart:1,format:"d/m/yyyy"}}(jQuery);

View File

@ -0,0 +1 @@
!function(a){a.fn.datepicker.dates.fi={days:["sunnuntai","maanantai","tiistai","keskiviikko","torstai","perjantai","lauantai"],daysShort:["sun","maa","tii","kes","tor","per","lau"],daysMin:["su","ma","ti","ke","to","pe","la"],months:["tammikuu","helmikuu","maaliskuu","huhtikuu","toukokuu","kesäkuu","heinäkuu","elokuu","syyskuu","lokakuu","marraskuu","joulukuu"],monthsShort:["tam","hel","maa","huh","tou","kes","hei","elo","syy","lok","mar","jou"],today:"tänään",clear:"Tyhjennä",weekStart:1,format:"d.m.yyyy"}}(jQuery);

View File

@ -0,0 +1 @@
!function(a){a.fn.datepicker.dates.hu={days:["vasárnap","hétfő","kedd","szerda","csütörtök","péntek","szombat"],daysShort:["vas","hét","ked","sze","csü","pén","szo"],daysMin:["V","H","K","Sze","Cs","P","Szo"],months:["január","február","március","április","május","június","július","augusztus","szeptember","október","november","december"],monthsShort:["jan","feb","már","ápr","máj","jún","júl","aug","sze","okt","nov","dec"],today:"ma",weekStart:1,clear:"töröl",titleFormat:"yyyy. MM",format:"yyyy.mm.dd"}}(jQuery);

View File

@ -0,0 +1 @@
!function(a){a.fn.datepicker.dates.km={days:["អាទិត្យ","ចន្ទ","អង្គារ","ពុធ","ព្រហស្បតិ៍","សុក្រ","សៅរ៍"],daysShort:["អា.ទិ","ចន្ទ","អង្គារ","ពុធ","ព្រ.ហ","សុក្រ","សៅរ៍"],daysMin:["អា.ទិ","ចន្ទ","អង្គារ","ពុធ","ព្រ.ហ","សុក្រ","សៅរ៍"],months:["មករា","កុម្ភះ","មិនា","មេសា","ឧសភា","មិថុនា","កក្កដា","សីហា","កញ្ញា","តុលា","វិច្ឆិកា","ធ្នូ"],monthsShort:["មករា","កុម្ភះ","មិនា","មេសា","ឧសភា","មិថុនា","កក្កដា","សីហា","កញ្ញា","តុលា","វិច្ឆិកា","ធ្នូ"],today:"ថ្ងៃនេះ",clear:"សំអាត"}}(jQuery);

View File

@ -0,0 +1 @@
!function(a){a.fn.datepicker.dates.sv={days:["söndag","måndag","tisdag","onsdag","torsdag","fredag","lördag"],daysShort:["sön","mån","tis","ons","tor","fre","lör"],daysMin:["sö","må","ti","on","to","fr","lö"],months:["januari","februari","mars","april","maj","juni","juli","augusti","september","oktober","november","december"],monthsShort:["jan","feb","mar","apr","maj","jun","jul","aug","sep","okt","nov","dec"],today:"Idag",format:"yyyy-mm-dd",weekStart:1,clear:"Rensa"}}(jQuery);

View File

@ -0,0 +1 @@
!function(a){a.fn.datepicker.dates.tr={days:["Pazar","Pazartesi","Salı","Çarşamba","Perşembe","Cuma","Cumartesi"],daysShort:["Pz","Pzt","Sal","Çrş","Prş","Cu","Cts"],daysMin:["Pz","Pzt","Sa","Çr","Pr","Cu","Ct"],months:["Ocak","Şubat","Mart","Nisan","Mayıs","Haziran","Temmuz","Ağustos","Eylül","Ekim","Kasım","Aralık"],monthsShort:["Oca","Şub","Mar","Nis","May","Haz","Tem","Ağu","Eyl","Eki","Kas","Ara"],today:"Bugün",clear:"Temizle",weekStart:1,format:"dd.mm.yyyy"}}(jQuery);

View File

@ -0,0 +1 @@
!function(a){a.fn.datepicker.dates.uk={days:["Неділя","Понеділок","Вівторок","Середа","Четвер","П'ятниця","Субота"],daysShort:["Нед","Пнд","Втр","Срд","Чтв","Птн","Суб"],daysMin:["Нд","Пн","Вт","Ср","Чт","Пт","Сб"],months:["Cічень","Лютий","Березень","Квітень","Травень","Червень","Липень","Серпень","Вересень","Жовтень","Листопад","Грудень"],monthsShort:["Січ","Лют","Бер","Кві","Тра","Чер","Лип","Сер","Вер","Жов","Лис","Гру"],today:"Сьогодні",clear:"Очистити",format:"dd.mm.yyyy",weekStart:1}}(jQuery);

View File

@ -1 +1 @@
!function(a){a.fn.datepicker.dates["zh-CN"]={days:["星期日","星期一","星期二","星期三","星期四","星期五","星期六"],daysShort:["周日","周一","周二","周三","周四","周五","周六"],daysMin:["日","一","二","三","四","五","六"],months:["一月","二月","三月","四月","五月","六月","七月","八月","九月","十月","十一月","十二月"],monthsShort:["1月","2月","3月","4月","5月","6月","7月","8月","9月","10月","11月","12月"],today:"今",clear:"清除",format:"yyyy年mm月dd日",titleFormat:"yyyy年mm月",weekStart:1}}(jQuery); !function(a){a.fn.datepicker.dates["zh-CN"]={days:["星期日","星期一","星期二","星期三","星期四","星期五","星期六"],daysShort:["周日","周一","周二","周三","周四","周五","周六"],daysMin:["日","一","二","三","四","五","六"],months:["一月","二月","三月","四月","五月","六月","七月","八月","九月","十月","十一月","十二月"],monthsShort:["1月","2月","3月","4月","5月","6月","7月","8月","9月","10月","11月","12月"],today:"今天",monthsTitle:"选择月份",clear:"清除",format:"yyyy-mm-dd",titleFormat:"yyyy年mm月",weekStart:1}}(jQuery);

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

File diff suppressed because one or more lines are too long

Some files were not shown because too many files have changed in this diff Show More