Compare commits

..

13 Commits
v0.7.0 ... main

Author SHA1 Message Date
Leo Vasanko
091d57dba7 Fix typing and import in the config file module. 2025-08-17 10:31:54 -06:00
Leo Vasanko
69a897cfec Startup banner with version display, and --version, using stderr/stdout properly. 2025-08-17 10:31:18 -06:00
Leo Vasanko
33db2c01b4 Cleaner server shutdowns:
- Remove a workaround for Sanic server not always terminating cleanly
- Terminate worker threads before server stop
- Silent closing of watching WebSocket attempted to open while shutting down
2025-08-17 08:15:01 -06:00
Leo Vasanko
26addb2f7b Image previews improved, all EXIF Orientations handled. 2025-08-17 07:00:52 -06:00
Leo Vasanko
ce6d60e588 Cleanup for release 1.0.0. 2025-08-15 11:11:57 -06:00
073f1a8707 Maintenance update (#7)
- Use modern tooling uv and bun
- Various changes to work with latest PyAV and PIL that have changed their API
- Improved image, video and document previews (uses AVIF, renders AVIF/HEIC/videos in HDR, faster processing)
- Fix a server hang in some cases where a folder was moved or renamed
- Log exceptions instead of only returning 500 response to client
- Log timing of preview generation functions
- Default to quality 50 in previews (previously 40)
2025-08-15 18:03:04 +01:00
Leo Vasanko
f38bb4bab9 Better transparent Cista image 2023-11-25 17:25:35 +00:00
Leo Vasanko
26f9bef087 Correct hatch build hook 2023-11-21 20:11:37 +00:00
Leo Vasanko
634dabe52d Less messy breadcrumbs on search results in gallery 2023-11-21 17:51:38 +00:00
Leo Vasanko
a383358369 Fix direct uploads and downloads, transfer bar UI 2023-11-21 16:13:46 +00:00
Leo Vasanko
369dc3ecaf Fixed New Folder, added Rename to Gallery 2023-11-21 15:49:33 +00:00
Leo Vasanko
0cf9c254e5 Various build fixes, cleanup and details (#6)
- Major memory usage reduction in video previews
- Finally builds properly on Windows too

Reviewed-on: #6
2023-11-21 15:32:49 +00:00
Leo Vasanko
58b9dd3dd4 Cleanup 2023-11-20 16:35:34 -08:00
32 changed files with 705 additions and 339 deletions

1
.gitignore vendored
View File

@ -1,4 +1,5 @@
.* .*
*.lock
!.gitignore !.gitignore
__pycache__/ __pycache__/
*.egg-info/ *.egg-info/

View File

@ -6,6 +6,8 @@ Cista takes its name from the ancient *cistae*, metal containers used by Greeks
This is a cutting-edge **file and document server** designed for speed, efficiency, and unparalleled ease of use. Experience **lightning-fast browsing**, thanks to the file list maintained directly in your browser and updated from server filesystem events, coupled with our highly optimized code. Fully **keyboard-navigable** and with a responsive layout, Cista flawlessly adapts to your devices, providing a seamless experience wherever you are. Our powerful **instant search** means you're always just a few keystrokes away from finding exactly what you need. Press **1/2/3** to switch ordering, navigate with all four arrow keys (+Shift to select). Or click your way around on **breadcrumbs that remember where you were**. This is a cutting-edge **file and document server** designed for speed, efficiency, and unparalleled ease of use. Experience **lightning-fast browsing**, thanks to the file list maintained directly in your browser and updated from server filesystem events, coupled with our highly optimized code. Fully **keyboard-navigable** and with a responsive layout, Cista flawlessly adapts to your devices, providing a seamless experience wherever you are. Our powerful **instant search** means you're always just a few keystrokes away from finding exactly what you need. Press **1/2/3** to switch ordering, navigate with all four arrow keys (+Shift to select). Or click your way around on **breadcrumbs that remember where you were**.
**Built-in document and media previews** let you quickly view files without downloading them. Cista shows PDF and other documents, video and image thumbnails, with **HDR10 support** video previews and image formats, including HEIC and AVIF. It also has a player for music and video files.
The Cista project started as an inevitable remake of [Droppy](https://github.com/droppyjs/droppy) which we used and loved despite its numerous bugs. Cista Storage stands out in handling even the most exotic filenames, ensuring a smooth experience where others falter. The Cista project started as an inevitable remake of [Droppy](https://github.com/droppyjs/droppy) which we used and loved despite its numerous bugs. Cista Storage stands out in handling even the most exotic filenames, ensuring a smooth experience where others falter.
All of this is wrapped in an intuitive interface with automatic light and dark themes, making Cista Storage the ideal choice for anyone seeking a reliable, versatile, and quick file storage solution. Quickly setup your own Cista where your files are just a click away, safe, and always accessible. All of this is wrapped in an intuitive interface with automatic light and dark themes, making Cista Storage the ideal choice for anyone seeking a reliable, versatile, and quick file storage solution. Quickly setup your own Cista where your files are just a click away, safe, and always accessible.
@ -14,39 +16,31 @@ Experience Cista by visiting [Cista Demo](https://drop.zi.fi) for a test run and
## Getting Started ## Getting Started
### Installation
To install the cista application, use:
```fish
pip install cista
```
Note: Some Linux distributions might need `--break-system-packages` to install Python packages, which are safely installed in the user's home folder. As an alternative to avoid installation, run it with command `pipx run cista`
### Running the Server ### Running the Server
Create an account: (or run a public server without authentication) We recommend using [UV](https://docs.astral.sh/uv/getting-started/installation/) to directly run Cista:
Create an account: (otherwise the server is public for all)
```fish ```fish
cista --user yourname --privileged uvx cista --user yourname --privileged
``` ```
Serve your files at http://localhost:8000: Serve your files at http://localhost:8000:
```fish ```fish
cista -l :8000 /path/to/files uvx cista -l :8000 /path/to/files
```
Alternatively, you can install with `pip` or `uv pip`. This enables using the `cista` command directly without `uvx` or `uv run`.
```fish
pip install cista --break-system-packages
``` ```
The server remembers its settings in the config folder (default `~/.local/share/cista/`), including the listen port and directory, for future runs without arguments. The server remembers its settings in the config folder (default `~/.local/share/cista/`), including the listen port and directory, for future runs without arguments.
### Internet Access ### Internet Access
To use your own TLS certificates, place them in the config folder and run: Most admins find the [Caddy](https://caddyserver.com/) web server convenient for its auto TLS certificates and all. A proxy also allows running multiple web services or Cista instances on the same IP address but different (sub)domains.
```fish
cista -l cista.example.com
```
Most admins instead find the [Caddy](https://caddyserver.com/) web server convenient for its auto TLS certificates and all. A proxy also allows running multiple web services or Cista instances on the same IP address but different (sub)domains.
`/etc/caddy/Caddyfile`: `/etc/caddy/Caddyfile`:
@ -56,33 +50,13 @@ cista.example.com {
} }
``` ```
## Development setup Nxing or other proxy may be similarly used, or alternatively you can place cert and key in cista config dir and run `cista -l cista.example.com`
For rapid development, we use the Vite development server for the Vue frontend, while running the backend on port 8000 that Vite proxies backend requests to. Each server live reloads whenever its code or configuration are modified.
```fish
cd frontend
npm install
npm run dev
```
Concurrently, start the backend on another terminal:
```fish
hatch shell
pip install -e '.[dev]'
cista --dev -l :8000 /path/to/files
```
We use `hatch shell` for installing on a virtual environment, to avoid disturbing the rest of the system with our hacking.
Vue is used to build files in `cista/wwwroot`, included prebuilt in the Python package. Running `hatch build` builds the frontend and creates a NodeJS-independent Python package.
## System Deployment ## System Deployment
This setup allows easy addition of storages, each with its own domain, configuration, and files. This setup allows easy addition of storages, each with its own domain, configuration, and files.
Assuming a restricted user account `storage` for serving files and that cista is installed system-wide or on this account (check with `sudo -u storage -s`). Alternatively, use `pipx run cista` or `hatch run cista` as the ExecStart command. Assuming a restricted user account `storage` for serving files and that UV is installed system-wide or on this account. Only UV is required: this does not use git or bun/npm.
Create `/etc/systemd/system/cista@.service`: Create `/etc/systemd/system/cista@.service`:
@ -92,7 +66,7 @@ Description=Cista storage %i
[Service] [Service]
User=storage User=storage
ExecStart=cista -c /srv/cista/%i -l /srv/cista/%i/socket /media/storage/%i ExecStart=uvx cista -c /srv/cista/%i -l /srv/cista/%i/socket /media/storage/%i
Restart=always Restart=always
[Install] [Install]
@ -116,3 +90,34 @@ foo.example.com, bar.example.com {
reverse_proxy unix//srv/cista/{host}/socket reverse_proxy unix//srv/cista/{host}/socket
} }
``` ```
## Development setup
For rapid development, we use the Vite development server for the Vue frontend, while running the backend on port 8000 that Vite proxies backend requests to. Each server live reloads whenever its code or configuration are modified.
Make sure you have git, uv and bun (or npm) installed.
Backend (Python) setup and run:
```fish
git clone https://git.zi.fi/Vasanko/cista-storage.git
cd cista-storage
uv sync --dev
uv run cista --dev -l :8000 /path/to/files
```
Frontend (Vue/Vite) run the dev server in another terminal:
```fish
cd frontend
bun install
bun run dev
```
Building the package for release (frontend + Python wheel/sdist):
```fish
uv build
```
Vue is used to build files in `cista/wwwroot`, included prebuilt in the Python package. `uv build` runs the project build hooks to bundle the frontend and produce a NodeJS-independent Python package.

View File

@ -1,3 +1,4 @@
import os
import sys import sys
from pathlib import Path from pathlib import Path
@ -9,8 +10,24 @@ from cista.util import pwgen
del app, server80.app # Only import needed, for Sanic multiprocessing del app, server80.app # Only import needed, for Sanic multiprocessing
doc = f"""Cista {cista.__version__} - A file storage for the web.
def create_banner():
"""Create a framed banner with the Cista version."""
title = f"Cista {cista.__version__}"
subtitle = "A file storage for the web"
width = max(len(title), len(subtitle)) + 4
return f"""\
{"" * width}
{title:^{width}}
{subtitle:^{width}}
{"" * width}
"""
banner = create_banner()
doc = """\
Usage: Usage:
cista [-c <confdir>] [-l <host>] [--import-droppy] [--dev] [<path>] cista [-c <confdir>] [-l <host>] [--import-droppy] [--dev] [<path>]
cista [-c <confdir>] --user <name> [--privileged] [--password] cista [-c <confdir>] --user <name> [--privileged] [--password]
@ -34,6 +51,14 @@ User management:
--password Reset password --password Reset password
""" """
first_time_help = """\
No config file found! Get started with:
cista --user yourname --privileged # If you want user accounts
cista -l :8000 /path/to/files # Run the server on localhost:8000
See cista --help for other options!
"""
def main(): def main():
# Dev mode doesn't catch exceptions # Dev mode doesn't catch exceptions
@ -43,11 +68,19 @@ def main():
try: try:
return _main() return _main()
except Exception as e: except Exception as e:
print("Error:", e) sys.stderr.write(f"Error: {e}\n")
return 1 return 1
def _main(): def _main():
# The banner printing differs by mode, and needs to be done before docopt() printing its messages
if any(arg in sys.argv for arg in ("--help", "-h")):
sys.stdout.write(banner)
elif "--version" in sys.argv:
sys.stdout.write(f"cista {cista.__version__}\n")
return 0
else:
sys.stderr.write(banner)
args = docopt(doc) args = docopt(doc)
if args["--user"]: if args["--user"]:
return _user(args) return _user(args)
@ -65,13 +98,7 @@ def _main():
necessary_opts = exists or import_droppy or path necessary_opts = exists or import_droppy or path
if not necessary_opts: if not necessary_opts:
# Maybe run without arguments # Maybe run without arguments
print(doc) sys.stderr.write(first_time_help)
print(
"No config file found! Get started with one of:\n"
" cista --user yourname --privileged\n"
" cista --import-droppy\n"
" cista -l :8000 /path/to/files\n"
)
return 1 return 1
settings = {} settings = {}
if import_droppy: if import_droppy:
@ -92,7 +119,7 @@ def _main():
# We have no users, so make it public # We have no users, so make it public
settings["public"] = True settings["public"] = True
operation = config.update_config(settings) operation = config.update_config(settings)
print(f"Config {operation}: {config.conffile}") sys.stderr.write(f"Config {operation}: {config.conffile}\n")
# Prepare to serve # Prepare to serve
unix = None unix = None
url, _ = serve.parse_listen(config.config.listen) url, _ = serve.parse_listen(config.config.listen)
@ -102,7 +129,7 @@ def _main():
dev = args["--dev"] dev = args["--dev"]
if dev: if dev:
extra += " (dev mode)" extra += " (dev mode)"
print(f"Serving {config.config.path} at {url}{extra}") sys.stderr.write(f"Serving {config.config.path} at {url}{extra}\n")
# Run the server # Run the server
serve.run(dev=dev) serve.run(dev=dev)
return 0 return 0
@ -117,7 +144,8 @@ def _confdir(args):
raise ValueError("Config path is not a directory") raise ValueError("Config path is not a directory")
# Accidentally pointed to the db.toml, use parent # Accidentally pointed to the db.toml, use parent
confdir = confdir.parent confdir = confdir.parent
config.conffile = confdir / config.conffile.name os.environ["CISTA_HOME"] = confdir.as_posix()
config.init_confdir() # Uses environ if available
def _user(args): def _user(args):
@ -134,7 +162,7 @@ def _user(args):
"public": False, "public": False,
} }
) )
print(f"Config {operation}: {config.conffile}\n") sys.stderr.write(f"Config {operation}: {config.conffile}\n\n")
name = args["--user"] name = args["--user"]
if not name or not name.isidentifier(): if not name or not name.isidentifier():
@ -152,12 +180,12 @@ def _user(args):
changes["password"] = pw = pwgen.generate() changes["password"] = pw = pwgen.generate()
info += f"\n Password: {pw}\n" info += f"\n Password: {pw}\n"
res = config.update_user(name, changes) res = config.update_user(name, changes)
print(info) sys.stderr.write(f"{info}\n")
if res == "read": if res == "read":
print(" No changes") sys.stderr.write(" No changes\n")
if operation == "created": if operation == "created":
print( sys.stderr.write(
"Now you can run the server:\n cista # defaults set: -l :8000 ~/Downloads\n" "Now you can run the server:\n cista # defaults set: -l :8000 ~/Downloads\n"
) )

View File

@ -119,8 +119,12 @@ async def watch(req, ws):
# Send updates # Send updates
while True: while True:
await ws.send(await q.get()) await ws.send(await q.get())
except RuntimeError as e:
if str(e) == "cannot schedule new futures after shutdown":
return # Server shutting down, drop the WebSocket
raise
finally: finally:
del watching.pubsub[uuid] watching.pubsub.pop(uuid, None) # Remove whether it got added yet or not
def subscribe(uuid, ws): def subscribe(uuid, ws):

View File

@ -3,6 +3,7 @@ import datetime
import mimetypes import mimetypes
import threading import threading
from concurrent.futures import ThreadPoolExecutor from concurrent.futures import ThreadPoolExecutor
from multiprocessing import cpu_count
from pathlib import Path, PurePath, PurePosixPath from pathlib import Path, PurePath, PurePosixPath
from stat import S_IFDIR, S_IFREG from stat import S_IFDIR, S_IFREG
from urllib.parse import unquote from urllib.parse import unquote
@ -14,6 +15,7 @@ from blake3 import blake3
from sanic import Blueprint, Sanic, empty, raw, redirect from sanic import Blueprint, Sanic, empty, raw, redirect
from sanic.exceptions import Forbidden, NotFound from sanic.exceptions import Forbidden, NotFound
from sanic.log import logger from sanic.log import logger
from setproctitle import setproctitle
from stream_zip import ZIP_AUTO, stream_zip from stream_zip import ZIP_AUTO, stream_zip
from cista import auth, config, preview, session, watching from cista import auth, config, preview, session, watching
@ -30,20 +32,27 @@ app.blueprint(bp)
app.exception(Exception)(handle_sanic_exception) app.exception(Exception)(handle_sanic_exception)
setproctitle("cista-main")
@app.before_server_start @app.before_server_start
async def main_start(app, loop): async def main_start(app, loop):
config.load_config() config.load_config()
setproctitle(f"cista {config.config.path.name}")
workers = max(2, min(8, cpu_count()))
app.ctx.threadexec = ThreadPoolExecutor( app.ctx.threadexec = ThreadPoolExecutor(
max_workers=8, thread_name_prefix="cista-ioworker" max_workers=workers, thread_name_prefix="cista-ioworker"
) )
await watching.start(app, loop) watching.start(app, loop)
@app.after_server_stop # Sanic sometimes fails to execute after_server_stop, so we do it before instead (potentially interrupting handlers)
@app.before_server_stop
async def main_stop(app, loop): async def main_stop(app, loop):
quit.set() quit.set()
await watching.stop(app, loop) watching.stop(app)
app.ctx.threadexec.shutdown() app.ctx.threadexec.shutdown()
logger.debug("Cista worker threads all finished")
@app.on_request @app.on_request

View File

@ -1,13 +1,17 @@
from __future__ import annotations from __future__ import annotations
import os
import secrets import secrets
import sys import sys
from contextlib import suppress
from functools import wraps from functools import wraps
from hashlib import sha256 from hashlib import sha256
from pathlib import Path, PurePath from pathlib import Path, PurePath
from time import time from time import sleep, time
from typing import Callable, Concatenate, Literal, ParamSpec
import msgspec import msgspec
import msgspec.toml
class Config(msgspec.Struct): class Config(msgspec.Struct):
@ -20,6 +24,13 @@ class Config(msgspec.Struct):
links: dict[str, Link] = {} links: dict[str, Link] = {}
# Typing: arguments for config-modifying functions
P = ParamSpec("P")
ResultStr = Literal["modified", "created", "read"]
RawModifyFunc = Callable[Concatenate[Config, P], Config]
ModifyPublic = Callable[P, ResultStr]
class User(msgspec.Struct, omit_defaults=True): class User(msgspec.Struct, omit_defaults=True):
privileged: bool = False privileged: bool = False
hash: str = "" hash: str = ""
@ -32,8 +43,24 @@ class Link(msgspec.Struct, omit_defaults=True):
expires: int = 0 expires: int = 0
config = None # Global variables - initialized during application startup
conffile = Path.home() / ".local/share/cista/db.toml" config: Config
conffile: Path
def init_confdir() -> None:
global conffile
if p := os.environ.get("CISTA_HOME"):
home = Path(p)
else:
xdg = os.environ.get("XDG_CONFIG_HOME")
home = (
Path(xdg).expanduser() / "cista" if xdg else Path.home() / ".config/cista"
)
if not home.is_dir():
home.mkdir(parents=True, exist_ok=True)
home.chmod(0o700)
conffile = home / "db.toml"
def derived_secret(*params, len=8) -> bytes: def derived_secret(*params, len=8) -> bytes:
@ -59,10 +86,10 @@ def dec_hook(typ, obj):
raise TypeError raise TypeError
def config_update(modify): def config_update(
modify: RawModifyFunc,
) -> ResultStr | Literal["collision"]:
global config global config
if not conffile.exists():
conffile.parent.mkdir(parents=True, exist_ok=True)
tmpname = conffile.with_suffix(".tmp") tmpname = conffile.with_suffix(".tmp")
try: try:
f = tmpname.open("xb") f = tmpname.open("xb")
@ -76,12 +103,8 @@ def config_update(modify):
old = conffile.read_bytes() old = conffile.read_bytes()
c = msgspec.toml.decode(old, type=Config, dec_hook=dec_hook) c = msgspec.toml.decode(old, type=Config, dec_hook=dec_hook)
except FileNotFoundError: except FileNotFoundError:
# No existing config file, make sure we have a folder...
confdir = conffile.parent
confdir.mkdir(parents=True, exist_ok=True)
confdir.chmod(0o700)
old = b"" old = b""
c = None c = Config(path=Path(), listen="", secret=secrets.token_hex(12))
c = modify(c) c = modify(c)
new = msgspec.toml.encode(c, enc_hook=enc_hook) new = msgspec.toml.encode(c, enc_hook=enc_hook)
if old == new: if old == new:
@ -92,7 +115,9 @@ def config_update(modify):
f.write(new) f.write(new)
f.close() f.close()
if sys.platform == "win32": if sys.platform == "win32":
conffile.unlink() # Windows doesn't support atomic replace # Windows doesn't support atomic replace
with suppress(FileNotFoundError):
conffile.unlink()
tmpname.rename(conffile) # Atomic replace tmpname.rename(conffile) # Atomic replace
except: except:
f.close() f.close()
@ -102,17 +127,23 @@ def config_update(modify):
return "modified" if old else "created" return "modified" if old else "created"
def modifies_config(modify): def modifies_config(
"""Decorator for functions that modify the config file""" modify: Callable[Concatenate[Config, P], Config],
) -> Callable[P, ResultStr]:
"""Decorator for functions that modify the config file
The decorated function takes as first arg Config and returns it modified.
The wrapper handles atomic modification and returns a string indicating the result.
"""
@wraps(modify) @wraps(modify)
def wrapper(*args, **kwargs): def wrapper(*args: P.args, **kwargs: P.kwargs) -> ResultStr:
def m(c): def m(c: Config) -> Config:
return modify(c, *args, **kwargs) return modify(c, *args, **kwargs)
# Retry modification in case of write collision # Retry modification in case of write collision
while (c := config_update(m)) == "collision": while (c := config_update(m)) == "collision":
time.sleep(0.01) sleep(0.01)
return c return c
return wrapper return wrapper
@ -120,6 +151,7 @@ def modifies_config(modify):
def load_config(): def load_config():
global config global config
init_confdir()
config = msgspec.toml.decode(conffile.read_bytes(), type=Config, dec_hook=dec_hook) config = msgspec.toml.decode(conffile.read_bytes(), type=Config, dec_hook=dec_hook)
@ -127,7 +159,7 @@ def load_config():
def update_config(conf: Config, changes: dict) -> Config: def update_config(conf: Config, changes: dict) -> Config:
"""Create/update the config with new values, respecting changes done by others.""" """Create/update the config with new values, respecting changes done by others."""
# Encode into dict, update values with new, convert to Config # Encode into dict, update values with new, convert to Config
settings = {} if conf is None else msgspec.to_builtins(conf, enc_hook=enc_hook) settings = msgspec.to_builtins(conf, enc_hook=enc_hook)
settings.update(changes) settings.update(changes)
return msgspec.convert(settings, Config, dec_hook=dec_hook) return msgspec.convert(settings, Config, dec_hook=dec_hook)
@ -137,8 +169,13 @@ def update_user(conf: Config, name: str, changes: dict) -> Config:
"""Create/update a user with new values, respecting changes done by others.""" """Create/update a user with new values, respecting changes done by others."""
# Encode into dict, update values with new, convert to Config # Encode into dict, update values with new, convert to Config
try: try:
u = conf.users[name].__copy__() # Copy user by converting to dict and back
except (KeyError, AttributeError): u = msgspec.convert(
msgspec.to_builtins(conf.users[name], enc_hook=enc_hook),
User,
dec_hook=dec_hook,
)
except KeyError:
u = User() u = User()
if "password" in changes: if "password" in changes:
from . import auth from . import auth
@ -147,7 +184,7 @@ def update_user(conf: Config, name: str, changes: dict) -> Config:
del changes["password"] del changes["password"]
udict = msgspec.to_builtins(u, enc_hook=enc_hook) udict = msgspec.to_builtins(u, enc_hook=enc_hook)
udict.update(changes) udict.update(changes)
settings = msgspec.to_builtins(conf, enc_hook=enc_hook) if conf else {"users": {}} settings = msgspec.to_builtins(conf, enc_hook=enc_hook)
settings["users"][name] = msgspec.convert(udict, User, dec_hook=dec_hook) settings["users"][name] = msgspec.convert(udict, User, dec_hook=dec_hook)
return msgspec.convert(settings, Config, dec_hook=dec_hook) return msgspec.convert(settings, Config, dec_hook=dec_hook)
@ -155,6 +192,7 @@ def update_user(conf: Config, name: str, changes: dict) -> Config:
@modifies_config @modifies_config
def del_user(conf: Config, name: str) -> Config: def del_user(conf: Config, name: str) -> Config:
"""Delete named user account.""" """Delete named user account."""
ret = conf.__copy__() # Create a copy by converting to dict and back
ret.users.pop(name) settings = msgspec.to_builtins(conf, enc_hook=enc_hook)
return ret settings["users"].pop(name)
return msgspec.convert(settings, Config, dec_hook=dec_hook)

View File

@ -1,14 +1,17 @@
import asyncio import asyncio
import gc
import io import io
import mimetypes import mimetypes
import urllib.parse import urllib.parse
from pathlib import PurePosixPath from pathlib import PurePosixPath
from time import perf_counter
from urllib.parse import unquote from urllib.parse import unquote
from wsgiref.handlers import format_date_time from wsgiref.handlers import format_date_time
import av import av
import av.datasets
import fitz # PyMuPDF import fitz # PyMuPDF
import numpy as np
import pillow_heif
from PIL import Image from PIL import Image
from sanic import Blueprint, empty, raw from sanic import Blueprint, empty, raw
from sanic.exceptions import NotFound from sanic.exceptions import NotFound
@ -17,28 +20,41 @@ from sanic.log import logger
from cista import config from cista import config
from cista.util.filename import sanitize from cista.util.filename import sanitize
pillow_heif.register_heif_opener()
bp = Blueprint("preview", url_prefix="/preview") bp = Blueprint("preview", url_prefix="/preview")
# Map EXIF Orientation value to a corresponding PIL transpose
EXIF_ORI = {
2: Image.Transpose.FLIP_LEFT_RIGHT,
3: Image.Transpose.ROTATE_180,
4: Image.Transpose.FLIP_TOP_BOTTOM,
5: Image.Transpose.TRANSPOSE,
6: Image.Transpose.ROTATE_270,
7: Image.Transpose.TRANSVERSE,
8: Image.Transpose.ROTATE_90,
}
@bp.get("/<path:path>") @bp.get("/<path:path>")
async def preview(req, path): async def preview(req, path):
"""Preview a file""" """Preview a file"""
maxsize = int(req.args.get("px", 1024)) maxsize = int(req.args.get("px", 1024))
maxzoom = float(req.args.get("zoom", 2.0)) maxzoom = float(req.args.get("zoom", 2.0))
quality = int(req.args.get("q", 40)) quality = int(req.args.get("q", 60))
rel = PurePosixPath(sanitize(unquote(path))) rel = PurePosixPath(sanitize(unquote(path)))
path = config.config.path / rel path = config.config.path / rel
stat = path.lstat() stat = path.lstat()
etag = config.derived_secret( etag = config.derived_secret(
"preview", rel, stat.st_mtime_ns, quality, maxsize, maxzoom "preview", rel, stat.st_mtime_ns, quality, maxsize, maxzoom
).hex() ).hex()
savename = PurePosixPath(path.name).with_suffix(".webp") savename = PurePosixPath(path.name).with_suffix(".avif")
headers = { headers = {
"etag": etag, "etag": etag,
"last-modified": format_date_time(stat.st_mtime), "last-modified": format_date_time(stat.st_mtime),
"cache-control": "max-age=604800, immutable" "cache-control": "max-age=604800, immutable"
+ ("" if config.config.public else ", private"), + ("" if config.config.public else ", private"),
"content-type": "image/webp", "content-type": "image/avif",
"content-disposition": f"inline; filename*=UTF-8''{urllib.parse.quote(savename.as_posix())}", "content-disposition": f"inline; filename*=UTF-8''{urllib.parse.quote(savename.as_posix())}",
} }
if req.headers.if_none_match == etag: if req.headers.if_none_match == etag:
@ -57,58 +73,165 @@ async def preview(req, path):
def dispatch(path, quality, maxsize, maxzoom): def dispatch(path, quality, maxsize, maxzoom):
if path.suffix.lower() in (".pdf", ".xps", ".epub", ".mobi"): if path.suffix.lower() in (".pdf", ".xps", ".epub", ".mobi"):
return process_pdf(path, quality=quality, maxsize=maxsize, maxzoom=maxzoom) return process_pdf(path, quality=quality, maxsize=maxsize, maxzoom=maxzoom)
if mimetypes.guess_type(path.name)[0].startswith("video/"): type, _ = mimetypes.guess_type(path.name)
if type and type.startswith("video/"):
return process_video(path, quality=quality, maxsize=maxsize) return process_video(path, quality=quality, maxsize=maxsize)
return process_image(path, quality=quality, maxsize=maxsize) return process_image(path, quality=quality, maxsize=maxsize)
def process_image(path, *, maxsize, quality): def process_image(path, *, maxsize, quality):
img = Image.open(path) t_load = perf_counter()
w, h = img.size with Image.open(path) as img:
img.thumbnail((min(w, maxsize), min(h, maxsize))) # Force decode to include I/O in load timing
# Fix rotation based on EXIF data img.load()
try: t_proc = perf_counter()
rotate_values = {3: 180, 6: 270, 8: 90} # Resize
orientation = img._getexif().get(274) w, h = img.size
if orientation in rotate_values: img.thumbnail((min(w, maxsize), min(h, maxsize)))
logger.debug(f"Rotating preview {path} by {rotate_values[orientation]}") # Transpose pixels according to EXIF Orientation
img = img.rotate(rotate_values[orientation], expand=True) orientation = img.getexif().get(274, 1)
except AttributeError: if orientation in EXIF_ORI:
... img = img.transpose(EXIF_ORI[orientation])
except Exception as e: # Save as AVIF
logger.error(f"Error rotating preview image: {e}") imgdata = io.BytesIO()
# Save as webp t_save = perf_counter()
imgdata = io.BytesIO() img.save(imgdata, format="avif", quality=quality, speed=10, max_threads=1)
img.save(imgdata, format="webp", quality=quality, method=4)
return imgdata.getvalue() t_end = perf_counter()
ret = imgdata.getvalue()
load_ms = (t_proc - t_load) * 1000
proc_ms = (t_save - t_proc) * 1000
save_ms = (t_end - t_save) * 1000
logger.debug(
"Preview image %s: load=%.1fms process=%.1fms save=%.1fms",
path.name,
load_ms,
proc_ms,
save_ms,
)
return ret
def process_pdf(path, *, maxsize, maxzoom, quality, page_number=0): def process_pdf(path, *, maxsize, maxzoom, quality, page_number=0):
t_load_start = perf_counter()
pdf = fitz.open(path) pdf = fitz.open(path)
page = pdf.load_page(page_number) page = pdf.load_page(page_number)
w, h = page.rect[2:4] w, h = page.rect[2:4]
zoom = min(maxsize / w, maxsize / h, maxzoom) zoom = min(maxsize / w, maxsize / h, maxzoom)
mat = fitz.Matrix(zoom, zoom) mat = fitz.Matrix(zoom, zoom)
pix = page.get_pixmap(matrix=mat) pix = page.get_pixmap(matrix=mat) # type: ignore[attr-defined]
return pix.pil_tobytes(format="webp", quality=quality, method=4) t_load_end = perf_counter()
t_save_start = perf_counter()
ret = pix.pil_tobytes(format="avif", quality=quality, speed=10, max_threads=1)
t_save_end = perf_counter()
logger.debug(
"Preview pdf %s: load+render=%.1fms save=%.1fms",
path.name,
(t_load_end - t_load_start) * 1000,
(t_save_end - t_save_start) * 1000,
)
return ret
def process_video(path, *, maxsize, quality): def process_video(path, *, maxsize, quality):
with av.open(str(path)) as container: frame = None
stream = container.streams.video[0]
rotation = (
stream.side_data
and stream.side_data.get(av.stream.SideData.DISPLAYMATRIX)
or 0
)
stream.codec_context.skip_frame = "NONKEY"
container.seek(container.duration // 8)
frame = next(container.decode(stream))
img = frame.to_image()
img.thumbnail((maxsize, maxsize))
imgdata = io.BytesIO() imgdata = io.BytesIO()
if rotation: istream = ostream = icc = occ = frame = None
img = img.rotate(rotation, expand=True) t_load_start = perf_counter()
img.save(imgdata, format="webp", quality=quality, method=4) # Initialize to avoid "possibly unbound" in static analysis when exceptions occur
return imgdata.getvalue() t_load_end = t_load_start
t_save_start = t_load_start
t_save_end = t_load_start
with (
av.open(str(path)) as icontainer,
av.open(imgdata, "w", format="avif") as ocontainer,
):
istream = icontainer.streams.video[0]
istream.codec_context.skip_frame = "NONKEY"
icontainer.seek((icontainer.duration or 0) // 8)
for frame in icontainer.decode(istream):
if frame.dts is not None:
break
else:
raise RuntimeError("No frames found in video")
# Resize frame to thumbnail size
if frame.width > maxsize or frame.height > maxsize:
scale_factor = min(maxsize / frame.width, maxsize / frame.height)
new_width = int(frame.width * scale_factor)
new_height = int(frame.height * scale_factor)
frame = frame.reformat(width=new_width, height=new_height)
# Simple rotation detection and logging
if frame.rotation:
try:
fplanes = frame.to_ndarray()
# Split into Y, U, V planes of proper dimensions
planes = [
fplanes[: frame.height],
fplanes[frame.height : frame.height + frame.height // 4].reshape(
frame.height // 2, frame.width // 2
),
fplanes[frame.height + frame.height // 4 :].reshape(
frame.height // 2, frame.width // 2
),
]
# Rotate
planes = [np.rot90(p, frame.rotation // 90) for p in planes]
# Restore PyAV format
planes = np.hstack([p.flat for p in planes]).reshape(
-1, planes[0].shape[1]
)
frame = av.VideoFrame.from_ndarray(planes, format=frame.format.name)
del planes, fplanes
except Exception as e:
if "not yet supported" in str(e):
logger.warning(
f"Not rotating {path.name} preview image by {frame.rotation}°:\n PyAV: {e}"
)
else:
logger.exception(f"Error rotating video frame: {e}")
t_load_end = perf_counter()
t_save_start = perf_counter()
crf = str(int(63 * (1 - quality / 100) ** 2)) # Closely matching PIL quality-%
ostream = ocontainer.add_stream(
"av1",
options={
"crf": crf,
"usage": "realtime",
"cpu-used": "8",
"threads": "1",
},
)
assert isinstance(ostream, av.VideoStream)
ostream.width = frame.width
ostream.height = frame.height
icc = istream.codec_context
occ = ostream.codec_context
# Copy HDR metadata from input video stream
occ.color_primaries = icc.color_primaries
occ.color_trc = icc.color_trc
occ.colorspace = icc.colorspace
occ.color_range = icc.color_range
ocontainer.mux(ostream.encode(frame))
ocontainer.mux(ostream.encode(None)) # Flush the stream
t_save_end = perf_counter()
# Capture frame dimensions before cleanup
ret = imgdata.getvalue()
logger.debug(
"Preview video %s: load+decode=%.1fms save=%.1fms",
path.name,
(t_load_end - t_load_start) * 1000,
(t_save_end - t_save_start) * 1000,
)
del imgdata, istream, ostream, icc, occ, frame
gc.collect()
return ret

View File

@ -127,8 +127,7 @@ class FileEntry(msgspec.Struct, array_like=True, frozen=True):
return f"{self.name} ({self.size}, {self.mtime})" return f"{self.name} ({self.size}, {self.mtime})"
class Update(msgspec.Struct, array_like=True): class Update(msgspec.Struct, array_like=True): ...
...
class UpdKeep(Update, tag="k"): class UpdKeep(Update, tag="k"):

View File

@ -26,7 +26,6 @@ def run(*, dev=False):
motd=False, motd=False,
dev=dev, dev=dev,
auto_reload=dev, auto_reload=dev,
reload_dir={confdir},
access_log=True, access_log=True,
) # type: ignore ) # type: ignore
if dev: if dev:

View File

@ -29,6 +29,8 @@ async def handle_sanic_exception(request, e):
if not message or not request.app.debug and code == 500: if not message or not request.app.debug and code == 500:
message = "Internal Server Error" message = "Internal Server Error"
message = f"⚠️ {message}" if code < 500 else f"🛑 {message}" message = f"⚠️ {message}" if code < 500 else f"🛑 {message}"
if code == 500:
logger.exception(e)
# Non-browsers get JSON errors # Non-browsers get JSON errors
if "text/html" not in request.headers.accept: if "text/html" not in request.headers.accept:
return jres( return jres(

View File

@ -48,6 +48,7 @@ def treeiter(rootmod):
def treeget(rootmod: list[FileEntry], path: PurePosixPath): def treeget(rootmod: list[FileEntry], path: PurePosixPath):
begin = None begin = None
ret = [] ret = []
for i, relpath, entry in treeiter(rootmod): for i, relpath, entry in treeiter(rootmod):
if begin is None: if begin is None:
if relpath == path: if relpath == path:
@ -57,6 +58,7 @@ def treeget(rootmod: list[FileEntry], path: PurePosixPath):
if entry.level <= len(path.parts): if entry.level <= len(path.parts):
break break
ret.append(entry) ret.append(entry)
return begin, ret return begin, ret
@ -77,28 +79,36 @@ def treeinspos(rootmod: list[FileEntry], relpath: PurePosixPath, relfile: int):
# root # root
level += 1 level += 1
continue continue
ename = rel.parts[level - 1] ename = rel.parts[level - 1]
name = relpath.parts[level - 1] name = relpath.parts[level - 1]
esort = sortkey(ename) esort = sortkey(ename)
nsort = sortkey(name) nsort = sortkey(name)
# Non-leaf are always folders, only use relfile at leaf # Non-leaf are always folders, only use relfile at leaf
isfile = relfile if len(relpath.parts) == level else 0 isfile = relfile if len(relpath.parts) == level else 0
# First compare by isfile, then by sorting order and if that too matches then case sensitive # First compare by isfile, then by sorting order and if that too matches then case sensitive
cmp = ( cmp = (
entry.isfile - isfile entry.isfile - isfile
or (esort > nsort) - (esort < nsort) or (esort > nsort) - (esort < nsort)
or (ename > name) - (ename < name) or (ename > name) - (ename < name)
) )
if cmp > 0: if cmp > 0:
return i return i
if cmp < 0: if cmp < 0:
continue continue
level += 1 level += 1
if level > len(relpath.parts): if level > len(relpath.parts):
print("ERROR: insertpos", relpath, i, entry.name, entry.level, level) logger.error(
f"insertpos level overflow: relpath={relpath}, i={i}, entry.name={entry.name}, entry.level={entry.level}, level={level}"
)
break break
else: else:
i += 1 i += 1
return i return i
@ -179,21 +189,16 @@ def update_path(rootmod: list[FileEntry], relpath: PurePosixPath, loop):
"""Called on FS updates, check the filesystem and broadcast any changes.""" """Called on FS updates, check the filesystem and broadcast any changes."""
new = walk(relpath) new = walk(relpath)
obegin, old = treeget(rootmod, relpath) obegin, old = treeget(rootmod, relpath)
if old == new: if old == new:
logger.debug(
f"Watch: Event without changes needed {relpath}"
if old
else f"Watch: Event with old and new missing: {relpath}"
)
return return
if obegin is not None: if obegin is not None:
del rootmod[obegin : obegin + len(old)] del rootmod[obegin : obegin + len(old)]
if new: if new:
logger.debug(f"Watch: Update {relpath}" if old else f"Watch: Created {relpath}")
i = treeinspos(rootmod, relpath, new[0].isfile) i = treeinspos(rootmod, relpath, new[0].isfile)
rootmod[i:i] = new rootmod[i:i] = new
else:
logger.debug(f"Watch: Removed {relpath}")
def update_space(loop): def update_space(loop):
@ -218,17 +223,35 @@ def format_update(old, new):
oremain, nremain = set(old), set(new) oremain, nremain = set(old), set(new)
update = [] update = []
keep_count = 0 keep_count = 0
iteration_count = 0
# Precompute index maps to allow deterministic tie-breaking when both
# candidates exist in both sequences but are not equal (rename/move cases)
old_pos = {e: i for i, e in enumerate(old)}
new_pos = {e: i for i, e in enumerate(new)}
while oidx < len(old) and nidx < len(new): while oidx < len(old) and nidx < len(new):
iteration_count += 1
# Emergency brake for potential infinite loops
if iteration_count > 50000:
logger.error(
f"format_update potential infinite loop! iteration={iteration_count}, oidx={oidx}, nidx={nidx}"
)
raise Exception(
f"format_update infinite loop detected at iteration {iteration_count}"
)
modified = False modified = False
# Matching entries are kept # Matching entries are kept
if old[oidx] == new[nidx]: if old[oidx] == new[nidx]:
entry = old[oidx] entry = old[oidx]
oremain.remove(entry) oremain.discard(entry)
nremain.remove(entry) nremain.discard(entry)
keep_count += 1 keep_count += 1
oidx += 1 oidx += 1
nidx += 1 nidx += 1
continue continue
if keep_count > 0: if keep_count > 0:
modified = True modified = True
update.append(UpdKeep(keep_count)) update.append(UpdKeep(keep_count))
@ -248,7 +271,7 @@ def format_update(old, new):
insert_items = [] insert_items = []
while nidx < len(new) and new[nidx] not in oremain: while nidx < len(new) and new[nidx] not in oremain:
entry = new[nidx] entry = new[nidx]
nremain.remove(entry) nremain.discard(entry)
insert_items.append(entry) insert_items.append(entry)
nidx += 1 nidx += 1
if insert_items: if insert_items:
@ -256,9 +279,32 @@ def format_update(old, new):
update.append(UpdIns(insert_items)) update.append(UpdIns(insert_items))
if not modified: if not modified:
raise Exception( # Tie-break: both items exist in both lists but don't match here.
f"Infinite loop in diff {nidx=} {oidx=} {len(old)=} {len(new)=}" # Decide whether to delete old[oidx] first or insert new[nidx] first
) # based on which alignment is closer.
if oidx >= len(old) or nidx >= len(new):
break
cur_old = old[oidx]
cur_new = new[nidx]
pos_old_in_new = new_pos.get(cur_old)
pos_new_in_old = old_pos.get(cur_new)
# Default distances if not present (shouldn't happen if in remain sets)
dist_del = (pos_old_in_new - nidx) if pos_old_in_new is not None else 1
dist_ins = (pos_new_in_old - oidx) if pos_new_in_old is not None else 1
# Prefer the operation with smaller forward distance; tie => delete
if dist_del <= dist_ins:
# Delete current old item
oremain.discard(cur_old)
update.append(UpdDel(1))
oidx += 1
else:
# Insert current new item
nremain.discard(cur_new)
update.append(UpdIns([cur_new]))
nidx += 1
# Diff any remaining # Diff any remaining
if keep_count > 0: if keep_count > 0:
@ -311,10 +357,7 @@ def watcher_inotify(loop):
while not quit.is_set(): while not quit.is_set():
i = inotify.adapters.InotifyTree(rootpath.as_posix()) i = inotify.adapters.InotifyTree(rootpath.as_posix())
# Initialize the tree from filesystem # Initialize the tree from filesystem
t0 = time.perf_counter()
update_root(loop) update_root(loop)
t1 = time.perf_counter()
logger.debug(f"Root update took {t1 - t0:.1f}s")
trefresh = time.monotonic() + 300.0 trefresh = time.monotonic() + 300.0
tspace = time.monotonic() + 5.0 tspace = time.monotonic() + 5.0
# Watch for changes (frequent wakeups needed for quiting) # Watch for changes (frequent wakeups needed for quiting)
@ -335,32 +378,52 @@ def watcher_inotify(loop):
if quit.is_set(): if quit.is_set():
return return
interesting = any(f in modified_flags for f in event[1]) interesting = any(f in modified_flags for f in event[1])
if event[2] == rootpath.as_posix() and event[3] == "zzz":
logger.debug(f"Watch: {interesting=} {event=}")
if interesting: if interesting:
# Update modified path # Update modified path
t0 = time.perf_counter()
path = PurePosixPath(event[2]) / event[3] path = PurePosixPath(event[2]) / event[3]
update_path(rootmod, path.relative_to(rootpath), loop) try:
t1 = time.perf_counter() rel_path = path.relative_to(rootpath)
logger.debug(f"Watch: Update {event[3]} took {t1 - t0:.1f}s") update_path(rootmod, rel_path, loop)
except Exception as e:
logger.error(
f"Error processing inotify event for path {path}: {e}"
)
raise
if not dirty: if not dirty:
t = time.monotonic() t = time.monotonic()
dirty = True dirty = True
# Wait a maximum of 0.5s to push the updates # Wait a maximum of 0.2s to push the updates
if dirty and time.monotonic() >= t + 0.5: if dirty and time.monotonic() >= t + 0.2:
break break
if dirty and state.root != rootmod: if dirty and state.root != rootmod:
t0 = time.perf_counter() try:
update = format_update(state.root, rootmod) update = format_update(state.root, rootmod)
t1 = time.perf_counter() with state.lock:
with state.lock: broadcast(update, loop)
broadcast(update, loop) state.root = rootmod
state.root = rootmod except Exception:
t2 = time.perf_counter() logger.exception(
logger.debug( "format_update failed; falling back to full rescan"
f"Format update took {t1 - t0:.1f}s, broadcast {t2 - t1:.1f}s" )
) # Fallback: full rescan and try diff again; last resort send full root
try:
fresh = walk(PurePosixPath())
try:
update = format_update(state.root, fresh)
with state.lock:
broadcast(update, loop)
state.root = fresh
except Exception:
logger.exception(
"Fallback diff failed; sending full root snapshot"
)
with state.lock:
broadcast(format_root(fresh), loop)
state.root = fresh
except Exception:
logger.exception(
"Full rescan failed; dropping this batch of updates"
)
del i # Free the inotify object del i # Free the inotify object
@ -377,7 +440,7 @@ def watcher_poll(loop):
quit.wait(0.1 + 8 * dur) quit.wait(0.1 + 8 * dur)
async def start(app, loop): def start(app, loop):
global rootpath global rootpath
config.load_config() config.load_config()
rootpath = config.config.path rootpath = config.config.path
@ -391,6 +454,6 @@ async def start(app, loop):
app.ctx.watcher.start() app.ctx.watcher.start()
async def stop(app, loop): def stop(app):
quit.set() quit.set()
app.ctx.watcher.join() app.ctx.watcher.join()

Binary file not shown.

Before

Width:  |  Height:  |  Size: 363 KiB

After

Width:  |  Height:  |  Size: 40 KiB

2
frontend/.npmrc Normal file
View File

@ -0,0 +1,2 @@
audit=false
fund=false

View File

@ -22,25 +22,25 @@ If the standalone TypeScript plugin doesn't feel fast enough to you, Volar has a
### Run the backend ### Run the backend
```fish ```fish
hatch shell uv sync --dev
cista --dev -l :8000 uv run cista --dev -l :8000
``` ```
### And the Vite server (in another terminal) ### And the Vite server (in another terminal)
```fish ```fish
cd frontend cd frontend
npm install bun install
npm run dev bun run dev
``` ```
Browse to Vite, which will proxy API requests to port 8000. Both servers live reload changes. Browse to Vite, which will proxy API requests to port 8000. Both servers live reload changes.
### Type-Check, Compile and Minify for Production ### Type-Check, Compile and Minify for Production
This is also called by `hatch build` during Python packaging: This is also called by `uv build` during Python packaging:
```fish ```fish
npm run build bun run build
``` ```

View File

@ -12,6 +12,9 @@
"lint": "eslint . --ext .vue,.js,.jsx,.cjs,.mjs,.ts,.tsx,.cts,.mts --fix --ignore-path .gitignore", "lint": "eslint . --ext .vue,.js,.jsx,.cjs,.mjs,.ts,.tsx,.cts,.mts --fix --ignore-path .gitignore",
"format": "prettier --write src/" "format": "prettier --write src/"
}, },
"engines": {
"node": ">=18.0.0"
},
"dependencies": { "dependencies": {
"@imengyu/vue3-context-menu": "^1.3.3", "@imengyu/vue3-context-menu": "^1.3.3",
"@vueuse/core": "^10.4.1", "@vueuse/core": "^10.4.1",
@ -21,7 +24,6 @@
"pinia": "^2.1.6", "pinia": "^2.1.6",
"pinia-plugin-persistedstate": "^3.2.0", "pinia-plugin-persistedstate": "^3.2.0",
"unplugin-vue-components": "^0.25.2", "unplugin-vue-components": "^0.25.2",
"vite-plugin-rewrite-all": "^1.0.1",
"vite-svg-loader": "^4.0.0", "vite-svg-loader": "^4.0.0",
"vue": "^3.3.4", "vue": "^3.3.4",
"vue-router": "^4.2.4" "vue-router": "^4.2.4"

View File

@ -10,6 +10,10 @@
<main> <main>
<RouterView :path="path.pathList" :query="path.query" /> <RouterView :path="path.pathList" :query="path.query" />
</main> </main>
<footer>
<TransferBar :status=store.uprogress @cancel=store.cancelUploads class=upload />
<TransferBar :status=store.dprogress @cancel=store.cancelDownloads class=download />
</footer>
</template> </template>
<script setup lang="ts"> <script setup lang="ts">

View File

@ -91,8 +91,7 @@
} }
.headermain, .headermain,
.menu, .menu,
.rename-button, .rename-button {
.suggest-gallery {
display: none !important; display: none !important;
} }
.breadcrumb > a { .breadcrumb > a {

View File

@ -1,6 +1,5 @@
<template> <template>
<SvgButton name="download" data-tooltip="Download" @click="download" /> <SvgButton name="download" data-tooltip="Download" @click="download" />
<TransferBar :status=progress @cancel=cancelDownloads />
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
@ -26,22 +25,22 @@ const status_init = {
filepos: 0, filepos: 0,
status: 'idle', status: 'idle',
} }
const progress = reactive({...status_init}) store.dprogress = {...status_init}
setInterval(() => { setInterval(() => {
if (Date.now() - progress.tlast > 3000) { if (Date.now() - store.dprogress.tlast > 3000) {
// Reset // Reset
progress.statbytes = 0 store.dprogress.statbytes = 0
progress.statdur = 1 store.dprogress.statdur = 1
} else { } else {
// Running average by decay // Running average by decay
progress.statbytes *= .9 store.dprogress.statbytes *= .9
progress.statdur *= .9 store.dprogress.statdur *= .9
} }
}, 100) }, 100)
const statReset = () => { const statReset = () => {
Object.assign(progress, status_init) Object.assign(store.dprogress, status_init)
progress.t0 = Date.now() store.dprogress.t0 = Date.now()
progress.tlast = progress.t0 + 1 store.dprogress.tlast = store.dprogress.t0 + 1
} }
const cancelDownloads = () => { const cancelDownloads = () => {
location.reload() // FIXME location.reload() // FIXME
@ -61,9 +60,9 @@ const filesystemdl = async (sel: SelectedItems, handle: FileSystemDirectoryHandl
console.log('Downloading to filesystem', sel.recursive) console.log('Downloading to filesystem', sel.recursive)
for (const [rel, full, doc] of sel.recursive) { for (const [rel, full, doc] of sel.recursive) {
if (doc.dir) continue if (doc.dir) continue
progress.files.push(rel) store.dprogress.files.push(rel)
++progress.filecount ++store.dprogress.filecount
progress.total += doc.size store.dprogress.total += doc.size
} }
for (const [rel, full, doc] of sel.recursive) { for (const [rel, full, doc] of sel.recursive) {
// Create any missing directories // Create any missing directories
@ -73,6 +72,7 @@ const filesystemdl = async (sel: SelectedItems, handle: FileSystemDirectoryHandl
} }
const r = rel.slice(hdir.length) const r = rel.slice(hdir.length)
for (const dir of r.split('/').slice(0, doc.dir ? undefined : -1)) { for (const dir of r.split('/').slice(0, doc.dir ? undefined : -1)) {
if (!dir) continue
hdir += `${dir}/` hdir += `${dir}/`
try { try {
h = await h.getDirectoryHandle(dir.normalize('NFC'), { create: true }) h = await h.getDirectoryHandle(dir.normalize('NFC'), { create: true })
@ -101,22 +101,22 @@ const filesystemdl = async (sel: SelectedItems, handle: FileSystemDirectoryHandl
throw new Error(`Failed to download ${url}: ${res.status} ${res.statusText}`) throw new Error(`Failed to download ${url}: ${res.status} ${res.statusText}`)
} }
if (res.body) { if (res.body) {
++progress.fileidx ++store.dprogress.fileidx
const reader = res.body.getReader() const reader = res.body.getReader()
await writable.truncate(0) await writable.truncate(0)
store.error = "Direct download." store.error = "Direct download."
progress.tlast = Date.now() store.dprogress.tlast = Date.now()
while (true) { while (true) {
const { value, done } = await reader.read() const { value, done } = await reader.read()
if (done) break if (done) break
await writable.write(value) await writable.write(value)
const now = Date.now() const now = Date.now()
const size = value.byteLength const size = value.byteLength
progress.xfer += size store.dprogress.xfer += size
progress.filepos += size store.dprogress.filepos += size
progress.statbytes += size store.dprogress.statbytes += size
progress.statdur += now - progress.tlast store.dprogress.statdur += now - store.dprogress.tlast
progress.tlast = now store.dprogress.tlast = now
} }
} }
await writable.close() await writable.close()

View File

@ -115,6 +115,7 @@ const rename = (doc: Doc, newName: string) => {
} }
defineExpose({ defineExpose({
newFolder() { newFolder() {
console.log("New folder")
const now = Math.floor(Date.now() / 1000) const now = Math.floor(Date.now() / 1000)
editing.value = new Doc({ editing.value = new Doc({
loc: loc.value, loc: loc.value,
@ -124,6 +125,7 @@ defineExpose({
mtime: now, mtime: now,
size: 0, size: 0,
}) })
store.cursor = editing.value.key
}, },
toggleSelectAll() { toggleSelectAll() {
console.log('Select') console.log('Select')

View File

@ -2,8 +2,11 @@
<div v-if="props.documents.length || editing" class="gallery" ref="gallery"> <div v-if="props.documents.length || editing" class="gallery" ref="gallery">
<GalleryFigure v-if="editing?.key === 'new'" :doc="editing" :key=editing.key :editing="{rename: mkdir, exit}" /> <GalleryFigure v-if="editing?.key === 'new'" :doc="editing" :key=editing.key :editing="{rename: mkdir, exit}" />
<template v-for="(doc, index) in documents" :key=doc.key> <template v-for="(doc, index) in documents" :key=doc.key>
<GalleryFigure :doc=doc :editing="editing === doc ? {rename, exit} : null"> <GalleryFigure :doc=doc :editing="editing === doc ? {rename, exit} : null" @menu="contextMenu($event, doc)">
<BreadCrumb v-if=showFolderBreadcrumb(index) :path="doc.loc ? doc.loc.split('/') : []" class="folder-change"/> <template v-if=showFolderBreadcrumb(index)>
<BreadCrumb :path="doc.loc ? doc.loc.split('/') : []" class="folder-change"/>
<div class="spacer"></div>
</template>
</GalleryFigure> </GalleryFigure>
</template> </template>
</div> </div>
@ -67,6 +70,7 @@ defineExpose({
mtime: now, mtime: now,
size: 0, size: 0,
}) })
store.cursor = editing.value.key
}, },
toggleSelectAll() { toggleSelectAll() {
console.log('Select') console.log('Select')
@ -254,6 +258,9 @@ const contextMenu = (ev: MouseEvent, doc: Doc) => {
align-items: end; align-items: end;
} }
.breadcrumb { .breadcrumb {
border-radius: .5em; border-radius: .5em 0 0 .5em;
}
.spacer {
flex: 0 1000000000 4rem;
} }
</style> </style>

View File

@ -9,7 +9,7 @@
<slot></slot> <slot></slot>
<MediaPreview ref=m :doc="doc" tabindex=-1 quality="sz=512" class="figcontent" /> <MediaPreview ref=m :doc="doc" tabindex=-1 quality="sz=512" class="figcontent" />
<div class="titlespacer"></div> <div class="titlespacer"></div>
<figcaption @click.prevent> <figcaption @click.prevent @contextmenu.prevent="$emit('menu', $event)">
<template v-if="editing"> <template v-if="editing">
<FileRenameInput :doc=doc :rename=editing.rename :exit=editing.exit /> <FileRenameInput :doc=doc :rename=editing.rename :exit=editing.exit />
</template> </template>

View File

@ -8,7 +8,7 @@
<SvgButton <SvgButton
name="create-folder" name="create-folder"
data-tooltip="New folder" data-tooltip="New folder"
@click="() => store.fileExplorer!.newFolder()" @click="() => { console.log('New', store.fileExplorer); store.fileExplorer!.newFolder(); console.log('Done')}"
/> />
<slot></slot> <slot></slot>
<div class="spacer smallgap"></div> <div class="spacer smallgap"></div>

View File

@ -101,7 +101,7 @@ const video = () => ['mkv', 'mp4', 'webm', 'mov', 'avi'].includes(props.doc.ext)
const audio = () => ['mp3', 'flac', 'ogg', 'aac'].includes(props.doc.ext) const audio = () => ['mp3', 'flac', 'ogg', 'aac'].includes(props.doc.ext)
const archive = () => ['zip', 'tar', 'gz', 'bz2', 'xz', '7z', 'rar'].includes(props.doc.ext) const archive = () => ['zip', 'tar', 'gz', 'bz2', 'xz', '7z', 'rar'].includes(props.doc.ext)
const preview = () => ( const preview = () => (
['bmp', 'ico', 'tif', 'tiff', 'pdf'].includes(props.doc.ext) || ['bmp', 'ico', 'tif', 'tiff', 'heic', 'heif', 'pdf', 'epub', 'mobi'].includes(props.doc.ext) ||
props.doc.size > 500000 && props.doc.size > 500000 &&
['avif', 'webp', 'png', 'jpg', 'jpeg'].includes(props.doc.ext) ['avif', 'webp', 'png', 'jpg', 'jpeg'].includes(props.doc.ext)
) )

View File

@ -57,13 +57,12 @@ const speeddisp = computed(() => speed.value ? speed.value.toFixed(speed.value <
display: flex; display: flex;
flex-direction: column; flex-direction: column;
color: var(--primary-color); color: var(--primary-color);
position: fixed; width: 100%;
left: 0;
bottom: 0;
width: 100vw;
} }
.statustext { .statustext {
display: flex; display: flex;
align-items: center;
margin: 0 .5em;
padding: 0.5rem 0; padding: 0.5rem 0;
} }
span { span {
@ -84,4 +83,12 @@ span {
.position { min-width: 4em } .position { min-width: 4em }
.speed { min-width: 4em } .speed { min-width: 4em }
.upload .statustext::before {
font-size: 1.5em;
content: '🔺'
}
.download .statustext::before {
font-size: 1.5em;
content: '🔻'
}
</style> </style>

View File

@ -1,3 +1,12 @@
<template>
<template>
<input ref="fileInput" @change="uploadHandler" type="file" multiple>
<input ref="folderInput" @change="uploadHandler" type="file" webkitdirectory>
</template>
<SvgButton name="add-file" data-tooltip="Upload files" @click="fileInput.click()" />
<SvgButton name="add-folder" data-tooltip="Upload folder" @click="folderInput.click()" />
</template>
<script setup lang="ts"> <script setup lang="ts">
import { connect, uploadUrl } from '@/repositories/WS'; import { connect, uploadUrl } from '@/repositories/WS';
import { useMainStore } from '@/stores/main' import { useMainStore } from '@/stores/main'
@ -108,50 +117,50 @@ const uprogress_init = {
filepos: 0, filepos: 0,
status: 'idle', status: 'idle',
} }
const uprogress = reactive({...uprogress_init}) store.uprogress = {...uprogress_init}
setInterval(() => { setInterval(() => {
if (Date.now() - uprogress.tlast > 3000) { if (Date.now() - store.uprogress.tlast > 3000) {
// Reset // Reset
uprogress.statbytes = 0 store.uprogress.statbytes = 0
uprogress.statdur = 1 store.uprogress.statdur = 1
} else { } else {
// Running average by decay // Running average by decay
uprogress.statbytes *= .9 store.uprogress.statbytes *= .9
uprogress.statdur *= .9 store.uprogress.statdur *= .9
} }
}, 100) }, 100)
const statUpdate = ({name, size, start, end}: {name: string, size: number, start: number, end: number}) => { const statUpdate = ({name, size, start, end}: {name: string, size: number, start: number, end: number}) => {
if (name !== uprogress.filename) return // If stats have been reset if (name !== store.uprogress.filename) return // If stats have been reset
const now = Date.now() const now = Date.now()
uprogress.xfer = uprogress.filestart + end store.uprogress.xfer = store.uprogress.filestart + end
uprogress.filepos = end store.uprogress.filepos = end
uprogress.statbytes += end - start store.uprogress.statbytes += end - start
uprogress.statdur += now - uprogress.tlast store.uprogress.statdur += now - store.uprogress.tlast
uprogress.tlast = now store.uprogress.tlast = now
// File finished? // File finished?
if (end === size) { if (end === size) {
uprogress.filestart += size store.uprogress.filestart += size
statNextFile() statNextFile()
if (++uprogress.fileidx >= uprogress.filecount) statReset() if (++store.uprogress.fileidx >= store.uprogress.filecount) statReset()
} }
} }
const statNextFile = () => { const statNextFile = () => {
const f = uprogress.files.shift() const f = store.uprogress.files.shift()
if (!f) return statReset() if (!f) return statReset()
uprogress.filepos = 0 store.uprogress.filepos = 0
uprogress.filesize = f.file.size store.uprogress.filesize = f.file.size
uprogress.filename = f.cloudName store.uprogress.filename = f.cloudName
} }
const statReset = () => { const statReset = () => {
Object.assign(uprogress, uprogress_init) Object.assign(store.uprogress, uprogress_init)
uprogress.t0 = Date.now() store.uprogress.t0 = Date.now()
uprogress.tlast = uprogress.t0 + 1 store.uprogress.tlast = store.uprogress.t0 + 1
} }
const statsAdd = (f: CloudFile[]) => { const statsAdd = (f: CloudFile[]) => {
if (uprogress.files.length === 0) statReset() if (store.uprogress.files.length === 0) statReset()
uprogress.total += f.reduce((a, b) => a + b.file.size, 0) store.uprogress.total += f.reduce((a, b) => a + b.file.size, 0)
uprogress.filecount += f.length store.uprogress.filecount += f.length
uprogress.files = [...uprogress.files, ...f] store.uprogress.files = [...store.uprogress.files, ...f]
statNextFile() statNextFile()
} }
let upqueue = [] as CloudFile[] let upqueue = [] as CloudFile[]
@ -181,7 +190,7 @@ const WSCreate = async () => await new Promise<WebSocket>(resolve => {
// @ts-ignore // @ts-ignore
ws.sendData = async (data: any) => { ws.sendData = async (data: any) => {
// Wait until the WS is ready to send another message // Wait until the WS is ready to send another message
uprogress.status = "uploading" store.uprogress.status = "uploading"
await new Promise(resolve => { await new Promise(resolve => {
const t = setInterval(() => { const t = setInterval(() => {
if (ws.bufferedAmount > 1<<20) return if (ws.bufferedAmount > 1<<20) return
@ -189,7 +198,7 @@ const WSCreate = async () => await new Promise<WebSocket>(resolve => {
clearInterval(t) clearInterval(t)
}, 1) }, 1)
}) })
uprogress.status = "processing" store.uprogress.status = "processing"
ws.send(data) ws.send(data)
} }
}) })
@ -210,7 +219,7 @@ const worker = async () => {
if (f.cloudPos === f.file.size) upqueue.shift() if (f.cloudPos === f.file.size) upqueue.shift()
} }
if (upqueue.length) startWorker() if (upqueue.length) startWorker()
uprogress.status = "idle" store.uprogress.status = "idle"
workerRunning = false workerRunning = false
} }
let workerRunning: any = false let workerRunning: any = false
@ -233,12 +242,3 @@ onUnmounted(() => {
removeEventListener('drop', uploadHandler) removeEventListener('drop', uploadHandler)
}) })
</script> </script>
<template>
<template>
<input ref="fileInput" @change="uploadHandler" type="file" multiple>
<input ref="folderInput" @change="uploadHandler" type="file" webkitdirectory>
</template>
<SvgButton name="add-file" data-tooltip="Upload files" @click="fileInput.click()" />
<SvgButton name="add-folder" data-tooltip="Upload folder" @click="folderInput.click()" />
<TransferBar :status=uprogress @cancel=cancelUploads />
</template>

View File

@ -37,21 +37,24 @@ export class Doc {
return this.url.replace(/^\/#/, '') return this.url.replace(/^\/#/, '')
} }
get img(): boolean { get img(): boolean {
const ext = this.name.split('.').pop()?.toLowerCase() // Folders cannot be images
return ['jpg', 'jpeg', 'png', 'gif', 'webp', 'avif', 'svg'].includes(ext || '') if (this.dir) return false
return ['jpg', 'jpeg', 'png', 'gif', 'webp', 'avif', 'heic', 'heif', 'svg'].includes(this.ext)
} }
get previewable(): boolean { get previewable(): boolean {
// Folders cannot be previewable
if (this.dir) return false
if (this.img) return true if (this.img) return true
const ext = this.name.split('.').pop()?.toLowerCase()
// Not a comprehensive list, but good enough for now // Not a comprehensive list, but good enough for now
return ['mp4', 'mkv', 'webm', 'ogg', 'mp3', 'flac', 'aac', 'pdf'].includes(ext || '') return ['mp4', 'mkv', 'webm', 'ogg', 'mp3', 'flac', 'aac', 'pdf'].includes(this.ext)
} }
get previewurl(): string { get previewurl(): string {
return this.url.replace(/^\/files/, '/preview') return this.url.replace(/^\/files/, '/preview')
} }
get ext(): string { get ext(): string {
const ext = this.name.split('.').pop() const dotIndex = this.name.lastIndexOf('.')
return ext ? ext.toLowerCase() : '' if (dotIndex === -1 || dotIndex === this.name.length - 1) return ''
return this.name.slice(dotIndex + 1).toLowerCase()
} }
} }
export type errorEvent = { export type errorEvent = {

View File

@ -19,6 +19,8 @@ export const useMainStore = defineStore({
cursor: '' as string, cursor: '' as string,
server: {} as Record<string, any>, server: {} as Record<string, any>,
dialog: '' as '' | 'login' | 'settings', dialog: '' as '' | 'login' | 'settings',
uprogress: {} as any,
dprogress: {} as any,
prefs: { prefs: {
gallery: false, gallery: false,
sortListing: '' as SortOrder, sortListing: '' as SortOrder,
@ -89,7 +91,13 @@ export const useMainStore = defineStore({
}, },
focusBreadcrumb() { focusBreadcrumb() {
(document.querySelector('.breadcrumb') as HTMLAnchorElement).focus() (document.querySelector('.breadcrumb') as HTMLAnchorElement).focus()
} },
cancelDownloads() {
location.reload() // FIXME
},
cancelUploads() {
location.reload() // FIXME
},
}, },
getters: { getters: {
sortOrder(): SortOrder { return this.query ? this.prefs.sortFiltered : this.prefs.sortListing }, sortOrder(): SortOrder { return this.query ? this.prefs.sortFiltered : this.prefs.sortListing },

View File

@ -50,12 +50,11 @@ export function formatUnixDate(t: number) {
} }
export function getFileExtension(filename: string) { export function getFileExtension(filename: string) {
const parts = filename.split('.') const dotIndex = filename.lastIndexOf('.')
if (parts.length > 1) { if (dotIndex === -1 || dotIndex === filename.length - 1) {
return parts[parts.length - 1] return '' // No extension
} else {
return '' // No hay extensión
} }
return filename.slice(dotIndex + 1)
} }
interface FileTypes { interface FileTypes {
[key: string]: string[] [key: string]: string[]
@ -68,8 +67,9 @@ const filetypes: FileTypes = {
} }
export function getFileType(name: string): string { export function getFileType(name: string): string {
const ext = name.split('.').pop()?.toLowerCase() const dotIndex = name.lastIndexOf('.')
if (!ext || ext.length === name.length) return 'unknown' if (dotIndex === -1 || dotIndex === name.length - 1) return 'unknown'
const ext = name.slice(dotIndex + 1).toLowerCase()
return Object.keys(filetypes).find(type => filetypes[type].includes(ext)) || 'unknown' return Object.keys(filetypes).find(type => filetypes[type].includes(ext)) || 'unknown'
} }

View File

@ -13,21 +13,16 @@
:path="props.path" :path="props.path"
:documents="documents" :documents="documents"
/> />
<div v-if="!store.prefs.gallery && documents.some(doc => doc.previewable)" class="suggest-gallery">
<SvgButton name="eye" taborder=0 @click="() => { store.prefs.gallery = true }"></SvgButton>
Gallery View
</div>
<EmptyFolder :documents=documents :path=props.path /> <EmptyFolder :documents=documents :path=props.path />
</template> </template>
<script setup lang="ts"> <script setup lang="ts">
import { watchEffect, ref, computed } from 'vue' import { watchEffect, ref, computed, watch } from 'vue'
import { useMainStore } from '@/stores/main' import { useMainStore } from '@/stores/main'
import Router from '@/router/index' import Router from '@/router/index'
import { needleFormat, localeIncludes, collator } from '@/utils' import { needleFormat, localeIncludes, collator } from '@/utils'
import { sorted } from '@/utils/docsort' import { sorted } from '@/utils/docsort'
import FileExplorer from '@/components/FileExplorer.vue' import FileExplorer from '@/components/FileExplorer.vue'
import cog from '@/assets/svg/cog.svg'
const store = useMainStore() const store = useMainStore()
const fileExplorer = ref() const fileExplorer = ref()
@ -77,6 +72,10 @@ watchEffect(() => {
store.fileExplorer = fileExplorer.value store.fileExplorer = fileExplorer.value
store.query = props.query store.query = props.query
}) })
watch(documents, (docs) => {
store.prefs.gallery = docs.some(d => d.previewable)
}, { immediate: true })
</script> </script>
<style scoped> <style scoped>
@ -90,15 +89,4 @@ watchEffect(() => {
text-shadow: 0 0 .3rem #000, 0 0 2rem #0008; text-shadow: 0 0 .3rem #000, 0 0 2rem #0008;
color: var(--accent-color); color: var(--accent-color);
} }
.suggest-gallery p {
font-size: 2rem;
color: var(--accent-color);
}
.suggest-gallery {
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
}
</style> </style>

View File

@ -4,12 +4,11 @@ import { defineConfig } from 'vite'
import vue from '@vitejs/plugin-vue' import vue from '@vitejs/plugin-vue'
// @ts-ignore // @ts-ignore
import pluginRewriteAll from 'vite-plugin-rewrite-all'
import svgLoader from 'vite-svg-loader' import svgLoader from 'vite-svg-loader'
import Components from 'unplugin-vue-components/vite' import Components from 'unplugin-vue-components/vite'
// Development mode: // Development mode:
// npm run dev # Run frontend that proxies to dev_backend // bun run dev # Run frontend that proxies to dev_backend
// cista -l :8000 --dev # Run backend // cista -l :8000 --dev # Run backend
const dev_backend = { const dev_backend = {
target: "http://localhost:8000", target: "http://localhost:8000",
@ -21,7 +20,6 @@ const dev_backend = {
export default defineConfig({ export default defineConfig({
plugins: [ plugins: [
vue(), vue(),
pluginRewriteAll(),
svgLoader(), // import svg files svgLoader(), // import svg files
Components(), // auto import components Components(), // auto import components
], ],

View File

@ -10,25 +10,38 @@ readme = "README.md"
authors = [ authors = [
{ name = "Vasanko" }, { name = "Vasanko" },
] ]
maintainers = [
{ name = "Vasanko" },
]
keywords = ["file-server", "web-interface", "dropbox", "storage"]
classifiers = [ classifiers = [
"Development Status :: 5 - Production/Stable",
"Environment :: Web Environment",
"Intended Audience :: End Users/Desktop",
"Intended Audience :: System Administrators",
"License :: Public Domain",
"License :: OSI Approved :: MIT License",
] ]
requires-python = ">=3.11" requires-python = ">=3.11"
dependencies = [ dependencies = [
"argon2-cffi", "argon2-cffi>=25.1.0",
"blake3", "av>=15.0.0",
"brotli", "blake3>=1.0.5",
"docopt", "brotli>=1.1.0",
"inotify", "docopt>=0.6.2",
"msgspec", "inotify>=0.2.12",
"natsort", "msgspec>=0.19.0",
"pathvalidate", "natsort>=8.4.0",
"pillow", "numpy>=2.3.2",
"pyav", "pathvalidate>=3.3.1",
"pyjwt", "pillow>=11.3.0",
"pymupdf", "pillow-heif>=1.1.0",
"sanic", "pyjwt>=2.10.1",
"stream-zip", "pymupdf>=1.26.3",
"tomli_w", "sanic>=25.3.0",
"setproctitle>=1.3.6",
"stream-zip>=0.0.83",
"tomli_w>=1.2.0",
] ]
[project.urls] [project.urls]
@ -39,8 +52,19 @@ cista = "cista.__main__:main"
[project.optional-dependencies] [project.optional-dependencies]
dev = [ dev = [
"pytest", "pytest>=8.4.1",
"ruff", "ruff>=0.8.0",
"mypy>=1.13.0",
"pre-commit>=4.0.0",
]
test = [
"pytest>=8.4.1",
"pytest-cov>=6.0.0",
"pytest-asyncio>=0.25.0",
]
docs = [
"sphinx>=8.0.0",
"sphinx-rtd-theme>=3.0.0",
] ]
[tool.hatch.version] [tool.hatch.version]
@ -48,57 +72,83 @@ source = "vcs"
[tool.hatch.build] [tool.hatch.build]
artifacts = ["cista/wwwroot"] artifacts = ["cista/wwwroot"]
hooks.custom.path = "scripts/build-frontend.py" targets.sdist.hooks.custom.path = "scripts/build-frontend.py"
targets.sdist.include = [
"/cista",
]
hooks.vcs.version-file = "cista/_version.py" hooks.vcs.version-file = "cista/_version.py"
hooks.vcs.template = """ hooks.vcs.template = """
# This file is automatically generated by hatch build. # This file is automatically generated by hatch build.
__version__ = {version!r} __version__ = {version!r}
""" """
only-packages = true only-packages = true
targets.sdist.include = [
"/cista",
]
[tool.pytest.ini_options] [tool.pytest.ini_options]
addopts = [ addopts = [
"--import-mode=importlib", "--import-mode=importlib",
"--verbosity=-1", "--verbosity=2",
"-p no:warnings", "--strict-markers",
"--strict-config",
"--cov=cista",
"--cov-report=term-missing",
"--cov-report=html",
"--cov-branch",
] ]
testpaths = [ testpaths = ["tests"]
"tests", python_files = ["test_*.py", "*_test.py"]
python_classes = ["Test*"]
python_functions = ["test_*"]
markers = [
"slow: marks tests as slow (deselect with '-m \"not slow\"')",
"integration: marks tests as integration tests",
]
filterwarnings = [
"error",
"ignore::UserWarning",
"ignore::DeprecationWarning",
] ]
[tool.ruff] [tool.ruff.lint]
select = ["ALL"] isort.known-first-party = ["cista"]
ignore = [ per-file-ignores."tests/*" = ["S", "ANN", "D", "INP", "PLR2004"]
"A0", per-file-ignores."scripts/*" = ["T20"]
"ARG001",
"ANN", [dependency-groups]
"B018", dev = [
"BLE001", "pytest>=8.4.1",
"C901", "ruff>=0.8.0",
"COM812", # conflicts with ruff format "mypy>=1.13.0",
"D", "pre-commit>=4.0.0",
"E501", ]
"EM1", test = [
"FIX002", "pytest>=8.4.1",
"ISC001", # conflicts with ruff format "pytest-cov>=6.0.0",
"PGH003", "pytest-asyncio>=0.25.0",
"PLR0912",
"PLR2004",
"PLW0603",
"S101",
"SLF001",
"T201",
"TD0",
"TRY",
] ]
show-source = true
show-fixes = true
[tool.ruff.isort] [tool.coverage.run]
known-first-party = ["cista"] source = ["cista"]
branch = true
omit = [
"*/tests/*",
"*/test_*",
"*/__pycache__/*",
"cista/_version.py",
]
[tool.ruff.per-file-ignores] [tool.coverage.report]
"tests/*" = ["S", "ANN", "D", "INP"] exclude_lines = [
"pragma: no cover",
"def __repr__",
"if self.debug:",
"if settings.DEBUG",
"raise AssertionError",
"raise NotImplementedError",
"if 0:",
"if __name__ == .__main__.:",
"class .*\\bProtocol\\):",
"@(abc\\.)?abstractmethod",
]
show_missing = true
skip_covered = false
precision = 2

View File

@ -1,5 +1,8 @@
# noqa: INP001 # noqa: INP001
import os
import shutil
import subprocess import subprocess
from sys import stderr
from hatchling.builders.hooks.plugin.interface import BuildHookInterface from hatchling.builders.hooks.plugin.interface import BuildHookInterface
@ -7,6 +10,28 @@ from hatchling.builders.hooks.plugin.interface import BuildHookInterface
class CustomBuildHook(BuildHookInterface): class CustomBuildHook(BuildHookInterface):
def initialize(self, version, build_data): def initialize(self, version, build_data):
super().initialize(version, build_data) super().initialize(version, build_data)
print("Building Cista frontend...") stderr.write(">>> Building Cista frontend\n")
subprocess.run("npm install --prefix frontend".split(" "), check=True) # noqa: S603 npm = None
subprocess.run("npm run build --prefix frontend".split(" "), check=True) # noqa: S603 bun = shutil.which("bun")
if bun is None:
npm = shutil.which("npm")
if npm is None:
raise RuntimeError(
"Bun or NodeJS `npm` is required for building but neither was found\n Visit https://bun.com/"
)
# npm --prefix doesn't work on Windows, so we chdir instead
os.chdir("frontend")
try:
if npm:
stderr.write("### npm install\n")
subprocess.run([npm, "install"], check=True) # noqa: S603
stderr.write("\n### npm run build\n")
subprocess.run([npm, "run", "build"], check=True) # noqa: S603
else:
assert bun
stderr.write("### bun install\n")
subprocess.run([bun, "install"], check=True) # noqa: S603
stderr.write("\n### bun run build\n")
subprocess.run([bun, "run", "build"], check=True) # noqa: S603
finally:
os.chdir("..")