7 Commits

Author SHA1 Message Date
Leo Vasanko
fdbf9b2610 Attempt to reduce leak of ffmpeg previews 2023-11-21 12:26:16 +00:00
Leo Vasanko
8437c1f60e Attempt to reduce leak of ffmpeg previews 2023-11-21 12:23:24 +00:00
Leo Vasanko
02c5e484b5 Attempt to reduce leak of ffmpeg previews 2023-11-21 12:22:03 +00:00
Leo Vasanko
71eb252b8d Restrict the number of workers. 2023-11-21 12:13:57 +00:00
Leo Vasanko
27422ae1e2 Attempt to reduce leak of ffmpeg previews 2023-11-21 04:01:51 -08:00
Leo Vasanko
c3d6aecffd Attempt to reduce leak of ffmpeg previews 2023-11-21 04:00:20 -08:00
Leo Vasanko
e2a9a6903c Memtrace 2023-11-21 03:46:06 -08:00
31 changed files with 313 additions and 609 deletions

1
.gitignore vendored
View File

@@ -1,5 +1,4 @@
.*
*.lock
!.gitignore
__pycache__/
*.egg-info/

View File

@@ -6,8 +6,6 @@ Cista takes its name from the ancient *cistae*, metal containers used by Greeks
This is a cutting-edge **file and document server** designed for speed, efficiency, and unparalleled ease of use. Experience **lightning-fast browsing**, thanks to the file list maintained directly in your browser and updated from server filesystem events, coupled with our highly optimized code. Fully **keyboard-navigable** and with a responsive layout, Cista flawlessly adapts to your devices, providing a seamless experience wherever you are. Our powerful **instant search** means you're always just a few keystrokes away from finding exactly what you need. Press **1/2/3** to switch ordering, navigate with all four arrow keys (+Shift to select). Or click your way around on **breadcrumbs that remember where you were**.
**Built-in document and media previews** let you quickly view files without downloading them. Cista shows PDF and other documents, video and image thumbnails, with **HDR10 support** video previews and image formats, including HEIC and AVIF. It also has a player for music and video files.
The Cista project started as an inevitable remake of [Droppy](https://github.com/droppyjs/droppy) which we used and loved despite its numerous bugs. Cista Storage stands out in handling even the most exotic filenames, ensuring a smooth experience where others falter.
All of this is wrapped in an intuitive interface with automatic light and dark themes, making Cista Storage the ideal choice for anyone seeking a reliable, versatile, and quick file storage solution. Quickly setup your own Cista where your files are just a click away, safe, and always accessible.
@@ -16,31 +14,39 @@ Experience Cista by visiting [Cista Demo](https://drop.zi.fi) for a test run and
## Getting Started
### Installation
To install the cista application, use:
```fish
pip install cista
```
Note: Some Linux distributions might need `--break-system-packages` to install Python packages, which are safely installed in the user's home folder. As an alternative to avoid installation, run it with command `pipx run cista`
### Running the Server
We recommend using [UV](https://docs.astral.sh/uv/getting-started/installation/) to directly run Cista:
Create an account: (otherwise the server is public for all)
Create an account: (or run a public server without authentication)
```fish
uvx cista --user yourname --privileged
cista --user yourname --privileged
```
Serve your files at http://localhost:8000:
```fish
uvx cista -l :8000 /path/to/files
```
Alternatively, you can install with `pip` or `uv pip`. This enables using the `cista` command directly without `uvx` or `uv run`.
```fish
pip install cista --break-system-packages
cista -l :8000 /path/to/files
```
The server remembers its settings in the config folder (default `~/.local/share/cista/`), including the listen port and directory, for future runs without arguments.
### Internet Access
Most admins find the [Caddy](https://caddyserver.com/) web server convenient for its auto TLS certificates and all. A proxy also allows running multiple web services or Cista instances on the same IP address but different (sub)domains.
To use your own TLS certificates, place them in the config folder and run:
```fish
cista -l cista.example.com
```
Most admins instead find the [Caddy](https://caddyserver.com/) web server convenient for its auto TLS certificates and all. A proxy also allows running multiple web services or Cista instances on the same IP address but different (sub)domains.
`/etc/caddy/Caddyfile`:
@@ -50,13 +56,33 @@ cista.example.com {
}
```
Nxing or other proxy may be similarly used, or alternatively you can place cert and key in cista config dir and run `cista -l cista.example.com`
## Development setup
For rapid development, we use the Vite development server for the Vue frontend, while running the backend on port 8000 that Vite proxies backend requests to. Each server live reloads whenever its code or configuration are modified.
```fish
cd frontend
npm install
npm run dev
```
Concurrently, start the backend on another terminal:
```fish
hatch shell
pip install -e '.[dev]'
cista --dev -l :8000 /path/to/files
```
We use `hatch shell` for installing on a virtual environment, to avoid disturbing the rest of the system with our hacking.
Vue is used to build files in `cista/wwwroot`, included prebuilt in the Python package. Running `hatch build` builds the frontend and creates a NodeJS-independent Python package.
## System Deployment
This setup allows easy addition of storages, each with its own domain, configuration, and files.
Assuming a restricted user account `storage` for serving files and that UV is installed system-wide or on this account. Only UV is required: this does not use git or bun/npm.
Assuming a restricted user account `storage` for serving files and that cista is installed system-wide or on this account (check with `sudo -u storage -s`). Alternatively, use `pipx run cista` or `hatch run cista` as the ExecStart command.
Create `/etc/systemd/system/cista@.service`:
@@ -66,7 +92,7 @@ Description=Cista storage %i
[Service]
User=storage
ExecStart=uvx cista -c /srv/cista/%i -l /srv/cista/%i/socket /media/storage/%i
ExecStart=cista -c /srv/cista/%i -l /srv/cista/%i/socket /media/storage/%i
Restart=always
[Install]
@@ -90,34 +116,3 @@ foo.example.com, bar.example.com {
reverse_proxy unix//srv/cista/{host}/socket
}
```
## Development setup
For rapid development, we use the Vite development server for the Vue frontend, while running the backend on port 8000 that Vite proxies backend requests to. Each server live reloads whenever its code or configuration are modified.
Make sure you have git, uv and bun (or npm) installed.
Backend (Python) setup and run:
```fish
git clone https://git.zi.fi/Vasanko/cista-storage.git
cd cista-storage
uv sync --dev
uv run cista --dev -l :8000 /path/to/files
```
Frontend (Vue/Vite) run the dev server in another terminal:
```fish
cd frontend
bun install
bun run dev
```
Building the package for release (frontend + Python wheel/sdist):
```fish
uv build
```
Vue is used to build files in `cista/wwwroot`, included prebuilt in the Python package. `uv build` runs the project build hooks to bundle the frontend and produce a NodeJS-independent Python package.

View File

@@ -1,4 +1,3 @@
import os
import sys
from pathlib import Path
@@ -62,7 +61,6 @@ def _main():
path = None
_confdir(args)
exists = config.conffile.exists()
print(config.conffile, exists)
import_droppy = args["--import-droppy"]
necessary_opts = exists or import_droppy or path
if not necessary_opts:
@@ -119,8 +117,7 @@ def _confdir(args):
raise ValueError("Config path is not a directory")
# Accidentally pointed to the db.toml, use parent
confdir = confdir.parent
os.environ["CISTA_HOME"] = confdir.as_posix()
config.init_confdir() # Uses environ if available
config.conffile = confdir / config.conffile.name
def _user(args):

View File

@@ -2,20 +2,20 @@ import asyncio
import datetime
import mimetypes
import threading
import tracemalloc
from concurrent.futures import ThreadPoolExecutor
from multiprocessing import cpu_count
from pathlib import Path, PurePath, PurePosixPath
from stat import S_IFDIR, S_IFREG
from urllib.parse import unquote
from wsgiref.handlers import format_date_time
import brotli
import objgraph
import sanic.helpers
from blake3 import blake3
from sanic import Blueprint, Sanic, empty, raw, redirect
from sanic.exceptions import Forbidden, NotFound
from sanic.log import logger
from setproctitle import setproctitle
from stream_zip import ZIP_AUTO, stream_zip
from cista import auth, config, preview, session, watching
@@ -32,20 +32,27 @@ app.blueprint(bp)
app.exception(Exception)(handle_sanic_exception)
setproctitle("cista-main")
@app.before_server_start
async def main_start(app, loop):
tracemalloc.start()
config.load_config()
setproctitle(f"cista {config.config.path.name}")
workers = max(2, min(8, cpu_count()))
app.ctx.threadexec = ThreadPoolExecutor(
max_workers=workers, thread_name_prefix="cista-ioworker"
max_workers=3, thread_name_prefix="cista-ioworker"
)
await watching.start(app, loop)
@app.add_task
async def mem_task():
while True:
await asyncio.sleep(10)
snapshot = tracemalloc.take_snapshot()
top_stats = snapshot.statistics("lineno")
for stat in top_stats[:10]:
print(stat)
objgraph.show_growth(limit=10)
@app.after_server_stop
async def main_stop(app, loop):
quit.set()

View File

@@ -1,9 +1,7 @@
from __future__ import annotations
import os
import secrets
import sys
from contextlib import suppress
from functools import wraps
from hashlib import sha256
from pathlib import Path, PurePath
@@ -35,23 +33,7 @@ class Link(msgspec.Struct, omit_defaults=True):
config = None
conffile = None
def init_confdir():
if p := os.environ.get("CISTA_HOME"):
home = Path(p)
else:
xdg = os.environ.get("XDG_CONFIG_HOME")
home = (
Path(xdg).expanduser() / "cista" if xdg else Path.home() / ".config/cista"
)
if not home.is_dir():
home.mkdir(parents=True, exist_ok=True)
home.chmod(0o700)
global conffile
conffile = home / "db.toml"
conffile = Path.home() / ".local/share/cista/db.toml"
def derived_secret(*params, len=8) -> bytes:
@@ -79,8 +61,8 @@ def dec_hook(typ, obj):
def config_update(modify):
global config
if conffile is None:
init_confdir()
if not conffile.exists():
conffile.parent.mkdir(parents=True, exist_ok=True)
tmpname = conffile.with_suffix(".tmp")
try:
f = tmpname.open("xb")
@@ -94,6 +76,10 @@ def config_update(modify):
old = conffile.read_bytes()
c = msgspec.toml.decode(old, type=Config, dec_hook=dec_hook)
except FileNotFoundError:
# No existing config file, make sure we have a folder...
confdir = conffile.parent
confdir.mkdir(parents=True, exist_ok=True)
confdir.chmod(0o700)
old = b""
c = None
c = modify(c)
@@ -106,9 +92,7 @@ def config_update(modify):
f.write(new)
f.close()
if sys.platform == "win32":
# Windows doesn't support atomic replace
with suppress(FileNotFoundError):
conffile.unlink()
conffile.unlink() # Windows doesn't support atomic replace
tmpname.rename(conffile) # Atomic replace
except:
f.close()
@@ -136,8 +120,6 @@ def modifies_config(modify):
def load_config():
global config
if conffile is None:
init_confdir()
config = msgspec.toml.decode(conffile.read_bytes(), type=Config, dec_hook=dec_hook)

View File

@@ -4,14 +4,13 @@ import io
import mimetypes
import urllib.parse
from pathlib import PurePosixPath
from time import perf_counter
from urllib.parse import unquote
from wsgiref.handlers import format_date_time
import av
import av.datasets
import fitz # PyMuPDF
import numpy as np
import pillow_heif
from av.streams import SideData
from PIL import Image
from sanic import Blueprint, empty, raw
from sanic.exceptions import NotFound
@@ -20,8 +19,6 @@ from sanic.log import logger
from cista import config
from cista.util.filename import sanitize
pillow_heif.register_heif_opener()
bp = Blueprint("preview", url_prefix="/preview")
@@ -30,20 +27,20 @@ async def preview(req, path):
"""Preview a file"""
maxsize = int(req.args.get("px", 1024))
maxzoom = float(req.args.get("zoom", 2.0))
quality = int(req.args.get("q", 60))
quality = int(req.args.get("q", 40))
rel = PurePosixPath(sanitize(unquote(path)))
path = config.config.path / rel
stat = path.lstat()
etag = config.derived_secret(
"preview", rel, stat.st_mtime_ns, quality, maxsize, maxzoom
).hex()
savename = PurePosixPath(path.name).with_suffix(".avif")
savename = PurePosixPath(path.name).with_suffix(".webp")
headers = {
"etag": etag,
"last-modified": format_date_time(stat.st_mtime),
"cache-control": "max-age=604800, immutable"
+ ("" if config.config.public else ", private"),
"content-type": "image/avif",
"content-type": "image/webp",
"content-disposition": f"inline; filename*=UTF-8''{urllib.parse.quote(savename.as_posix())}",
}
if req.headers.if_none_match == etag:
@@ -62,164 +59,59 @@ async def preview(req, path):
def dispatch(path, quality, maxsize, maxzoom):
if path.suffix.lower() in (".pdf", ".xps", ".epub", ".mobi"):
return process_pdf(path, quality=quality, maxsize=maxsize, maxzoom=maxzoom)
type, _ = mimetypes.guess_type(path.name)
if type and type.startswith("video/"):
if mimetypes.guess_type(path.name)[0].startswith("video/"):
return process_video(path, quality=quality, maxsize=maxsize)
return process_image(path, quality=quality, maxsize=maxsize)
def process_image(path, *, maxsize, quality):
t_load_start = perf_counter()
img = Image.open(path)
# Force decode to include I/O in load timing
img.load()
t_load_end = perf_counter()
# Resize
orig_w, orig_h = img.size
t_proc_start = perf_counter()
img.thumbnail((min(orig_w, maxsize), min(orig_h, maxsize)))
t_proc_end = perf_counter()
# Save as AVIF
w, h = img.size
img.thumbnail((min(w, maxsize), min(h, maxsize)))
# Fix rotation based on EXIF data
try:
rotate_values = {3: 180, 6: 270, 8: 90}
orientation = img._getexif().get(274)
if orientation in rotate_values:
logger.debug(f"Rotating preview {path} by {rotate_values[orientation]}")
img = img.rotate(rotate_values[orientation], expand=True)
except AttributeError:
...
except Exception as e:
logger.error(f"Error rotating preview image: {e}")
# Save as webp
imgdata = io.BytesIO()
t_save_start = perf_counter()
img.save(imgdata, format="avif", quality=quality, speed=10, max_threads=1)
t_save_end = perf_counter()
ret = imgdata.getvalue()
load_ms = (t_load_end - t_load_start) * 1000
proc_ms = (t_proc_end - t_proc_start) * 1000
save_ms = (t_save_end - t_save_start) * 1000
logger.debug(
"Preview image %s: load=%.1fms process=%.1fms save=%.1fms out=%.1fKB",
path.name,
load_ms,
proc_ms,
save_ms,
len(ret) / 1024,
)
return ret
img.save(imgdata, format="webp", quality=quality, method=4)
return imgdata.getvalue()
def process_pdf(path, *, maxsize, maxzoom, quality, page_number=0):
t_load_start = perf_counter()
pdf = fitz.open(path)
page = pdf.load_page(page_number)
w, h = page.rect[2:4]
zoom = min(maxsize / w, maxsize / h, maxzoom)
mat = fitz.Matrix(zoom, zoom)
pix = page.get_pixmap(matrix=mat) # type: ignore[attr-defined]
t_load_end = perf_counter()
t_save_start = perf_counter()
ret = pix.pil_tobytes(format="avif", quality=quality, speed=10, max_threads=1)
t_save_end = perf_counter()
logger.debug(
"Preview pdf %s: load+render=%.1fms save=%.1fms",
path.name,
(t_load_end - t_load_start) * 1000,
(t_save_end - t_save_start) * 1000,
)
return ret
pix = page.get_pixmap(matrix=mat)
return pix.pil_tobytes(format="webp", quality=quality, method=4)
def process_video(path, *, maxsize, quality):
frame = None
with av.open(str(path)) as container:
stream = container.streams.video[0]
stream.codec_context.skip_frame = "NONKEY"
stream.codec_context.threads = 1
rot = stream.side_data and stream.side_data.get(SideData.DISPLAYMATRIX) or 0
container.seek(container.duration // 8)
img = next(container.decode(stream)).to_image()
del stream
img.thumbnail((maxsize, maxsize))
imgdata = io.BytesIO()
istream = ostream = icc = occ = frame = None
t_load_start = perf_counter()
# Initialize to avoid "possibly unbound" in static analysis when exceptions occur
t_load_end = t_load_start
t_save_start = t_load_start
t_save_end = t_load_start
with (
av.open(str(path)) as icontainer,
av.open(imgdata, "w", format="avif") as ocontainer,
):
istream = icontainer.streams.video[0]
istream.codec_context.skip_frame = "NONKEY"
icontainer.seek((icontainer.duration or 0) // 8)
for frame in icontainer.decode(istream):
if frame.dts is not None:
break
else:
raise RuntimeError("No frames found in video")
# Resize frame to thumbnail size
if frame.width > maxsize or frame.height > maxsize:
scale_factor = min(maxsize / frame.width, maxsize / frame.height)
new_width = int(frame.width * scale_factor)
new_height = int(frame.height * scale_factor)
frame = frame.reformat(width=new_width, height=new_height)
# Simple rotation detection and logging
if frame.rotation:
try:
fplanes = frame.to_ndarray()
# Split into Y, U, V planes of proper dimensions
planes = [
fplanes[: frame.height],
fplanes[frame.height : frame.height + frame.height // 4].reshape(
frame.height // 2, frame.width // 2
),
fplanes[frame.height + frame.height // 4 :].reshape(
frame.height // 2, frame.width // 2
),
]
# Rotate
planes = [np.rot90(p, frame.rotation // 90) for p in planes]
# Restore PyAV format
planes = np.hstack([p.flat for p in planes]).reshape(
-1, planes[0].shape[1]
)
frame = av.VideoFrame.from_ndarray(planes, format=frame.format.name)
del planes, fplanes
except Exception as e:
if "not yet supported" in str(e):
logger.warning(
f"Not rotating {path.name} preview image by {frame.rotation}°:\n PyAV: {e}"
)
else:
logger.exception(f"Error rotating video frame: {e}")
t_load_end = perf_counter()
t_save_start = perf_counter()
crf = str(int(63 * (1 - quality / 100) ** 2)) # Closely matching PIL quality-%
ostream = ocontainer.add_stream(
"av1",
options={
"crf": crf,
"usage": "realtime",
"cpu-used": "8",
"threads": "1",
},
)
assert isinstance(ostream, av.VideoStream)
ostream.width = frame.width
ostream.height = frame.height
icc = istream.codec_context
occ = ostream.codec_context
# Copy HDR metadata from input video stream
occ.color_primaries = icc.color_primaries
occ.color_trc = icc.color_trc
occ.colorspace = icc.colorspace
occ.color_range = icc.color_range
ocontainer.mux(ostream.encode(frame))
ocontainer.mux(ostream.encode(None)) # Flush the stream
t_save_end = perf_counter()
# Capture frame dimensions before cleanup
if rot:
img = img.rotate(rot, expand=True)
img.save(imgdata, format="webp", quality=quality, method=4)
del img
ret = imgdata.getvalue()
logger.debug(
"Preview video %s: load+decode=%.1fms save=%.1fms",
path.name,
(t_load_end - t_load_start) * 1000,
(t_save_end - t_save_start) * 1000,
)
del imgdata, istream, ostream, icc, occ, frame
del imgdata
gc.collect()
return ret

View File

@@ -127,7 +127,8 @@ class FileEntry(msgspec.Struct, array_like=True, frozen=True):
return f"{self.name} ({self.size}, {self.mtime})"
class Update(msgspec.Struct, array_like=True): ...
class Update(msgspec.Struct, array_like=True):
...
class UpdKeep(Update, tag="k"):

View File

@@ -1,6 +1,5 @@
import os
import re
import signal
from pathlib import Path
from sanic import Sanic
@@ -12,14 +11,6 @@ def run(*, dev=False):
"""Run Sanic main process that spawns worker processes to serve HTTP requests."""
from .app import app
# Set up immediate exit on Ctrl+C for faster termination
def signal_handler(signum, frame):
print("\nReceived interrupt signal, exiting immediately...")
os._exit(0)
signal.signal(signal.SIGINT, signal_handler)
signal.signal(signal.SIGTERM, signal_handler)
url, opts = parse_listen(config.config.listen)
# Silence Sanic's warning about running in production rather than debug
os.environ["SANIC_IGNORE_PRODUCTION_WARNING"] = "1"
@@ -35,6 +26,7 @@ def run(*, dev=False):
motd=False,
dev=dev,
auto_reload=dev,
reload_dir={confdir},
access_log=True,
) # type: ignore
if dev:

View File

@@ -29,8 +29,6 @@ async def handle_sanic_exception(request, e):
if not message or not request.app.debug and code == 500:
message = "Internal Server Error"
message = f"⚠️ {message}" if code < 500 else f"🛑 {message}"
if code == 500:
logger.exception(e)
# Non-browsers get JSON errors
if "text/html" not in request.headers.accept:
return jres(

View File

@@ -48,7 +48,6 @@ def treeiter(rootmod):
def treeget(rootmod: list[FileEntry], path: PurePosixPath):
begin = None
ret = []
for i, relpath, entry in treeiter(rootmod):
if begin is None:
if relpath == path:
@@ -58,7 +57,6 @@ def treeget(rootmod: list[FileEntry], path: PurePosixPath):
if entry.level <= len(path.parts):
break
ret.append(entry)
return begin, ret
@@ -79,36 +77,28 @@ def treeinspos(rootmod: list[FileEntry], relpath: PurePosixPath, relfile: int):
# root
level += 1
continue
ename = rel.parts[level - 1]
name = relpath.parts[level - 1]
esort = sortkey(ename)
nsort = sortkey(name)
# Non-leaf are always folders, only use relfile at leaf
isfile = relfile if len(relpath.parts) == level else 0
# First compare by isfile, then by sorting order and if that too matches then case sensitive
cmp = (
entry.isfile - isfile
or (esort > nsort) - (esort < nsort)
or (ename > name) - (ename < name)
)
if cmp > 0:
return i
if cmp < 0:
continue
level += 1
if level > len(relpath.parts):
logger.error(
f"insertpos level overflow: relpath={relpath}, i={i}, entry.name={entry.name}, entry.level={entry.level}, level={level}"
)
print("ERROR: insertpos", relpath, i, entry.name, entry.level, level)
break
else:
i += 1
return i
@@ -189,16 +179,21 @@ def update_path(rootmod: list[FileEntry], relpath: PurePosixPath, loop):
"""Called on FS updates, check the filesystem and broadcast any changes."""
new = walk(relpath)
obegin, old = treeget(rootmod, relpath)
if old == new:
logger.debug(
f"Watch: Event without changes needed {relpath}"
if old
else f"Watch: Event with old and new missing: {relpath}"
)
return
if obegin is not None:
del rootmod[obegin : obegin + len(old)]
if new:
logger.debug(f"Watch: Update {relpath}" if old else f"Watch: Created {relpath}")
i = treeinspos(rootmod, relpath, new[0].isfile)
rootmod[i:i] = new
else:
logger.debug(f"Watch: Removed {relpath}")
def update_space(loop):
@@ -223,35 +218,17 @@ def format_update(old, new):
oremain, nremain = set(old), set(new)
update = []
keep_count = 0
iteration_count = 0
# Precompute index maps to allow deterministic tie-breaking when both
# candidates exist in both sequences but are not equal (rename/move cases)
old_pos = {e: i for i, e in enumerate(old)}
new_pos = {e: i for i, e in enumerate(new)}
while oidx < len(old) and nidx < len(new):
iteration_count += 1
# Emergency brake for potential infinite loops
if iteration_count > 50000:
logger.error(
f"format_update potential infinite loop! iteration={iteration_count}, oidx={oidx}, nidx={nidx}"
)
raise Exception(
f"format_update infinite loop detected at iteration {iteration_count}"
)
modified = False
# Matching entries are kept
if old[oidx] == new[nidx]:
entry = old[oidx]
oremain.discard(entry)
nremain.discard(entry)
oremain.remove(entry)
nremain.remove(entry)
keep_count += 1
oidx += 1
nidx += 1
continue
if keep_count > 0:
modified = True
update.append(UpdKeep(keep_count))
@@ -271,7 +248,7 @@ def format_update(old, new):
insert_items = []
while nidx < len(new) and new[nidx] not in oremain:
entry = new[nidx]
nremain.discard(entry)
nremain.remove(entry)
insert_items.append(entry)
nidx += 1
if insert_items:
@@ -279,32 +256,9 @@ def format_update(old, new):
update.append(UpdIns(insert_items))
if not modified:
# Tie-break: both items exist in both lists but don't match here.
# Decide whether to delete old[oidx] first or insert new[nidx] first
# based on which alignment is closer.
if oidx >= len(old) or nidx >= len(new):
break
cur_old = old[oidx]
cur_new = new[nidx]
pos_old_in_new = new_pos.get(cur_old)
pos_new_in_old = old_pos.get(cur_new)
# Default distances if not present (shouldn't happen if in remain sets)
dist_del = (pos_old_in_new - nidx) if pos_old_in_new is not None else 1
dist_ins = (pos_new_in_old - oidx) if pos_new_in_old is not None else 1
# Prefer the operation with smaller forward distance; tie => delete
if dist_del <= dist_ins:
# Delete current old item
oremain.discard(cur_old)
update.append(UpdDel(1))
oidx += 1
else:
# Insert current new item
nremain.discard(cur_new)
update.append(UpdIns([cur_new]))
nidx += 1
raise Exception(
f"Infinite loop in diff {nidx=} {oidx=} {len(old)=} {len(new)=}"
)
# Diff any remaining
if keep_count > 0:
@@ -357,7 +311,10 @@ def watcher_inotify(loop):
while not quit.is_set():
i = inotify.adapters.InotifyTree(rootpath.as_posix())
# Initialize the tree from filesystem
t0 = time.perf_counter()
update_root(loop)
t1 = time.perf_counter()
logger.debug(f"Root update took {t1 - t0:.1f}s")
trefresh = time.monotonic() + 300.0
tspace = time.monotonic() + 5.0
# Watch for changes (frequent wakeups needed for quiting)
@@ -378,52 +335,32 @@ def watcher_inotify(loop):
if quit.is_set():
return
interesting = any(f in modified_flags for f in event[1])
if event[2] == rootpath.as_posix() and event[3] == "zzz":
logger.debug(f"Watch: {interesting=} {event=}")
if interesting:
# Update modified path
t0 = time.perf_counter()
path = PurePosixPath(event[2]) / event[3]
try:
rel_path = path.relative_to(rootpath)
update_path(rootmod, rel_path, loop)
except Exception as e:
logger.error(
f"Error processing inotify event for path {path}: {e}"
)
raise
update_path(rootmod, path.relative_to(rootpath), loop)
t1 = time.perf_counter()
logger.debug(f"Watch: Update {event[3]} took {t1 - t0:.1f}s")
if not dirty:
t = time.monotonic()
dirty = True
# Wait a maximum of 0.2s to push the updates
if dirty and time.monotonic() >= t + 0.2:
# Wait a maximum of 0.5s to push the updates
if dirty and time.monotonic() >= t + 0.5:
break
if dirty and state.root != rootmod:
try:
update = format_update(state.root, rootmod)
with state.lock:
broadcast(update, loop)
state.root = rootmod
except Exception:
logger.exception(
"format_update failed; falling back to full rescan"
)
# Fallback: full rescan and try diff again; last resort send full root
try:
fresh = walk(PurePosixPath())
try:
update = format_update(state.root, fresh)
with state.lock:
broadcast(update, loop)
state.root = fresh
except Exception:
logger.exception(
"Fallback diff failed; sending full root snapshot"
)
with state.lock:
broadcast(format_root(fresh), loop)
state.root = fresh
except Exception:
logger.exception(
"Full rescan failed; dropping this batch of updates"
)
t0 = time.perf_counter()
update = format_update(state.root, rootmod)
t1 = time.perf_counter()
with state.lock:
broadcast(update, loop)
state.root = rootmod
t2 = time.perf_counter()
logger.debug(
f"Format update took {t1 - t0:.1f}s, broadcast {t2 - t1:.1f}s"
)
del i # Free the inotify object

Binary file not shown.

Before

Width:  |  Height:  |  Size: 40 KiB

After

Width:  |  Height:  |  Size: 363 KiB

View File

@@ -1,2 +0,0 @@
audit=false
fund=false

View File

@@ -22,25 +22,25 @@ If the standalone TypeScript plugin doesn't feel fast enough to you, Volar has a
### Run the backend
```fish
uv sync --dev
uv run cista --dev -l :8000
hatch shell
cista --dev -l :8000
```
### And the Vite server (in another terminal)
```fish
cd frontend
bun install
bun run dev
npm install
npm run dev
```
Browse to Vite, which will proxy API requests to port 8000. Both servers live reload changes.
### Type-Check, Compile and Minify for Production
This is also called by `uv build` during Python packaging:
This is also called by `hatch build` during Python packaging:
```fish
bun run build
npm run build
```

View File

@@ -12,9 +12,6 @@
"lint": "eslint . --ext .vue,.js,.jsx,.cjs,.mjs,.ts,.tsx,.cts,.mts --fix --ignore-path .gitignore",
"format": "prettier --write src/"
},
"engines": {
"node": ">=18.0.0"
},
"dependencies": {
"@imengyu/vue3-context-menu": "^1.3.3",
"@vueuse/core": "^10.4.1",
@@ -24,6 +21,7 @@
"pinia": "^2.1.6",
"pinia-plugin-persistedstate": "^3.2.0",
"unplugin-vue-components": "^0.25.2",
"vite-plugin-rewrite-all": "^1.0.1",
"vite-svg-loader": "^4.0.0",
"vue": "^3.3.4",
"vue-router": "^4.2.4"

View File

@@ -10,10 +10,6 @@
<main>
<RouterView :path="path.pathList" :query="path.query" />
</main>
<footer>
<TransferBar :status=store.uprogress @cancel=store.cancelUploads class=upload />
<TransferBar :status=store.dprogress @cancel=store.cancelDownloads class=download />
</footer>
</template>
<script setup lang="ts">

View File

@@ -91,7 +91,8 @@
}
.headermain,
.menu,
.rename-button {
.rename-button,
.suggest-gallery {
display: none !important;
}
.breadcrumb > a {

View File

@@ -1,5 +1,6 @@
<template>
<SvgButton name="download" data-tooltip="Download" @click="download" />
<TransferBar :status=progress @cancel=cancelDownloads />
</template>
<script setup lang="ts">
@@ -25,22 +26,22 @@ const status_init = {
filepos: 0,
status: 'idle',
}
store.dprogress = {...status_init}
const progress = reactive({...status_init})
setInterval(() => {
if (Date.now() - store.dprogress.tlast > 3000) {
if (Date.now() - progress.tlast > 3000) {
// Reset
store.dprogress.statbytes = 0
store.dprogress.statdur = 1
progress.statbytes = 0
progress.statdur = 1
} else {
// Running average by decay
store.dprogress.statbytes *= .9
store.dprogress.statdur *= .9
progress.statbytes *= .9
progress.statdur *= .9
}
}, 100)
const statReset = () => {
Object.assign(store.dprogress, status_init)
store.dprogress.t0 = Date.now()
store.dprogress.tlast = store.dprogress.t0 + 1
Object.assign(progress, status_init)
progress.t0 = Date.now()
progress.tlast = progress.t0 + 1
}
const cancelDownloads = () => {
location.reload() // FIXME
@@ -60,9 +61,9 @@ const filesystemdl = async (sel: SelectedItems, handle: FileSystemDirectoryHandl
console.log('Downloading to filesystem', sel.recursive)
for (const [rel, full, doc] of sel.recursive) {
if (doc.dir) continue
store.dprogress.files.push(rel)
++store.dprogress.filecount
store.dprogress.total += doc.size
progress.files.push(rel)
++progress.filecount
progress.total += doc.size
}
for (const [rel, full, doc] of sel.recursive) {
// Create any missing directories
@@ -72,7 +73,6 @@ const filesystemdl = async (sel: SelectedItems, handle: FileSystemDirectoryHandl
}
const r = rel.slice(hdir.length)
for (const dir of r.split('/').slice(0, doc.dir ? undefined : -1)) {
if (!dir) continue
hdir += `${dir}/`
try {
h = await h.getDirectoryHandle(dir.normalize('NFC'), { create: true })
@@ -101,22 +101,22 @@ const filesystemdl = async (sel: SelectedItems, handle: FileSystemDirectoryHandl
throw new Error(`Failed to download ${url}: ${res.status} ${res.statusText}`)
}
if (res.body) {
++store.dprogress.fileidx
++progress.fileidx
const reader = res.body.getReader()
await writable.truncate(0)
store.error = "Direct download."
store.dprogress.tlast = Date.now()
progress.tlast = Date.now()
while (true) {
const { value, done } = await reader.read()
if (done) break
await writable.write(value)
const now = Date.now()
const size = value.byteLength
store.dprogress.xfer += size
store.dprogress.filepos += size
store.dprogress.statbytes += size
store.dprogress.statdur += now - store.dprogress.tlast
store.dprogress.tlast = now
progress.xfer += size
progress.filepos += size
progress.statbytes += size
progress.statdur += now - progress.tlast
progress.tlast = now
}
}
await writable.close()

View File

@@ -115,7 +115,6 @@ const rename = (doc: Doc, newName: string) => {
}
defineExpose({
newFolder() {
console.log("New folder")
const now = Math.floor(Date.now() / 1000)
editing.value = new Doc({
loc: loc.value,
@@ -125,7 +124,6 @@ defineExpose({
mtime: now,
size: 0,
})
store.cursor = editing.value.key
},
toggleSelectAll() {
console.log('Select')

View File

@@ -2,11 +2,8 @@
<div v-if="props.documents.length || editing" class="gallery" ref="gallery">
<GalleryFigure v-if="editing?.key === 'new'" :doc="editing" :key=editing.key :editing="{rename: mkdir, exit}" />
<template v-for="(doc, index) in documents" :key=doc.key>
<GalleryFigure :doc=doc :editing="editing === doc ? {rename, exit} : null" @menu="contextMenu($event, doc)">
<template v-if=showFolderBreadcrumb(index)>
<BreadCrumb :path="doc.loc ? doc.loc.split('/') : []" class="folder-change"/>
<div class="spacer"></div>
</template>
<GalleryFigure :doc=doc :editing="editing === doc ? {rename, exit} : null">
<BreadCrumb v-if=showFolderBreadcrumb(index) :path="doc.loc ? doc.loc.split('/') : []" class="folder-change"/>
</GalleryFigure>
</template>
</div>
@@ -70,7 +67,6 @@ defineExpose({
mtime: now,
size: 0,
})
store.cursor = editing.value.key
},
toggleSelectAll() {
console.log('Select')
@@ -258,9 +254,6 @@ const contextMenu = (ev: MouseEvent, doc: Doc) => {
align-items: end;
}
.breadcrumb {
border-radius: .5em 0 0 .5em;
}
.spacer {
flex: 0 1000000000 4rem;
border-radius: .5em;
}
</style>

View File

@@ -9,7 +9,7 @@
<slot></slot>
<MediaPreview ref=m :doc="doc" tabindex=-1 quality="sz=512" class="figcontent" />
<div class="titlespacer"></div>
<figcaption @click.prevent @contextmenu.prevent="$emit('menu', $event)">
<figcaption @click.prevent>
<template v-if="editing">
<FileRenameInput :doc=doc :rename=editing.rename :exit=editing.exit />
</template>

View File

@@ -8,7 +8,7 @@
<SvgButton
name="create-folder"
data-tooltip="New folder"
@click="() => { console.log('New', store.fileExplorer); store.fileExplorer!.newFolder(); console.log('Done')}"
@click="() => store.fileExplorer!.newFolder()"
/>
<slot></slot>
<div class="spacer smallgap"></div>

View File

@@ -101,7 +101,7 @@ const video = () => ['mkv', 'mp4', 'webm', 'mov', 'avi'].includes(props.doc.ext)
const audio = () => ['mp3', 'flac', 'ogg', 'aac'].includes(props.doc.ext)
const archive = () => ['zip', 'tar', 'gz', 'bz2', 'xz', '7z', 'rar'].includes(props.doc.ext)
const preview = () => (
['bmp', 'ico', 'tif', 'tiff', 'heic', 'heif', 'pdf', 'epub', 'mobi'].includes(props.doc.ext) ||
['bmp', 'ico', 'tif', 'tiff', 'pdf'].includes(props.doc.ext) ||
props.doc.size > 500000 &&
['avif', 'webp', 'png', 'jpg', 'jpeg'].includes(props.doc.ext)
)

View File

@@ -57,12 +57,13 @@ const speeddisp = computed(() => speed.value ? speed.value.toFixed(speed.value <
display: flex;
flex-direction: column;
color: var(--primary-color);
width: 100%;
position: fixed;
left: 0;
bottom: 0;
width: 100vw;
}
.statustext {
display: flex;
align-items: center;
margin: 0 .5em;
padding: 0.5rem 0;
}
span {
@@ -83,12 +84,4 @@ span {
.position { min-width: 4em }
.speed { min-width: 4em }
.upload .statustext::before {
font-size: 1.5em;
content: '🔺'
}
.download .statustext::before {
font-size: 1.5em;
content: '🔻'
}
</style>

View File

@@ -1,12 +1,3 @@
<template>
<template>
<input ref="fileInput" @change="uploadHandler" type="file" multiple>
<input ref="folderInput" @change="uploadHandler" type="file" webkitdirectory>
</template>
<SvgButton name="add-file" data-tooltip="Upload files" @click="fileInput.click()" />
<SvgButton name="add-folder" data-tooltip="Upload folder" @click="folderInput.click()" />
</template>
<script setup lang="ts">
import { connect, uploadUrl } from '@/repositories/WS';
import { useMainStore } from '@/stores/main'
@@ -117,50 +108,50 @@ const uprogress_init = {
filepos: 0,
status: 'idle',
}
store.uprogress = {...uprogress_init}
const uprogress = reactive({...uprogress_init})
setInterval(() => {
if (Date.now() - store.uprogress.tlast > 3000) {
if (Date.now() - uprogress.tlast > 3000) {
// Reset
store.uprogress.statbytes = 0
store.uprogress.statdur = 1
uprogress.statbytes = 0
uprogress.statdur = 1
} else {
// Running average by decay
store.uprogress.statbytes *= .9
store.uprogress.statdur *= .9
uprogress.statbytes *= .9
uprogress.statdur *= .9
}
}, 100)
const statUpdate = ({name, size, start, end}: {name: string, size: number, start: number, end: number}) => {
if (name !== store.uprogress.filename) return // If stats have been reset
if (name !== uprogress.filename) return // If stats have been reset
const now = Date.now()
store.uprogress.xfer = store.uprogress.filestart + end
store.uprogress.filepos = end
store.uprogress.statbytes += end - start
store.uprogress.statdur += now - store.uprogress.tlast
store.uprogress.tlast = now
uprogress.xfer = uprogress.filestart + end
uprogress.filepos = end
uprogress.statbytes += end - start
uprogress.statdur += now - uprogress.tlast
uprogress.tlast = now
// File finished?
if (end === size) {
store.uprogress.filestart += size
uprogress.filestart += size
statNextFile()
if (++store.uprogress.fileidx >= store.uprogress.filecount) statReset()
if (++uprogress.fileidx >= uprogress.filecount) statReset()
}
}
const statNextFile = () => {
const f = store.uprogress.files.shift()
const f = uprogress.files.shift()
if (!f) return statReset()
store.uprogress.filepos = 0
store.uprogress.filesize = f.file.size
store.uprogress.filename = f.cloudName
uprogress.filepos = 0
uprogress.filesize = f.file.size
uprogress.filename = f.cloudName
}
const statReset = () => {
Object.assign(store.uprogress, uprogress_init)
store.uprogress.t0 = Date.now()
store.uprogress.tlast = store.uprogress.t0 + 1
Object.assign(uprogress, uprogress_init)
uprogress.t0 = Date.now()
uprogress.tlast = uprogress.t0 + 1
}
const statsAdd = (f: CloudFile[]) => {
if (store.uprogress.files.length === 0) statReset()
store.uprogress.total += f.reduce((a, b) => a + b.file.size, 0)
store.uprogress.filecount += f.length
store.uprogress.files = [...store.uprogress.files, ...f]
if (uprogress.files.length === 0) statReset()
uprogress.total += f.reduce((a, b) => a + b.file.size, 0)
uprogress.filecount += f.length
uprogress.files = [...uprogress.files, ...f]
statNextFile()
}
let upqueue = [] as CloudFile[]
@@ -190,7 +181,7 @@ const WSCreate = async () => await new Promise<WebSocket>(resolve => {
// @ts-ignore
ws.sendData = async (data: any) => {
// Wait until the WS is ready to send another message
store.uprogress.status = "uploading"
uprogress.status = "uploading"
await new Promise(resolve => {
const t = setInterval(() => {
if (ws.bufferedAmount > 1<<20) return
@@ -198,7 +189,7 @@ const WSCreate = async () => await new Promise<WebSocket>(resolve => {
clearInterval(t)
}, 1)
})
store.uprogress.status = "processing"
uprogress.status = "processing"
ws.send(data)
}
})
@@ -219,7 +210,7 @@ const worker = async () => {
if (f.cloudPos === f.file.size) upqueue.shift()
}
if (upqueue.length) startWorker()
store.uprogress.status = "idle"
uprogress.status = "idle"
workerRunning = false
}
let workerRunning: any = false
@@ -242,3 +233,12 @@ onUnmounted(() => {
removeEventListener('drop', uploadHandler)
})
</script>
<template>
<template>
<input ref="fileInput" @change="uploadHandler" type="file" multiple>
<input ref="folderInput" @change="uploadHandler" type="file" webkitdirectory>
</template>
<SvgButton name="add-file" data-tooltip="Upload files" @click="fileInput.click()" />
<SvgButton name="add-folder" data-tooltip="Upload folder" @click="folderInput.click()" />
<TransferBar :status=uprogress @cancel=cancelUploads />
</template>

View File

@@ -37,24 +37,21 @@ export class Doc {
return this.url.replace(/^\/#/, '')
}
get img(): boolean {
// Folders cannot be images
if (this.dir) return false
return ['jpg', 'jpeg', 'png', 'gif', 'webp', 'avif', 'heic', 'heif', 'svg'].includes(this.ext)
const ext = this.name.split('.').pop()?.toLowerCase()
return ['jpg', 'jpeg', 'png', 'gif', 'webp', 'avif', 'svg'].includes(ext || '')
}
get previewable(): boolean {
// Folders cannot be previewable
if (this.dir) return false
if (this.img) return true
const ext = this.name.split('.').pop()?.toLowerCase()
// Not a comprehensive list, but good enough for now
return ['mp4', 'mkv', 'webm', 'ogg', 'mp3', 'flac', 'aac', 'pdf'].includes(this.ext)
return ['mp4', 'mkv', 'webm', 'ogg', 'mp3', 'flac', 'aac', 'pdf'].includes(ext || '')
}
get previewurl(): string {
return this.url.replace(/^\/files/, '/preview')
}
get ext(): string {
const dotIndex = this.name.lastIndexOf('.')
if (dotIndex === -1 || dotIndex === this.name.length - 1) return ''
return this.name.slice(dotIndex + 1).toLowerCase()
const ext = this.name.split('.').pop()
return ext ? ext.toLowerCase() : ''
}
}
export type errorEvent = {

View File

@@ -19,8 +19,6 @@ export const useMainStore = defineStore({
cursor: '' as string,
server: {} as Record<string, any>,
dialog: '' as '' | 'login' | 'settings',
uprogress: {} as any,
dprogress: {} as any,
prefs: {
gallery: false,
sortListing: '' as SortOrder,
@@ -91,13 +89,7 @@ export const useMainStore = defineStore({
},
focusBreadcrumb() {
(document.querySelector('.breadcrumb') as HTMLAnchorElement).focus()
},
cancelDownloads() {
location.reload() // FIXME
},
cancelUploads() {
location.reload() // FIXME
},
}
},
getters: {
sortOrder(): SortOrder { return this.query ? this.prefs.sortFiltered : this.prefs.sortListing },

View File

@@ -50,11 +50,12 @@ export function formatUnixDate(t: number) {
}
export function getFileExtension(filename: string) {
const dotIndex = filename.lastIndexOf('.')
if (dotIndex === -1 || dotIndex === filename.length - 1) {
return '' // No extension
const parts = filename.split('.')
if (parts.length > 1) {
return parts[parts.length - 1]
} else {
return '' // No hay extensión
}
return filename.slice(dotIndex + 1)
}
interface FileTypes {
[key: string]: string[]
@@ -67,9 +68,8 @@ const filetypes: FileTypes = {
}
export function getFileType(name: string): string {
const dotIndex = name.lastIndexOf('.')
if (dotIndex === -1 || dotIndex === name.length - 1) return 'unknown'
const ext = name.slice(dotIndex + 1).toLowerCase()
const ext = name.split('.').pop()?.toLowerCase()
if (!ext || ext.length === name.length) return 'unknown'
return Object.keys(filetypes).find(type => filetypes[type].includes(ext)) || 'unknown'
}

View File

@@ -13,11 +13,15 @@
:path="props.path"
:documents="documents"
/>
<div v-if="!store.prefs.gallery && documents.some(doc => doc.previewable)" class="suggest-gallery">
<SvgButton name="eye" taborder=0 @click="() => { store.prefs.gallery = true }"></SvgButton>
Gallery View
</div>
<EmptyFolder :documents=documents :path=props.path />
</template>
<script setup lang="ts">
import { watchEffect, ref, computed, watch } from 'vue'
import { watchEffect, ref, computed } from 'vue'
import { useMainStore } from '@/stores/main'
import Router from '@/router/index'
import { needleFormat, localeIncludes, collator } from '@/utils'
@@ -72,10 +76,6 @@ watchEffect(() => {
store.fileExplorer = fileExplorer.value
store.query = props.query
})
watch(documents, (docs) => {
store.prefs.gallery = docs.some(d => d.previewable)
}, { immediate: true })
</script>
<style scoped>
@@ -89,4 +89,15 @@ watch(documents, (docs) => {
text-shadow: 0 0 .3rem #000, 0 0 2rem #0008;
color: var(--accent-color);
}
.suggest-gallery p {
font-size: 2rem;
color: var(--accent-color);
}
.suggest-gallery {
display: flex;
flex-direction: column;
align-items: center;
justify-content: center;
}
</style>

View File

@@ -4,11 +4,12 @@ import { defineConfig } from 'vite'
import vue from '@vitejs/plugin-vue'
// @ts-ignore
import pluginRewriteAll from 'vite-plugin-rewrite-all'
import svgLoader from 'vite-svg-loader'
import Components from 'unplugin-vue-components/vite'
// Development mode:
// bun run dev # Run frontend that proxies to dev_backend
// npm run dev # Run frontend that proxies to dev_backend
// cista -l :8000 --dev # Run backend
const dev_backend = {
target: "http://localhost:8000",
@@ -20,6 +21,7 @@ const dev_backend = {
export default defineConfig({
plugins: [
vue(),
pluginRewriteAll(),
svgLoader(), // import svg files
Components(), // auto import components
],

View File

@@ -10,38 +10,25 @@ readme = "README.md"
authors = [
{ name = "Vasanko" },
]
maintainers = [
{ name = "Vasanko" },
]
keywords = ["file-server", "web-interface", "dropbox", "storage"]
classifiers = [
"Development Status :: 5 - Production/Stable",
"Environment :: Web Environment",
"Intended Audience :: End Users/Desktop",
"Intended Audience :: System Administrators",
"License :: Public Domain",
"License :: OSI Approved :: MIT License",
]
requires-python = ">=3.11"
dependencies = [
"argon2-cffi>=25.1.0",
"av>=15.0.0",
"blake3>=1.0.5",
"brotli>=1.1.0",
"docopt>=0.6.2",
"inotify>=0.2.12",
"msgspec>=0.19.0",
"natsort>=8.4.0",
"numpy>=2.3.2",
"pathvalidate>=3.3.1",
"pillow>=11.3.0",
"pillow-heif>=1.1.0",
"pyjwt>=2.10.1",
"pymupdf>=1.26.3",
"sanic>=25.3.0",
"setproctitle>=1.3.6",
"stream-zip>=0.0.83",
"tomli_w>=1.2.0",
"argon2-cffi",
"blake3",
"brotli",
"docopt",
"inotify",
"msgspec",
"natsort",
"pathvalidate",
"pillow",
"pyav",
"pyjwt",
"pymupdf",
"sanic",
"stream-zip",
"tomli_w",
]
[project.urls]
@@ -52,19 +39,8 @@ cista = "cista.__main__:main"
[project.optional-dependencies]
dev = [
"pytest>=8.4.1",
"ruff>=0.8.0",
"mypy>=1.13.0",
"pre-commit>=4.0.0",
]
test = [
"pytest>=8.4.1",
"pytest-cov>=6.0.0",
"pytest-asyncio>=0.25.0",
]
docs = [
"sphinx>=8.0.0",
"sphinx-rtd-theme>=3.0.0",
"pytest",
"ruff",
]
[tool.hatch.version]
@@ -72,83 +48,57 @@ source = "vcs"
[tool.hatch.build]
artifacts = ["cista/wwwroot"]
targets.sdist.hooks.custom.path = "scripts/build-frontend.py"
targets.sdist.include = [
"/cista",
]
hooks.custom.path = "scripts/build-frontend.py"
hooks.vcs.version-file = "cista/_version.py"
hooks.vcs.template = """
# This file is automatically generated by hatch build.
__version__ = {version!r}
"""
only-packages = true
targets.sdist.include = [
"/cista",
]
[tool.pytest.ini_options]
addopts = [
"--import-mode=importlib",
"--verbosity=2",
"--strict-markers",
"--strict-config",
"--cov=cista",
"--cov-report=term-missing",
"--cov-report=html",
"--cov-branch",
"--verbosity=-1",
"-p no:warnings",
]
testpaths = ["tests"]
python_files = ["test_*.py", "*_test.py"]
python_classes = ["Test*"]
python_functions = ["test_*"]
markers = [
"slow: marks tests as slow (deselect with '-m \"not slow\"')",
"integration: marks tests as integration tests",
]
filterwarnings = [
"error",
"ignore::UserWarning",
"ignore::DeprecationWarning",
testpaths = [
"tests",
]
[tool.ruff.lint]
isort.known-first-party = ["cista"]
per-file-ignores."tests/*" = ["S", "ANN", "D", "INP", "PLR2004"]
per-file-ignores."scripts/*" = ["T20"]
[tool.ruff]
select = ["ALL"]
ignore = [
"A0",
"ARG001",
"ANN",
"B018",
"BLE001",
"C901",
"COM812", # conflicts with ruff format
"D",
"E501",
"EM1",
"FIX002",
"ISC001", # conflicts with ruff format
"PGH003",
"PLR0912",
"PLR2004",
"PLW0603",
"S101",
"SLF001",
"T201",
"TD0",
"TRY",
]
show-source = true
show-fixes = true
[dependency-groups]
dev = [
"pytest>=8.4.1",
"ruff>=0.8.0",
"mypy>=1.13.0",
"pre-commit>=4.0.0",
]
test = [
"pytest>=8.4.1",
"pytest-cov>=6.0.0",
"pytest-asyncio>=0.25.0",
]
[tool.ruff.isort]
known-first-party = ["cista"]
[tool.coverage.run]
source = ["cista"]
branch = true
omit = [
"*/tests/*",
"*/test_*",
"*/__pycache__/*",
"cista/_version.py",
]
[tool.coverage.report]
exclude_lines = [
"pragma: no cover",
"def __repr__",
"if self.debug:",
"if settings.DEBUG",
"raise AssertionError",
"raise NotImplementedError",
"if 0:",
"if __name__ == .__main__.:",
"class .*\\bProtocol\\):",
"@(abc\\.)?abstractmethod",
]
show_missing = true
skip_covered = false
precision = 2
[tool.ruff.per-file-ignores]
"tests/*" = ["S", "ANN", "D", "INP"]

View File

@@ -1,8 +1,5 @@
# noqa: INP001
import os
import shutil
import subprocess
from sys import stderr
from hatchling.builders.hooks.plugin.interface import BuildHookInterface
@@ -10,28 +7,6 @@ from hatchling.builders.hooks.plugin.interface import BuildHookInterface
class CustomBuildHook(BuildHookInterface):
def initialize(self, version, build_data):
super().initialize(version, build_data)
stderr.write(">>> Building Cista frontend\n")
npm = None
bun = shutil.which("bun")
if bun is None:
npm = shutil.which("npm")
if npm is None:
raise RuntimeError(
"Bun or NodeJS `npm` is required for building but neither was found\n Visit https://bun.com/"
)
# npm --prefix doesn't work on Windows, so we chdir instead
os.chdir("frontend")
try:
if npm:
stderr.write("### npm install\n")
subprocess.run([npm, "install"], check=True) # noqa: S603
stderr.write("\n### npm run build\n")
subprocess.run([npm, "run", "build"], check=True) # noqa: S603
else:
assert bun
stderr.write("### bun install\n")
subprocess.run([bun, "install"], check=True) # noqa: S603
stderr.write("\n### bun run build\n")
subprocess.run([bun, "run", "build"], check=True) # noqa: S603
finally:
os.chdir("..")
print("Building Cista frontend...")
subprocess.run("npm install --prefix frontend".split(" "), check=True) # noqa: S603
subprocess.run("npm run build --prefix frontend".split(" "), check=True) # noqa: S603