Compare commits
84 Commits
32fa005c62
...
v1.0.0
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
ce6d60e588 | ||
| 073f1a8707 | |||
|
|
f38bb4bab9 | ||
|
|
26f9bef087 | ||
|
|
634dabe52d | ||
|
|
a383358369 | ||
|
|
369dc3ecaf | ||
|
|
0cf9c254e5 | ||
|
|
58b9dd3dd4 | ||
|
|
0965a56204 | ||
|
|
52beedcef0 | ||
|
|
8592d462f2 | ||
|
|
b2a24fca57 | ||
|
|
7cc7e32c33 | ||
|
|
fa98cb9177 | ||
|
|
b3ab09a614 | ||
|
|
a49dd2f111 | ||
|
|
dbb06e111c | ||
|
|
667e31aa08 | ||
|
|
007f021d6f | ||
|
|
b2188eee68 | ||
|
|
7311ffdff1 | ||
|
|
102a970174 | ||
|
|
a9d713dbd0 | ||
|
|
434e55f303 | ||
|
|
be9772c90e | ||
|
|
bb2448dc24 | ||
|
|
115f3e20d0 | ||
|
|
a366a0bcc6 | ||
|
|
2ff0f78759 | ||
|
|
9a2d6e8065 | ||
|
|
62388eb555 | ||
|
|
53778543bf | ||
|
|
8dda230510 | ||
|
|
696e3ab568 | ||
|
|
85ac12ad33 | ||
|
|
e56cc47105 | ||
|
|
ebbd96bc94 | ||
|
|
a9b6d04361 | ||
|
|
5808fe17ad | ||
|
|
671359e327 | ||
|
|
ba9495eb65 | ||
|
|
de482afd60 | ||
|
|
a547052e29 | ||
|
|
07c2ff4c15 | ||
|
|
e20b04189f | ||
|
|
8da141744e | ||
|
|
11887edde3 | ||
|
|
034c6fdea9 | ||
|
|
c5083f0f2b | ||
|
|
f8a9197474 | ||
|
|
5285cb2fb5 | ||
|
|
b6b387d09b | ||
|
|
669762dfe7 | ||
|
|
51fd07d4fa | ||
|
|
c40c245ce6 | ||
|
|
1fdd00b833 | ||
|
|
520a9dff47 | ||
|
|
c5c65d136a | ||
|
|
61f9026e23 | ||
|
|
3e50149d4d | ||
|
|
7077b21159 | ||
|
|
938c5ca657 | ||
|
|
e0aef07783 | ||
|
|
36826a83c1 | ||
|
|
6880f82c19 | ||
|
|
5dd1bd9bdc | ||
|
|
41e8c78ecd | ||
|
|
dc4bb494f3 | ||
|
|
9b58b887b4 | ||
|
|
07848907f3 | ||
|
|
7a08f7cbe2 | ||
|
|
dd37238510 | ||
|
|
c8d5f335b1 | ||
|
|
bb80b3ee54 | ||
|
|
06d860c601 | ||
|
|
c321de13fd | ||
|
|
278e8303c4 | ||
|
|
9854dd01cc | ||
|
|
fb03fa5430 | ||
|
|
e26cb8f70a | ||
|
|
9bbbc829a1 | ||
|
|
876d76bc1f | ||
|
|
4a53d0b8e2 |
1
.gitignore
vendored
@@ -1,4 +1,5 @@
|
||||
.*
|
||||
*.lock
|
||||
!.gitignore
|
||||
__pycache__/
|
||||
*.egg-info/
|
||||
|
||||
148
README.md
@@ -1,25 +1,123 @@
|
||||
# Web File Storage
|
||||
|
||||
Run directly from repository with Hatch (or use pip install as usual):
|
||||
```sh
|
||||
hatch run cista -l :3000 /path/to/files
|
||||
```
|
||||
|
||||
Settings incl. these arguments are stored to config file on the first startup and later `hatch run cista` is sufficient. If the `cista` script is missing, consider `pip install -e .` (within `hatch shell`) or some other trickery (known issue with installs made prior to adding the startup script).
|
||||
|
||||
Create your user account:
|
||||
```sh
|
||||
hatch run cista --user admin --privileged
|
||||
```
|
||||
|
||||
## Build frontend
|
||||
|
||||
Prebuilt frontend is provided in repository but for any changes it will need to be manually rebuilt:
|
||||
|
||||
```sh
|
||||
cd cista-front
|
||||
npm install
|
||||
npm run build
|
||||
```
|
||||
|
||||
This will place the front in `cista/wwwroot` from where the backend server delivers it, and that also gets included in the Python package built via `hatch build`.
|
||||
# Cista Web Storage
|
||||
|
||||
<img src="https://git.zi.fi/Vasanko/cista-storage/raw/branch/main/docs/cista.webp" align=left width=250>
|
||||
|
||||
Cista takes its name from the ancient *cistae*, metal containers used by Greeks and Egyptians to safeguard valuable items. This modern application provides a browser interface for secure and accessible file storage, echoing the trust and reliability of its historical namesake.
|
||||
|
||||
This is a cutting-edge **file and document server** designed for speed, efficiency, and unparalleled ease of use. Experience **lightning-fast browsing**, thanks to the file list maintained directly in your browser and updated from server filesystem events, coupled with our highly optimized code. Fully **keyboard-navigable** and with a responsive layout, Cista flawlessly adapts to your devices, providing a seamless experience wherever you are. Our powerful **instant search** means you're always just a few keystrokes away from finding exactly what you need. Press **1/2/3** to switch ordering, navigate with all four arrow keys (+Shift to select). Or click your way around on **breadcrumbs that remember where you were**.
|
||||
|
||||
**Built-in document and media previews** let you quickly view files without downloading them. Cista shows PDF and other documents, video and image thumbnails, with **HDR10 support** video previews and image formats, including HEIC and AVIF. It also has a player for music and video files.
|
||||
|
||||
The Cista project started as an inevitable remake of [Droppy](https://github.com/droppyjs/droppy) which we used and loved despite its numerous bugs. Cista Storage stands out in handling even the most exotic filenames, ensuring a smooth experience where others falter.
|
||||
|
||||
All of this is wrapped in an intuitive interface with automatic light and dark themes, making Cista Storage the ideal choice for anyone seeking a reliable, versatile, and quick file storage solution. Quickly setup your own Cista where your files are just a click away, safe, and always accessible.
|
||||
|
||||
Experience Cista by visiting [Cista Demo](https://drop.zi.fi) for a test run and perhaps upload something...
|
||||
|
||||
|
||||
## Getting Started
|
||||
### Running the Server
|
||||
|
||||
We recommend using [UV](https://docs.astral.sh/uv/getting-started/installation/) to directly run Cista:
|
||||
|
||||
Create an account: (otherwise the server is public for all)
|
||||
```fish
|
||||
uvx cista --user yourname --privileged
|
||||
```
|
||||
|
||||
Serve your files at http://localhost:8000:
|
||||
```fish
|
||||
uvx cista -l :8000 /path/to/files
|
||||
```
|
||||
|
||||
Alternatively, you can install with `pip` or `uv pip`. This enables using the `cista` command directly without `uvx` or `uv run`.
|
||||
|
||||
```fish
|
||||
pip install cista --break-system-packages
|
||||
```
|
||||
|
||||
The server remembers its settings in the config folder (default `~/.local/share/cista/`), including the listen port and directory, for future runs without arguments.
|
||||
|
||||
### Internet Access
|
||||
|
||||
Most admins find the [Caddy](https://caddyserver.com/) web server convenient for its auto TLS certificates and all. A proxy also allows running multiple web services or Cista instances on the same IP address but different (sub)domains.
|
||||
|
||||
`/etc/caddy/Caddyfile`:
|
||||
|
||||
```Caddyfile
|
||||
cista.example.com {
|
||||
reverse_proxy :8000
|
||||
}
|
||||
```
|
||||
|
||||
Nxing or other proxy may be similarly used, or alternatively you can place cert and key in cista config dir and run `cista -l cista.example.com`
|
||||
|
||||
## System Deployment
|
||||
|
||||
This setup allows easy addition of storages, each with its own domain, configuration, and files.
|
||||
|
||||
Assuming a restricted user account `storage` for serving files and that UV is installed system-wide or on this account. Only UV is required: this does not use git or bun/npm.
|
||||
|
||||
Create `/etc/systemd/system/cista@.service`:
|
||||
|
||||
```ini
|
||||
[Unit]
|
||||
Description=Cista storage %i
|
||||
|
||||
[Service]
|
||||
User=storage
|
||||
ExecStart=uvx cista -c /srv/cista/%i -l /srv/cista/%i/socket /media/storage/%i
|
||||
Restart=always
|
||||
|
||||
[Install]
|
||||
WantedBy=multi-user.target
|
||||
```
|
||||
|
||||
This setup supports multiple storages, each under `/media/storage/<domain>` for files and `/srv/cista/<domain>/` for configuration. UNIX sockets are used instead of numeric ports for convenience.
|
||||
|
||||
```fish
|
||||
systemctl daemon-reload
|
||||
systemctl enable --now cista@foo.example.com
|
||||
systemctl enable --now cista@bar.example.com
|
||||
```
|
||||
|
||||
Public exposure is easiest using the Caddy web server.
|
||||
|
||||
`/etc/caddy/Caddyfile`:
|
||||
|
||||
```Caddyfile
|
||||
foo.example.com, bar.example.com {
|
||||
reverse_proxy unix//srv/cista/{host}/socket
|
||||
}
|
||||
```
|
||||
|
||||
## Development setup
|
||||
|
||||
For rapid development, we use the Vite development server for the Vue frontend, while running the backend on port 8000 that Vite proxies backend requests to. Each server live reloads whenever its code or configuration are modified.
|
||||
|
||||
Make sure you have git, uv and bun (or npm) installed.
|
||||
|
||||
Backend (Python) – setup and run:
|
||||
|
||||
```fish
|
||||
git clone https://git.zi.fi/Vasanko/cista-storage.git
|
||||
cd cista-storage
|
||||
uv sync --dev
|
||||
uv run cista --dev -l :8000 /path/to/files
|
||||
```
|
||||
|
||||
Frontend (Vue/Vite) – run the dev server in another terminal:
|
||||
|
||||
```fish
|
||||
cd frontend
|
||||
bun install
|
||||
bun run dev
|
||||
```
|
||||
|
||||
Building the package for release (frontend + Python wheel/sdist):
|
||||
|
||||
```fish
|
||||
uv build
|
||||
```
|
||||
|
||||
Vue is used to build files in `cista/wwwroot`, included prebuilt in the Python package. `uv build` runs the project build hooks to bundle the frontend and produce a NodeJS-independent Python package.
|
||||
|
||||
|
Before Width: | Height: | Size: 4.2 KiB |
@@ -1,241 +0,0 @@
|
||||
<!DOCTYPE html>
|
||||
<title>Storage</title>
|
||||
<style>
|
||||
body {
|
||||
font-family: sans-serif;
|
||||
max-width: 100ch;
|
||||
margin: 0 auto;
|
||||
padding: 1em;
|
||||
background-color: #333;
|
||||
color: #eee;
|
||||
}
|
||||
td {
|
||||
text-align: right;
|
||||
padding: .5em;
|
||||
}
|
||||
td:first-child {
|
||||
text-align: left;
|
||||
}
|
||||
a {
|
||||
color: inherit;
|
||||
text-decoration: none;
|
||||
}
|
||||
</style>
|
||||
<div>
|
||||
<h2>Quick file upload</h2>
|
||||
<p>Uses parallel WebSocket connections for increased bandwidth /api/upload</p>
|
||||
<input type=file id=fileInput>
|
||||
<progress id=progressBar value=0 max=1></progress>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<h2>Files</h2>
|
||||
<ul id=file_list></ul>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
let files = {}
|
||||
let flatfiles = {}
|
||||
|
||||
function createWatchSocket() {
|
||||
const wsurl = new URL("/api/watch", location.href.replace(/^http/, 'ws'))
|
||||
const ws = new WebSocket(wsurl)
|
||||
ws.onmessage = event => {
|
||||
msg = JSON.parse(event.data)
|
||||
if (msg.update) {
|
||||
tree_update(msg.update)
|
||||
file_list(files)
|
||||
} else {
|
||||
console.log("Unkonwn message from watch socket", msg)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
createWatchSocket()
|
||||
|
||||
function tree_update(msg) {
|
||||
console.log("Tree update", msg)
|
||||
let node = files
|
||||
for (const elem of msg) {
|
||||
if (elem.deleted) {
|
||||
const p = node.dir[elem.name].path
|
||||
delete node.dir[elem.name]
|
||||
delete flatfiles[p]
|
||||
break
|
||||
}
|
||||
if (elem.name !== undefined) node = node.dir[elem.name] ||= {}
|
||||
if (elem.size !== undefined) node.size = elem.size
|
||||
if (elem.mtime !== undefined) node.mtime = elem.mtime
|
||||
if (elem.dir !== undefined) node.dir = elem.dir
|
||||
}
|
||||
// Update paths and flatfiles
|
||||
files.path = "/"
|
||||
const nodes = [files]
|
||||
flatfiles = {}
|
||||
while (node = nodes.pop()) {
|
||||
flatfiles[node.path] = node
|
||||
if (node.dir === undefined) continue
|
||||
for (const name of Object.keys(node.dir)) {
|
||||
const child = node.dir[name]
|
||||
child.path = node.path + name + (child.dir === undefined ? "" : "/")
|
||||
nodes.push(child)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
var collator = new Intl.Collator(undefined, {numeric: true, sensitivity: 'base'});
|
||||
|
||||
const compare_path = (a, b) => collator.compare(a.path, b.path)
|
||||
const compare_time = (a, b) => a.mtime > b.mtime
|
||||
|
||||
function file_list(files) {
|
||||
const table = document.getElementById("file_list")
|
||||
const sorted = Object.values(flatfiles).sort(compare_time)
|
||||
table.innerHTML = ""
|
||||
for (const f of sorted) {
|
||||
const {path, size, mtime} = f
|
||||
const tr = document.createElement("tr")
|
||||
const name_td = document.createElement("td")
|
||||
const size_td = document.createElement("td")
|
||||
const mtime_td = document.createElement("td")
|
||||
const a = document.createElement("a")
|
||||
table.appendChild(tr)
|
||||
tr.appendChild(name_td)
|
||||
tr.appendChild(size_td)
|
||||
tr.appendChild(mtime_td)
|
||||
name_td.appendChild(a)
|
||||
size_td.textContent = size
|
||||
mtime_td.textContent = formatUnixDate(mtime)
|
||||
a.textContent = path
|
||||
a.href = `/files${path}`
|
||||
/*a.onclick = event => {
|
||||
if (window.showSaveFilePicker) {
|
||||
event.preventDefault()
|
||||
download_ws(name, size)
|
||||
}
|
||||
}
|
||||
a.download = ""*/
|
||||
}
|
||||
}
|
||||
|
||||
function formatUnixDate(t) {
|
||||
const date = new Date(t * 1000)
|
||||
const now = new Date()
|
||||
const diff = date - now
|
||||
const formatter = new Intl.RelativeTimeFormat('en', { numeric: 'auto' })
|
||||
|
||||
if (Math.abs(diff) <= 60000) {
|
||||
return formatter.format(Math.round(diff / 1000), 'second')
|
||||
}
|
||||
|
||||
if (Math.abs(diff) <= 3600000) {
|
||||
return formatter.format(Math.round(diff / 60000), 'minute')
|
||||
}
|
||||
|
||||
if (Math.abs(diff) <= 86400000) {
|
||||
return formatter.format(Math.round(diff / 3600000), 'hour')
|
||||
}
|
||||
|
||||
if (Math.abs(diff) <= 604800000) {
|
||||
return formatter.format(Math.round(diff / 86400000), 'day')
|
||||
}
|
||||
|
||||
return date.toLocaleDateString()
|
||||
}
|
||||
|
||||
async function download_ws(name, size) {
|
||||
const fh = await window.showSaveFilePicker({
|
||||
suggestedName: name,
|
||||
})
|
||||
const writer = await fh.createWritable()
|
||||
writer.truncate(size)
|
||||
const wsurl = new URL("/api/download", location.href.replace(/^http/, 'ws'))
|
||||
const ws = new WebSocket(wsurl)
|
||||
let pos = 0
|
||||
ws.onopen = () => {
|
||||
console.log("Downloading over WebSocket", name, size)
|
||||
ws.send(JSON.stringify({name, start: 0, end: size, size}))
|
||||
}
|
||||
ws.onmessage = event => {
|
||||
if (typeof event.data === 'string') {
|
||||
const msg = JSON.parse(event.data)
|
||||
console.log("Download finished", msg)
|
||||
ws.close()
|
||||
return
|
||||
}
|
||||
console.log("Received chunk", name, pos, pos + event.data.size)
|
||||
pos += event.data.size
|
||||
writer.write(event.data)
|
||||
}
|
||||
ws.onclose = () => {
|
||||
if (pos < size) {
|
||||
console.log("Download aborted", name, pos)
|
||||
writer.truncate(pos)
|
||||
}
|
||||
writer.close()
|
||||
}
|
||||
}
|
||||
|
||||
const fileInput = document.getElementById("fileInput")
|
||||
const progress = document.getElementById("progressBar")
|
||||
const numConnections = 2
|
||||
const chunkSize = 1<<20
|
||||
const wsConnections = new Set()
|
||||
|
||||
//for (let i = 0; i < numConnections; i++) createUploadWS()
|
||||
|
||||
function createUploadWS() {
|
||||
const wsurl = new URL("/api/upload", location.href.replace(/^http/, 'ws'))
|
||||
const ws = new WebSocket(wsurl)
|
||||
ws.binaryType = 'arraybuffer'
|
||||
ws.onopen = () => {
|
||||
wsConnections.add(ws)
|
||||
console.log("Upload socket connected")
|
||||
}
|
||||
ws.onmessage = event => {
|
||||
msg = JSON.parse(event.data)
|
||||
if (msg.written) progress.value += +msg.written
|
||||
else console.log(`Error: ${msg.error}`)
|
||||
}
|
||||
ws.onclose = () => {
|
||||
wsConnections.delete(ws)
|
||||
console.log("Upload socket disconnected, reconnecting...")
|
||||
setTimeout(createUploadWS, 1000)
|
||||
}
|
||||
}
|
||||
|
||||
async function load(file, start, end) {
|
||||
const reader = new FileReader()
|
||||
const load = new Promise(resolve => reader.onload = resolve)
|
||||
reader.readAsArrayBuffer(file.slice(start, end))
|
||||
const event = await load
|
||||
return event.target.result
|
||||
}
|
||||
|
||||
async function sendChunk(file, start, end, ws) {
|
||||
const chunk = await load(file, start, end)
|
||||
ws.send(JSON.stringify({
|
||||
name: file.name,
|
||||
size: file.size,
|
||||
start: start,
|
||||
end: end
|
||||
}))
|
||||
ws.send(chunk)
|
||||
}
|
||||
|
||||
fileInput.addEventListener("change", async function() {
|
||||
const file = this.files[0]
|
||||
const numChunks = Math.ceil(file.size / chunkSize)
|
||||
progress.value = 0
|
||||
progress.max = file.size
|
||||
|
||||
console.log(wsConnections)
|
||||
for (let i = 0; i < numChunks; i++) {
|
||||
const ws = Array.from(wsConnections)[i % wsConnections.size]
|
||||
const start = i * chunkSize
|
||||
const end = Math.min(file.size, start + chunkSize)
|
||||
const res = await sendChunk(file, start, end, ws)
|
||||
}
|
||||
})
|
||||
|
||||
</script>
|
||||
@@ -1,140 +0,0 @@
|
||||
<template>
|
||||
<LoginModal />
|
||||
<header>
|
||||
<HeaderMain ref="headerMain">
|
||||
<HeaderSelected :path="path.pathList" />
|
||||
</HeaderMain>
|
||||
<BreadCrumb :path="path.pathList" tabindex="-1"/>
|
||||
</header>
|
||||
<main>
|
||||
<RouterView :path="path.pathList" />
|
||||
</main>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { RouterView } from 'vue-router'
|
||||
import type { ComputedRef } from 'vue'
|
||||
import type HeaderMain from '@/components/HeaderMain.vue'
|
||||
import { onMounted, onUnmounted, ref, watchEffect } from 'vue'
|
||||
import createWebSocket from '@/repositories/WS'
|
||||
import {
|
||||
url_document_watch_ws,
|
||||
url_document_upload_ws,
|
||||
DocumentHandler,
|
||||
DocumentUploadHandler
|
||||
} from '@/repositories/Document'
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
|
||||
import { computed } from 'vue'
|
||||
import Router from '@/router/index'
|
||||
|
||||
interface Path {
|
||||
path: string
|
||||
pathList: string[]
|
||||
}
|
||||
const documentStore = useDocumentStore()
|
||||
const path: ComputedRef<Path> = computed(() => {
|
||||
const p = decodeURIComponent(Router.currentRoute.value.path)
|
||||
const pathList = p.split('/').filter(value => value !== '')
|
||||
return {
|
||||
path: p,
|
||||
pathList
|
||||
}
|
||||
})
|
||||
// Update human-readable x seconds ago messages from mtimes
|
||||
setInterval(documentStore.updateModified, 1000)
|
||||
watchEffect(() => {
|
||||
const documentHandler = new DocumentHandler()
|
||||
const documentUploadHandler = new DocumentUploadHandler()
|
||||
const wsWatch = createWebSocket(
|
||||
url_document_watch_ws,
|
||||
documentHandler.handleWebSocketMessage
|
||||
)
|
||||
const wsUpload = createWebSocket(
|
||||
url_document_upload_ws,
|
||||
documentUploadHandler.handleWebSocketMessage
|
||||
)
|
||||
|
||||
documentStore.wsWatch = wsWatch
|
||||
documentStore.wsUpload = wsUpload
|
||||
})
|
||||
const headerMain = ref<typeof HeaderMain | null>(null)
|
||||
let vert = 0
|
||||
let timer: any = null
|
||||
const globalShortcutHandler = (event: KeyboardEvent) => {
|
||||
const fileExplorer = documentStore.fileExplorer as any
|
||||
if (!fileExplorer) return
|
||||
const c = fileExplorer.isCursor()
|
||||
const keyup = event.type === 'keyup'
|
||||
if (event.repeat) {
|
||||
if (
|
||||
event.key === 'ArrowUp' ||
|
||||
event.key === 'ArrowDown' ||
|
||||
(c && event.code === 'Space')
|
||||
) {
|
||||
event.preventDefault()
|
||||
}
|
||||
return
|
||||
}
|
||||
//console.log("key pressed", event)
|
||||
// For up/down implement custom fast repeat
|
||||
if (event.key === 'ArrowUp') vert = keyup ? 0 : event.altKey ? -10 : -1
|
||||
else if (event.key === 'ArrowDown') vert = keyup ? 0 : event.altKey ? 10 : 1
|
||||
// Find: process on keydown so that we can bypass the built-in search hotkey
|
||||
else if (!keyup && event.key === 'f' && (event.ctrlKey || event.metaKey)) {
|
||||
headerMain.value!.toggleSearchInput()
|
||||
}
|
||||
// Select all (toggle); keydown to prevent builtin
|
||||
else if (!keyup && event.key === 'a' && (event.ctrlKey || event.metaKey)) {
|
||||
fileExplorer.toggleSelectAll()
|
||||
}
|
||||
// Keys 1-3 to sort columns
|
||||
else if (
|
||||
c &&
|
||||
keyup &&
|
||||
(event.key === '1' || event.key === '2' || event.key === '3')
|
||||
) {
|
||||
fileExplorer.toggleSortColumn(+event.key)
|
||||
}
|
||||
// Rename
|
||||
else if (c && keyup && !event.ctrlKey && (event.key === 'F2' || event.key === 'r')) {
|
||||
fileExplorer.cursorRename()
|
||||
}
|
||||
// Toggle selections on file explorer; ignore all spaces to prevent scrolling built-in hotkey
|
||||
else if (c && event.code === 'Space') {
|
||||
if (keyup && !event.altKey && !event.ctrlKey)
|
||||
fileExplorer.cursorSelect()
|
||||
} else return
|
||||
event.preventDefault()
|
||||
if (!vert) {
|
||||
if (timer) {
|
||||
clearTimeout(timer) // Good for either timeout or interval
|
||||
timer = null
|
||||
}
|
||||
return
|
||||
}
|
||||
if (!timer) {
|
||||
// Initial move, then t0 delay until repeats at tr intervals
|
||||
const select = event.shiftKey
|
||||
fileExplorer.cursorMove(vert, select)
|
||||
const t0 = 200,
|
||||
tr = 30
|
||||
timer = setTimeout(
|
||||
() =>
|
||||
(timer = setInterval(() => {
|
||||
fileExplorer.cursorMove(vert, select)
|
||||
}, tr)),
|
||||
t0 - tr
|
||||
)
|
||||
}
|
||||
}
|
||||
onMounted(() => {
|
||||
window.addEventListener('keydown', globalShortcutHandler)
|
||||
window.addEventListener('keyup', globalShortcutHandler)
|
||||
})
|
||||
onUnmounted(() => {
|
||||
window.removeEventListener('keydown', globalShortcutHandler)
|
||||
window.removeEventListener('keyup', globalShortcutHandler)
|
||||
})
|
||||
export type { Path }
|
||||
</script>
|
||||
@@ -1,505 +0,0 @@
|
||||
<template>
|
||||
<table v-if="props.documents.length || editing">
|
||||
<thead>
|
||||
<tr>
|
||||
<th class="selection">
|
||||
<input
|
||||
type="checkbox"
|
||||
tabindex="-1"
|
||||
v-model="allSelected"
|
||||
:indeterminate="selectionIndeterminate"
|
||||
/>
|
||||
</th>
|
||||
<th
|
||||
class="sortcolumn"
|
||||
:class="{ sortactive: sort === 'name' }"
|
||||
@click="toggleSort('name')"
|
||||
>
|
||||
Name
|
||||
</th>
|
||||
<th
|
||||
class="sortcolumn modified right"
|
||||
:class="{ sortactive: sort === 'modified' }"
|
||||
@click="toggleSort('modified')"
|
||||
>
|
||||
Modified
|
||||
</th>
|
||||
<th
|
||||
class="sortcolumn size right"
|
||||
:class="{ sortactive: sort === 'size' }"
|
||||
@click="toggleSort('size')"
|
||||
>
|
||||
Size
|
||||
</th>
|
||||
<th class="menu"></th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
<tr v-if="editing?.key === 'new'" class="folder">
|
||||
<td class="selection"></td>
|
||||
<td class="name">
|
||||
<FileRenameInput
|
||||
:doc="editing"
|
||||
:rename="mkdir"
|
||||
:exit="
|
||||
() => {
|
||||
editing = null
|
||||
}
|
||||
"
|
||||
/>
|
||||
</td>
|
||||
<td class="modified right">
|
||||
<time :datetime="new Date(editing.mtime).toISOString().replace('.000', '')">{{
|
||||
editing.modified
|
||||
}}</time>
|
||||
</td>
|
||||
<td class="size right">{{ editing.sizedisp }}</td>
|
||||
<td class="menu"></td>
|
||||
</tr>
|
||||
<template
|
||||
v-for="doc of sorted(props.documents as Document[])"
|
||||
:key="doc.key">
|
||||
<tr v-if="doc.loc !== prevloc && ((prevloc = doc.loc) || true)">
|
||||
<th colspan="5"><BreadCrumb :path="doc.loc ? doc.loc.split('/') : []" /></th>
|
||||
</tr>
|
||||
|
||||
<tr
|
||||
:id="`file-${doc.key}`"
|
||||
:class="{
|
||||
file: doc.type === 'file',
|
||||
folder: doc.type === 'folder',
|
||||
cursor: cursor === doc
|
||||
}"
|
||||
@click="cursor = cursor === doc ? null : doc"
|
||||
@contextmenu.prevent="contextMenu($event, doc)"
|
||||
>
|
||||
<td class="selection" @click.up.stop="cursor = cursor === doc ? doc : null">
|
||||
<input
|
||||
type="checkbox"
|
||||
tabindex="-1"
|
||||
:checked="documentStore.selected.has(doc.key)"
|
||||
@change="
|
||||
($event.target as HTMLInputElement).checked
|
||||
? documentStore.selected.add(doc.key)
|
||||
: documentStore.selected.delete(doc.key)
|
||||
"
|
||||
/>
|
||||
</td>
|
||||
<td class="name">
|
||||
<template v-if="editing === doc"
|
||||
><FileRenameInput
|
||||
:doc="doc"
|
||||
:rename="rename"
|
||||
:exit="
|
||||
() => {
|
||||
editing = null
|
||||
}
|
||||
"
|
||||
/></template>
|
||||
<template v-else>
|
||||
<a
|
||||
:href="url_for(doc)"
|
||||
tabindex="-1"
|
||||
@contextmenu.prevent
|
||||
@focus.stop="cursor = doc"
|
||||
@blur="ev => { if (!editing) cursor = null }"
|
||||
@keyup.left="router.back()"
|
||||
@keyup.right.stop="ev => { if (doc.type === 'folder') (ev.target as HTMLElement).click() }"
|
||||
>{{ doc.name }}</a
|
||||
>
|
||||
<button
|
||||
v-if="cursor == doc"
|
||||
class="rename-button"
|
||||
@click="() => (editing = doc)"
|
||||
>
|
||||
🖊️
|
||||
</button>
|
||||
</template>
|
||||
</td>
|
||||
<td class="modified right">
|
||||
<time
|
||||
:data-tooltip="new Date(1000 * doc.mtime).toISOString().replace('T', '\n').replace('.000Z', ' UTC')"
|
||||
>{{ doc.modified }}</time
|
||||
>
|
||||
</td>
|
||||
<td class="size right">{{ doc.sizedisp }}</td>
|
||||
<td class="menu">
|
||||
<button
|
||||
tabindex="-1"
|
||||
@click.stop="contextMenu($event, doc)"
|
||||
>
|
||||
⋮
|
||||
</button>
|
||||
</td>
|
||||
</tr>
|
||||
</template>
|
||||
<tr>
|
||||
<td colspan="3" class="right">{{props.documents.length}} items shown:</td>
|
||||
<td class="size right">{{ formatSize(props.documents.reduce((a, b) => a + b.size, 0)) }}</td>
|
||||
<td class="menu"></td>
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
<div v-else class="empty-container">Nothing to see here</div>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref, computed, watchEffect, onBeforeUpdate } from 'vue'
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import type { Document } from '@/repositories/Document'
|
||||
import FileRenameInput from './FileRenameInput.vue'
|
||||
import createWebSocket from '@/repositories/WS'
|
||||
import { collator, formatSize, formatUnixDate } from '@/utils'
|
||||
import { useRouter } from 'vue-router'
|
||||
|
||||
const props = withDefaults(
|
||||
defineProps<{
|
||||
path: Array<string>
|
||||
documents: Document[]
|
||||
}>(),
|
||||
{}
|
||||
)
|
||||
const documentStore = useDocumentStore()
|
||||
const router = useRouter()
|
||||
const url_for = (doc: Document) => {
|
||||
const p = doc.loc ? `${doc.loc}/${doc.name}` : doc.name
|
||||
return doc.type === 'folder' ? `#/${p}/` : `/files/${p}`
|
||||
}
|
||||
const cursor = ref<Document | null>(null)
|
||||
// File rename
|
||||
const editing = ref<Document | null>(null)
|
||||
const rename = (doc: Document, newName: string) => {
|
||||
const oldName = doc.name
|
||||
const control = createWebSocket('/api/control', (ev: MessageEvent) => {
|
||||
const msg = JSON.parse(ev.data)
|
||||
if ('error' in msg) {
|
||||
console.error('Rename failed', msg.error.message, msg.error)
|
||||
doc.name = oldName
|
||||
} else {
|
||||
console.log('Rename succeeded', msg)
|
||||
}
|
||||
})
|
||||
control.onopen = () => {
|
||||
control.send(
|
||||
JSON.stringify({
|
||||
op: 'rename',
|
||||
path: `${doc.loc}/${oldName}`,
|
||||
to: newName
|
||||
})
|
||||
)
|
||||
}
|
||||
doc.name = newName // We should get an update from watch but this is quicker
|
||||
}
|
||||
defineExpose({
|
||||
newFolder() {
|
||||
const now = Date.now() / 1000
|
||||
editing.value = {
|
||||
loc: loc.value,
|
||||
key: 'new',
|
||||
name: 'New Folder',
|
||||
type: 'folder',
|
||||
mtime: now,
|
||||
size: 0,
|
||||
sizedisp: formatSize(0),
|
||||
modified: formatUnixDate(now),
|
||||
haystack: '',
|
||||
}
|
||||
},
|
||||
toggleSelectAll() {
|
||||
console.log('Select')
|
||||
allSelected.value = !allSelected.value
|
||||
},
|
||||
toggleSortColumn(column: number) {
|
||||
const columns = ['', 'name', 'modified', 'size', '']
|
||||
toggleSort(columns[column])
|
||||
},
|
||||
isCursor() {
|
||||
return cursor.value !== null && editing.value === null
|
||||
},
|
||||
cursorRename() {
|
||||
editing.value = cursor.value
|
||||
},
|
||||
cursorSelect() {
|
||||
const doc = cursor.value
|
||||
if (!doc) return
|
||||
if (documentStore.selected.has(doc.key)) {
|
||||
documentStore.selected.delete(doc.key)
|
||||
} else {
|
||||
documentStore.selected.add(doc.key)
|
||||
}
|
||||
this.cursorMove(1)
|
||||
},
|
||||
cursorMove(d: number, select = false) {
|
||||
// Move cursor up or down (keyboard navigation)
|
||||
const documents = sorted(props.documents as Document[])
|
||||
if (documents.length === 0) {
|
||||
cursor.value = null
|
||||
return
|
||||
}
|
||||
const N = documents.length
|
||||
const mod = (a: number, b: number) => ((a % b) + b) % b
|
||||
const increment = (i: number, d: number) => mod(i + d, N + 1)
|
||||
const index =
|
||||
cursor.value !== null ? documents.indexOf(cursor.value) : documents.length
|
||||
const moveto = increment(index, d)
|
||||
cursor.value = documents[moveto] ?? null
|
||||
const tr = cursor.value ? document.getElementById(`file-${cursor.value.key}`) : null
|
||||
if (select) {
|
||||
// Go forwards, possibly wrapping over the end; the last entry is not toggled
|
||||
let [begin, end] = d > 0 ? [index, moveto] : [moveto, index]
|
||||
for (let p = begin; p !== end; p = increment(p, 1)) {
|
||||
if (p === N) continue
|
||||
const key = documents[p].key
|
||||
if (documentStore.selected.has(key)) documentStore.selected.delete(key)
|
||||
else documentStore.selected.add(key)
|
||||
}
|
||||
}
|
||||
// @ts-ignore
|
||||
scrolltr = tr
|
||||
if (!scrolltimer) {
|
||||
scrolltimer = setTimeout(() => {
|
||||
if (scrolltr)
|
||||
scrolltr.scrollIntoView({ block: 'center', behavior: 'smooth' })
|
||||
scrolltimer = null
|
||||
}, 300)
|
||||
}
|
||||
if (moveto === N) focusBreadcrumb()
|
||||
}
|
||||
})
|
||||
const focusBreadcrumb = () => {
|
||||
const el = document.querySelector('.breadcrumb') as HTMLElement | null
|
||||
if (el) el.focus()
|
||||
}
|
||||
let scrolltimer: any = null
|
||||
let scrolltr: any = null
|
||||
watchEffect(() => {
|
||||
if (cursor.value && cursor.value !== editing.value) editing.value = null
|
||||
if (editing.value) cursor.value = editing.value
|
||||
if (cursor.value) {
|
||||
const a = document.querySelector(
|
||||
`#file-${cursor.value.key} .name a`
|
||||
) as HTMLAnchorElement | null
|
||||
if (a) a.focus()
|
||||
}
|
||||
})
|
||||
watchEffect(() => {
|
||||
if (!props.documents.length && cursor.value) {
|
||||
cursor.value = null
|
||||
focusBreadcrumb()
|
||||
}
|
||||
})
|
||||
const mkdir = (doc: Document, name: string) => {
|
||||
const control = createWebSocket('/api/control', (ev: MessageEvent) => {
|
||||
const msg = JSON.parse(ev.data)
|
||||
if ('error' in msg) {
|
||||
console.error('Mkdir failed', msg.error.message, msg.error)
|
||||
editing.value = null
|
||||
} else {
|
||||
console.log('mkdir', msg)
|
||||
router.push(`/${doc.loc}/${name}/`)
|
||||
}
|
||||
})
|
||||
control.onopen = () => {
|
||||
control.send(
|
||||
JSON.stringify({
|
||||
op: 'mkdir',
|
||||
path: `${doc.loc}/${name}`
|
||||
})
|
||||
)
|
||||
}
|
||||
doc.name = name // We should get an update from watch but this is quicker
|
||||
}
|
||||
|
||||
// Column sort
|
||||
const toggleSort = (name: string) => {
|
||||
sort.value = sort.value === name ? '' : name
|
||||
}
|
||||
const sort = ref<string>('')
|
||||
const sortCompare = {
|
||||
name: (a: Document, b: Document) => collator.compare(a.name, b.name),
|
||||
modified: (a: Document, b: Document) => b.mtime - a.mtime,
|
||||
size: (a: Document, b: Document) => b.size - a.size
|
||||
}
|
||||
const sorted = (documents: Document[]) => {
|
||||
const cmp = sortCompare[sort.value as keyof typeof sortCompare]
|
||||
const sorted = [...documents]
|
||||
if (cmp) sorted.sort(cmp)
|
||||
return sorted
|
||||
}
|
||||
const selectionIndeterminate = computed({
|
||||
get: () => {
|
||||
return (
|
||||
props.documents.length > 0 &&
|
||||
props.documents.some((doc: Document) => documentStore.selected.has(doc.key)) &&
|
||||
!allSelected.value
|
||||
)
|
||||
},
|
||||
// eslint-disable-next-line @typescript-eslint/no-unused-vars
|
||||
set: (value: boolean) => {}
|
||||
})
|
||||
const allSelected = computed({
|
||||
get: () => {
|
||||
return (
|
||||
props.documents.length > 0 &&
|
||||
props.documents.every((doc: Document) => documentStore.selected.has(doc.key))
|
||||
)
|
||||
},
|
||||
set: (value: boolean) => {
|
||||
console.log('Setting allSelected', value)
|
||||
for (const doc of props.documents) {
|
||||
if (value) {
|
||||
documentStore.selected.add(doc.key)
|
||||
} else {
|
||||
documentStore.selected.delete(doc.key)
|
||||
}
|
||||
}
|
||||
}
|
||||
})
|
||||
|
||||
const loc = computed(() => props.path.join('/'))
|
||||
let prevloc = ''
|
||||
onBeforeUpdate(() => { prevloc = loc.value })
|
||||
|
||||
const contextMenu = (ev: Event, doc: Document) => {
|
||||
cursor.value = doc
|
||||
console.log('Context menu', ev, doc)
|
||||
}
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
table {
|
||||
width: 100%;
|
||||
table-layout: fixed;
|
||||
}
|
||||
thead tr {
|
||||
position: sticky;
|
||||
top: 0;
|
||||
z-index: 2;
|
||||
}
|
||||
tbody tr {
|
||||
position: relative;
|
||||
z-index: auto;
|
||||
}
|
||||
table thead input[type='checkbox'] {
|
||||
position: inherit;
|
||||
width: 1em;
|
||||
height: 1em;
|
||||
padding: 0.5rem 0.5em;
|
||||
}
|
||||
table tbody input[type='checkbox'] {
|
||||
width: 2rem;
|
||||
height: 2rem;
|
||||
}
|
||||
table .selection {
|
||||
width: 2rem;
|
||||
text-align: center;
|
||||
text-overflow: clip;
|
||||
}
|
||||
table .modified {
|
||||
width: 8em;
|
||||
}
|
||||
table .size {
|
||||
width: 5em;
|
||||
}
|
||||
table .menu {
|
||||
width: 1rem;
|
||||
}
|
||||
tbody td {
|
||||
font-size: 1.2rem;
|
||||
}
|
||||
table th,
|
||||
table td {
|
||||
padding: 0 0.5rem;
|
||||
font-weight: normal;
|
||||
text-align: left;
|
||||
white-space: nowrap;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
}
|
||||
.name {
|
||||
white-space: nowrap;
|
||||
position: relative;
|
||||
}
|
||||
.name .rename-button {
|
||||
position: absolute;
|
||||
right: 0;
|
||||
animation: appear calc(5 * var(--transition-time)) linear;
|
||||
}
|
||||
@keyframes appear {
|
||||
from {
|
||||
opacity: 0;
|
||||
}
|
||||
80% {
|
||||
opacity: 0;
|
||||
}
|
||||
to {
|
||||
opacity: 1;
|
||||
}
|
||||
}
|
||||
thead tr {
|
||||
background: linear-gradient(to bottom, #eee, #fff 30%, #ddd);
|
||||
color: #000;
|
||||
box-shadow: 0 0 .2rem black;
|
||||
}
|
||||
tbody tr.cursor {
|
||||
background: var(--accent-color);
|
||||
}
|
||||
.right {
|
||||
text-align: right;
|
||||
}
|
||||
.sortcolumn:hover {
|
||||
cursor: pointer;
|
||||
}
|
||||
.sortcolumn:hover::after {
|
||||
color: var(--accent-color);
|
||||
}
|
||||
.sortcolumn {
|
||||
padding-right: 1.5rem;
|
||||
}
|
||||
.sortcolumn::after {
|
||||
content: '▸';
|
||||
color: #888;
|
||||
margin-left: 0.5em;
|
||||
position: absolute;
|
||||
transition: all var(--transition-time) linear;
|
||||
}
|
||||
.sortactive::after {
|
||||
transform: rotate(90deg);
|
||||
color: var(--accent-color);
|
||||
}
|
||||
.name a {
|
||||
text-decoration: none;
|
||||
}
|
||||
tbody .selection input {
|
||||
z-index: 1;
|
||||
position: absolute;
|
||||
opacity: 0;
|
||||
left: 0.5rem;
|
||||
top: 0;
|
||||
}
|
||||
.selection {
|
||||
width: 2em;
|
||||
height: 2em;
|
||||
}
|
||||
.selection input:checked {
|
||||
opacity: 0.7;
|
||||
}
|
||||
.file .selection::before {
|
||||
content: '📄';
|
||||
font-size: 1.5rem;
|
||||
}
|
||||
.folder .selection::before {
|
||||
height: 2rem;
|
||||
content: '📁';
|
||||
font-size: 1.5rem;
|
||||
}
|
||||
.empty-container {
|
||||
padding-top: 3rem;
|
||||
text-align: center;
|
||||
font-size: 3rem;
|
||||
color: var(--accent-color);
|
||||
}
|
||||
.loc {
|
||||
color: #888;
|
||||
}
|
||||
</style>
|
||||
@@ -1,52 +0,0 @@
|
||||
<template>
|
||||
<object
|
||||
v-if="props.type === 'pdf'"
|
||||
:data="dataURL"
|
||||
type="application/pdf"
|
||||
width="100%"
|
||||
height="100%"
|
||||
></object>
|
||||
<a-image
|
||||
v-else-if="props.type === 'image'"
|
||||
width="50%"
|
||||
:src="dataURL"
|
||||
@click="() => setVisible(true)"
|
||||
:previewMask="false"
|
||||
:preview="{
|
||||
visibleImg,
|
||||
onVisibleChange: setVisible
|
||||
}"
|
||||
/>
|
||||
<!-- Unknown case -->
|
||||
<h1 v-else>Unsupported file type</h1>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { watchEffect, ref } from 'vue'
|
||||
import Router from '@/router/index'
|
||||
import { url_document_get } from '@/repositories/Document'
|
||||
|
||||
const dataURL = ref('')
|
||||
watchEffect(() => {
|
||||
dataURL.value = new URL(
|
||||
url_document_get + Router.currentRoute.value.path,
|
||||
location.origin
|
||||
).toString()
|
||||
})
|
||||
const emit = defineEmits({
|
||||
visibleImg(value: boolean) {
|
||||
return value
|
||||
}
|
||||
})
|
||||
|
||||
function setVisible(value: boolean) {
|
||||
emit('visibleImg', value)
|
||||
}
|
||||
|
||||
const props = defineProps<{
|
||||
type?: string
|
||||
visibleImg: boolean
|
||||
}>()
|
||||
</script>
|
||||
|
||||
<style></style>
|
||||
@@ -1,79 +0,0 @@
|
||||
<script setup lang="ts">
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import { ref, nextTick } from 'vue'
|
||||
|
||||
const documentStore = useDocumentStore()
|
||||
const showSearchInput = ref<boolean>(false)
|
||||
const search = ref<HTMLInputElement | null>()
|
||||
const searchButton = ref<HTMLButtonElement | null>()
|
||||
|
||||
const toggleSearchInput = () => {
|
||||
showSearchInput.value = !showSearchInput.value
|
||||
nextTick(() => {
|
||||
const input = search.value
|
||||
if (input) input.focus()
|
||||
//else if (searchButton.value) document.querySelector('.breadcrumb')!.focus()
|
||||
})
|
||||
}
|
||||
|
||||
defineExpose({
|
||||
toggleSearchInput
|
||||
})
|
||||
</script>
|
||||
|
||||
<template>
|
||||
<nav class="headermain">
|
||||
<div class="buttons">
|
||||
<UploadButton />
|
||||
<SvgButton
|
||||
name="create-folder"
|
||||
data-tooltip="New folder"
|
||||
@click="() => documentStore.fileExplorer.newFolder()"
|
||||
/>
|
||||
<slot></slot>
|
||||
<div class="spacer smallgap"></div>
|
||||
<template v-if="showSearchInput">
|
||||
<input
|
||||
ref="search"
|
||||
type="search"
|
||||
v-model="documentStore.search"
|
||||
placeholder="Search words"
|
||||
class="margin-input"
|
||||
@blur="() => { if (documentStore.search === '') toggleSearchInput() }"
|
||||
@keyup.esc="toggleSearchInput"
|
||||
/>
|
||||
</template>
|
||||
<SvgButton ref="searchButton" name="find" @click="toggleSearchInput" />
|
||||
<SvgButton name="cog" @click="console.log('settings menu')" />
|
||||
</div>
|
||||
</nav>
|
||||
</template>
|
||||
|
||||
<style scoped>
|
||||
.buttons {
|
||||
padding: 0;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
height: 3.5em;
|
||||
z-index: 10;
|
||||
}
|
||||
.buttons > * {
|
||||
flex-shrink: 1;
|
||||
}
|
||||
.spacer {
|
||||
flex-grow: 1;
|
||||
}
|
||||
.smallgap {
|
||||
margin-left: 2em;
|
||||
}
|
||||
input[type='search'] {
|
||||
background: var(--primary-background);
|
||||
color: var(--primary-color);
|
||||
border: 0;
|
||||
border-radius: 0.1em;
|
||||
padding: 0.5em;
|
||||
outline: none;
|
||||
font-size: 1.5em;
|
||||
max-width: 30vw;
|
||||
}
|
||||
</style>
|
||||
@@ -1,152 +0,0 @@
|
||||
<template>
|
||||
<template v-if="documentStore.selected.size">
|
||||
<div class="smallgap"></div>
|
||||
<p class="select-text">{{ documentStore.selected.size }} selected ➤</p>
|
||||
<SvgButton name="download" data-tooltip="Download" @click="download" />
|
||||
<SvgButton name="copy" data-tooltip="Copy here" @click="op('cp', dst)" />
|
||||
<SvgButton name="paste" data-tooltip="Move here" @click="op('mv', dst)" />
|
||||
<SvgButton name="trash" data-tooltip="Delete ⚠️" @click="op('rm')" />
|
||||
<button class="action-button unselect" data-tooltip="Unselect all" @click="documentStore.selected.clear()">❌</button>
|
||||
</template>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import createWebSocket from '@/repositories/WS'
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import { computed } from 'vue'
|
||||
import type { SelectedItems } from '@/repositories/Document'
|
||||
|
||||
const documentStore = useDocumentStore()
|
||||
const props = defineProps({
|
||||
path: Array<string>
|
||||
})
|
||||
|
||||
const dst = computed(() => props.path!.join('/'))
|
||||
const op = (op: string, dst?: string) => {
|
||||
const sel = documentStore.selectedFiles
|
||||
const msg = {
|
||||
op,
|
||||
sel: sel.ids.filter(id => sel.selected.has(id)).map(id => sel.fullpath[id])
|
||||
}
|
||||
// @ts-ignore
|
||||
if (dst !== undefined) msg.dst = dst
|
||||
const control = createWebSocket('/api/control', ev => {
|
||||
const res = JSON.parse(ev.data)
|
||||
if ('error' in res) {
|
||||
console.error('Control socket error', msg, res.error)
|
||||
return
|
||||
} else if (res.status === 'ack') {
|
||||
console.log('Control ack OK', res)
|
||||
control.close()
|
||||
documentStore.selected.clear()
|
||||
return
|
||||
} else console.log('Unknown control respons', msg, res)
|
||||
})
|
||||
control.onopen = () => {
|
||||
control.send(JSON.stringify(msg))
|
||||
}
|
||||
}
|
||||
|
||||
const linkdl = (href: string) => {
|
||||
const a = document.createElement('a')
|
||||
a.href = href
|
||||
a.download = ''
|
||||
a.click()
|
||||
}
|
||||
|
||||
const filesystemdl = async (sel: SelectedItems, handle: FileSystemDirectoryHandle) => {
|
||||
let hdir = ''
|
||||
let h = handle
|
||||
let filelist = []
|
||||
for (const id of sel.ids) {
|
||||
filelist.push(sel.relpath[id])
|
||||
}
|
||||
console.log('Downloading to filesystem', filelist)
|
||||
for (const id of sel.ids) {
|
||||
const rel = sel.relpath[id]
|
||||
const url = sel.url[id] // Only files, not folders
|
||||
// Create any missing directories
|
||||
if (!rel.startsWith(hdir)) {
|
||||
hdir = ''
|
||||
h = handle
|
||||
}
|
||||
const r = rel.slice(hdir.length)
|
||||
for (const dir of r.split('/').slice(0, url ? -1 : undefined)) {
|
||||
hdir += `${dir}/`
|
||||
try {
|
||||
h = await h.getDirectoryHandle(dir.normalize('NFC'), { create: true })
|
||||
} catch (error) {
|
||||
console.error('Failed to create directory', hdir, error)
|
||||
return
|
||||
}
|
||||
console.log('Created', hdir)
|
||||
}
|
||||
if (!url) continue // Target was a folder and was created
|
||||
const name = rel.split('/').pop()!.normalize('NFC')
|
||||
// Download file
|
||||
let fileHandle
|
||||
try {
|
||||
fileHandle = await h.getFileHandle(name, { create: true })
|
||||
} catch (error) {
|
||||
console.error('Failed to create file', hdir + name, error)
|
||||
return
|
||||
}
|
||||
const writable = await fileHandle.createWritable()
|
||||
console.log('Fetching', url)
|
||||
const res = await fetch(url)
|
||||
if (!res.ok)
|
||||
throw new Error(`Failed to download ${url}: ${res.status} ${res.statusText}`)
|
||||
if (res.body) await res.body.pipeTo(writable)
|
||||
else {
|
||||
// Zero-sized files don't have a body, so we need to create an empty file
|
||||
await writable.truncate(0)
|
||||
await writable.close()
|
||||
}
|
||||
console.log('Saved', hdir + name)
|
||||
}
|
||||
}
|
||||
|
||||
const download = async () => {
|
||||
const sel = documentStore.selectedFiles
|
||||
console.log('Download', sel)
|
||||
if (sel.selected.size === 0) {
|
||||
console.warn('Attempted download but no files found. Missing:', sel.missing)
|
||||
documentStore.selected.clear()
|
||||
return
|
||||
}
|
||||
// Plain old a href download if only one file (ignoring any folders)
|
||||
const urls = Object.values(sel.url)
|
||||
if (urls.length === 1) {
|
||||
documentStore.selected.clear()
|
||||
return linkdl(urls[0] as string)
|
||||
}
|
||||
// Use FileSystem API if multiple files and the browser supports it
|
||||
if ('showDirectoryPicker' in window) {
|
||||
try {
|
||||
// @ts-ignore
|
||||
const handle = await window.showDirectoryPicker({
|
||||
startIn: 'downloads',
|
||||
mode: 'readwrite'
|
||||
})
|
||||
filesystemdl(sel, handle).then(() => {
|
||||
documentStore.selected.clear()
|
||||
})
|
||||
return
|
||||
} catch (e) {
|
||||
console.error('Download to folder aborted', e)
|
||||
}
|
||||
}
|
||||
// Otherwise, zip and download
|
||||
linkdl(`/zip/${Array.from(sel.selected).join('+')}/download.zip`)
|
||||
documentStore.selected.clear()
|
||||
}
|
||||
</script>
|
||||
|
||||
<style>
|
||||
.select-text {
|
||||
color: var(--accent-color);
|
||||
text-wrap: nowrap;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
}
|
||||
</style>
|
||||
@@ -1,71 +0,0 @@
|
||||
<template>
|
||||
<dialog ref="dialog">
|
||||
<h1 v-if="props.title">{{ props.title }}</h1>
|
||||
<div>
|
||||
<slot>
|
||||
Dialog with no content
|
||||
<button onclick="dialog.close()">OK</button>
|
||||
</slot>
|
||||
</div>
|
||||
</dialog>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref, onMounted } from 'vue'
|
||||
|
||||
const dialog = ref<HTMLDialogElement | null>(null)
|
||||
|
||||
const props = withDefaults(
|
||||
defineProps<{
|
||||
title: string
|
||||
}>(),
|
||||
{
|
||||
title: ''
|
||||
}
|
||||
)
|
||||
|
||||
onMounted(() => {
|
||||
dialog.value!.showModal()
|
||||
})
|
||||
</script>
|
||||
|
||||
<style>
|
||||
/* Style for the background */
|
||||
body:has(dialog[open])::before {
|
||||
content: '';
|
||||
display: block;
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
width: 100%;
|
||||
height: 100%;
|
||||
background: #0008;
|
||||
backdrop-filter: blur(0.2em);
|
||||
z-index: 1000;
|
||||
}
|
||||
|
||||
/* Hide the dialog by default */
|
||||
dialog[open] {
|
||||
display: block;
|
||||
border: none;
|
||||
border-radius: 0.5rem;
|
||||
box-shadow: 0.2rem 0.2rem 1rem #000;
|
||||
padding: 1rem;
|
||||
position: fixed;
|
||||
top: 0;
|
||||
left: 0;
|
||||
z-index: 1001;
|
||||
}
|
||||
|
||||
dialog[open] > h1 {
|
||||
background: #00f;
|
||||
color: #fff;
|
||||
font-size: 1rem;
|
||||
margin: -1rem -1rem 0 -1rem;
|
||||
padding: 0.5rem 1rem 0.5rem 1rem;
|
||||
}
|
||||
|
||||
dialog[open] > div {
|
||||
padding: 1em 0;
|
||||
}
|
||||
</style>
|
||||
@@ -1,27 +0,0 @@
|
||||
<template>
|
||||
<template v-for="upload in documentStore.uploadingDocuments" :key="upload.key">
|
||||
<span>{{ upload.name }}</span>
|
||||
<div class="progress-container">
|
||||
<a-progress :percent="upload.progress" />
|
||||
<CloseCircleOutlined class="close-button" @click="dismissUpload(upload.key)" />
|
||||
</div>
|
||||
</template>
|
||||
</template>
|
||||
<script setup lang="ts">
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
const documentStore = useDocumentStore()
|
||||
|
||||
function dismissUpload(key: number) {
|
||||
documentStore.deleteUploadingDocument(key)
|
||||
}
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
.progress-container {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
}
|
||||
.close-button:hover {
|
||||
color: #b81414;
|
||||
}
|
||||
</style>
|
||||
@@ -1,96 +0,0 @@
|
||||
<script setup lang="ts">
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import { h, ref } from 'vue'
|
||||
|
||||
const fileUploadButton = ref()
|
||||
const folderUploadButton = ref()
|
||||
const documentStore = useDocumentStore()
|
||||
const open = (placement: any) => openNotification(placement)
|
||||
|
||||
const isNotificationOpen = ref(false)
|
||||
const openNotification = (placement: any) => {
|
||||
if (!isNotificationOpen.value) {
|
||||
/*
|
||||
api.open({
|
||||
message: `Uploading documents`,
|
||||
description: h(NotificationLoading),
|
||||
placement,
|
||||
duration: 0,
|
||||
onClose: () => { isNotificationOpen.value = false }
|
||||
});*/
|
||||
isNotificationOpen.value = true
|
||||
}
|
||||
}
|
||||
|
||||
function uploadFileHandler() {
|
||||
fileUploadButton.value.click()
|
||||
}
|
||||
|
||||
async function load(file: File, start: number, end: number): Promise<ArrayBuffer> {
|
||||
const reader = new FileReader()
|
||||
const load = new Promise<Event>(resolve => (reader.onload = resolve))
|
||||
reader.readAsArrayBuffer(file.slice(start, end))
|
||||
const event = await load
|
||||
if (event.target && event.target instanceof FileReader) {
|
||||
return event.target.result as ArrayBuffer
|
||||
} else {
|
||||
throw new Error('Error loading file')
|
||||
}
|
||||
}
|
||||
|
||||
async function sendChunk(file: File, start: number, end: number) {
|
||||
const ws = documentStore.wsUpload
|
||||
if (ws) {
|
||||
const chunk = await load(file, start, end)
|
||||
|
||||
ws.send(
|
||||
JSON.stringify({
|
||||
name: file.name,
|
||||
size: file.size,
|
||||
start: start,
|
||||
end: end
|
||||
})
|
||||
)
|
||||
ws.send(chunk)
|
||||
}
|
||||
}
|
||||
|
||||
async function uploadFileChangeHandler(event: Event) {
|
||||
const target = event.target as HTMLInputElement
|
||||
const chunkSize = 1 << 20
|
||||
if (target && target.files && target.files.length > 0) {
|
||||
const file = target.files[0]
|
||||
const numChunks = Math.ceil(file.size / chunkSize)
|
||||
const document = documentStore.pushUploadingDocuments(file.name)
|
||||
open('bottomRight')
|
||||
for (let i = 0; i < numChunks; i++) {
|
||||
const start = i * chunkSize
|
||||
const end = Math.min(file.size, start + chunkSize)
|
||||
const res = await sendChunk(file, start, end)
|
||||
console.log('progress: ' + (100 * (i + 1)) / numChunks)
|
||||
console.log('Num Chunks: ' + numChunks)
|
||||
documentStore.updateUploadingDocuments(document.key, (100 * (i + 1)) / numChunks)
|
||||
}
|
||||
}
|
||||
}
|
||||
</script>
|
||||
<template>
|
||||
<template>
|
||||
<input
|
||||
ref="fileUploadButton"
|
||||
@change="uploadFileChangeHandler"
|
||||
class="upload-input"
|
||||
type="file"
|
||||
multiple
|
||||
/>
|
||||
<input
|
||||
ref="folderUploadButton"
|
||||
@change="uploadFileChangeHandler"
|
||||
class="upload-input"
|
||||
type="file"
|
||||
webkitdirectory
|
||||
/>
|
||||
</template>
|
||||
<SvgButton name="add-file" data-tooltip="Upload files" @click="fileUploadButton.click()" />
|
||||
<SvgButton name="add-folder" data-tooltip="Upload folder" @click="folderUploadButton.click()" />
|
||||
</template>
|
||||
@@ -1,162 +0,0 @@
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import createWebSocket from './WS'
|
||||
|
||||
export type FUID = string
|
||||
|
||||
export type Document = {
|
||||
loc: string
|
||||
name: string
|
||||
key: FUID
|
||||
type: 'folder' | 'file'
|
||||
size: number
|
||||
sizedisp: string
|
||||
mtime: number
|
||||
modified: string
|
||||
haystack: string
|
||||
dir?: DirList
|
||||
}
|
||||
|
||||
export type errorEvent = {
|
||||
error: {
|
||||
code: number
|
||||
message: string
|
||||
redirect: string
|
||||
}
|
||||
}
|
||||
|
||||
// Raw types the backend /api/watch sends us
|
||||
|
||||
export type FileEntry = {
|
||||
key: FUID
|
||||
size: number
|
||||
mtime: number
|
||||
}
|
||||
|
||||
export type DirEntry = {
|
||||
key: FUID
|
||||
size: number
|
||||
mtime: number
|
||||
dir: DirList
|
||||
}
|
||||
|
||||
export type DirList = Record<string, FileEntry | DirEntry>
|
||||
|
||||
export type UpdateEntry = {
|
||||
name: string
|
||||
deleted?: boolean
|
||||
key?: FUID
|
||||
size?: number
|
||||
mtime?: number
|
||||
dir?: DirList
|
||||
}
|
||||
|
||||
// Helper structure for selections
|
||||
export interface SelectedItems {
|
||||
selected: Set<FUID>
|
||||
missing: Set<FUID>
|
||||
rootdir: DirList
|
||||
entries: Record<FUID, FileEntry | DirEntry>
|
||||
fullpath: Record<FUID, string>
|
||||
relpath: Record<FUID, string>
|
||||
url: Record<FUID, string>
|
||||
ids: FUID[]
|
||||
}
|
||||
|
||||
export const url_document_watch_ws = '/api/watch'
|
||||
export const url_document_upload_ws = '/api/upload'
|
||||
export const url_document_get = '/files'
|
||||
|
||||
export class DocumentHandler {
|
||||
constructor(private store = useDocumentStore()) {
|
||||
this.handleWebSocketMessage = this.handleWebSocketMessage.bind(this)
|
||||
}
|
||||
|
||||
handleWebSocketMessage(event: MessageEvent) {
|
||||
const msg = JSON.parse(event.data)
|
||||
if ('error' in msg) {
|
||||
if (msg.error.code === 401) {
|
||||
this.store.user.isLoggedIn = false
|
||||
this.store.user.isOpenLoginModal = true
|
||||
} else {
|
||||
this.store.error = msg.error.message
|
||||
}
|
||||
// The server closes the websocket after errors, so we need to reopen it
|
||||
setTimeout(() => {
|
||||
this.store.wsWatch = createWebSocket(
|
||||
url_document_watch_ws,
|
||||
this.handleWebSocketMessage
|
||||
)
|
||||
}, 1000)
|
||||
}
|
||||
switch (true) {
|
||||
case !!msg.root:
|
||||
this.handleRootMessage(msg)
|
||||
break
|
||||
case !!msg.update:
|
||||
this.handleUpdateMessage(msg)
|
||||
break
|
||||
case !!msg.space:
|
||||
console.log('Watch space', msg.space)
|
||||
break
|
||||
case !!msg.error:
|
||||
this.handleError(msg)
|
||||
break
|
||||
default:
|
||||
}
|
||||
}
|
||||
|
||||
private handleRootMessage({ root }: { root: DirEntry }) {
|
||||
console.log('Watch root', root)
|
||||
if (this.store) {
|
||||
this.store.user.isLoggedIn = true
|
||||
this.store.updateRoot(root)
|
||||
}
|
||||
}
|
||||
private handleUpdateMessage(updateData: { update: UpdateEntry[] }) {
|
||||
console.log('Watch update', updateData.update)
|
||||
let node: DirEntry = this.store.root
|
||||
for (const elem of updateData.update) {
|
||||
if (elem.deleted) {
|
||||
delete node.dir[elem.name]
|
||||
break // Deleted elements can't have further children
|
||||
}
|
||||
if (elem.name !== undefined) {
|
||||
// @ts-ignore
|
||||
node = node.dir[elem.name] ||= {}
|
||||
}
|
||||
if (elem.key !== undefined) node.key = elem.key
|
||||
if (elem.size !== undefined) node.size = elem.size
|
||||
if (elem.mtime !== undefined) node.mtime = elem.mtime
|
||||
if (elem.dir !== undefined) node.dir = elem.dir
|
||||
}
|
||||
this.store.updateRoot()
|
||||
}
|
||||
private handleError(msg: errorEvent) {
|
||||
if (msg.error.code === 401) {
|
||||
this.store.user.isOpenLoginModal = true
|
||||
this.store.user.isLoggedIn = false
|
||||
return
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
export class DocumentUploadHandler {
|
||||
constructor(private store = useDocumentStore()) {
|
||||
this.handleWebSocketMessage = this.handleWebSocketMessage.bind(this)
|
||||
}
|
||||
|
||||
handleWebSocketMessage(event: MessageEvent) {
|
||||
const msg = JSON.parse(event.data)
|
||||
switch (true) {
|
||||
case !!msg.written:
|
||||
this.handleWrittenMessage(msg)
|
||||
break
|
||||
default:
|
||||
}
|
||||
}
|
||||
|
||||
private handleWrittenMessage(msg: { written: number }) {
|
||||
// if (this.store && this.store.root) this.store.root = root;
|
||||
console.log('Written message', msg.written)
|
||||
}
|
||||
}
|
||||
@@ -1,15 +0,0 @@
|
||||
import Client from '@/repositories/Client'
|
||||
export const url_login = '/login'
|
||||
export const url_logout = '/logout '
|
||||
|
||||
export async function loginUser(username: string, password: string) {
|
||||
const user = await Client.post(url_login, {
|
||||
username,
|
||||
password
|
||||
})
|
||||
return user
|
||||
}
|
||||
export async function logoutUser() {
|
||||
const data = await Client.post(url_logout)
|
||||
return data
|
||||
}
|
||||
@@ -1,8 +0,0 @@
|
||||
function createWebSocket(url: string, eventHandler: (event: MessageEvent) => void) {
|
||||
const urlObject = new URL(url, location.origin.replace(/^http/, 'ws'))
|
||||
const webSocket = new WebSocket(urlObject)
|
||||
webSocket.onmessage = eventHandler
|
||||
return webSocket
|
||||
}
|
||||
|
||||
export default createWebSocket
|
||||
@@ -1,172 +0,0 @@
|
||||
import type {
|
||||
Document,
|
||||
DirEntry,
|
||||
FileEntry,
|
||||
FUID,
|
||||
DirList,
|
||||
SelectedItems
|
||||
} from '@/repositories/Document'
|
||||
import { formatSize, formatUnixDate, haystackFormat } from '@/utils'
|
||||
import { defineStore } from 'pinia'
|
||||
import { collator } from '@/utils'
|
||||
|
||||
type FileData = { id: string; mtime: number; size: number; dir: DirectoryData }
|
||||
type DirectoryData = {
|
||||
[filename: string]: FileData
|
||||
}
|
||||
type User = {
|
||||
username: string
|
||||
privileged: boolean
|
||||
isOpenLoginModal: boolean
|
||||
isLoggedIn: boolean
|
||||
}
|
||||
|
||||
export const useDocumentStore = defineStore({
|
||||
id: 'documents',
|
||||
state: () => ({
|
||||
root: {} as DirEntry,
|
||||
document: [] as Document[],
|
||||
search: "" as string,
|
||||
selected: new Set<FUID>(),
|
||||
uploadingDocuments: [],
|
||||
uploadCount: 0 as number,
|
||||
wsWatch: undefined,
|
||||
wsUpload: undefined,
|
||||
fileExplorer: null,
|
||||
error: '' as string,
|
||||
user: {
|
||||
username: '',
|
||||
privileged: false,
|
||||
isLoggedIn: false,
|
||||
isOpenLoginModal: false
|
||||
} as User
|
||||
}),
|
||||
|
||||
actions: {
|
||||
updateRoot(root: DirEntry | null = null) {
|
||||
root ??= this.root
|
||||
// Transform tree data to flat documents array
|
||||
let loc = ""
|
||||
const mapper = ([name, attr]: [string, FileEntry | DirEntry]) => ({
|
||||
loc,
|
||||
name,
|
||||
type: 'dir' in attr ? 'folder' : 'file' as 'folder' | 'file',
|
||||
...attr,
|
||||
sizedisp: formatSize(attr.size),
|
||||
modified: formatUnixDate(attr.mtime),
|
||||
haystack: haystackFormat(name),
|
||||
})
|
||||
const queue = [...Object.entries(root.dir ?? {}).map(mapper)]
|
||||
const docs = []
|
||||
for (let doc; (doc = queue.shift()) !== undefined;) {
|
||||
docs.push(doc)
|
||||
if ("dir" in doc) {
|
||||
loc = doc.loc ? `${doc.loc}/${doc.name}` : doc.name
|
||||
queue.push(...Object.entries(doc.dir).map(mapper))
|
||||
}
|
||||
}
|
||||
// Pre sort directory entries folders first then files, names in natural ordering
|
||||
docs.sort((a, b) =>
|
||||
// @ts-ignore
|
||||
(a.type === "file") - (b.type === "file") ||
|
||||
collator.compare(a.name, b.name)
|
||||
)
|
||||
this.root = root
|
||||
this.document = docs
|
||||
},
|
||||
updateUploadingDocuments(key: number, progress: number) {
|
||||
for (const d of this.uploadingDocuments) {
|
||||
if (d.key === key) d.progress = progress
|
||||
}
|
||||
},
|
||||
pushUploadingDocuments(name: string) {
|
||||
this.uploadCount++
|
||||
const document = {
|
||||
key: this.uploadCount,
|
||||
name: name,
|
||||
progress: 0
|
||||
}
|
||||
this.uploadingDocuments.push(document)
|
||||
return document
|
||||
},
|
||||
deleteUploadingDocument(key: number) {
|
||||
this.uploadingDocuments = this.uploadingDocuments.filter(e => e.key !== key)
|
||||
},
|
||||
updateModified() {
|
||||
for (const d of this.document) {
|
||||
if ('mtime' in d) d.modified = formatUnixDate(d.mtime)
|
||||
}
|
||||
},
|
||||
login(username: string, privileged: boolean) {
|
||||
this.user.username = username
|
||||
this.user.privileged = privileged
|
||||
this.user.isLoggedIn = true
|
||||
this.user.isOpenLoginModal = false
|
||||
}
|
||||
},
|
||||
getters: {
|
||||
isUserLogged(): boolean {
|
||||
return this.user.isLoggedIn
|
||||
},
|
||||
recentDocuments(): Document[] {
|
||||
const ret = [...this.document]
|
||||
ret.sort((a, b) => b.mtime - a.mtime)
|
||||
return ret
|
||||
},
|
||||
largeDocuments(): Document[] {
|
||||
const ret = [...this.document]
|
||||
ret.sort((a, b) => b.size - a.size)
|
||||
return ret
|
||||
},
|
||||
selectedFiles(): SelectedItems {
|
||||
function traverseDir(data: DirEntry | FileEntry, path: string, relpath: string) {
|
||||
if (!('dir' in data)) return
|
||||
for (const [name, attr] of Object.entries(data.dir)) {
|
||||
const fullname = path ? `${path}/${name}` : name
|
||||
const key = attr.key
|
||||
// Is this the file we are looking for? Ignore if nested within another selection.
|
||||
let r = relpath
|
||||
if (selected.has(key) && !relpath) {
|
||||
ret.selected.add(key)
|
||||
ret.rootdir[name] = attr
|
||||
r = name
|
||||
} else if (relpath) {
|
||||
r = `${relpath}/${name}`
|
||||
}
|
||||
if (r) {
|
||||
ret.entries[key] = attr
|
||||
ret.fullpath[key] = fullname
|
||||
ret.relpath[key] = r
|
||||
ret.ids.push(key)
|
||||
if (!('dir' in attr)) ret.url[key] = `/files/${fullname}`
|
||||
}
|
||||
traverseDir(attr, fullname, r)
|
||||
}
|
||||
}
|
||||
const selected = this.selected
|
||||
const ret: SelectedItems = {
|
||||
selected: new Set<FUID>(),
|
||||
missing: new Set<FUID>(),
|
||||
rootdir: {} as DirList,
|
||||
entries: {} as Record<FUID, FileEntry | DirEntry>,
|
||||
fullpath: {} as Record<FUID, string>,
|
||||
relpath: {} as Record<FUID, string>,
|
||||
url: {} as Record<FUID, string>,
|
||||
ids: [] as FUID[]
|
||||
}
|
||||
traverseDir(this.root, '', '')
|
||||
// What did we not select?
|
||||
for (const id of selected) {
|
||||
if (!ret.selected.has(id)) ret.missing.add(id)
|
||||
}
|
||||
// Sorted array of FUIDs for easy traversal
|
||||
ret.ids.sort((a, b) =>
|
||||
ret.relpath[a].localeCompare(ret.relpath[b], undefined, {
|
||||
numeric: true,
|
||||
sensitivity: 'base'
|
||||
})
|
||||
)
|
||||
return ret
|
||||
}
|
||||
}
|
||||
})
|
||||
@@ -1,58 +0,0 @@
|
||||
<template>
|
||||
<FileExplorer
|
||||
ref="fileExplorer"
|
||||
:key="Router.currentRoute.value.path"
|
||||
:path="props.path"
|
||||
:documents="documents"
|
||||
v-if="props.path"
|
||||
/>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { watchEffect, ref, computed } from 'vue'
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import Router from '@/router/index'
|
||||
import { needleFormat, localeIncludes, collator } from '@/utils';
|
||||
|
||||
const documentStore = useDocumentStore()
|
||||
const fileExplorer = ref()
|
||||
const props = defineProps({
|
||||
path: Array<string>
|
||||
})
|
||||
const documents = computed(() => {
|
||||
if (!props.path) return []
|
||||
const loc = props.path.join('/')
|
||||
// List the current location
|
||||
if (!documentStore.search) return documentStore.document.filter(doc => doc.loc === loc)
|
||||
// Find up to 100 newest documents that match the search
|
||||
const search = documentStore.search
|
||||
const needle = needleFormat(search)
|
||||
let limit = 100
|
||||
let docs = []
|
||||
for (const doc of documentStore.recentDocuments) {
|
||||
if (localeIncludes(doc.haystack, needle)) {
|
||||
docs.push(doc)
|
||||
if (--limit === 0) break
|
||||
}
|
||||
}
|
||||
// Organize by folder, by relevance
|
||||
const locsub = loc + '/'
|
||||
docs.sort((a, b) => (
|
||||
// @ts-ignore
|
||||
(b.loc === loc) - (a.loc === loc) ||
|
||||
// @ts-ignore
|
||||
(b.loc.slice(0, locsub.length) === locsub) - (a.loc.slice(0, locsub.length) === locsub) ||
|
||||
collator.compare(a.loc, b.loc) ||
|
||||
// @ts-ignore
|
||||
(a.type === 'file') - (b.type === 'file') ||
|
||||
// @ts-ignore
|
||||
b.name.includes(search) - a.name.includes(search) ||
|
||||
collator.compare(a.name, b.name)
|
||||
))
|
||||
return docs
|
||||
})
|
||||
|
||||
watchEffect(() => {
|
||||
documentStore.fileExplorer = fileExplorer.value
|
||||
})
|
||||
</script>
|
||||
@@ -1,3 +1,4 @@
|
||||
import os
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
@@ -61,13 +62,17 @@ def _main():
|
||||
path = None
|
||||
_confdir(args)
|
||||
exists = config.conffile.exists()
|
||||
print(config.conffile, exists)
|
||||
import_droppy = args["--import-droppy"]
|
||||
necessary_opts = exists or import_droppy or path and listen
|
||||
necessary_opts = exists or import_droppy or path
|
||||
if not necessary_opts:
|
||||
# Maybe run without arguments
|
||||
print(doc)
|
||||
print(
|
||||
"No config file found! Get started with:\n cista -l :8000 /path/to/files, or\n cista -l example.com --import-droppy # Uses Droppy files\n",
|
||||
"No config file found! Get started with one of:\n"
|
||||
" cista --user yourname --privileged\n"
|
||||
" cista --import-droppy\n"
|
||||
" cista -l :8000 /path/to/files\n"
|
||||
)
|
||||
return 1
|
||||
settings = {}
|
||||
@@ -79,8 +84,15 @@ def _main():
|
||||
settings = droppy.readconf()
|
||||
if path:
|
||||
settings["path"] = path
|
||||
elif not exists:
|
||||
settings["path"] = Path.home() / "Downloads"
|
||||
if listen:
|
||||
settings["listen"] = listen
|
||||
elif not exists:
|
||||
settings["listen"] = ":8000"
|
||||
if not exists and not import_droppy:
|
||||
# We have no users, so make it public
|
||||
settings["public"] = True
|
||||
operation = config.update_config(settings)
|
||||
print(f"Config {operation}: {config.conffile}")
|
||||
# Prepare to serve
|
||||
@@ -105,18 +117,31 @@ def _confdir(args):
|
||||
if confdir.exists() and not confdir.is_dir():
|
||||
if confdir.name != config.conffile.name:
|
||||
raise ValueError("Config path is not a directory")
|
||||
# Accidentally pointed to the cista.toml, use parent
|
||||
# Accidentally pointed to the db.toml, use parent
|
||||
confdir = confdir.parent
|
||||
config.conffile = config.conffile.with_parent(confdir)
|
||||
os.environ["CISTA_HOME"] = confdir.as_posix()
|
||||
config.init_confdir() # Uses environ if available
|
||||
|
||||
|
||||
def _user(args):
|
||||
_confdir(args)
|
||||
config.load_config()
|
||||
if config.conffile.exists():
|
||||
config.load_config()
|
||||
operation = False
|
||||
else:
|
||||
# Defaults for new config when user is created
|
||||
operation = config.update_config(
|
||||
{
|
||||
"listen": ":8000",
|
||||
"path": Path.home() / "Downloads",
|
||||
"public": False,
|
||||
}
|
||||
)
|
||||
print(f"Config {operation}: {config.conffile}\n")
|
||||
|
||||
name = args["--user"]
|
||||
if not name or not name.isidentifier():
|
||||
raise ValueError("Invalid username")
|
||||
config.load_config()
|
||||
u = config.config.users.get(name)
|
||||
info = f"User {name}" if u else f"New user {name}"
|
||||
changes = {}
|
||||
@@ -128,12 +153,17 @@ def _user(args):
|
||||
info += " (admin)" if oldadmin else ""
|
||||
if args["--password"] or not u:
|
||||
changes["password"] = pw = pwgen.generate()
|
||||
info += f"\n Password: {pw}"
|
||||
res = config.update_user(args["--user"], changes)
|
||||
info += f"\n Password: {pw}\n"
|
||||
res = config.update_user(name, changes)
|
||||
print(info)
|
||||
if res == "read":
|
||||
print(" No changes")
|
||||
|
||||
if operation == "created":
|
||||
print(
|
||||
"Now you can run the server:\n cista # defaults set: -l :8000 ~/Downloads\n"
|
||||
)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
|
||||
56
cista/api.py
@@ -1,10 +1,11 @@
|
||||
import asyncio
|
||||
import typing
|
||||
from secrets import token_bytes
|
||||
|
||||
import msgspec
|
||||
from sanic import Blueprint
|
||||
|
||||
from cista import watching
|
||||
from cista import __version__, config, watching
|
||||
from cista.fileio import FileServer
|
||||
from cista.protocol import ControlTypes, FileRange, StatusMsg
|
||||
from cista.util.apphelpers import asend, websocket_wrapper
|
||||
@@ -36,10 +37,18 @@ async def upload(req, ws):
|
||||
)
|
||||
req = msgspec.json.decode(text, type=FileRange)
|
||||
pos = req.start
|
||||
data = None
|
||||
while pos < req.end and (data := await ws.recv()) and isinstance(data, bytes):
|
||||
while True:
|
||||
data = await ws.recv()
|
||||
if not isinstance(data, bytes):
|
||||
break
|
||||
if len(data) > req.end - pos:
|
||||
raise ValueError(
|
||||
f"Expected up to {req.end - pos} bytes, got {len(data)} bytes"
|
||||
)
|
||||
sentsize = await alink(("upload", req.name, pos, data, req.size))
|
||||
pos += typing.cast(int, sentsize)
|
||||
if pos >= req.end:
|
||||
break
|
||||
if pos != req.end:
|
||||
d = f"{len(data)} bytes" if isinstance(data, bytes) else data
|
||||
raise ValueError(f"Expected {req.end - pos} more bytes, got {d}")
|
||||
@@ -83,14 +92,43 @@ async def control(req, ws):
|
||||
@bp.websocket("watch")
|
||||
@websocket_wrapper
|
||||
async def watch(req, ws):
|
||||
await ws.send(
|
||||
msgspec.json.encode(
|
||||
{
|
||||
"server": {
|
||||
"name": config.config.name or config.config.path.name,
|
||||
"version": __version__,
|
||||
"public": config.config.public,
|
||||
},
|
||||
"user": {
|
||||
"username": req.ctx.username,
|
||||
"privileged": req.ctx.user.privileged,
|
||||
}
|
||||
if req.ctx.user
|
||||
else None,
|
||||
}
|
||||
).decode()
|
||||
)
|
||||
uuid = token_bytes(16)
|
||||
try:
|
||||
with watching.tree_lock:
|
||||
q = watching.pubsub[ws] = asyncio.Queue()
|
||||
# Init with disk usage and full tree
|
||||
await ws.send(watching.format_du())
|
||||
await ws.send(watching.format_tree())
|
||||
q, space, root = await asyncio.get_event_loop().run_in_executor(
|
||||
req.app.ctx.threadexec, subscribe, uuid, ws
|
||||
)
|
||||
await ws.send(space)
|
||||
await ws.send(root)
|
||||
# Send updates
|
||||
while True:
|
||||
await ws.send(await q.get())
|
||||
finally:
|
||||
del watching.pubsub[ws]
|
||||
del watching.pubsub[uuid]
|
||||
|
||||
|
||||
def subscribe(uuid, ws):
|
||||
with watching.state.lock:
|
||||
q = watching.pubsub[uuid] = asyncio.Queue()
|
||||
# Init with disk usage and full tree
|
||||
return (
|
||||
q,
|
||||
watching.format_space(watching.state.space),
|
||||
watching.format_root(watching.state.root),
|
||||
)
|
||||
|
||||
207
cista/app.py
@@ -1,16 +1,24 @@
|
||||
import asyncio
|
||||
import datetime
|
||||
import mimetypes
|
||||
from importlib.resources import files
|
||||
import threading
|
||||
from concurrent.futures import ThreadPoolExecutor
|
||||
from multiprocessing import cpu_count
|
||||
from pathlib import Path, PurePath, PurePosixPath
|
||||
from stat import S_IFDIR, S_IFREG
|
||||
from urllib.parse import unquote
|
||||
from wsgiref.handlers import format_date_time
|
||||
|
||||
import brotli
|
||||
import sanic.helpers
|
||||
from blake3 import blake3
|
||||
from sanic import Blueprint, Sanic, empty, raw
|
||||
from sanic import Blueprint, Sanic, empty, raw, redirect
|
||||
from sanic.exceptions import Forbidden, NotFound
|
||||
from sanic.log import logger
|
||||
from setproctitle import setproctitle
|
||||
from stream_zip import ZIP_AUTO, stream_zip
|
||||
|
||||
from cista import auth, config, session, watching
|
||||
from cista import auth, config, preview, session, watching
|
||||
from cista.api import bp
|
||||
from cista.util.apphelpers import handle_sanic_exception
|
||||
|
||||
@@ -19,27 +27,40 @@ sanic.helpers._ENTITY_HEADERS = frozenset()
|
||||
|
||||
app = Sanic("cista", strict_slashes=True)
|
||||
app.blueprint(auth.bp)
|
||||
app.blueprint(preview.bp)
|
||||
app.blueprint(bp)
|
||||
app.exception(Exception)(handle_sanic_exception)
|
||||
|
||||
|
||||
setproctitle("cista-main")
|
||||
|
||||
|
||||
@app.before_server_start
|
||||
async def main_start(app, loop):
|
||||
config.load_config()
|
||||
setproctitle(f"cista {config.config.path.name}")
|
||||
workers = max(2, min(8, cpu_count()))
|
||||
app.ctx.threadexec = ThreadPoolExecutor(
|
||||
max_workers=workers, thread_name_prefix="cista-ioworker"
|
||||
)
|
||||
await watching.start(app, loop)
|
||||
|
||||
|
||||
@app.after_server_stop
|
||||
async def main_stop(app, loop):
|
||||
quit.set()
|
||||
await watching.stop(app, loop)
|
||||
app.ctx.threadexec.shutdown()
|
||||
|
||||
|
||||
@app.on_request
|
||||
async def use_session(req):
|
||||
req.ctx.session = session.get(req)
|
||||
try:
|
||||
req.ctx.user = config.config.users[req.ctx.session["username"]] # type: ignore
|
||||
req.ctx.username = req.ctx.session["username"] # type: ignore
|
||||
req.ctx.user = config.config.users[req.ctx.username]
|
||||
except (AttributeError, KeyError, TypeError):
|
||||
req.ctx.username = None
|
||||
req.ctx.user = None
|
||||
# CSRF protection
|
||||
if req.method == "GET" and req.headers.upgrade != "websocket":
|
||||
@@ -68,22 +89,16 @@ def http_fileserver(app, _):
|
||||
www = {}
|
||||
|
||||
|
||||
@app.before_server_start
|
||||
async def load_wwwroot(*_ignored):
|
||||
global www
|
||||
www = await asyncio.get_event_loop().run_in_executor(None, _load_wwwroot, www)
|
||||
|
||||
|
||||
def _load_wwwroot(www):
|
||||
wwwnew = {}
|
||||
base = files("cista") / "wwwroot"
|
||||
paths = ["."]
|
||||
base = Path(__file__).with_name("wwwroot")
|
||||
paths = [PurePath()]
|
||||
while paths:
|
||||
path = paths.pop(0)
|
||||
current = base / path
|
||||
for p in current.iterdir():
|
||||
if p.is_dir():
|
||||
paths.append(current / p.parts[-1])
|
||||
paths.append(p.relative_to(base))
|
||||
continue
|
||||
name = p.relative_to(base).as_posix()
|
||||
mime = mimetypes.guess_type(name)[0] or "application/octet-stream"
|
||||
@@ -114,31 +129,63 @@ def _load_wwwroot(www):
|
||||
if len(br) >= len(data):
|
||||
br = False
|
||||
wwwnew[name] = data, br, headers
|
||||
if not wwwnew:
|
||||
msg = f"Web frontend missing from {base}\n Did you forget: hatch build\n"
|
||||
if not www:
|
||||
logger.warning(msg)
|
||||
if not app.debug:
|
||||
msg = "Web frontend missing. Cista installation is broken.\n"
|
||||
wwwnew[""] = (
|
||||
msg.encode(),
|
||||
False,
|
||||
{
|
||||
"etag": "error",
|
||||
"content-type": "text/plain",
|
||||
"cache-control": "no-store",
|
||||
},
|
||||
)
|
||||
return wwwnew
|
||||
|
||||
|
||||
@app.add_task
|
||||
@app.before_server_start
|
||||
async def start(app):
|
||||
await load_wwwroot(app)
|
||||
if app.debug:
|
||||
app.add_task(refresh_wwwroot(), name="refresh_wwwroot")
|
||||
|
||||
|
||||
async def load_wwwroot(app):
|
||||
global www
|
||||
www = await asyncio.get_event_loop().run_in_executor(
|
||||
app.ctx.threadexec, _load_wwwroot, www
|
||||
)
|
||||
|
||||
|
||||
quit = threading.Event()
|
||||
|
||||
|
||||
async def refresh_wwwroot():
|
||||
while True:
|
||||
try:
|
||||
wwwold = www
|
||||
await load_wwwroot()
|
||||
changes = ""
|
||||
for name in sorted(www):
|
||||
attr = www[name]
|
||||
if wwwold.get(name) == attr:
|
||||
continue
|
||||
headers = attr[2]
|
||||
changes += f"{headers['last-modified']} {headers['etag']} /{name}\n"
|
||||
for name in sorted(set(wwwold) - set(www)):
|
||||
changes += f"Deleted /{name}\n"
|
||||
if changes:
|
||||
print(f"Updated wwwroot:\n{changes}", end="", flush=True)
|
||||
except Exception as e:
|
||||
print("Error loading wwwroot", e)
|
||||
if not app.debug:
|
||||
return
|
||||
await asyncio.sleep(0.5)
|
||||
try:
|
||||
while not quit.is_set():
|
||||
try:
|
||||
wwwold = www
|
||||
await load_wwwroot(app)
|
||||
changes = ""
|
||||
for name in sorted(www):
|
||||
attr = www[name]
|
||||
if wwwold.get(name) == attr:
|
||||
continue
|
||||
headers = attr[2]
|
||||
changes += f"{headers['last-modified']} {headers['etag']} /{name}\n"
|
||||
for name in sorted(set(wwwold) - set(www)):
|
||||
changes += f"Deleted /{name}\n"
|
||||
if changes:
|
||||
print(f"Updated wwwroot:\n{changes}", end="", flush=True)
|
||||
except Exception as e:
|
||||
print(f"Error loading wwwroot: {e!r}")
|
||||
await asyncio.sleep(0.5)
|
||||
except asyncio.CancelledError:
|
||||
pass
|
||||
|
||||
|
||||
@app.route("/<path:path>", methods=["GET", "HEAD"])
|
||||
@@ -153,9 +200,93 @@ async def wwwroot(req, path=""):
|
||||
return empty(304, headers=headers)
|
||||
# Brotli compressed?
|
||||
if br and "br" in req.headers.accept_encoding.split(", "):
|
||||
headers = {
|
||||
**headers,
|
||||
"content-encoding": "br",
|
||||
}
|
||||
headers = {**headers, "content-encoding": "br"}
|
||||
data = br
|
||||
return raw(data, headers=headers)
|
||||
|
||||
|
||||
@app.route("/favicon.ico", methods=["GET", "HEAD"])
|
||||
async def favicon(req):
|
||||
# Browsers keep asking for it when viewing files (not HTML with icon link)
|
||||
return redirect("/assets/logo-97d1d7eb.svg", status=308)
|
||||
|
||||
|
||||
def get_files(wanted: set) -> list[tuple[PurePosixPath, Path]]:
|
||||
loc = PurePosixPath()
|
||||
idx = 0
|
||||
ret = []
|
||||
level: int | None = None
|
||||
parent: PurePosixPath | None = None
|
||||
with watching.state.lock:
|
||||
root = watching.state.root
|
||||
while idx < len(root):
|
||||
f = root[idx]
|
||||
loc = PurePosixPath(*loc.parts[: f.level - 1]) / f.name
|
||||
if parent is not None and f.level <= level:
|
||||
level = parent = None
|
||||
if f.key in wanted:
|
||||
level, parent = f.level, loc.parent
|
||||
if parent is not None:
|
||||
wanted.discard(f.key)
|
||||
ret.append((loc.relative_to(parent), watching.rootpath / loc))
|
||||
idx += 1
|
||||
return ret
|
||||
|
||||
|
||||
@app.get("/zip/<keys>/<zipfile:ext=zip>")
|
||||
async def zip_download(req, keys, zipfile, ext):
|
||||
"""Download a zip archive of the given keys"""
|
||||
|
||||
wanted = set(keys.split("+"))
|
||||
files = get_files(wanted)
|
||||
|
||||
if not files:
|
||||
raise NotFound(
|
||||
"No files found",
|
||||
context={"keys": keys, "zipfile": f"{zipfile}.{ext}", "wanted": wanted},
|
||||
)
|
||||
if wanted:
|
||||
raise NotFound("Files not found", context={"missing": wanted})
|
||||
|
||||
def local_files(files):
|
||||
for rel, p in files:
|
||||
s = p.stat()
|
||||
size = s.st_size
|
||||
modified = datetime.datetime.fromtimestamp(s.st_mtime, datetime.UTC)
|
||||
name = rel.as_posix()
|
||||
if p.is_dir():
|
||||
yield f"{name}/", modified, S_IFDIR | 0o755, ZIP_AUTO(size), iter(b"")
|
||||
else:
|
||||
yield name, modified, S_IFREG | 0o644, ZIP_AUTO(size), contents(p, size)
|
||||
|
||||
def contents(name, size):
|
||||
with name.open("rb") as f:
|
||||
while size > 0 and (chunk := f.read(min(size, 1 << 20))):
|
||||
size -= len(chunk)
|
||||
yield chunk
|
||||
assert size == 0
|
||||
|
||||
def worker():
|
||||
try:
|
||||
for chunk in stream_zip(local_files(files)):
|
||||
asyncio.run_coroutine_threadsafe(queue.put(chunk), loop).result()
|
||||
except Exception:
|
||||
logger.exception("Error streaming ZIP")
|
||||
raise
|
||||
finally:
|
||||
asyncio.run_coroutine_threadsafe(queue.put(None), loop)
|
||||
|
||||
# Don't block the event loop: run in a thread
|
||||
queue = asyncio.Queue(maxsize=1)
|
||||
loop = asyncio.get_event_loop()
|
||||
thread = loop.run_in_executor(app.ctx.threadexec, worker)
|
||||
|
||||
# Stream the response
|
||||
res = await req.respond(
|
||||
content_type="application/zip",
|
||||
headers={"cache-control": "no-store"},
|
||||
)
|
||||
while chunk := await queue.get():
|
||||
await res.send(chunk)
|
||||
|
||||
await thread # If it raises, the response will fail download
|
||||
|
||||
@@ -68,10 +68,10 @@ def verify(request, *, privileged=False):
|
||||
if request.ctx.user:
|
||||
if request.ctx.user.privileged:
|
||||
return
|
||||
raise Forbidden("Access Forbidden: Only for privileged users")
|
||||
raise Forbidden("Access Forbidden: Only for privileged users", quiet=True)
|
||||
elif config.config.public or request.ctx.user:
|
||||
return
|
||||
raise Unauthorized("Login required", "cookie", context={"redirect": "/login"})
|
||||
raise Unauthorized(f"Login required for {request.path}", "cookie", quiet=True)
|
||||
|
||||
|
||||
bp = Blueprint("auth")
|
||||
@@ -159,3 +159,35 @@ async def logout_post(request):
|
||||
res = json({"message": msg})
|
||||
session.delete(res)
|
||||
return res
|
||||
|
||||
|
||||
@bp.post("/password-change")
|
||||
async def change_password(request):
|
||||
try:
|
||||
if request.headers.content_type == "application/json":
|
||||
username = request.json["username"]
|
||||
pwchange = request.json["passwordChange"]
|
||||
password = request.json["password"]
|
||||
else:
|
||||
username = request.form["username"][0]
|
||||
pwchange = request.form["passwordChange"][0]
|
||||
password = request.form["password"][0]
|
||||
if not username or not password:
|
||||
raise KeyError
|
||||
except KeyError:
|
||||
raise BadRequest(
|
||||
"Missing username, passwordChange or password",
|
||||
) from None
|
||||
try:
|
||||
user = login(username, password)
|
||||
set_password(user, pwchange)
|
||||
except ValueError as e:
|
||||
raise Forbidden(str(e), context={"redirect": "/login"}) from e
|
||||
|
||||
if "text/html" in request.headers.accept:
|
||||
res = redirect("/")
|
||||
session.flash(res, "Password updated")
|
||||
else:
|
||||
res = json({"message": "Password updated"})
|
||||
session.create(res, username)
|
||||
return res
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
import secrets
|
||||
import sys
|
||||
from contextlib import suppress
|
||||
from functools import wraps
|
||||
from hashlib import sha256
|
||||
from pathlib import Path, PurePath
|
||||
@@ -14,6 +17,7 @@ class Config(msgspec.Struct):
|
||||
listen: str
|
||||
secret: str = secrets.token_hex(12)
|
||||
public: bool = False
|
||||
name: str = ""
|
||||
users: dict[str, User] = {}
|
||||
links: dict[str, Link] = {}
|
||||
|
||||
@@ -31,7 +35,23 @@ class Link(msgspec.Struct, omit_defaults=True):
|
||||
|
||||
|
||||
config = None
|
||||
conffile = Path.home() / ".local/share/cista/db.toml"
|
||||
conffile = None
|
||||
|
||||
|
||||
def init_confdir():
|
||||
if p := os.environ.get("CISTA_HOME"):
|
||||
home = Path(p)
|
||||
else:
|
||||
xdg = os.environ.get("XDG_CONFIG_HOME")
|
||||
home = (
|
||||
Path(xdg).expanduser() / "cista" if xdg else Path.home() / ".config/cista"
|
||||
)
|
||||
if not home.is_dir():
|
||||
home.mkdir(parents=True, exist_ok=True)
|
||||
home.chmod(0o700)
|
||||
|
||||
global conffile
|
||||
conffile = home / "db.toml"
|
||||
|
||||
|
||||
def derived_secret(*params, len=8) -> bytes:
|
||||
@@ -59,8 +79,8 @@ def dec_hook(typ, obj):
|
||||
|
||||
def config_update(modify):
|
||||
global config
|
||||
if not conffile.exists():
|
||||
conffile.parent.mkdir(parents=True, exist_ok=True)
|
||||
if conffile is None:
|
||||
init_confdir()
|
||||
tmpname = conffile.with_suffix(".tmp")
|
||||
try:
|
||||
f = tmpname.open("xb")
|
||||
@@ -74,10 +94,6 @@ def config_update(modify):
|
||||
old = conffile.read_bytes()
|
||||
c = msgspec.toml.decode(old, type=Config, dec_hook=dec_hook)
|
||||
except FileNotFoundError:
|
||||
# No existing config file, make sure we have a folder...
|
||||
confdir = conffile.parent
|
||||
confdir.mkdir(parents=True, exist_ok=True)
|
||||
confdir.chmod(0o700)
|
||||
old = b""
|
||||
c = None
|
||||
c = modify(c)
|
||||
@@ -89,6 +105,10 @@ def config_update(modify):
|
||||
return "read"
|
||||
f.write(new)
|
||||
f.close()
|
||||
if sys.platform == "win32":
|
||||
# Windows doesn't support atomic replace
|
||||
with suppress(FileNotFoundError):
|
||||
conffile.unlink()
|
||||
tmpname.rename(conffile) # Atomic replace
|
||||
except:
|
||||
f.close()
|
||||
@@ -116,6 +136,8 @@ def modifies_config(modify):
|
||||
|
||||
def load_config():
|
||||
global config
|
||||
if conffile is None:
|
||||
init_confdir()
|
||||
config = msgspec.toml.decode(conffile.read_bytes(), type=Config, dec_hook=dec_hook)
|
||||
|
||||
|
||||
@@ -134,7 +156,7 @@ def update_user(conf: Config, name: str, changes: dict) -> Config:
|
||||
# Encode into dict, update values with new, convert to Config
|
||||
try:
|
||||
u = conf.users[name].__copy__()
|
||||
except KeyError:
|
||||
except (KeyError, AttributeError):
|
||||
u = User()
|
||||
if "password" in changes:
|
||||
from . import auth
|
||||
@@ -143,7 +165,7 @@ def update_user(conf: Config, name: str, changes: dict) -> Config:
|
||||
del changes["password"]
|
||||
udict = msgspec.to_builtins(u, enc_hook=enc_hook)
|
||||
udict.update(changes)
|
||||
settings = msgspec.to_builtins(conf, enc_hook=enc_hook)
|
||||
settings = msgspec.to_builtins(conf, enc_hook=enc_hook) if conf else {"users": {}}
|
||||
settings["users"][name] = msgspec.convert(udict, User, dec_hook=dec_hook)
|
||||
return msgspec.convert(settings, Config, dec_hook=dec_hook)
|
||||
|
||||
|
||||
@@ -34,9 +34,11 @@ class File:
|
||||
self.open_rw()
|
||||
assert self.fd is not None
|
||||
if file_size is not None:
|
||||
assert pos + len(buffer) <= file_size
|
||||
os.ftruncate(self.fd, file_size)
|
||||
os.lseek(self.fd, pos, os.SEEK_SET)
|
||||
os.write(self.fd, buffer)
|
||||
if buffer:
|
||||
os.lseek(self.fd, pos, os.SEEK_SET)
|
||||
os.write(self.fd, buffer)
|
||||
|
||||
def __getitem__(self, slice):
|
||||
if self.fd is None:
|
||||
|
||||
225
cista/preview.py
Normal file
@@ -0,0 +1,225 @@
|
||||
import asyncio
|
||||
import gc
|
||||
import io
|
||||
import mimetypes
|
||||
import urllib.parse
|
||||
from pathlib import PurePosixPath
|
||||
from time import perf_counter
|
||||
from urllib.parse import unquote
|
||||
from wsgiref.handlers import format_date_time
|
||||
|
||||
import av
|
||||
import fitz # PyMuPDF
|
||||
import numpy as np
|
||||
import pillow_heif
|
||||
from PIL import Image
|
||||
from sanic import Blueprint, empty, raw
|
||||
from sanic.exceptions import NotFound
|
||||
from sanic.log import logger
|
||||
|
||||
from cista import config
|
||||
from cista.util.filename import sanitize
|
||||
|
||||
pillow_heif.register_heif_opener()
|
||||
|
||||
bp = Blueprint("preview", url_prefix="/preview")
|
||||
|
||||
|
||||
@bp.get("/<path:path>")
|
||||
async def preview(req, path):
|
||||
"""Preview a file"""
|
||||
maxsize = int(req.args.get("px", 1024))
|
||||
maxzoom = float(req.args.get("zoom", 2.0))
|
||||
quality = int(req.args.get("q", 60))
|
||||
rel = PurePosixPath(sanitize(unquote(path)))
|
||||
path = config.config.path / rel
|
||||
stat = path.lstat()
|
||||
etag = config.derived_secret(
|
||||
"preview", rel, stat.st_mtime_ns, quality, maxsize, maxzoom
|
||||
).hex()
|
||||
savename = PurePosixPath(path.name).with_suffix(".avif")
|
||||
headers = {
|
||||
"etag": etag,
|
||||
"last-modified": format_date_time(stat.st_mtime),
|
||||
"cache-control": "max-age=604800, immutable"
|
||||
+ ("" if config.config.public else ", private"),
|
||||
"content-type": "image/avif",
|
||||
"content-disposition": f"inline; filename*=UTF-8''{urllib.parse.quote(savename.as_posix())}",
|
||||
}
|
||||
if req.headers.if_none_match == etag:
|
||||
# The client has it cached, respond 304 Not Modified
|
||||
return empty(304, headers=headers)
|
||||
|
||||
if not path.is_file():
|
||||
raise NotFound("File not found")
|
||||
|
||||
img = await asyncio.get_event_loop().run_in_executor(
|
||||
req.app.ctx.threadexec, dispatch, path, quality, maxsize, maxzoom
|
||||
)
|
||||
return raw(img, headers=headers)
|
||||
|
||||
|
||||
def dispatch(path, quality, maxsize, maxzoom):
|
||||
if path.suffix.lower() in (".pdf", ".xps", ".epub", ".mobi"):
|
||||
return process_pdf(path, quality=quality, maxsize=maxsize, maxzoom=maxzoom)
|
||||
type, _ = mimetypes.guess_type(path.name)
|
||||
if type and type.startswith("video/"):
|
||||
return process_video(path, quality=quality, maxsize=maxsize)
|
||||
return process_image(path, quality=quality, maxsize=maxsize)
|
||||
|
||||
|
||||
def process_image(path, *, maxsize, quality):
|
||||
t_load_start = perf_counter()
|
||||
img = Image.open(path)
|
||||
# Force decode to include I/O in load timing
|
||||
img.load()
|
||||
t_load_end = perf_counter()
|
||||
# Resize
|
||||
orig_w, orig_h = img.size
|
||||
t_proc_start = perf_counter()
|
||||
img.thumbnail((min(orig_w, maxsize), min(orig_h, maxsize)))
|
||||
t_proc_end = perf_counter()
|
||||
# Save as AVIF
|
||||
imgdata = io.BytesIO()
|
||||
t_save_start = perf_counter()
|
||||
img.save(imgdata, format="avif", quality=quality, speed=10, max_threads=1)
|
||||
t_save_end = perf_counter()
|
||||
|
||||
ret = imgdata.getvalue()
|
||||
|
||||
load_ms = (t_load_end - t_load_start) * 1000
|
||||
proc_ms = (t_proc_end - t_proc_start) * 1000
|
||||
save_ms = (t_save_end - t_save_start) * 1000
|
||||
logger.debug(
|
||||
"Preview image %s: load=%.1fms process=%.1fms save=%.1fms out=%.1fKB",
|
||||
path.name,
|
||||
load_ms,
|
||||
proc_ms,
|
||||
save_ms,
|
||||
len(ret) / 1024,
|
||||
)
|
||||
|
||||
return ret
|
||||
|
||||
|
||||
def process_pdf(path, *, maxsize, maxzoom, quality, page_number=0):
|
||||
t_load_start = perf_counter()
|
||||
pdf = fitz.open(path)
|
||||
page = pdf.load_page(page_number)
|
||||
w, h = page.rect[2:4]
|
||||
zoom = min(maxsize / w, maxsize / h, maxzoom)
|
||||
mat = fitz.Matrix(zoom, zoom)
|
||||
pix = page.get_pixmap(matrix=mat) # type: ignore[attr-defined]
|
||||
t_load_end = perf_counter()
|
||||
|
||||
t_save_start = perf_counter()
|
||||
ret = pix.pil_tobytes(format="avif", quality=quality, speed=10, max_threads=1)
|
||||
t_save_end = perf_counter()
|
||||
|
||||
logger.debug(
|
||||
"Preview pdf %s: load+render=%.1fms save=%.1fms",
|
||||
path.name,
|
||||
(t_load_end - t_load_start) * 1000,
|
||||
(t_save_end - t_save_start) * 1000,
|
||||
)
|
||||
return ret
|
||||
|
||||
|
||||
def process_video(path, *, maxsize, quality):
|
||||
frame = None
|
||||
imgdata = io.BytesIO()
|
||||
istream = ostream = icc = occ = frame = None
|
||||
t_load_start = perf_counter()
|
||||
# Initialize to avoid "possibly unbound" in static analysis when exceptions occur
|
||||
t_load_end = t_load_start
|
||||
t_save_start = t_load_start
|
||||
t_save_end = t_load_start
|
||||
with (
|
||||
av.open(str(path)) as icontainer,
|
||||
av.open(imgdata, "w", format="avif") as ocontainer,
|
||||
):
|
||||
istream = icontainer.streams.video[0]
|
||||
istream.codec_context.skip_frame = "NONKEY"
|
||||
icontainer.seek((icontainer.duration or 0) // 8)
|
||||
for frame in icontainer.decode(istream):
|
||||
if frame.dts is not None:
|
||||
break
|
||||
else:
|
||||
raise RuntimeError("No frames found in video")
|
||||
|
||||
# Resize frame to thumbnail size
|
||||
if frame.width > maxsize or frame.height > maxsize:
|
||||
scale_factor = min(maxsize / frame.width, maxsize / frame.height)
|
||||
new_width = int(frame.width * scale_factor)
|
||||
new_height = int(frame.height * scale_factor)
|
||||
frame = frame.reformat(width=new_width, height=new_height)
|
||||
|
||||
# Simple rotation detection and logging
|
||||
if frame.rotation:
|
||||
try:
|
||||
fplanes = frame.to_ndarray()
|
||||
# Split into Y, U, V planes of proper dimensions
|
||||
planes = [
|
||||
fplanes[: frame.height],
|
||||
fplanes[frame.height : frame.height + frame.height // 4].reshape(
|
||||
frame.height // 2, frame.width // 2
|
||||
),
|
||||
fplanes[frame.height + frame.height // 4 :].reshape(
|
||||
frame.height // 2, frame.width // 2
|
||||
),
|
||||
]
|
||||
# Rotate
|
||||
planes = [np.rot90(p, frame.rotation // 90) for p in planes]
|
||||
# Restore PyAV format
|
||||
planes = np.hstack([p.flat for p in planes]).reshape(
|
||||
-1, planes[0].shape[1]
|
||||
)
|
||||
frame = av.VideoFrame.from_ndarray(planes, format=frame.format.name)
|
||||
del planes, fplanes
|
||||
except Exception as e:
|
||||
if "not yet supported" in str(e):
|
||||
logger.warning(
|
||||
f"Not rotating {path.name} preview image by {frame.rotation}°:\n PyAV: {e}"
|
||||
)
|
||||
else:
|
||||
logger.exception(f"Error rotating video frame: {e}")
|
||||
t_load_end = perf_counter()
|
||||
|
||||
t_save_start = perf_counter()
|
||||
crf = str(int(63 * (1 - quality / 100) ** 2)) # Closely matching PIL quality-%
|
||||
ostream = ocontainer.add_stream(
|
||||
"av1",
|
||||
options={
|
||||
"crf": crf,
|
||||
"usage": "realtime",
|
||||
"cpu-used": "8",
|
||||
"threads": "1",
|
||||
},
|
||||
)
|
||||
assert isinstance(ostream, av.VideoStream)
|
||||
ostream.width = frame.width
|
||||
ostream.height = frame.height
|
||||
icc = istream.codec_context
|
||||
occ = ostream.codec_context
|
||||
|
||||
# Copy HDR metadata from input video stream
|
||||
occ.color_primaries = icc.color_primaries
|
||||
occ.color_trc = icc.color_trc
|
||||
occ.colorspace = icc.colorspace
|
||||
occ.color_range = icc.color_range
|
||||
|
||||
ocontainer.mux(ostream.encode(frame))
|
||||
ocontainer.mux(ostream.encode(None)) # Flush the stream
|
||||
t_save_end = perf_counter()
|
||||
|
||||
# Capture frame dimensions before cleanup
|
||||
ret = imgdata.getvalue()
|
||||
logger.debug(
|
||||
"Preview video %s: load+decode=%.1fms save=%.1fms",
|
||||
path.name,
|
||||
(t_load_end - t_load_start) * 1000,
|
||||
(t_save_end - t_save_start) * 1000,
|
||||
)
|
||||
del imgdata, istream, ostream, icc, occ, frame
|
||||
gc.collect()
|
||||
return ret
|
||||
@@ -22,7 +22,7 @@ class MkDir(ControlBase):
|
||||
|
||||
def __call__(self):
|
||||
path = config.config.path / filename.sanitize(self.path)
|
||||
path.mkdir(parents=False, exist_ok=False)
|
||||
path.mkdir(parents=True, exist_ok=False)
|
||||
|
||||
|
||||
class Rename(ControlBase):
|
||||
@@ -45,7 +45,7 @@ class Rm(ControlBase):
|
||||
sel = [root / filename.sanitize(p) for p in self.sel]
|
||||
for p in sel:
|
||||
if p.is_dir():
|
||||
shutil.rmtree(p, ignore_errors=True)
|
||||
shutil.rmtree(p)
|
||||
else:
|
||||
p.unlink()
|
||||
|
||||
@@ -112,56 +112,42 @@ class ErrorMsg(msgspec.Struct):
|
||||
## Directory listings
|
||||
|
||||
|
||||
class FileEntry(msgspec.Struct):
|
||||
class FileEntry(msgspec.Struct, array_like=True, frozen=True):
|
||||
level: int
|
||||
name: str
|
||||
key: str
|
||||
size: int
|
||||
mtime: int
|
||||
|
||||
|
||||
class DirEntry(msgspec.Struct):
|
||||
key: str
|
||||
size: int
|
||||
mtime: int
|
||||
dir: DirList
|
||||
isfile: int
|
||||
|
||||
def __getitem__(self, name):
|
||||
return self.dir[name]
|
||||
def __str__(self):
|
||||
return self.key or "FileEntry()"
|
||||
|
||||
def __setitem__(self, name, value):
|
||||
self.dir[name] = value
|
||||
|
||||
def __contains__(self, name):
|
||||
return name in self.dir
|
||||
|
||||
def __delitem__(self, name):
|
||||
del self.dir[name]
|
||||
|
||||
@property
|
||||
def props(self):
|
||||
return {k: v for k, v in self.__struct_fields__ if k != "dir"}
|
||||
def __repr__(self):
|
||||
return f"{self.name} ({self.size}, {self.mtime})"
|
||||
|
||||
|
||||
DirList = dict[str, FileEntry | DirEntry]
|
||||
class Update(msgspec.Struct, array_like=True): ...
|
||||
|
||||
|
||||
class UpdateEntry(msgspec.Struct, omit_defaults=True):
|
||||
"""Updates the named entry in the tree. Fields that are set replace old values. A list of entries recurses directories."""
|
||||
|
||||
name: str = ""
|
||||
deleted: bool = False
|
||||
key: str | None = None
|
||||
size: int | None = None
|
||||
mtime: int | None = None
|
||||
dir: DirList | None = None
|
||||
class UpdKeep(Update, tag="k"):
|
||||
count: int
|
||||
|
||||
|
||||
def make_dir_data(root):
|
||||
if len(root) == 3:
|
||||
return FileEntry(*root)
|
||||
id_, size, mtime, listing = root
|
||||
converted = {}
|
||||
for name, data in listing.items():
|
||||
converted[name] = make_dir_data(data)
|
||||
sz = sum(x.size for x in converted.values())
|
||||
mt = max(x.mtime for x in converted.values())
|
||||
return DirEntry(id_, sz, max(mt, mtime), converted)
|
||||
class UpdDel(Update, tag="d"):
|
||||
count: int
|
||||
|
||||
|
||||
class UpdIns(Update, tag="i"):
|
||||
items: list[FileEntry]
|
||||
|
||||
|
||||
class UpdateMessage(msgspec.Struct):
|
||||
update: list[UpdKeep | UpdDel | UpdIns]
|
||||
|
||||
|
||||
class Space(msgspec.Struct):
|
||||
disk: int
|
||||
free: int
|
||||
usage: int
|
||||
storage: int
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
import os
|
||||
import re
|
||||
from pathlib import Path, PurePath
|
||||
import signal
|
||||
from pathlib import Path
|
||||
|
||||
from sanic import Sanic
|
||||
|
||||
@@ -11,11 +12,18 @@ def run(*, dev=False):
|
||||
"""Run Sanic main process that spawns worker processes to serve HTTP requests."""
|
||||
from .app import app
|
||||
|
||||
# Set up immediate exit on Ctrl+C for faster termination
|
||||
def signal_handler(signum, frame):
|
||||
print("\nReceived interrupt signal, exiting immediately...")
|
||||
os._exit(0)
|
||||
|
||||
signal.signal(signal.SIGINT, signal_handler)
|
||||
signal.signal(signal.SIGTERM, signal_handler)
|
||||
|
||||
url, opts = parse_listen(config.config.listen)
|
||||
# Silence Sanic's warning about running in production rather than debug
|
||||
os.environ["SANIC_IGNORE_PRODUCTION_WARNING"] = "1"
|
||||
confdir = config.conffile.parent
|
||||
wwwroot = PurePath(__file__).parent / "wwwroot"
|
||||
if opts.get("ssl"):
|
||||
# Run plain HTTP redirect/acme server on port 80
|
||||
server80.app.prepare(port=80, motd=False)
|
||||
@@ -27,10 +35,12 @@ def run(*, dev=False):
|
||||
motd=False,
|
||||
dev=dev,
|
||||
auto_reload=dev,
|
||||
reload_dir={confdir, wwwroot},
|
||||
access_log=True,
|
||||
) # type: ignore
|
||||
Sanic.serve()
|
||||
if dev:
|
||||
Sanic.serve()
|
||||
else:
|
||||
Sanic.serve_single()
|
||||
|
||||
|
||||
def check_cert(certdir, domain):
|
||||
@@ -49,7 +59,7 @@ def parse_listen(listen):
|
||||
raise ValueError(
|
||||
f"Directory for unix socket does not exist: {unix.parent}/",
|
||||
)
|
||||
return "http://localhost", {"unix": unix}
|
||||
return "http://localhost", {"unix": unix.as_posix()}
|
||||
if re.fullmatch(r"(\w+(-\w+)*\.)+\w{2,}", listen, re.UNICODE):
|
||||
return f"https://{listen}", {"host": listen, "port": 443, "ssl": True}
|
||||
try:
|
||||
|
||||
@@ -21,7 +21,6 @@ def jres(data, **kwargs):
|
||||
|
||||
|
||||
async def handle_sanic_exception(request, e):
|
||||
logger.exception(e)
|
||||
context, code = {}, 500
|
||||
message = str(e)
|
||||
if isinstance(e, SanicException):
|
||||
@@ -30,6 +29,8 @@ async def handle_sanic_exception(request, e):
|
||||
if not message or not request.app.debug and code == 500:
|
||||
message = "Internal Server Error"
|
||||
message = f"⚠️ {message}" if code < 500 else f"🛑 {message}"
|
||||
if code == 500:
|
||||
logger.exception(e)
|
||||
# Non-browsers get JSON errors
|
||||
if "text/html" not in request.headers.accept:
|
||||
return jres(
|
||||
@@ -42,7 +43,7 @@ async def handle_sanic_exception(request, e):
|
||||
res.cookies.add_cookie("message", message, max_age=5)
|
||||
return res
|
||||
# Otherwise use Sanic's default error page
|
||||
return errorpages.HTMLRenderer(request, e, debug=request.app.debug).full()
|
||||
return errorpages.HTMLRenderer(request, e, debug=request.app.debug).render()
|
||||
|
||||
|
||||
def websocket_wrapper(handler):
|
||||
@@ -54,13 +55,14 @@ def websocket_wrapper(handler):
|
||||
auth.verify(request)
|
||||
await handler(request, ws, *args, **kwargs)
|
||||
except Exception as e:
|
||||
logger.exception(e)
|
||||
context, code, message = {}, 500, str(e) or "Internal Server Error"
|
||||
if isinstance(e, SanicException):
|
||||
context = e.context or {}
|
||||
code = e.status_code
|
||||
message = f"⚠️ {message}" if code < 500 else f"🛑 {message}"
|
||||
await asend(ws, ErrorMsg({"code": code, "message": message, **context}))
|
||||
if not getattr(e, "quiet", False) or code == 500:
|
||||
logger.exception(f"{code} {e!r}")
|
||||
raise
|
||||
|
||||
return wrapper
|
||||
|
||||
@@ -10,4 +10,7 @@ def sanitize(filename: str) -> str:
|
||||
filename = filename.replace("\\", "-")
|
||||
filename = sanitize_filepath(filename)
|
||||
filename = filename.strip("/")
|
||||
return PurePosixPath(filename).as_posix()
|
||||
p = PurePosixPath(filename)
|
||||
if any(n.startswith(".") for n in p.parts):
|
||||
raise ValueError("Filenames starting with dot are not allowed")
|
||||
return p.as_posix()
|
||||
|
||||
@@ -1,211 +1,459 @@
|
||||
import asyncio
|
||||
import shutil
|
||||
import sys
|
||||
import threading
|
||||
import time
|
||||
from contextlib import suppress
|
||||
from os import stat_result
|
||||
from pathlib import Path, PurePosixPath
|
||||
from stat import S_ISDIR, S_ISREG
|
||||
|
||||
import inotify.adapters
|
||||
import msgspec
|
||||
from natsort import humansorted, natsort_keygen, ns
|
||||
from sanic.log import logger
|
||||
|
||||
from cista import config
|
||||
from cista.fileio import fuid
|
||||
from cista.protocol import DirEntry, FileEntry, UpdateEntry
|
||||
from cista.protocol import FileEntry, Space, UpdDel, UpdIns, UpdKeep
|
||||
|
||||
pubsub = {}
|
||||
tree = {"": None}
|
||||
tree_lock = threading.Lock()
|
||||
rootpath: Path = None # type: ignore
|
||||
quit = False
|
||||
modified_flags = (
|
||||
"IN_CREATE",
|
||||
"IN_DELETE",
|
||||
"IN_DELETE_SELF",
|
||||
"IN_MODIFY",
|
||||
"IN_MOVE_SELF",
|
||||
"IN_MOVED_FROM",
|
||||
"IN_MOVED_TO",
|
||||
)
|
||||
disk_usage = None
|
||||
sortkey = natsort_keygen(alg=ns.LOCALE)
|
||||
|
||||
|
||||
def watcher_thread(loop):
|
||||
global disk_usage
|
||||
class State:
|
||||
def __init__(self):
|
||||
self.lock = threading.RLock()
|
||||
self._space = Space(0, 0, 0, 0)
|
||||
self.root: list[FileEntry] = []
|
||||
|
||||
while True:
|
||||
rootpath = config.config.path
|
||||
i = inotify.adapters.InotifyTree(rootpath.as_posix())
|
||||
old = format_tree() if tree[""] else None
|
||||
with tree_lock:
|
||||
# Initialize the tree from filesystem
|
||||
tree[""] = walk(rootpath)
|
||||
msg = format_tree()
|
||||
if msg != old:
|
||||
asyncio.run_coroutine_threadsafe(broadcast(msg), loop)
|
||||
@property
|
||||
def space(self):
|
||||
with self.lock:
|
||||
return self._space
|
||||
|
||||
# The watching is not entirely reliable, so do a full refresh every minute
|
||||
refreshdl = time.monotonic() + 60.0
|
||||
|
||||
for event in i.event_gen():
|
||||
if quit:
|
||||
return
|
||||
# Disk usage update
|
||||
du = shutil.disk_usage(rootpath)
|
||||
if du != disk_usage:
|
||||
disk_usage = du
|
||||
asyncio.run_coroutine_threadsafe(broadcast(format_du()), loop)
|
||||
break
|
||||
# Do a full refresh?
|
||||
if time.monotonic() > refreshdl:
|
||||
break
|
||||
if event is None:
|
||||
continue
|
||||
_, flags, path, filename = event
|
||||
if not any(f in modified_flags for f in flags):
|
||||
continue
|
||||
# Update modified path
|
||||
path = PurePosixPath(path) / filename
|
||||
try:
|
||||
update(path.relative_to(rootpath), loop)
|
||||
except Exception as e:
|
||||
print("Watching error", e)
|
||||
break
|
||||
i = None # Free the inotify object
|
||||
@space.setter
|
||||
def space(self, space):
|
||||
with self.lock:
|
||||
self._space = space
|
||||
|
||||
|
||||
def format_du():
|
||||
return msgspec.json.encode(
|
||||
{
|
||||
"space": {
|
||||
"disk": disk_usage.total,
|
||||
"used": disk_usage.used,
|
||||
"free": disk_usage.free,
|
||||
"storage": tree[""].size,
|
||||
},
|
||||
},
|
||||
).decode()
|
||||
def treeiter(rootmod):
|
||||
relpath = PurePosixPath()
|
||||
for i, entry in enumerate(rootmod):
|
||||
if entry.level > 0:
|
||||
relpath = PurePosixPath(*relpath.parts[: entry.level - 1]) / entry.name
|
||||
yield i, relpath, entry
|
||||
|
||||
|
||||
def format_tree():
|
||||
root = tree[""]
|
||||
return msgspec.json.encode(
|
||||
{
|
||||
"update": [
|
||||
UpdateEntry(
|
||||
key=root.key, size=root.size, mtime=root.mtime, dir=root.dir
|
||||
),
|
||||
],
|
||||
},
|
||||
).decode()
|
||||
def treeget(rootmod: list[FileEntry], path: PurePosixPath):
|
||||
begin = None
|
||||
ret = []
|
||||
|
||||
|
||||
def walk(path: Path) -> DirEntry | FileEntry | None:
|
||||
try:
|
||||
s = path.stat()
|
||||
key = fuid(s)
|
||||
assert key, repr(key)
|
||||
mtime = int(s.st_mtime)
|
||||
if path.is_file():
|
||||
return FileEntry(key, s.st_size, mtime)
|
||||
|
||||
tree = {
|
||||
p.name: v
|
||||
for p in path.iterdir()
|
||||
if not p.name.startswith(".")
|
||||
if (v := walk(p)) is not None
|
||||
}
|
||||
if tree:
|
||||
size = sum(v.size for v in tree.values())
|
||||
mtime = max(mtime, *(v.mtime for v in tree.values()))
|
||||
else:
|
||||
size = 0
|
||||
return DirEntry(key, size, mtime, tree)
|
||||
except FileNotFoundError:
|
||||
return None
|
||||
except OSError as e:
|
||||
print("OS error walking path", path, e)
|
||||
return None
|
||||
|
||||
|
||||
def update(relpath: PurePosixPath, loop):
|
||||
"""Called by inotify updates, check the filesystem and broadcast any changes."""
|
||||
new = walk(rootpath / relpath)
|
||||
with tree_lock:
|
||||
update = update_internal(relpath, new)
|
||||
if not update:
|
||||
return # No changes
|
||||
msg = msgspec.json.encode({"update": update}).decode()
|
||||
asyncio.run_coroutine_threadsafe(broadcast(msg), loop)
|
||||
|
||||
|
||||
def update_internal(
|
||||
relpath: PurePosixPath,
|
||||
new: DirEntry | FileEntry | None,
|
||||
) -> list[UpdateEntry]:
|
||||
path = "", *relpath.parts
|
||||
old = tree
|
||||
elems = []
|
||||
for name in path:
|
||||
if name not in old:
|
||||
# File or folder created
|
||||
old = None
|
||||
elems.append((name, None))
|
||||
if len(elems) < len(path):
|
||||
# We got a notify for an item whose parent is not in tree
|
||||
print("Tree out of sync DEBUG", relpath)
|
||||
print(elems)
|
||||
print("Current tree:")
|
||||
print(tree[""])
|
||||
print("Walking all:")
|
||||
print(walk(rootpath))
|
||||
raise ValueError("Tree out of sync")
|
||||
for i, relpath, entry in treeiter(rootmod):
|
||||
if begin is None:
|
||||
if relpath == path:
|
||||
begin = i
|
||||
ret.append(entry)
|
||||
continue
|
||||
if entry.level <= len(path.parts):
|
||||
break
|
||||
ret.append(entry)
|
||||
|
||||
return begin, ret
|
||||
|
||||
|
||||
def treeinspos(rootmod: list[FileEntry], relpath: PurePosixPath, relfile: int):
|
||||
# Find the first entry greater than the new one
|
||||
# precondition: the new entry doesn't exist
|
||||
isfile = 0
|
||||
level = 0
|
||||
i = 0
|
||||
for i, rel, entry in treeiter(rootmod):
|
||||
if entry.level > level:
|
||||
# We haven't found item at level, skip subdirectories
|
||||
continue
|
||||
if entry.level < level:
|
||||
# We have passed the level, so the new item is the first
|
||||
return i
|
||||
if level == 0:
|
||||
# root
|
||||
level += 1
|
||||
continue
|
||||
|
||||
ename = rel.parts[level - 1]
|
||||
name = relpath.parts[level - 1]
|
||||
|
||||
esort = sortkey(ename)
|
||||
nsort = sortkey(name)
|
||||
# Non-leaf are always folders, only use relfile at leaf
|
||||
isfile = relfile if len(relpath.parts) == level else 0
|
||||
|
||||
# First compare by isfile, then by sorting order and if that too matches then case sensitive
|
||||
cmp = (
|
||||
entry.isfile - isfile
|
||||
or (esort > nsort) - (esort < nsort)
|
||||
or (ename > name) - (ename < name)
|
||||
)
|
||||
|
||||
if cmp > 0:
|
||||
return i
|
||||
if cmp < 0:
|
||||
continue
|
||||
|
||||
level += 1
|
||||
if level > len(relpath.parts):
|
||||
logger.error(
|
||||
f"insertpos level overflow: relpath={relpath}, i={i}, entry.name={entry.name}, entry.level={entry.level}, level={level}"
|
||||
)
|
||||
break
|
||||
old = old[name]
|
||||
elems.append((name, old))
|
||||
if old == new:
|
||||
return []
|
||||
mt = new.mtime if new else 0
|
||||
szdiff = (new.size if new else 0) - (old.size if old else 0)
|
||||
# Update parents
|
||||
update = []
|
||||
for name, entry in elems[:-1]:
|
||||
u = UpdateEntry(name)
|
||||
if szdiff:
|
||||
entry.size += szdiff
|
||||
u.size = entry.size
|
||||
if mt > entry.mtime:
|
||||
u.mtime = entry.mtime = mt
|
||||
update.append(u)
|
||||
# The last element is the one that changed
|
||||
name, entry = elems[-1]
|
||||
parent = elems[-2][1] if len(elems) > 1 else tree
|
||||
u = UpdateEntry(name)
|
||||
if new:
|
||||
parent[name] = new
|
||||
if u.size != new.size:
|
||||
u.size = new.size
|
||||
if u.mtime != new.mtime:
|
||||
u.mtime = new.mtime
|
||||
if isinstance(new, DirEntry) and u.dir == new.dir:
|
||||
u.dir = new.dir
|
||||
else:
|
||||
del parent[name]
|
||||
u.deleted = True
|
||||
update.append(u)
|
||||
return update
|
||||
i += 1
|
||||
|
||||
return i
|
||||
|
||||
|
||||
async def broadcast(msg):
|
||||
for queue in pubsub.values():
|
||||
await queue.put_nowait(msg)
|
||||
state = State()
|
||||
rootpath: Path = None # type: ignore
|
||||
quit = threading.Event()
|
||||
|
||||
## Filesystem scanning
|
||||
|
||||
|
||||
def walk(rel: PurePosixPath, stat: stat_result | None = None) -> list[FileEntry]:
|
||||
path = rootpath / rel
|
||||
ret = []
|
||||
try:
|
||||
st = stat or path.stat()
|
||||
isfile = int(not S_ISDIR(st.st_mode))
|
||||
entry = FileEntry(
|
||||
level=len(rel.parts),
|
||||
name=rel.name,
|
||||
key=fuid(st),
|
||||
mtime=int(st.st_mtime),
|
||||
size=st.st_size if isfile else 0,
|
||||
isfile=isfile,
|
||||
)
|
||||
if isfile:
|
||||
return [entry]
|
||||
# Walk all entries of the directory
|
||||
ret: list[FileEntry] = [...] # type: ignore
|
||||
li = []
|
||||
for f in path.iterdir():
|
||||
if quit.is_set():
|
||||
raise SystemExit("quit")
|
||||
if f.name.startswith("."):
|
||||
continue # No dotfiles
|
||||
with suppress(FileNotFoundError):
|
||||
s = f.lstat()
|
||||
isfile = S_ISREG(s.st_mode)
|
||||
isdir = S_ISDIR(s.st_mode)
|
||||
if not isfile and not isdir:
|
||||
continue
|
||||
li.append((int(isfile), f.name, s))
|
||||
# Build the tree as a list of FileEntries
|
||||
for [_, name, s] in humansorted(li):
|
||||
sub = walk(rel / name, stat=s)
|
||||
child = sub[0]
|
||||
entry = FileEntry(
|
||||
level=entry.level,
|
||||
name=entry.name,
|
||||
key=entry.key,
|
||||
size=entry.size + child.size,
|
||||
mtime=max(entry.mtime, child.mtime),
|
||||
isfile=entry.isfile,
|
||||
)
|
||||
ret.extend(sub)
|
||||
except FileNotFoundError:
|
||||
pass # Things may be rapidly in motion
|
||||
except OSError as e:
|
||||
if e.errno == 13: # Permission denied
|
||||
pass
|
||||
logger.error(f"Watching {path=}: {e!r}")
|
||||
if ret:
|
||||
ret[0] = entry
|
||||
return ret
|
||||
|
||||
|
||||
def update_root(loop):
|
||||
"""Full filesystem scan"""
|
||||
old = state.root
|
||||
new = walk(PurePosixPath())
|
||||
if old != new:
|
||||
update = format_update(old, new)
|
||||
with state.lock:
|
||||
broadcast(update, loop)
|
||||
state.root = new
|
||||
|
||||
|
||||
def update_path(rootmod: list[FileEntry], relpath: PurePosixPath, loop):
|
||||
"""Called on FS updates, check the filesystem and broadcast any changes."""
|
||||
new = walk(relpath)
|
||||
obegin, old = treeget(rootmod, relpath)
|
||||
|
||||
if old == new:
|
||||
return
|
||||
|
||||
if obegin is not None:
|
||||
del rootmod[obegin : obegin + len(old)]
|
||||
|
||||
if new:
|
||||
i = treeinspos(rootmod, relpath, new[0].isfile)
|
||||
rootmod[i:i] = new
|
||||
|
||||
|
||||
def update_space(loop):
|
||||
"""Called periodically to update the disk usage."""
|
||||
du = shutil.disk_usage(rootpath)
|
||||
space = Space(*du, storage=state.root[0].size)
|
||||
# Update only on difference above 1 MB
|
||||
tol = 10**6
|
||||
old = msgspec.structs.astuple(state.space)
|
||||
new = msgspec.structs.astuple(space)
|
||||
if any(abs(o - n) > tol for o, n in zip(old, new, strict=True)):
|
||||
state.space = space
|
||||
broadcast(format_space(space), loop)
|
||||
|
||||
|
||||
## Messaging
|
||||
|
||||
|
||||
def format_update(old, new):
|
||||
# Make keep/del/insert diff until one of the lists ends
|
||||
oidx, nidx = 0, 0
|
||||
oremain, nremain = set(old), set(new)
|
||||
update = []
|
||||
keep_count = 0
|
||||
iteration_count = 0
|
||||
# Precompute index maps to allow deterministic tie-breaking when both
|
||||
# candidates exist in both sequences but are not equal (rename/move cases)
|
||||
old_pos = {e: i for i, e in enumerate(old)}
|
||||
new_pos = {e: i for i, e in enumerate(new)}
|
||||
|
||||
while oidx < len(old) and nidx < len(new):
|
||||
iteration_count += 1
|
||||
|
||||
# Emergency brake for potential infinite loops
|
||||
if iteration_count > 50000:
|
||||
logger.error(
|
||||
f"format_update potential infinite loop! iteration={iteration_count}, oidx={oidx}, nidx={nidx}"
|
||||
)
|
||||
raise Exception(
|
||||
f"format_update infinite loop detected at iteration {iteration_count}"
|
||||
)
|
||||
|
||||
modified = False
|
||||
# Matching entries are kept
|
||||
if old[oidx] == new[nidx]:
|
||||
entry = old[oidx]
|
||||
oremain.discard(entry)
|
||||
nremain.discard(entry)
|
||||
keep_count += 1
|
||||
oidx += 1
|
||||
nidx += 1
|
||||
continue
|
||||
|
||||
if keep_count > 0:
|
||||
modified = True
|
||||
update.append(UpdKeep(keep_count))
|
||||
keep_count = 0
|
||||
|
||||
# Items only in old are deleted
|
||||
del_count = 0
|
||||
while oidx < len(old) and old[oidx] not in nremain:
|
||||
oremain.remove(old[oidx])
|
||||
del_count += 1
|
||||
oidx += 1
|
||||
if del_count:
|
||||
update.append(UpdDel(del_count))
|
||||
continue
|
||||
|
||||
# Items only in new are inserted
|
||||
insert_items = []
|
||||
while nidx < len(new) and new[nidx] not in oremain:
|
||||
entry = new[nidx]
|
||||
nremain.discard(entry)
|
||||
insert_items.append(entry)
|
||||
nidx += 1
|
||||
if insert_items:
|
||||
modified = True
|
||||
update.append(UpdIns(insert_items))
|
||||
|
||||
if not modified:
|
||||
# Tie-break: both items exist in both lists but don't match here.
|
||||
# Decide whether to delete old[oidx] first or insert new[nidx] first
|
||||
# based on which alignment is closer.
|
||||
if oidx >= len(old) or nidx >= len(new):
|
||||
break
|
||||
cur_old = old[oidx]
|
||||
cur_new = new[nidx]
|
||||
|
||||
pos_old_in_new = new_pos.get(cur_old)
|
||||
pos_new_in_old = old_pos.get(cur_new)
|
||||
|
||||
# Default distances if not present (shouldn't happen if in remain sets)
|
||||
dist_del = (pos_old_in_new - nidx) if pos_old_in_new is not None else 1
|
||||
dist_ins = (pos_new_in_old - oidx) if pos_new_in_old is not None else 1
|
||||
|
||||
# Prefer the operation with smaller forward distance; tie => delete
|
||||
if dist_del <= dist_ins:
|
||||
# Delete current old item
|
||||
oremain.discard(cur_old)
|
||||
update.append(UpdDel(1))
|
||||
oidx += 1
|
||||
else:
|
||||
# Insert current new item
|
||||
nremain.discard(cur_new)
|
||||
update.append(UpdIns([cur_new]))
|
||||
nidx += 1
|
||||
|
||||
# Diff any remaining
|
||||
if keep_count > 0:
|
||||
update.append(UpdKeep(keep_count))
|
||||
if oremain:
|
||||
update.append(UpdDel(len(oremain)))
|
||||
elif nremain:
|
||||
update.append(UpdIns(new[nidx:]))
|
||||
|
||||
return msgspec.json.encode({"update": update}).decode()
|
||||
|
||||
|
||||
def format_space(usage):
|
||||
return msgspec.json.encode({"space": usage}).decode()
|
||||
|
||||
|
||||
def format_root(root):
|
||||
return msgspec.json.encode({"root": root}).decode()
|
||||
|
||||
|
||||
def broadcast(msg, loop):
|
||||
return asyncio.run_coroutine_threadsafe(abroadcast(msg), loop).result()
|
||||
|
||||
|
||||
async def abroadcast(msg):
|
||||
try:
|
||||
for queue in pubsub.values():
|
||||
queue.put_nowait(msg)
|
||||
except Exception:
|
||||
# Log because asyncio would silently eat the error
|
||||
logger.exception("Broadcast error")
|
||||
|
||||
|
||||
## Watcher thread
|
||||
|
||||
|
||||
def watcher_inotify(loop):
|
||||
"""Inotify watcher thread (Linux only)"""
|
||||
import inotify.adapters
|
||||
|
||||
modified_flags = (
|
||||
"IN_CREATE",
|
||||
"IN_DELETE",
|
||||
"IN_DELETE_SELF",
|
||||
"IN_MODIFY",
|
||||
"IN_MOVE_SELF",
|
||||
"IN_MOVED_FROM",
|
||||
"IN_MOVED_TO",
|
||||
)
|
||||
while not quit.is_set():
|
||||
i = inotify.adapters.InotifyTree(rootpath.as_posix())
|
||||
# Initialize the tree from filesystem
|
||||
update_root(loop)
|
||||
trefresh = time.monotonic() + 300.0
|
||||
tspace = time.monotonic() + 5.0
|
||||
# Watch for changes (frequent wakeups needed for quiting)
|
||||
while not quit.is_set():
|
||||
t = time.monotonic()
|
||||
# The watching is not entirely reliable, so do a full refresh every 30 seconds
|
||||
if t >= trefresh:
|
||||
break
|
||||
# Disk usage update
|
||||
if t >= tspace:
|
||||
tspace = time.monotonic() + 5.0
|
||||
update_space(loop)
|
||||
# Inotify events, update the tree
|
||||
dirty = False
|
||||
rootmod = state.root[:]
|
||||
for event in i.event_gen(yield_nones=False, timeout_s=0.1):
|
||||
assert event
|
||||
if quit.is_set():
|
||||
return
|
||||
interesting = any(f in modified_flags for f in event[1])
|
||||
if interesting:
|
||||
# Update modified path
|
||||
path = PurePosixPath(event[2]) / event[3]
|
||||
try:
|
||||
rel_path = path.relative_to(rootpath)
|
||||
update_path(rootmod, rel_path, loop)
|
||||
except Exception as e:
|
||||
logger.error(
|
||||
f"Error processing inotify event for path {path}: {e}"
|
||||
)
|
||||
raise
|
||||
if not dirty:
|
||||
t = time.monotonic()
|
||||
dirty = True
|
||||
# Wait a maximum of 0.2s to push the updates
|
||||
if dirty and time.monotonic() >= t + 0.2:
|
||||
break
|
||||
if dirty and state.root != rootmod:
|
||||
try:
|
||||
update = format_update(state.root, rootmod)
|
||||
with state.lock:
|
||||
broadcast(update, loop)
|
||||
state.root = rootmod
|
||||
except Exception:
|
||||
logger.exception(
|
||||
"format_update failed; falling back to full rescan"
|
||||
)
|
||||
# Fallback: full rescan and try diff again; last resort send full root
|
||||
try:
|
||||
fresh = walk(PurePosixPath())
|
||||
try:
|
||||
update = format_update(state.root, fresh)
|
||||
with state.lock:
|
||||
broadcast(update, loop)
|
||||
state.root = fresh
|
||||
except Exception:
|
||||
logger.exception(
|
||||
"Fallback diff failed; sending full root snapshot"
|
||||
)
|
||||
with state.lock:
|
||||
broadcast(format_root(fresh), loop)
|
||||
state.root = fresh
|
||||
except Exception:
|
||||
logger.exception(
|
||||
"Full rescan failed; dropping this batch of updates"
|
||||
)
|
||||
|
||||
del i # Free the inotify object
|
||||
|
||||
|
||||
def watcher_poll(loop):
|
||||
"""Polling version of the watcher thread."""
|
||||
while not quit.is_set():
|
||||
t0 = time.perf_counter()
|
||||
update_root(loop)
|
||||
update_space(loop)
|
||||
dur = time.perf_counter() - t0
|
||||
if dur > 1.0:
|
||||
logger.debug(f"Reading the full file list took {dur:.1f}s")
|
||||
quit.wait(0.1 + 8 * dur)
|
||||
|
||||
|
||||
async def start(app, loop):
|
||||
global rootpath
|
||||
config.load_config()
|
||||
app.ctx.watcher = threading.Thread(target=watcher_thread, args=[loop])
|
||||
rootpath = config.config.path
|
||||
use_inotify = sys.platform == "linux"
|
||||
app.ctx.watcher = threading.Thread(
|
||||
target=watcher_inotify if use_inotify else watcher_poll,
|
||||
args=[loop],
|
||||
# Descriptive name for system monitoring
|
||||
name=f"cista-watcher {rootpath}",
|
||||
)
|
||||
app.ctx.watcher.start()
|
||||
|
||||
|
||||
async def stop(app, loop):
|
||||
global quit
|
||||
quit = True
|
||||
quit.set()
|
||||
app.ctx.watcher.join()
|
||||
|
||||
BIN
docs/cista.webp
Normal file
|
After Width: | Height: | Size: 40 KiB |
2
frontend/.npmrc
Normal file
@@ -0,0 +1,2 @@
|
||||
audit=false
|
||||
fund=false
|
||||
@@ -1,40 +1,46 @@
|
||||
# cista-front
|
||||
|
||||
This template should help get you started developing with Vue 3 in Vite.
|
||||
|
||||
## Recommended IDE Setup
|
||||
|
||||
[VSCode](https://code.visualstudio.com/) + [Volar](https://marketplace.visualstudio.com/items?itemName=Vue.volar) (and disable Vetur) + [TypeScript Vue Plugin (Volar)](https://marketplace.visualstudio.com/items?itemName=Vue.vscode-typescript-vue-plugin).
|
||||
|
||||
## Type Support for `.vue` Imports in TS
|
||||
|
||||
TypeScript cannot handle type information for `.vue` imports by default, so we replace the `tsc` CLI with `vue-tsc` for type checking. In editors, we need [TypeScript Vue Plugin (Volar)](https://marketplace.visualstudio.com/items?itemName=Vue.vscode-typescript-vue-plugin) to make the TypeScript language service aware of `.vue` types.
|
||||
|
||||
If the standalone TypeScript plugin doesn't feel fast enough to you, Volar has also implemented a [Take Over Mode](https://github.com/johnsoncodehk/volar/discussions/471#discussioncomment-1361669) that is more performant. You can enable it by the following steps:
|
||||
|
||||
1. Disable the built-in TypeScript Extension
|
||||
1) Run `Extensions: Show Built-in Extensions` from VSCode's command palette
|
||||
2) Find `TypeScript and JavaScript Language Features`, right click and select `Disable (Workspace)`
|
||||
2. Reload the VSCode window by running `Developer: Reload Window` from the command palette.
|
||||
|
||||
## Customize configuration
|
||||
|
||||
See [Vite Configuration Reference](https://vitejs.dev/config/).
|
||||
|
||||
## Project Setup
|
||||
|
||||
```sh
|
||||
npm install
|
||||
```
|
||||
|
||||
### Compile and Hot-Reload for Development
|
||||
|
||||
```sh
|
||||
npm run dev
|
||||
```
|
||||
|
||||
### Type-Check, Compile and Minify for Production
|
||||
|
||||
```sh
|
||||
npm run build
|
||||
```
|
||||
# Cista Vue Frontend
|
||||
|
||||
The frontend is a Single-Page App implemented with Vue 3. Development uses the Vite server together with the main Python backend, but in production the latter also serves the prebuilt frontend files.
|
||||
|
||||
## Recommended IDE Setup
|
||||
|
||||
[VSCode](https://code.visualstudio.com/) + [Volar](https://marketplace.visualstudio.com/items?itemName=Vue.volar) (and disable Vetur) + [TypeScript Vue Plugin (Volar)](https://marketplace.visualstudio.com/items?itemName=Vue.vscode-typescript-vue-plugin).
|
||||
|
||||
## Type Support for `.vue` Imports in TS
|
||||
|
||||
TypeScript cannot handle type information for `.vue` imports by default, so we replace the `tsc` CLI with `vue-tsc` for type checking. In editors, we need [TypeScript Vue Plugin (Volar)](https://marketplace.visualstudio.com/items?itemName=Vue.vscode-typescript-vue-plugin) to make the TypeScript language service aware of `.vue` types.
|
||||
|
||||
If the standalone TypeScript plugin doesn't feel fast enough to you, Volar has also implemented a [Take Over Mode](https://github.com/johnsoncodehk/volar/discussions/471#discussioncomment-1361669) that is more performant. You can enable it by the following steps:
|
||||
|
||||
1. Disable the built-in TypeScript Extension
|
||||
1) Run `Extensions: Show Built-in Extensions` from VSCode's command palette
|
||||
2) Find `TypeScript and JavaScript Language Features`, right click and select `Disable (Workspace)`
|
||||
2. Reload the VSCode window by running `Developer: Reload Window` from the command palette.
|
||||
|
||||
## Hot-Reload for Development
|
||||
|
||||
### Run the backend
|
||||
|
||||
```fish
|
||||
uv sync --dev
|
||||
uv run cista --dev -l :8000
|
||||
```
|
||||
|
||||
### And the Vite server (in another terminal)
|
||||
|
||||
```fish
|
||||
cd frontend
|
||||
bun install
|
||||
bun run dev
|
||||
```
|
||||
Browse to Vite, which will proxy API requests to port 8000. Both servers live reload changes.
|
||||
|
||||
|
||||
### Type-Check, Compile and Minify for Production
|
||||
|
||||
This is also called by `uv build` during Python packaging:
|
||||
|
||||
```fish
|
||||
bun run build
|
||||
```
|
||||
|
||||
@@ -1,12 +1,11 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang=en>
|
||||
<meta charset=UTF-8>
|
||||
<title>Cista</title>
|
||||
<title>Cista Storage</title>
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
|
||||
<link rel="icon" href="/favicon.ico">
|
||||
<link rel="icon" href="/src/assets/logo.svg">
|
||||
<link rel="preconnect" href="https://fonts.googleapis.com">
|
||||
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
|
||||
<link href="https://fonts.googleapis.com/css2?family=Roboto+Mono&family=Roboto:wght@400;700&display=swap" rel="stylesheet">
|
||||
<script type="module" src="/src/main.ts"></script>
|
||||
|
||||
<div id="app"></div>
|
||||
<body id="app">
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"name": "front",
|
||||
"name": "cista-frontend",
|
||||
"version": "0.0.0",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
@@ -12,7 +12,11 @@
|
||||
"lint": "eslint . --ext .vue,.js,.jsx,.cjs,.mjs,.ts,.tsx,.cts,.mts --fix --ignore-path .gitignore",
|
||||
"format": "prettier --write src/"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=18.0.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"@imengyu/vue3-context-menu": "^1.3.3",
|
||||
"@vueuse/core": "^10.4.1",
|
||||
"esbuild": "^0.19.5",
|
||||
"lodash": "^4.17.21",
|
||||
@@ -20,7 +24,6 @@
|
||||
"pinia": "^2.1.6",
|
||||
"pinia-plugin-persistedstate": "^3.2.0",
|
||||
"unplugin-vue-components": "^0.25.2",
|
||||
"vite-plugin-rewrite-all": "^1.0.1",
|
||||
"vite-svg-loader": "^4.0.0",
|
||||
"vue": "^3.3.4",
|
||||
"vue-router": "^4.2.4"
|
||||
2
frontend/public/robots.txt
Normal file
@@ -0,0 +1,2 @@
|
||||
User-agent: *
|
||||
Disallow: /
|
||||
158
frontend/src/App.vue
Normal file
@@ -0,0 +1,158 @@
|
||||
<template>
|
||||
<LoginModal />
|
||||
<SettingsModal />
|
||||
<header>
|
||||
<HeaderMain ref="headerMain" :path="path.pathList" :query="path.query">
|
||||
<HeaderSelected :path="path.pathList" />
|
||||
</HeaderMain>
|
||||
<BreadCrumb :path="path.pathList" primary />
|
||||
</header>
|
||||
<main>
|
||||
<RouterView :path="path.pathList" :query="path.query" />
|
||||
</main>
|
||||
<footer>
|
||||
<TransferBar :status=store.uprogress @cancel=store.cancelUploads class=upload />
|
||||
<TransferBar :status=store.dprogress @cancel=store.cancelDownloads class=download />
|
||||
</footer>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { RouterView } from 'vue-router'
|
||||
import type { ComputedRef } from 'vue'
|
||||
import type HeaderMain from '@/components/HeaderMain.vue'
|
||||
import { onMounted, onUnmounted, ref, watchEffect } from 'vue'
|
||||
import { loadSession, watchConnect, watchDisconnect } from '@/repositories/WS'
|
||||
import { useMainStore } from '@/stores/main'
|
||||
|
||||
import { computed } from 'vue'
|
||||
import Router from '@/router/index'
|
||||
import type { SortOrder } from './utils/docsort'
|
||||
import type SettingsModalVue from './components/SettingsModal.vue'
|
||||
|
||||
interface Path {
|
||||
path: string
|
||||
pathList: string[]
|
||||
query: string
|
||||
}
|
||||
const store = useMainStore()
|
||||
const path: ComputedRef<Path> = computed(() => {
|
||||
const p = decodeURIComponent(Router.currentRoute.value.path).split('//')
|
||||
const pathList = p[0].split('/').filter(value => value !== '')
|
||||
const query = p.slice(1).join('//')
|
||||
return {
|
||||
path: p[0],
|
||||
pathList,
|
||||
query
|
||||
}
|
||||
})
|
||||
watchEffect(() => {
|
||||
document.title = path.value.path.replace(/\/$/, '').split('/').pop() || store.server.name || 'Cista Storage'
|
||||
})
|
||||
onMounted(loadSession)
|
||||
onMounted(watchConnect)
|
||||
onUnmounted(watchDisconnect)
|
||||
const headerMain = ref<typeof HeaderMain | null>(null)
|
||||
let vert = 0
|
||||
let timer: any = null
|
||||
const globalShortcutHandler = (event: KeyboardEvent) => {
|
||||
if (store.dialog) {
|
||||
if (timer) {
|
||||
clearTimeout(timer)
|
||||
timer = null
|
||||
}
|
||||
return
|
||||
}
|
||||
const fileExplorer = store.fileExplorer as any
|
||||
if (!fileExplorer) return
|
||||
const c = fileExplorer.isCursor()
|
||||
const input = (event.target as HTMLElement).tagName === 'INPUT'
|
||||
const keyup = event.type === 'keyup'
|
||||
if (event.repeat) {
|
||||
if (
|
||||
event.key === 'ArrowUp' ||
|
||||
event.key === 'ArrowDown' ||
|
||||
event.key === 'ArrowLeft' ||
|
||||
event.key === 'ArrowRight' ||
|
||||
(c && event.code === 'Space')
|
||||
) {
|
||||
if (!input) event.preventDefault()
|
||||
}
|
||||
return
|
||||
}
|
||||
//console.log("key pressed", event)
|
||||
/// Long if-else machina for all keys we handle here
|
||||
let arrow = ''
|
||||
if (!input && event.key.startsWith("Arrow")) arrow = event.key.slice(5).toLowerCase()
|
||||
// Find: process on keydown so that we can bypass the built-in search hotkey
|
||||
else if (!keyup && event.key === 'f' && (event.ctrlKey || event.metaKey)) {
|
||||
headerMain.value!.toggleSearchInput()
|
||||
}
|
||||
// Search also on / (UNIX style)
|
||||
else if (!input && keyup && event.key === '/') {
|
||||
headerMain.value!.toggleSearchInput()
|
||||
}
|
||||
// Globally close search, clear errors on Escape
|
||||
else if (keyup && event.key === 'Escape') {
|
||||
store.error = ''
|
||||
headerMain.value!.closeSearch(event)
|
||||
store.focusBreadcrumb()
|
||||
}
|
||||
else if (!input && keyup && event.key === 'Backspace') {
|
||||
Router.back()
|
||||
}
|
||||
// Select all (toggle); keydown to precede and prevent builtin
|
||||
else if (!input && !keyup && event.key === 'a' && (event.ctrlKey || event.metaKey)) {
|
||||
fileExplorer.toggleSelectAll()
|
||||
}
|
||||
// G toggles Gallery
|
||||
else if (!input && keyup && event.key === 'g') {
|
||||
store.prefs.gallery = !store.prefs.gallery
|
||||
}
|
||||
// Keys Backquote-1-2-3 to sort columns
|
||||
else if (
|
||||
!input &&
|
||||
keyup &&
|
||||
(event.code === 'Backquote' || event.key === '1' || event.key === '2' || event.key === '3')
|
||||
) {
|
||||
store.sort(['', 'name', 'modified', 'size'][+event.key || 0] as SortOrder)
|
||||
}
|
||||
// Rename
|
||||
else if (!input && c && keyup && !event.ctrlKey && (event.key === 'F2' || event.key === 'r')) {
|
||||
fileExplorer.cursorRename()
|
||||
}
|
||||
// Toggle selections on file explorer; ignore all spaces to prevent scrolling built-in hotkey
|
||||
else if (!input && c && event.code === 'Space') {
|
||||
if (keyup && !event.altKey && !event.ctrlKey)
|
||||
fileExplorer.cursorSelect()
|
||||
}
|
||||
else return
|
||||
/// We are handling this!
|
||||
event.preventDefault()
|
||||
if (timer) {
|
||||
clearTimeout(timer) // Good for either timeout or interval
|
||||
timer = null
|
||||
}
|
||||
let f: any
|
||||
switch (arrow) {
|
||||
case 'up': f = () => fileExplorer.up(event); break
|
||||
case 'down': f = () => fileExplorer.down(event); break
|
||||
case 'left': f = () => fileExplorer.left(event); break
|
||||
case 'right': f = () => fileExplorer.right(event); break
|
||||
}
|
||||
if (f && !keyup) {
|
||||
// Initial move, then t0 delay until repeats at tr intervals
|
||||
const t0 = 200, tr = event.altKey ? 20 : 100
|
||||
f()
|
||||
timer = setTimeout(() => { timer = setInterval(f, tr) }, t0 - tr)
|
||||
}
|
||||
}
|
||||
onMounted(() => {
|
||||
window.addEventListener('keydown', globalShortcutHandler)
|
||||
window.addEventListener('keyup', globalShortcutHandler)
|
||||
})
|
||||
onUnmounted(() => {
|
||||
window.removeEventListener('keydown', globalShortcutHandler)
|
||||
window.removeEventListener('keyup', globalShortcutHandler)
|
||||
})
|
||||
export type { Path }
|
||||
</script>
|
||||
1
frontend/src/assets/logo.svg
Normal file
@@ -0,0 +1 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="512" height="512" viewBox="0 0 512 512"><rect width="512" height="512" fill="#f80" rx="64" ry="64"/><path fill="#fff" d="M381 298h-84V167h-66L339 35l108 132h-66zm-168-84h-84v131H63l108 132 108-132h-66z"/></svg>
|
||||
|
After Width: | Height: | Size: 258 B |
@@ -3,68 +3,83 @@
|
||||
:root {
|
||||
--primary-color: #000;
|
||||
--primary-background: #ddd;
|
||||
--header-background: #246;
|
||||
--header-background: var(--soft-color);
|
||||
--header-color: #ccc;
|
||||
--input-background: #fff;
|
||||
--input-color: #000;
|
||||
--primary-color: #000;
|
||||
--soft-color: #146;
|
||||
--accent-color: #f80;
|
||||
--transition-time: 0.2s;
|
||||
/* The following are overridden by responsive layouts */
|
||||
--root-font-size: 1rem;
|
||||
--header-font-size: 1rem;
|
||||
--header-height: calc(8 * var(--header-font-size));
|
||||
--header-height: 4rem;
|
||||
}
|
||||
@media (prefers-color-scheme: dark) {
|
||||
:root {
|
||||
--primary-color: #ddd;
|
||||
--primary-background: #003;
|
||||
--primary-background: var(--soft-color);
|
||||
--header-background: #000;
|
||||
--header-color: #ccc;
|
||||
}
|
||||
--input-background: var(--soft-color);
|
||||
--input-color: #ddd;
|
||||
}
|
||||
}
|
||||
@media screen and (max-width: 600px) {
|
||||
.size,
|
||||
.modified {
|
||||
.modified,
|
||||
.summary {
|
||||
display: none;
|
||||
}
|
||||
}
|
||||
@media screen and (orientation: landscape) and (min-width: 1200px) {
|
||||
/* Breadcrumbs and buttons side by side */
|
||||
header {
|
||||
display: flex;
|
||||
flex-direction: row-reverse;
|
||||
justify-content: space-between;
|
||||
align-items: end;
|
||||
}
|
||||
header .breadcrumb {
|
||||
font-size: 1.7em;
|
||||
flex-shrink: 10;
|
||||
}
|
||||
}
|
||||
@media screen and (min-width: 800px) and (--webkit-min-device-pixel-ratio: 2) {
|
||||
@media screen and (min-width: 1000px) {
|
||||
:root {
|
||||
--root-font-size: calc(16 * 100vw / 800);
|
||||
}
|
||||
header .buttons:has(input[type='search']) > div {
|
||||
display: none;
|
||||
}
|
||||
header .buttons > div:has(input[type='search']) {
|
||||
display: inherit;
|
||||
--root-font-size: calc(8px + 8 * 100vw / 1000);
|
||||
}
|
||||
}
|
||||
@media screen and (min-width: 1600px) and (--webkit-min-device-pixel-ratio: 3) {
|
||||
@media screen and (min-width: 2000px) {
|
||||
:root {
|
||||
--root-font-size: 2rem;
|
||||
--root-font-size: 1.5rem;
|
||||
}
|
||||
}
|
||||
/* Low (landscape) screens: smaller header */
|
||||
@media screen and (max-height: 600px) {
|
||||
:root {
|
||||
--header-font-size: calc(16 * 100vh / 600); /* 16px (1rem nominal) at 600px height */
|
||||
--header-font-size: calc(10px + 10 * 100vh / 600); /* 20px at 600px height */
|
||||
--root-font-size: 0.8rem;
|
||||
--header-height: 2rem;
|
||||
}
|
||||
header .breadcrumb > * {
|
||||
padding-top: calc(8 + 8 * 100vh / 600) !important;
|
||||
padding-bottom: calc(8 + 8 * 100vh / 600) !important;
|
||||
}
|
||||
}
|
||||
@media screen and (max-height: 300px) {
|
||||
:root {
|
||||
--header-font-size: 0.5rem; /* Don't go smaller than this, no benefit */
|
||||
--header-font-size: 15px; /* Don't go smaller than this, no benefit */
|
||||
--header-height: calc(1.75 * 16px);
|
||||
--root-font-size: 0.6rem;
|
||||
}
|
||||
header .breadcrumb > * {
|
||||
padding-top: 14px !important;
|
||||
padding-bottom: 14px !important;
|
||||
}
|
||||
}
|
||||
@media screen and (orientation: landscape) and (min-width: 700px) {
|
||||
/* Breadcrumbs and buttons side by side */
|
||||
:root {
|
||||
--header-font-size: calc(8px + 8 * 100vh / 600); /* 16px (1rem nominal) at 600px height */
|
||||
}
|
||||
header {
|
||||
display: flex;
|
||||
justify-content: space-between;
|
||||
}
|
||||
header .headermain { order: 1; }
|
||||
header .breadcrumb { align-self: stretch; }
|
||||
header .action-button {
|
||||
width: 2em;
|
||||
height: 2em;
|
||||
}
|
||||
}
|
||||
@media print {
|
||||
@@ -74,10 +89,10 @@
|
||||
--header-background: none;
|
||||
--header-color: black;
|
||||
}
|
||||
nav,
|
||||
.headermain,
|
||||
.menu,
|
||||
.rename-button {
|
||||
display: none;
|
||||
display: none !important;
|
||||
}
|
||||
.breadcrumb > a {
|
||||
color: black !important;
|
||||
@@ -92,16 +107,31 @@
|
||||
}
|
||||
.breadcrumb svg {
|
||||
fill: black !important;
|
||||
margin: 0 .5rem 0 1rem !important;
|
||||
}
|
||||
body#app {
|
||||
height: auto !important;
|
||||
}
|
||||
main {
|
||||
height: auto !important;
|
||||
padding-bottom: 0 !important;
|
||||
}
|
||||
thead tr {
|
||||
font-size: 1rem !important;
|
||||
position: static !important;
|
||||
background: none !important;
|
||||
border-bottom: 1pt solid black !important;
|
||||
}
|
||||
audio::-webkit-media-controls-timeline,
|
||||
video::-webkit-media-controls-timeline {
|
||||
display: none;
|
||||
}
|
||||
audio::-webkit-media-controls,
|
||||
video::-webkit-media-controls {
|
||||
display: none;
|
||||
}
|
||||
tr, figure {
|
||||
page-break-inside: avoid;
|
||||
}
|
||||
.selection {
|
||||
min-width: 0 !important;
|
||||
padding: 0 !important;
|
||||
@@ -118,12 +148,12 @@
|
||||
left: 0;
|
||||
}
|
||||
}
|
||||
html {
|
||||
overflow: hidden;
|
||||
* {
|
||||
box-sizing: border-box;
|
||||
}
|
||||
/* Hide scrollbar for all browsers */
|
||||
main::-webkit-scrollbar {
|
||||
display: none;
|
||||
html {
|
||||
font-size: var(--root-font-size);
|
||||
overflow: hidden;
|
||||
}
|
||||
main {
|
||||
-ms-overflow-style: none; /* IE and Edge */
|
||||
@@ -141,6 +171,7 @@ tbody .modified {
|
||||
font-family: 'Roboto Mono';
|
||||
}
|
||||
header {
|
||||
flex: 0 0 auto;
|
||||
background-color: var(--header-background);
|
||||
color: var(--header-color);
|
||||
font-size: var(--header-font-size);
|
||||
@@ -182,21 +213,26 @@ table {
|
||||
border: 0;
|
||||
gap: 0;
|
||||
}
|
||||
#app {
|
||||
height: 100%;
|
||||
body#app {
|
||||
height: 100vh;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
}
|
||||
main {
|
||||
flex: 1 1 auto;
|
||||
padding-bottom: 3em; /* convenience space on the bottom */
|
||||
overflow-y: scroll;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
header nav.headermain {
|
||||
/* Position so that tooltips can appear on top of other positioned elements */
|
||||
position: relative;
|
||||
z-index: 100;
|
||||
}
|
||||
main {
|
||||
height: calc(100svh - var(--header-height));
|
||||
padding-bottom: 3em; /* convenience space on the bottom */
|
||||
overflow-y: scroll;
|
||||
}
|
||||
.spacer { flex-grow: 1 }
|
||||
.smallgap { flex-shrink: 1; width: 2em }
|
||||
|
||||
[data-tooltip]:hover:after {
|
||||
z-index: 101;
|
||||
content: attr(data-tooltip);
|
||||
@@ -206,7 +242,7 @@ main {
|
||||
padding: .5rem 1rem;
|
||||
border-radius: 3rem 0 3rem 0;
|
||||
box-shadow: 0 0 1rem var(--accent-color);
|
||||
transform: translate(calc(1rem + -50%), 100%);
|
||||
transform: translate(calc(1rem + -50%), 150%);
|
||||
background-color: var(--accent-color);
|
||||
color: var(--primary-color);
|
||||
white-space: pre;
|
||||
@@ -232,3 +268,9 @@ main {
|
||||
opacity: 0;
|
||||
}
|
||||
}
|
||||
.error-message {
|
||||
padding: .5em;
|
||||
font-weight: bold;
|
||||
background: var(--accent-color);
|
||||
color: #000;
|
||||
}
|
||||
|
Before Width: | Height: | Size: 158 B After Width: | Height: | Size: 158 B |
|
Before Width: | Height: | Size: 168 B After Width: | Height: | Size: 168 B |
|
Before Width: | Height: | Size: 388 B After Width: | Height: | Size: 388 B |
|
Before Width: | Height: | Size: 128 B After Width: | Height: | Size: 128 B |
|
Before Width: | Height: | Size: 126 B After Width: | Height: | Size: 126 B |
|
Before Width: | Height: | Size: 158 B After Width: | Height: | Size: 158 B |
|
Before Width: | Height: | Size: 208 B After Width: | Height: | Size: 208 B |
|
Before Width: | Height: | Size: 563 B After Width: | Height: | Size: 563 B |
|
Before Width: | Height: | Size: 212 B After Width: | Height: | Size: 212 B |
|
Before Width: | Height: | Size: 293 B After Width: | Height: | Size: 293 B |
|
Before Width: | Height: | Size: 310 B After Width: | Height: | Size: 310 B |
|
Before Width: | Height: | Size: 193 B After Width: | Height: | Size: 193 B |
|
Before Width: | Height: | Size: 278 B After Width: | Height: | Size: 278 B |
|
Before Width: | Height: | Size: 711 B After Width: | Height: | Size: 711 B |
|
Before Width: | Height: | Size: 365 B After Width: | Height: | Size: 365 B |
|
Before Width: | Height: | Size: 783 B After Width: | Height: | Size: 783 B |
|
Before Width: | Height: | Size: 382 B After Width: | Height: | Size: 382 B |
|
Before Width: | Height: | Size: 200 B After Width: | Height: | Size: 200 B |
|
Before Width: | Height: | Size: 698 B After Width: | Height: | Size: 698 B |
|
Before Width: | Height: | Size: 156 B After Width: | Height: | Size: 156 B |
|
Before Width: | Height: | Size: 416 B After Width: | Height: | Size: 416 B |
|
Before Width: | Height: | Size: 517 B After Width: | Height: | Size: 517 B |
|
Before Width: | Height: | Size: 257 B After Width: | Height: | Size: 257 B |
|
Before Width: | Height: | Size: 297 B After Width: | Height: | Size: 297 B |
|
Before Width: | Height: | Size: 312 B After Width: | Height: | Size: 312 B |
|
Before Width: | Height: | Size: 109 B After Width: | Height: | Size: 109 B |
|
Before Width: | Height: | Size: 587 B After Width: | Height: | Size: 587 B |
|
Before Width: | Height: | Size: 269 B After Width: | Height: | Size: 269 B |
|
Before Width: | Height: | Size: 106 B After Width: | Height: | Size: 106 B |
|
Before Width: | Height: | Size: 393 B After Width: | Height: | Size: 393 B |
|
Before Width: | Height: | Size: 94 B After Width: | Height: | Size: 94 B |
|
Before Width: | Height: | Size: 229 B After Width: | Height: | Size: 229 B |
|
Before Width: | Height: | Size: 108 B After Width: | Height: | Size: 108 B |
|
Before Width: | Height: | Size: 407 B After Width: | Height: | Size: 407 B |
|
Before Width: | Height: | Size: 887 B After Width: | Height: | Size: 887 B |
|
Before Width: | Height: | Size: 908 B After Width: | Height: | Size: 908 B |
|
Before Width: | Height: | Size: 417 B After Width: | Height: | Size: 417 B |
|
Before Width: | Height: | Size: 554 B After Width: | Height: | Size: 554 B |
|
Before Width: | Height: | Size: 552 B After Width: | Height: | Size: 552 B |
|
Before Width: | Height: | Size: 114 B After Width: | Height: | Size: 114 B |
|
Before Width: | Height: | Size: 1.1 KiB After Width: | Height: | Size: 1.1 KiB |
|
Before Width: | Height: | Size: 91 B After Width: | Height: | Size: 91 B |
|
Before Width: | Height: | Size: 647 B After Width: | Height: | Size: 647 B |
|
Before Width: | Height: | Size: 95 B After Width: | Height: | Size: 95 B |
|
Before Width: | Height: | Size: 208 B After Width: | Height: | Size: 208 B |
|
Before Width: | Height: | Size: 104 B After Width: | Height: | Size: 104 B |
|
Before Width: | Height: | Size: 508 B After Width: | Height: | Size: 508 B |
|
Before Width: | Height: | Size: 1009 B After Width: | Height: | Size: 1009 B |
|
Before Width: | Height: | Size: 278 B After Width: | Height: | Size: 278 B |
|
Before Width: | Height: | Size: 753 B After Width: | Height: | Size: 753 B |
|
Before Width: | Height: | Size: 353 B After Width: | Height: | Size: 353 B |
|
Before Width: | Height: | Size: 542 B After Width: | Height: | Size: 542 B |
|
Before Width: | Height: | Size: 292 B After Width: | Height: | Size: 292 B |
|
Before Width: | Height: | Size: 621 B After Width: | Height: | Size: 621 B |
|
Before Width: | Height: | Size: 517 B After Width: | Height: | Size: 517 B |
|
Before Width: | Height: | Size: 289 B After Width: | Height: | Size: 289 B |
|
Before Width: | Height: | Size: 498 B After Width: | Height: | Size: 498 B |
|
Before Width: | Height: | Size: 464 B After Width: | Height: | Size: 464 B |
@@ -2,14 +2,19 @@
|
||||
<nav
|
||||
class="breadcrumb"
|
||||
aria-label="Breadcrumb"
|
||||
@keyup.left.stop="move(-1)"
|
||||
@keyup.right.stop="move(1)"
|
||||
@focus="move(0)"
|
||||
@keydown.left.stop="move(-1)"
|
||||
@keydown.right.stop="move(1)"
|
||||
@keyup.enter="move(0)"
|
||||
@focus=focusCurrent
|
||||
tabindex=0
|
||||
>
|
||||
<a href="#/"
|
||||
:ref="el => setLinkRef(0, el)"
|
||||
class="home"
|
||||
:class="{ current: !!isCurrent(0) }"
|
||||
:aria-current="isCurrent(0)"
|
||||
@click.prevent="navigate(0)"
|
||||
title="/"
|
||||
>
|
||||
<component :is="home" />
|
||||
</a>
|
||||
@@ -19,6 +24,7 @@
|
||||
:aria-current="isCurrent(index + 1)"
|
||||
@click.prevent="navigate(index + 1)"
|
||||
:ref="el => setLinkRef(index + 1, el)"
|
||||
:title="`/${longest.slice(0, index + 1).join('/')}`"
|
||||
>{{ location }}</a>
|
||||
</template>
|
||||
</nav>
|
||||
@@ -26,8 +32,9 @@
|
||||
|
||||
<script setup lang="ts">
|
||||
import home from '@/assets/svg/home.svg'
|
||||
import { onBeforeUpdate, ref, watchEffect } from 'vue'
|
||||
import { nextTick, onBeforeUpdate, ref, watchEffect } from 'vue'
|
||||
import { useRouter } from 'vue-router'
|
||||
import { exists } from '@/utils/fileutil'
|
||||
|
||||
const router = useRouter()
|
||||
|
||||
@@ -37,17 +44,33 @@ onBeforeUpdate(() => { links.length = 1 }) // 1 to keep home
|
||||
|
||||
const props = defineProps<{
|
||||
path: Array<string>
|
||||
primary?: boolean
|
||||
}>()
|
||||
|
||||
const longest = ref<Array<string>>([])
|
||||
|
||||
const isCurrent = (index: number) => index == props.path.length ? 'location' : undefined
|
||||
|
||||
const focusCurrent = () => {
|
||||
nextTick(() => {
|
||||
const index = props.path.length
|
||||
if (index < links.length) links[index].focus()
|
||||
})
|
||||
}
|
||||
|
||||
const navigate = (index: number) => {
|
||||
const link = links[index]
|
||||
if (!link) throw Error(`No link at index ${index} (path: ${props.path})`)
|
||||
link.focus()
|
||||
router.replace(`/${longest.value.slice(0, index).join('/')}`)
|
||||
const url = index ? `/${longest.value.slice(0, index).join('/')}/` : '/'
|
||||
const long = longest.value.length ? `/${longest.value.join('/')}/` : '/'
|
||||
const browser = decodeURIComponent(location.hash.slice(1).split('//')[0])
|
||||
const u = url.replaceAll('?', '%3F').replaceAll('#', '%23')
|
||||
// Clicking on current link clears the rest of the path and adds new history
|
||||
if (isCurrent(index)) { longest.value.splice(index); router.push(u) }
|
||||
// Moving along breadcrumbs doesn't create new history
|
||||
else if (long.startsWith(browser)) router.replace(u)
|
||||
// Nornal navigation from elsewhere (e.g. search result breadcrumbs)
|
||||
else router.push(u)
|
||||
}
|
||||
|
||||
const move = (dir: number) => {
|
||||
@@ -59,13 +82,25 @@ const move = (dir: number) => {
|
||||
watchEffect(() => {
|
||||
const longcut = longest.value.slice(0, props.path.length)
|
||||
const same = longcut.every((value, index) => value === props.path[index])
|
||||
// Navigated out of previous path, reset longest to current
|
||||
if (!same) longest.value = props.path
|
||||
else if (props.path.length > longcut.length) {
|
||||
longest.value = longcut.concat(props.path.slice(longcut.length))
|
||||
}
|
||||
})
|
||||
watchEffect(() => {
|
||||
if (links.length) navigate(props.path.length)
|
||||
else {
|
||||
// Prune deleted folders from longest
|
||||
for (let i = props.path.length; i < longest.value.length; ++i) {
|
||||
if (!exists(longest.value.slice(0, i + 1))) {
|
||||
longest.value = longest.value.slice(0, i)
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
// If needed, focus primary navigation to new location
|
||||
if (props.primary) nextTick(() => {
|
||||
const act = document.activeElement as HTMLElement
|
||||
if (!act || [...links, document.body].includes(act)) focusCurrent()
|
||||
})
|
||||
})
|
||||
</script>
|
||||
|
||||
@@ -80,31 +115,36 @@ watchEffect(() => {
|
||||
--breadcrumb-transtime: 0.3s;
|
||||
}
|
||||
.breadcrumb {
|
||||
flex: 1 1 auto;
|
||||
display: flex;
|
||||
list-style: none;
|
||||
min-width: 20%;
|
||||
max-width: 100%;
|
||||
min-height: 2em;
|
||||
margin: 0;
|
||||
padding: 0 1em 0 0;
|
||||
overflow: hidden;
|
||||
}
|
||||
.breadcrumb > a {
|
||||
flex: 0 4 auto;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
margin: 0 -0.5em 0 -0.5em;
|
||||
padding: 0;
|
||||
max-width: 8em;
|
||||
white-space: nowrap;
|
||||
text-overflow: ellipsis;
|
||||
overflow: hidden;
|
||||
height: 1.5em;
|
||||
color: var(--breadcrumb-color);
|
||||
padding: 0.3em 1.5em;
|
||||
clip-path: polygon(0 0, 1em 50%, 0 100%, 100% 100%, 100% 0, 0 0);
|
||||
transition: all var(--breadcrumb-transtime);
|
||||
}
|
||||
.breadcrumb a:first-child {
|
||||
margin-left: 0;
|
||||
padding-left: .2em;
|
||||
.breadcrumb > a:first-child {
|
||||
flex: 0 0 auto;
|
||||
padding-left: 1.5em;
|
||||
padding-right: 1.7em;
|
||||
clip-path: none;
|
||||
}
|
||||
.breadcrumb a:last-child {
|
||||
max-width: none;
|
||||
.breadcrumb > a:last-child {
|
||||
clip-path: polygon(
|
||||
0 0,
|
||||
calc(100% - 1em) 0,
|
||||
@@ -115,7 +155,7 @@ watchEffect(() => {
|
||||
0 0
|
||||
);
|
||||
}
|
||||
.breadcrumb a:only-child {
|
||||
.breadcrumb > a:only-child {
|
||||
clip-path: polygon(
|
||||
0 0,
|
||||
calc(100% - 1em) 0,
|
||||
@@ -127,9 +167,9 @@ watchEffect(() => {
|
||||
}
|
||||
.breadcrumb svg {
|
||||
/* FIXME: Custom positioning to align it well; needs proper solution */
|
||||
padding-left: 0.8em;
|
||||
width: 1.3em;
|
||||
height: 1.3em;
|
||||
margin: -.5em;
|
||||
fill: var(--breadcrumb-color);
|
||||
transition: fill var(--breadcrumb-transtime);
|
||||
}
|
||||
@@ -149,6 +189,6 @@ watchEffect(() => {
|
||||
}
|
||||
.breadcrumb a:hover { color: var(--breadcrumb-hover-color) }
|
||||
.breadcrumb a:hover svg { fill: var(--breadcrumb-hover-color) }
|
||||
.breadcrumb a.current { color: var(--accent-color) }
|
||||
.breadcrumb a.current { color: var(--accent-color); max-width: none; flex: 0 1 auto; }
|
||||
.breadcrumb a.current svg { fill: var(--accent-color) }
|
||||
</style>
|
||||
171
frontend/src/components/DownloadButton.vue
Normal file
@@ -0,0 +1,171 @@
|
||||
<template>
|
||||
<SvgButton name="download" data-tooltip="Download" @click="download" />
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { useMainStore } from '@/stores/main'
|
||||
import type { SelectedItems } from '@/repositories/Document'
|
||||
import { reactive } from 'vue';
|
||||
|
||||
const store = useMainStore()
|
||||
|
||||
const status_init = {
|
||||
total: 0,
|
||||
xfer: 0,
|
||||
t0: 0,
|
||||
tlast: 0,
|
||||
statbytes: 0,
|
||||
statdur: 0,
|
||||
files: [] as string[],
|
||||
filestart: 0,
|
||||
fileidx: 0,
|
||||
filecount: 0,
|
||||
filename: '',
|
||||
filesize: 0,
|
||||
filepos: 0,
|
||||
status: 'idle',
|
||||
}
|
||||
store.dprogress = {...status_init}
|
||||
setInterval(() => {
|
||||
if (Date.now() - store.dprogress.tlast > 3000) {
|
||||
// Reset
|
||||
store.dprogress.statbytes = 0
|
||||
store.dprogress.statdur = 1
|
||||
} else {
|
||||
// Running average by decay
|
||||
store.dprogress.statbytes *= .9
|
||||
store.dprogress.statdur *= .9
|
||||
}
|
||||
}, 100)
|
||||
const statReset = () => {
|
||||
Object.assign(store.dprogress, status_init)
|
||||
store.dprogress.t0 = Date.now()
|
||||
store.dprogress.tlast = store.dprogress.t0 + 1
|
||||
}
|
||||
const cancelDownloads = () => {
|
||||
location.reload() // FIXME
|
||||
}
|
||||
|
||||
|
||||
const linkdl = (href: string) => {
|
||||
const a = document.createElement('a')
|
||||
a.href = href
|
||||
a.download = ''
|
||||
a.click()
|
||||
}
|
||||
|
||||
const filesystemdl = async (sel: SelectedItems, handle: FileSystemDirectoryHandle) => {
|
||||
let hdir = ''
|
||||
let h = handle
|
||||
console.log('Downloading to filesystem', sel.recursive)
|
||||
for (const [rel, full, doc] of sel.recursive) {
|
||||
if (doc.dir) continue
|
||||
store.dprogress.files.push(rel)
|
||||
++store.dprogress.filecount
|
||||
store.dprogress.total += doc.size
|
||||
}
|
||||
for (const [rel, full, doc] of sel.recursive) {
|
||||
// Create any missing directories
|
||||
if (hdir && !rel.startsWith(hdir + '/')) {
|
||||
hdir = ''
|
||||
h = handle
|
||||
}
|
||||
const r = rel.slice(hdir.length)
|
||||
for (const dir of r.split('/').slice(0, doc.dir ? undefined : -1)) {
|
||||
if (!dir) continue
|
||||
hdir += `${dir}/`
|
||||
try {
|
||||
h = await h.getDirectoryHandle(dir.normalize('NFC'), { create: true })
|
||||
} catch (error) {
|
||||
console.error('Failed to create directory', hdir, error)
|
||||
return
|
||||
}
|
||||
console.log('Created', hdir)
|
||||
}
|
||||
if (doc.dir) continue // Target was a folder and was created
|
||||
const name = rel.split('/').pop()!.normalize('NFC')
|
||||
// Download file
|
||||
let fileHandle
|
||||
try {
|
||||
fileHandle = await h.getFileHandle(name, { create: true })
|
||||
} catch (error) {
|
||||
console.error('Failed to create file', rel, full, hdir + name, error)
|
||||
return
|
||||
}
|
||||
const writable = await fileHandle.createWritable()
|
||||
const url = `/files/${rel}`
|
||||
console.log('Fetching', url)
|
||||
const res = await fetch(url)
|
||||
if (!res.ok) {
|
||||
store.error = `Failed to download ${url}: ${res.status} ${res.statusText}`
|
||||
throw new Error(`Failed to download ${url}: ${res.status} ${res.statusText}`)
|
||||
}
|
||||
if (res.body) {
|
||||
++store.dprogress.fileidx
|
||||
const reader = res.body.getReader()
|
||||
await writable.truncate(0)
|
||||
store.error = "Direct download."
|
||||
store.dprogress.tlast = Date.now()
|
||||
while (true) {
|
||||
const { value, done } = await reader.read()
|
||||
if (done) break
|
||||
await writable.write(value)
|
||||
const now = Date.now()
|
||||
const size = value.byteLength
|
||||
store.dprogress.xfer += size
|
||||
store.dprogress.filepos += size
|
||||
store.dprogress.statbytes += size
|
||||
store.dprogress.statdur += now - store.dprogress.tlast
|
||||
store.dprogress.tlast = now
|
||||
}
|
||||
}
|
||||
await writable.close()
|
||||
console.log('Saved', hdir + name)
|
||||
}
|
||||
statReset()
|
||||
}
|
||||
|
||||
const download = async () => {
|
||||
const sel = store.selectedFiles
|
||||
console.log('Download', sel)
|
||||
if (sel.keys.length === 0) {
|
||||
console.warn('Attempted download but no files found. Missing selected keys:', sel.missing)
|
||||
store.error = 'No existing files selected'
|
||||
store.selected.clear()
|
||||
return
|
||||
}
|
||||
// Plain old a href download if only one file (ignoring any folders)
|
||||
const files = sel.recursive.filter(([rel, full, doc]) => !doc.dir)
|
||||
if (files.length === 1) {
|
||||
store.selected.clear()
|
||||
store.error = "Single file via browser downloads"
|
||||
return linkdl(`/files/${files[0][1]}`)
|
||||
}
|
||||
// Use FileSystem API if multiple files and the browser supports it
|
||||
if ('showDirectoryPicker' in window) {
|
||||
try {
|
||||
// @ts-ignore
|
||||
const handle = await window.showDirectoryPicker({
|
||||
startIn: 'downloads',
|
||||
mode: 'readwrite'
|
||||
})
|
||||
await filesystemdl(sel, handle)
|
||||
store.selected.clear()
|
||||
return
|
||||
} catch (e) {
|
||||
console.error('Download to folder aborted', e)
|
||||
}
|
||||
}
|
||||
// Otherwise, zip and download
|
||||
console.log("Falling back to zip download")
|
||||
const name = sel.keys.length === 1 ? sel.docs[sel.keys[0]].name : 'download'
|
||||
linkdl(`/zip/${Array.from(sel.keys).join('+')}/${name}.zip`)
|
||||
store.error = "Downloading as ZIP via browser downloads"
|
||||
store.selected.clear()
|
||||
}
|
||||
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
|
||||
</style>
|
||||