Compare commits
86 Commits
37167a41a6
...
v1.1.0
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
33db2c01b4 | ||
|
|
26addb2f7b | ||
|
|
ce6d60e588 | ||
| 073f1a8707 | |||
|
|
f38bb4bab9 | ||
|
|
26f9bef087 | ||
|
|
634dabe52d | ||
|
|
a383358369 | ||
|
|
369dc3ecaf | ||
|
|
0cf9c254e5 | ||
|
|
58b9dd3dd4 | ||
|
|
0965a56204 | ||
|
|
52beedcef0 | ||
|
|
8592d462f2 | ||
|
|
b2a24fca57 | ||
|
|
7cc7e32c33 | ||
|
|
fa98cb9177 | ||
|
|
b3ab09a614 | ||
|
|
a49dd2f111 | ||
|
|
dbb06e111c | ||
|
|
667e31aa08 | ||
|
|
007f021d6f | ||
|
|
b2188eee68 | ||
|
|
7311ffdff1 | ||
|
|
102a970174 | ||
|
|
a9d713dbd0 | ||
|
|
434e55f303 | ||
|
|
be9772c90e | ||
|
|
bb2448dc24 | ||
|
|
115f3e20d0 | ||
|
|
a366a0bcc6 | ||
|
|
2ff0f78759 | ||
|
|
9a2d6e8065 | ||
|
|
62388eb555 | ||
|
|
53778543bf | ||
|
|
8dda230510 | ||
|
|
696e3ab568 | ||
|
|
85ac12ad33 | ||
|
|
e56cc47105 | ||
|
|
ebbd96bc94 | ||
|
|
a9b6d04361 | ||
|
|
5808fe17ad | ||
|
|
671359e327 | ||
|
|
ba9495eb65 | ||
|
|
de482afd60 | ||
|
|
a547052e29 | ||
|
|
07c2ff4c15 | ||
|
|
e20b04189f | ||
|
|
8da141744e | ||
|
|
11887edde3 | ||
|
|
034c6fdea9 | ||
|
|
c5083f0f2b | ||
|
|
f8a9197474 | ||
|
|
5285cb2fb5 | ||
|
|
b6b387d09b | ||
|
|
669762dfe7 | ||
|
|
51fd07d4fa | ||
|
|
c40c245ce6 | ||
|
|
1fdd00b833 | ||
|
|
520a9dff47 | ||
|
|
c5c65d136a | ||
|
|
61f9026e23 | ||
|
|
3e50149d4d | ||
|
|
7077b21159 | ||
|
|
938c5ca657 | ||
|
|
e0aef07783 | ||
|
|
36826a83c1 | ||
|
|
6880f82c19 | ||
|
|
5dd1bd9bdc | ||
|
|
41e8c78ecd | ||
|
|
dc4bb494f3 | ||
|
|
9b58b887b4 | ||
|
|
07848907f3 | ||
|
|
7a08f7cbe2 | ||
|
|
dd37238510 | ||
|
|
c8d5f335b1 | ||
|
|
bb80b3ee54 | ||
|
|
06d860c601 | ||
|
|
c321de13fd | ||
|
|
278e8303c4 | ||
|
|
9854dd01cc | ||
|
|
fb03fa5430 | ||
|
|
e26cb8f70a | ||
|
|
9bbbc829a1 | ||
|
|
876d76bc1f | ||
|
|
4a53d0b8e2 |
1
.gitignore
vendored
@@ -1,4 +1,5 @@
|
||||
.*
|
||||
*.lock
|
||||
!.gitignore
|
||||
__pycache__/
|
||||
*.egg-info/
|
||||
|
||||
148
README.md
@@ -1,25 +1,123 @@
|
||||
# Web File Storage
|
||||
|
||||
Run directly from repository with Hatch (or use pip install as usual):
|
||||
```sh
|
||||
hatch run cista -l :3000 /path/to/files
|
||||
```
|
||||
|
||||
Settings incl. these arguments are stored to config file on the first startup and later `hatch run cista` is sufficient. If the `cista` script is missing, consider `pip install -e .` (within `hatch shell`) or some other trickery (known issue with installs made prior to adding the startup script).
|
||||
|
||||
Create your user account:
|
||||
```sh
|
||||
hatch run cista --user admin --privileged
|
||||
```
|
||||
|
||||
## Build frontend
|
||||
|
||||
Prebuilt frontend is provided in repository but for any changes it will need to be manually rebuilt:
|
||||
|
||||
```sh
|
||||
cd cista-front
|
||||
npm install
|
||||
npm run build
|
||||
```
|
||||
|
||||
This will place the front in `cista/wwwroot` from where the backend server delivers it, and that also gets included in the Python package built via `hatch build`.
|
||||
# Cista Web Storage
|
||||
|
||||
<img src="https://git.zi.fi/Vasanko/cista-storage/raw/branch/main/docs/cista.webp" align=left width=250>
|
||||
|
||||
Cista takes its name from the ancient *cistae*, metal containers used by Greeks and Egyptians to safeguard valuable items. This modern application provides a browser interface for secure and accessible file storage, echoing the trust and reliability of its historical namesake.
|
||||
|
||||
This is a cutting-edge **file and document server** designed for speed, efficiency, and unparalleled ease of use. Experience **lightning-fast browsing**, thanks to the file list maintained directly in your browser and updated from server filesystem events, coupled with our highly optimized code. Fully **keyboard-navigable** and with a responsive layout, Cista flawlessly adapts to your devices, providing a seamless experience wherever you are. Our powerful **instant search** means you're always just a few keystrokes away from finding exactly what you need. Press **1/2/3** to switch ordering, navigate with all four arrow keys (+Shift to select). Or click your way around on **breadcrumbs that remember where you were**.
|
||||
|
||||
**Built-in document and media previews** let you quickly view files without downloading them. Cista shows PDF and other documents, video and image thumbnails, with **HDR10 support** video previews and image formats, including HEIC and AVIF. It also has a player for music and video files.
|
||||
|
||||
The Cista project started as an inevitable remake of [Droppy](https://github.com/droppyjs/droppy) which we used and loved despite its numerous bugs. Cista Storage stands out in handling even the most exotic filenames, ensuring a smooth experience where others falter.
|
||||
|
||||
All of this is wrapped in an intuitive interface with automatic light and dark themes, making Cista Storage the ideal choice for anyone seeking a reliable, versatile, and quick file storage solution. Quickly setup your own Cista where your files are just a click away, safe, and always accessible.
|
||||
|
||||
Experience Cista by visiting [Cista Demo](https://drop.zi.fi) for a test run and perhaps upload something...
|
||||
|
||||
|
||||
## Getting Started
|
||||
### Running the Server
|
||||
|
||||
We recommend using [UV](https://docs.astral.sh/uv/getting-started/installation/) to directly run Cista:
|
||||
|
||||
Create an account: (otherwise the server is public for all)
|
||||
```fish
|
||||
uvx cista --user yourname --privileged
|
||||
```
|
||||
|
||||
Serve your files at http://localhost:8000:
|
||||
```fish
|
||||
uvx cista -l :8000 /path/to/files
|
||||
```
|
||||
|
||||
Alternatively, you can install with `pip` or `uv pip`. This enables using the `cista` command directly without `uvx` or `uv run`.
|
||||
|
||||
```fish
|
||||
pip install cista --break-system-packages
|
||||
```
|
||||
|
||||
The server remembers its settings in the config folder (default `~/.local/share/cista/`), including the listen port and directory, for future runs without arguments.
|
||||
|
||||
### Internet Access
|
||||
|
||||
Most admins find the [Caddy](https://caddyserver.com/) web server convenient for its auto TLS certificates and all. A proxy also allows running multiple web services or Cista instances on the same IP address but different (sub)domains.
|
||||
|
||||
`/etc/caddy/Caddyfile`:
|
||||
|
||||
```Caddyfile
|
||||
cista.example.com {
|
||||
reverse_proxy :8000
|
||||
}
|
||||
```
|
||||
|
||||
Nxing or other proxy may be similarly used, or alternatively you can place cert and key in cista config dir and run `cista -l cista.example.com`
|
||||
|
||||
## System Deployment
|
||||
|
||||
This setup allows easy addition of storages, each with its own domain, configuration, and files.
|
||||
|
||||
Assuming a restricted user account `storage` for serving files and that UV is installed system-wide or on this account. Only UV is required: this does not use git or bun/npm.
|
||||
|
||||
Create `/etc/systemd/system/cista@.service`:
|
||||
|
||||
```ini
|
||||
[Unit]
|
||||
Description=Cista storage %i
|
||||
|
||||
[Service]
|
||||
User=storage
|
||||
ExecStart=uvx cista -c /srv/cista/%i -l /srv/cista/%i/socket /media/storage/%i
|
||||
Restart=always
|
||||
|
||||
[Install]
|
||||
WantedBy=multi-user.target
|
||||
```
|
||||
|
||||
This setup supports multiple storages, each under `/media/storage/<domain>` for files and `/srv/cista/<domain>/` for configuration. UNIX sockets are used instead of numeric ports for convenience.
|
||||
|
||||
```fish
|
||||
systemctl daemon-reload
|
||||
systemctl enable --now cista@foo.example.com
|
||||
systemctl enable --now cista@bar.example.com
|
||||
```
|
||||
|
||||
Public exposure is easiest using the Caddy web server.
|
||||
|
||||
`/etc/caddy/Caddyfile`:
|
||||
|
||||
```Caddyfile
|
||||
foo.example.com, bar.example.com {
|
||||
reverse_proxy unix//srv/cista/{host}/socket
|
||||
}
|
||||
```
|
||||
|
||||
## Development setup
|
||||
|
||||
For rapid development, we use the Vite development server for the Vue frontend, while running the backend on port 8000 that Vite proxies backend requests to. Each server live reloads whenever its code or configuration are modified.
|
||||
|
||||
Make sure you have git, uv and bun (or npm) installed.
|
||||
|
||||
Backend (Python) – setup and run:
|
||||
|
||||
```fish
|
||||
git clone https://git.zi.fi/Vasanko/cista-storage.git
|
||||
cd cista-storage
|
||||
uv sync --dev
|
||||
uv run cista --dev -l :8000 /path/to/files
|
||||
```
|
||||
|
||||
Frontend (Vue/Vite) – run the dev server in another terminal:
|
||||
|
||||
```fish
|
||||
cd frontend
|
||||
bun install
|
||||
bun run dev
|
||||
```
|
||||
|
||||
Building the package for release (frontend + Python wheel/sdist):
|
||||
|
||||
```fish
|
||||
uv build
|
||||
```
|
||||
|
||||
Vue is used to build files in `cista/wwwroot`, included prebuilt in the Python package. `uv build` runs the project build hooks to bundle the frontend and produce a NodeJS-independent Python package.
|
||||
|
||||
|
Before Width: | Height: | Size: 4.2 KiB |
@@ -1,241 +0,0 @@
|
||||
<!DOCTYPE html>
|
||||
<title>Storage</title>
|
||||
<style>
|
||||
body {
|
||||
font-family: sans-serif;
|
||||
max-width: 100ch;
|
||||
margin: 0 auto;
|
||||
padding: 1em;
|
||||
background-color: #333;
|
||||
color: #eee;
|
||||
}
|
||||
td {
|
||||
text-align: right;
|
||||
padding: .5em;
|
||||
}
|
||||
td:first-child {
|
||||
text-align: left;
|
||||
}
|
||||
a {
|
||||
color: inherit;
|
||||
text-decoration: none;
|
||||
}
|
||||
</style>
|
||||
<div>
|
||||
<h2>Quick file upload</h2>
|
||||
<p>Uses parallel WebSocket connections for increased bandwidth /api/upload</p>
|
||||
<input type=file id=fileInput>
|
||||
<progress id=progressBar value=0 max=1></progress>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<h2>Files</h2>
|
||||
<ul id=file_list></ul>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
let files = {}
|
||||
let flatfiles = {}
|
||||
|
||||
function createWatchSocket() {
|
||||
const wsurl = new URL("/api/watch", location.href.replace(/^http/, 'ws'))
|
||||
const ws = new WebSocket(wsurl)
|
||||
ws.onmessage = event => {
|
||||
msg = JSON.parse(event.data)
|
||||
if (msg.update) {
|
||||
tree_update(msg.update)
|
||||
file_list(files)
|
||||
} else {
|
||||
console.log("Unkonwn message from watch socket", msg)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
createWatchSocket()
|
||||
|
||||
function tree_update(msg) {
|
||||
console.log("Tree update", msg)
|
||||
let node = files
|
||||
for (const elem of msg) {
|
||||
if (elem.deleted) {
|
||||
const p = node.dir[elem.name].path
|
||||
delete node.dir[elem.name]
|
||||
delete flatfiles[p]
|
||||
break
|
||||
}
|
||||
if (elem.name !== undefined) node = node.dir[elem.name] ||= {}
|
||||
if (elem.size !== undefined) node.size = elem.size
|
||||
if (elem.mtime !== undefined) node.mtime = elem.mtime
|
||||
if (elem.dir !== undefined) node.dir = elem.dir
|
||||
}
|
||||
// Update paths and flatfiles
|
||||
files.path = "/"
|
||||
const nodes = [files]
|
||||
flatfiles = {}
|
||||
while (node = nodes.pop()) {
|
||||
flatfiles[node.path] = node
|
||||
if (node.dir === undefined) continue
|
||||
for (const name of Object.keys(node.dir)) {
|
||||
const child = node.dir[name]
|
||||
child.path = node.path + name + (child.dir === undefined ? "" : "/")
|
||||
nodes.push(child)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
var collator = new Intl.Collator(undefined, {numeric: true, sensitivity: 'base'});
|
||||
|
||||
const compare_path = (a, b) => collator.compare(a.path, b.path)
|
||||
const compare_time = (a, b) => a.mtime > b.mtime
|
||||
|
||||
function file_list(files) {
|
||||
const table = document.getElementById("file_list")
|
||||
const sorted = Object.values(flatfiles).sort(compare_time)
|
||||
table.innerHTML = ""
|
||||
for (const f of sorted) {
|
||||
const {path, size, mtime} = f
|
||||
const tr = document.createElement("tr")
|
||||
const name_td = document.createElement("td")
|
||||
const size_td = document.createElement("td")
|
||||
const mtime_td = document.createElement("td")
|
||||
const a = document.createElement("a")
|
||||
table.appendChild(tr)
|
||||
tr.appendChild(name_td)
|
||||
tr.appendChild(size_td)
|
||||
tr.appendChild(mtime_td)
|
||||
name_td.appendChild(a)
|
||||
size_td.textContent = size
|
||||
mtime_td.textContent = formatUnixDate(mtime)
|
||||
a.textContent = path
|
||||
a.href = `/files${path}`
|
||||
/*a.onclick = event => {
|
||||
if (window.showSaveFilePicker) {
|
||||
event.preventDefault()
|
||||
download_ws(name, size)
|
||||
}
|
||||
}
|
||||
a.download = ""*/
|
||||
}
|
||||
}
|
||||
|
||||
function formatUnixDate(t) {
|
||||
const date = new Date(t * 1000)
|
||||
const now = new Date()
|
||||
const diff = date - now
|
||||
const formatter = new Intl.RelativeTimeFormat('en', { numeric: 'auto' })
|
||||
|
||||
if (Math.abs(diff) <= 60000) {
|
||||
return formatter.format(Math.round(diff / 1000), 'second')
|
||||
}
|
||||
|
||||
if (Math.abs(diff) <= 3600000) {
|
||||
return formatter.format(Math.round(diff / 60000), 'minute')
|
||||
}
|
||||
|
||||
if (Math.abs(diff) <= 86400000) {
|
||||
return formatter.format(Math.round(diff / 3600000), 'hour')
|
||||
}
|
||||
|
||||
if (Math.abs(diff) <= 604800000) {
|
||||
return formatter.format(Math.round(diff / 86400000), 'day')
|
||||
}
|
||||
|
||||
return date.toLocaleDateString()
|
||||
}
|
||||
|
||||
async function download_ws(name, size) {
|
||||
const fh = await window.showSaveFilePicker({
|
||||
suggestedName: name,
|
||||
})
|
||||
const writer = await fh.createWritable()
|
||||
writer.truncate(size)
|
||||
const wsurl = new URL("/api/download", location.href.replace(/^http/, 'ws'))
|
||||
const ws = new WebSocket(wsurl)
|
||||
let pos = 0
|
||||
ws.onopen = () => {
|
||||
console.log("Downloading over WebSocket", name, size)
|
||||
ws.send(JSON.stringify({name, start: 0, end: size, size}))
|
||||
}
|
||||
ws.onmessage = event => {
|
||||
if (typeof event.data === 'string') {
|
||||
const msg = JSON.parse(event.data)
|
||||
console.log("Download finished", msg)
|
||||
ws.close()
|
||||
return
|
||||
}
|
||||
console.log("Received chunk", name, pos, pos + event.data.size)
|
||||
pos += event.data.size
|
||||
writer.write(event.data)
|
||||
}
|
||||
ws.onclose = () => {
|
||||
if (pos < size) {
|
||||
console.log("Download aborted", name, pos)
|
||||
writer.truncate(pos)
|
||||
}
|
||||
writer.close()
|
||||
}
|
||||
}
|
||||
|
||||
const fileInput = document.getElementById("fileInput")
|
||||
const progress = document.getElementById("progressBar")
|
||||
const numConnections = 2
|
||||
const chunkSize = 1<<20
|
||||
const wsConnections = new Set()
|
||||
|
||||
//for (let i = 0; i < numConnections; i++) createUploadWS()
|
||||
|
||||
function createUploadWS() {
|
||||
const wsurl = new URL("/api/upload", location.href.replace(/^http/, 'ws'))
|
||||
const ws = new WebSocket(wsurl)
|
||||
ws.binaryType = 'arraybuffer'
|
||||
ws.onopen = () => {
|
||||
wsConnections.add(ws)
|
||||
console.log("Upload socket connected")
|
||||
}
|
||||
ws.onmessage = event => {
|
||||
msg = JSON.parse(event.data)
|
||||
if (msg.written) progress.value += +msg.written
|
||||
else console.log(`Error: ${msg.error}`)
|
||||
}
|
||||
ws.onclose = () => {
|
||||
wsConnections.delete(ws)
|
||||
console.log("Upload socket disconnected, reconnecting...")
|
||||
setTimeout(createUploadWS, 1000)
|
||||
}
|
||||
}
|
||||
|
||||
async function load(file, start, end) {
|
||||
const reader = new FileReader()
|
||||
const load = new Promise(resolve => reader.onload = resolve)
|
||||
reader.readAsArrayBuffer(file.slice(start, end))
|
||||
const event = await load
|
||||
return event.target.result
|
||||
}
|
||||
|
||||
async function sendChunk(file, start, end, ws) {
|
||||
const chunk = await load(file, start, end)
|
||||
ws.send(JSON.stringify({
|
||||
name: file.name,
|
||||
size: file.size,
|
||||
start: start,
|
||||
end: end
|
||||
}))
|
||||
ws.send(chunk)
|
||||
}
|
||||
|
||||
fileInput.addEventListener("change", async function() {
|
||||
const file = this.files[0]
|
||||
const numChunks = Math.ceil(file.size / chunkSize)
|
||||
progress.value = 0
|
||||
progress.max = file.size
|
||||
|
||||
console.log(wsConnections)
|
||||
for (let i = 0; i < numChunks; i++) {
|
||||
const ws = Array.from(wsConnections)[i % wsConnections.size]
|
||||
const start = i * chunkSize
|
||||
const end = Math.min(file.size, start + chunkSize)
|
||||
const res = await sendChunk(file, start, end, ws)
|
||||
}
|
||||
})
|
||||
|
||||
</script>
|
||||
@@ -1,121 +0,0 @@
|
||||
<template>
|
||||
<LoginModal />
|
||||
<header>
|
||||
<HeaderMain ref="headerMain">
|
||||
<HeaderSelected :path="path.pathList" />
|
||||
</HeaderMain>
|
||||
<BreadCrumb :path="path.pathList" tabindex="-1"/>
|
||||
</header>
|
||||
<main>
|
||||
<RouterView :path="path.pathList" />
|
||||
</main>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { RouterView } from 'vue-router'
|
||||
import type { ComputedRef } from 'vue'
|
||||
import type HeaderMain from '@/components/HeaderMain.vue'
|
||||
import { onMounted, onUnmounted, ref } from 'vue'
|
||||
import { watchConnect, watchDisconnect } from '@/repositories/WS'
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
|
||||
import { computed } from 'vue'
|
||||
import Router from '@/router/index'
|
||||
|
||||
interface Path {
|
||||
path: string
|
||||
pathList: string[]
|
||||
}
|
||||
const documentStore = useDocumentStore()
|
||||
const path: ComputedRef<Path> = computed(() => {
|
||||
const p = decodeURIComponent(Router.currentRoute.value.path)
|
||||
const pathList = p.split('/').filter(value => value !== '')
|
||||
return {
|
||||
path: p,
|
||||
pathList
|
||||
}
|
||||
})
|
||||
onMounted(watchConnect)
|
||||
onUnmounted(watchDisconnect)
|
||||
// Update human-readable x seconds ago messages from mtimes
|
||||
setInterval(documentStore.updateModified, 1000)
|
||||
const headerMain = ref<typeof HeaderMain | null>(null)
|
||||
let vert = 0
|
||||
let timer: any = null
|
||||
const globalShortcutHandler = (event: KeyboardEvent) => {
|
||||
const fileExplorer = documentStore.fileExplorer as any
|
||||
if (!fileExplorer) return
|
||||
const c = fileExplorer.isCursor()
|
||||
const keyup = event.type === 'keyup'
|
||||
if (event.repeat) {
|
||||
if (
|
||||
event.key === 'ArrowUp' ||
|
||||
event.key === 'ArrowDown' ||
|
||||
(c && event.code === 'Space')
|
||||
) {
|
||||
event.preventDefault()
|
||||
}
|
||||
return
|
||||
}
|
||||
//console.log("key pressed", event)
|
||||
// For up/down implement custom fast repeat
|
||||
if (event.key === 'ArrowUp') vert = keyup ? 0 : event.altKey ? -10 : -1
|
||||
else if (event.key === 'ArrowDown') vert = keyup ? 0 : event.altKey ? 10 : 1
|
||||
// Find: process on keydown so that we can bypass the built-in search hotkey
|
||||
else if (!keyup && event.key === 'f' && (event.ctrlKey || event.metaKey)) {
|
||||
headerMain.value!.toggleSearchInput()
|
||||
}
|
||||
// Select all (toggle); keydown to prevent builtin
|
||||
else if (!keyup && event.key === 'a' && (event.ctrlKey || event.metaKey)) {
|
||||
fileExplorer.toggleSelectAll()
|
||||
}
|
||||
// Keys 1-3 to sort columns
|
||||
else if (
|
||||
c &&
|
||||
keyup &&
|
||||
(event.key === '1' || event.key === '2' || event.key === '3')
|
||||
) {
|
||||
fileExplorer.toggleSortColumn(+event.key)
|
||||
}
|
||||
// Rename
|
||||
else if (c && keyup && !event.ctrlKey && (event.key === 'F2' || event.key === 'r')) {
|
||||
fileExplorer.cursorRename()
|
||||
}
|
||||
// Toggle selections on file explorer; ignore all spaces to prevent scrolling built-in hotkey
|
||||
else if (c && event.code === 'Space') {
|
||||
if (keyup && !event.altKey && !event.ctrlKey)
|
||||
fileExplorer.cursorSelect()
|
||||
} else return
|
||||
event.preventDefault()
|
||||
if (!vert) {
|
||||
if (timer) {
|
||||
clearTimeout(timer) // Good for either timeout or interval
|
||||
timer = null
|
||||
}
|
||||
return
|
||||
}
|
||||
if (!timer) {
|
||||
// Initial move, then t0 delay until repeats at tr intervals
|
||||
const select = event.shiftKey
|
||||
fileExplorer.cursorMove(vert, select)
|
||||
const t0 = 200,
|
||||
tr = 30
|
||||
timer = setTimeout(
|
||||
() =>
|
||||
(timer = setInterval(() => {
|
||||
fileExplorer.cursorMove(vert, select)
|
||||
}, tr)),
|
||||
t0 - tr
|
||||
)
|
||||
}
|
||||
}
|
||||
onMounted(() => {
|
||||
window.addEventListener('keydown', globalShortcutHandler)
|
||||
window.addEventListener('keyup', globalShortcutHandler)
|
||||
})
|
||||
onUnmounted(() => {
|
||||
window.removeEventListener('keydown', globalShortcutHandler)
|
||||
window.removeEventListener('keyup', globalShortcutHandler)
|
||||
})
|
||||
export type { Path }
|
||||
</script>
|
||||
@@ -1,52 +0,0 @@
|
||||
<template>
|
||||
<object
|
||||
v-if="props.type === 'pdf'"
|
||||
:data="dataURL"
|
||||
type="application/pdf"
|
||||
width="100%"
|
||||
height="100%"
|
||||
></object>
|
||||
<a-image
|
||||
v-else-if="props.type === 'image'"
|
||||
width="50%"
|
||||
:src="dataURL"
|
||||
@click="() => setVisible(true)"
|
||||
:previewMask="false"
|
||||
:preview="{
|
||||
visibleImg,
|
||||
onVisibleChange: setVisible
|
||||
}"
|
||||
/>
|
||||
<!-- Unknown case -->
|
||||
<h1 v-else>Unsupported file type</h1>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { watchEffect, ref } from 'vue'
|
||||
import Router from '@/router/index'
|
||||
import { url_document_get } from '@/repositories/Document'
|
||||
|
||||
const dataURL = ref('')
|
||||
watchEffect(() => {
|
||||
dataURL.value = new URL(
|
||||
url_document_get + Router.currentRoute.value.path,
|
||||
location.origin
|
||||
).toString()
|
||||
})
|
||||
const emit = defineEmits({
|
||||
visibleImg(value: boolean) {
|
||||
return value
|
||||
}
|
||||
})
|
||||
|
||||
function setVisible(value: boolean) {
|
||||
emit('visibleImg', value)
|
||||
}
|
||||
|
||||
const props = defineProps<{
|
||||
type?: string
|
||||
visibleImg: boolean
|
||||
}>()
|
||||
</script>
|
||||
|
||||
<style></style>
|
||||
@@ -1,100 +0,0 @@
|
||||
<template>
|
||||
<nav class="headermain">
|
||||
<div class="buttons">
|
||||
<template v-if="documentStore.error">
|
||||
<div class="error-message" @click="documentStore.error = ''">{{ documentStore.error }}</div>
|
||||
<div class="smallgap"></div>
|
||||
</template>
|
||||
<UploadButton />
|
||||
<SvgButton
|
||||
name="create-folder"
|
||||
data-tooltip="New folder"
|
||||
@click="() => documentStore.fileExplorer.newFolder()"
|
||||
/>
|
||||
<slot></slot>
|
||||
<div class="spacer smallgap"></div>
|
||||
<template v-if="showSearchInput">
|
||||
<input
|
||||
ref="search"
|
||||
type="search"
|
||||
v-model="documentStore.search"
|
||||
placeholder="Search words"
|
||||
class="margin-input"
|
||||
@keyup.escape="closeSearch"
|
||||
/>
|
||||
</template>
|
||||
<SvgButton ref="searchButton" name="find" @click.prevent="toggleSearchInput" />
|
||||
<SvgButton name="cog" @click="settingsMenu" />
|
||||
</div>
|
||||
</nav>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import { ref, nextTick } from 'vue'
|
||||
import ContextMenu from '@imengyu/vue3-context-menu'
|
||||
|
||||
const documentStore = useDocumentStore()
|
||||
const showSearchInput = ref<boolean>(false)
|
||||
const search = ref<HTMLInputElement | null>()
|
||||
const searchButton = ref<HTMLButtonElement | null>()
|
||||
|
||||
const closeSearch = () => {
|
||||
if (!showSearchInput.value) return // Already closing
|
||||
showSearchInput.value = false
|
||||
documentStore.search = ''
|
||||
const breadcrumb = document.querySelector('.breadcrumb') as HTMLElement
|
||||
breadcrumb.focus()
|
||||
}
|
||||
const toggleSearchInput = () => {
|
||||
showSearchInput.value = !showSearchInput.value
|
||||
if (!showSearchInput.value) return closeSearch()
|
||||
nextTick(() => {
|
||||
const input = search.value
|
||||
if (input) input.focus()
|
||||
})
|
||||
}
|
||||
|
||||
const settingsMenu = (e: Event) => {
|
||||
// show the context menu
|
||||
const items = []
|
||||
if (documentStore.user.isLoggedIn) {
|
||||
items.push({ label: `Logout ${documentStore.user.username ?? ''}`, onClick: () => documentStore.logout() })
|
||||
} else {
|
||||
items.push({ label: 'Login', onClick: () => documentStore.loginDialog() })
|
||||
}
|
||||
ContextMenu.showContextMenu({
|
||||
// @ts-ignore
|
||||
x: e.target.getBoundingClientRect().right, y: e.target.getBoundingClientRect().bottom,
|
||||
items,
|
||||
})
|
||||
}
|
||||
|
||||
defineExpose({
|
||||
toggleSearchInput,
|
||||
closeSearch,
|
||||
})
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
.buttons {
|
||||
padding: 0;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
height: 3.5em;
|
||||
z-index: 10;
|
||||
}
|
||||
.buttons > * {
|
||||
flex-shrink: 1;
|
||||
}
|
||||
input[type='search'] {
|
||||
background: var(--input-background);
|
||||
color: var(--input-color);
|
||||
border: 0;
|
||||
border-radius: 0.1em;
|
||||
padding: 0.5em;
|
||||
outline: none;
|
||||
font-size: 1.5em;
|
||||
max-width: 30vw;
|
||||
}
|
||||
</style>
|
||||
@@ -1,27 +0,0 @@
|
||||
<template>
|
||||
<template v-for="upload in documentStore.uploadingDocuments" :key="upload.key">
|
||||
<span>{{ upload.name }}</span>
|
||||
<div class="progress-container">
|
||||
<a-progress :percent="upload.progress" />
|
||||
<CloseCircleOutlined class="close-button" @click="dismissUpload(upload.key)" />
|
||||
</div>
|
||||
</template>
|
||||
</template>
|
||||
<script setup lang="ts">
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
const documentStore = useDocumentStore()
|
||||
|
||||
function dismissUpload(key: number) {
|
||||
documentStore.deleteUploadingDocument(key)
|
||||
}
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
.progress-container {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
}
|
||||
.close-button:hover {
|
||||
color: #b81414;
|
||||
}
|
||||
</style>
|
||||
@@ -1,101 +0,0 @@
|
||||
<script setup lang="ts">
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import { h, ref } from 'vue'
|
||||
|
||||
const fileUploadButton = ref()
|
||||
const folderUploadButton = ref()
|
||||
const documentStore = useDocumentStore()
|
||||
const open = (placement: any) => openNotification(placement)
|
||||
|
||||
const isNotificationOpen = ref(false)
|
||||
const openNotification = (placement: any) => {
|
||||
if (!isNotificationOpen.value) {
|
||||
/*
|
||||
api.open({
|
||||
message: `Uploading documents`,
|
||||
description: h(NotificationLoading),
|
||||
placement,
|
||||
duration: 0,
|
||||
onClose: () => { isNotificationOpen.value = false }
|
||||
});*/
|
||||
isNotificationOpen.value = true
|
||||
}
|
||||
}
|
||||
|
||||
function uploadFileHandler() {
|
||||
fileUploadButton.value.click()
|
||||
}
|
||||
|
||||
async function load(file: File, start: number, end: number): Promise<ArrayBuffer> {
|
||||
const reader = new FileReader()
|
||||
const load = new Promise<Event>(resolve => (reader.onload = resolve))
|
||||
reader.readAsArrayBuffer(file.slice(start, end))
|
||||
const event = await load
|
||||
if (event.target && event.target instanceof FileReader) {
|
||||
return event.target.result as ArrayBuffer
|
||||
} else {
|
||||
throw new Error('Error loading file')
|
||||
}
|
||||
}
|
||||
|
||||
async function sendChunk(file: File, start: number, end: number) {
|
||||
const ws = documentStore.wsUpload
|
||||
if (ws) {
|
||||
const chunk = await load(file, start, end)
|
||||
|
||||
ws.send(
|
||||
JSON.stringify({
|
||||
name: file.name,
|
||||
size: file.size,
|
||||
start: start,
|
||||
end: end
|
||||
})
|
||||
)
|
||||
ws.send(chunk)
|
||||
}
|
||||
}
|
||||
|
||||
async function uploadHandler(event: Event) {
|
||||
const target = event.target as HTMLInputElement
|
||||
const chunkSize = 1 << 20
|
||||
if (!target?.files?.length) {
|
||||
documentStore.error = 'No files selected'
|
||||
return
|
||||
}
|
||||
for (const idx in target.files) {
|
||||
const file = target.files[idx]
|
||||
console.log('Uploading', file)
|
||||
const numChunks = Math.ceil(file.size / chunkSize)
|
||||
const document = documentStore.pushUploadingDocuments(file.name)
|
||||
open('bottomRight')
|
||||
for (let i = 0; i < numChunks; i++) {
|
||||
const start = i * chunkSize
|
||||
const end = Math.min(file.size, start + chunkSize)
|
||||
const res = await sendChunk(file, start, end)
|
||||
console.log('progress: ' + (100 * (i + 1)) / numChunks)
|
||||
console.log('Num Chunks: ' + numChunks)
|
||||
documentStore.updateUploadingDocuments(document.key, (100 * (i + 1)) / numChunks)
|
||||
}
|
||||
}
|
||||
}
|
||||
</script>
|
||||
<template>
|
||||
<template>
|
||||
<input
|
||||
ref="fileUploadButton"
|
||||
@change="uploadHandler"
|
||||
class="upload-input"
|
||||
type="file"
|
||||
multiple
|
||||
/>
|
||||
<input
|
||||
ref="folderUploadButton"
|
||||
@change="uploadHandler"
|
||||
class="upload-input"
|
||||
type="file"
|
||||
webkitdirectory
|
||||
/>
|
||||
</template>
|
||||
<SvgButton name="add-file" data-tooltip="Upload files" @click="fileUploadButton.click()" />
|
||||
<SvgButton name="add-folder" data-tooltip="Upload folder" @click="folderUploadButton.click()" />
|
||||
</template>
|
||||
@@ -1,55 +0,0 @@
|
||||
export type FUID = string
|
||||
|
||||
export type Document = {
|
||||
loc: string
|
||||
name: string
|
||||
key: FUID
|
||||
size: number
|
||||
sizedisp: string
|
||||
mtime: number
|
||||
modified: string
|
||||
haystack: string
|
||||
dir: boolean
|
||||
}
|
||||
|
||||
export type errorEvent = {
|
||||
error: {
|
||||
code: number
|
||||
message: string
|
||||
redirect: string
|
||||
}
|
||||
}
|
||||
|
||||
// Raw types the backend /api/watch sends us
|
||||
|
||||
export type FileEntry = {
|
||||
key: FUID
|
||||
size: number
|
||||
mtime: number
|
||||
}
|
||||
|
||||
export type DirEntry = {
|
||||
key: FUID
|
||||
size: number
|
||||
mtime: number
|
||||
dir: DirList
|
||||
}
|
||||
|
||||
export type DirList = Record<string, FileEntry | DirEntry>
|
||||
|
||||
export type UpdateEntry = {
|
||||
name: string
|
||||
deleted?: boolean
|
||||
key?: FUID
|
||||
size?: number
|
||||
mtime?: number
|
||||
dir?: DirList
|
||||
}
|
||||
|
||||
// Helper structure for selections
|
||||
export interface SelectedItems {
|
||||
keys: FUID[]
|
||||
docs: Record<FUID, Document>
|
||||
recursive: Array<[string, string, Document]>
|
||||
missing: Set<FUID>
|
||||
}
|
||||
@@ -1,15 +0,0 @@
|
||||
import Client from '@/repositories/Client'
|
||||
export const url_login = '/login'
|
||||
export const url_logout = '/logout '
|
||||
|
||||
export async function loginUser(username: string, password: string) {
|
||||
const user = await Client.post(url_login, {
|
||||
username,
|
||||
password
|
||||
})
|
||||
return user
|
||||
}
|
||||
export async function logoutUser() {
|
||||
const data = await Client.post(url_logout)
|
||||
return data
|
||||
}
|
||||
@@ -1,183 +0,0 @@
|
||||
import type {
|
||||
Document,
|
||||
DirEntry,
|
||||
FileEntry,
|
||||
FUID,
|
||||
SelectedItems
|
||||
} from '@/repositories/Document'
|
||||
import { formatSize, formatUnixDate, haystackFormat } from '@/utils'
|
||||
import { defineStore } from 'pinia'
|
||||
import { collator } from '@/utils'
|
||||
import { logoutUser } from '@/repositories/User'
|
||||
import { watchConnect } from '@/repositories/WS'
|
||||
|
||||
type FileData = { id: string; mtime: number; size: number; dir: DirectoryData }
|
||||
type DirectoryData = {
|
||||
[filename: string]: FileData
|
||||
}
|
||||
type User = {
|
||||
username: string
|
||||
privileged: boolean
|
||||
isOpenLoginModal: boolean
|
||||
isLoggedIn: boolean
|
||||
}
|
||||
|
||||
export const useDocumentStore = defineStore({
|
||||
id: 'documents',
|
||||
state: () => ({
|
||||
document: [] as Document[],
|
||||
search: "" as string,
|
||||
selected: new Set<FUID>(),
|
||||
uploadingDocuments: [],
|
||||
uploadCount: 0 as number,
|
||||
fileExplorer: null,
|
||||
error: '' as string,
|
||||
connected: false,
|
||||
user: {
|
||||
username: '',
|
||||
privileged: false,
|
||||
isLoggedIn: false,
|
||||
isOpenLoginModal: false
|
||||
} as User
|
||||
}),
|
||||
persist: {
|
||||
storage: sessionStorage,
|
||||
paths: ['document'],
|
||||
},
|
||||
actions: {
|
||||
updateRoot(root: DirEntry | null = null) {
|
||||
if (!root) {
|
||||
this.document = []
|
||||
return
|
||||
}
|
||||
// Transform tree data to flat documents array
|
||||
let loc = ""
|
||||
const mapper = ([name, attr]: [string, FileEntry | DirEntry]) => ({
|
||||
...attr,
|
||||
loc,
|
||||
name,
|
||||
sizedisp: formatSize(attr.size),
|
||||
modified: formatUnixDate(attr.mtime),
|
||||
haystack: haystackFormat(name),
|
||||
})
|
||||
const queue = [...Object.entries(root.dir ?? {}).map(mapper)]
|
||||
const docs = []
|
||||
for (let doc; (doc = queue.shift()) !== undefined;) {
|
||||
docs.push(doc)
|
||||
if ("dir" in doc) {
|
||||
// Recurse but replace recursive structure with boolean
|
||||
loc = doc.loc ? `${doc.loc}/${doc.name}` : doc.name
|
||||
queue.push(...Object.entries(doc.dir).map(mapper))
|
||||
// @ts-ignore
|
||||
doc.dir = true
|
||||
}
|
||||
// @ts-ignore
|
||||
else doc.dir = false
|
||||
}
|
||||
// Pre sort directory entries folders first then files, names in natural ordering
|
||||
docs.sort((a, b) =>
|
||||
// @ts-ignore
|
||||
b.dir - a.dir ||
|
||||
collator.compare(a.name, b.name)
|
||||
)
|
||||
this.document = docs as Document[]
|
||||
},
|
||||
updateUploadingDocuments(key: number, progress: number) {
|
||||
for (const d of this.uploadingDocuments) {
|
||||
if (d.key === key) d.progress = progress
|
||||
}
|
||||
},
|
||||
pushUploadingDocuments(name: string) {
|
||||
this.uploadCount++
|
||||
const document = {
|
||||
key: this.uploadCount,
|
||||
name: name,
|
||||
progress: 0
|
||||
}
|
||||
this.uploadingDocuments.push(document)
|
||||
return document
|
||||
},
|
||||
deleteUploadingDocument(key: number) {
|
||||
this.uploadingDocuments = this.uploadingDocuments.filter(e => e.key !== key)
|
||||
},
|
||||
updateModified() {
|
||||
for (const d of this.document) {
|
||||
if ('mtime' in d) d.modified = formatUnixDate(d.mtime)
|
||||
}
|
||||
},
|
||||
login(username: string, privileged: boolean) {
|
||||
this.user.username = username
|
||||
this.user.privileged = privileged
|
||||
this.user.isLoggedIn = true
|
||||
this.user.isOpenLoginModal = false
|
||||
if (!this.connected) watchConnect()
|
||||
},
|
||||
loginDialog() {
|
||||
this.user.isOpenLoginModal = true
|
||||
},
|
||||
async logout() {
|
||||
console.log("Logout")
|
||||
await logoutUser()
|
||||
this.$reset()
|
||||
history.go() // Reload page
|
||||
}
|
||||
},
|
||||
getters: {
|
||||
isUserLogged(): boolean {
|
||||
return this.user.isLoggedIn
|
||||
},
|
||||
recentDocuments(): Document[] {
|
||||
const ret = [...this.document]
|
||||
ret.sort((a, b) => b.mtime - a.mtime)
|
||||
return ret
|
||||
},
|
||||
largeDocuments(): Document[] {
|
||||
const ret = [...this.document]
|
||||
ret.sort((a, b) => b.size - a.size)
|
||||
return ret
|
||||
},
|
||||
selectedFiles(): SelectedItems {
|
||||
const selected = this.selected
|
||||
const found = new Set<FUID>()
|
||||
const ret: SelectedItems = {
|
||||
missing: new Set(),
|
||||
docs: {},
|
||||
keys: [],
|
||||
recursive: [],
|
||||
}
|
||||
for (const doc of this.document) {
|
||||
if (selected.has(doc.key)) {
|
||||
found.add(doc.key)
|
||||
ret.keys.push(doc.key)
|
||||
ret.docs[doc.key] = doc
|
||||
}
|
||||
}
|
||||
// What did we not select?
|
||||
for (const key of selected) if (!found.has(key)) ret.missing.add(key)
|
||||
// Build a flat list including contents recursively
|
||||
const relnames = new Set<string>()
|
||||
function add(rel: string, full: string, doc: Document) {
|
||||
if (!doc.dir && relnames.has(rel)) throw Error(`Multiple selections conflict for: ${rel}`)
|
||||
relnames.add(rel)
|
||||
ret.recursive.push([rel, full, doc])
|
||||
}
|
||||
for (const key of ret.keys) {
|
||||
const base = ret.docs[key]
|
||||
const basepath = base.loc ? `${base.loc}/${base.name}` : base.name
|
||||
const nremove = base.loc.length
|
||||
add(base.name, basepath, base)
|
||||
for (const doc of this.document) {
|
||||
if (doc.loc === basepath || doc.loc.startsWith(basepath) && doc.loc[basepath.length] === '/') {
|
||||
const full = doc.loc ? `${doc.loc}/${doc.name}` : doc.name
|
||||
const rel = full.slice(nremove)
|
||||
add(rel, full, doc)
|
||||
}
|
||||
}
|
||||
}
|
||||
// Sort by rel (name stored as on download)
|
||||
ret.recursive.sort((a, b) => collator.compare(a[0], b[0]))
|
||||
|
||||
return ret
|
||||
}
|
||||
}
|
||||
})
|
||||
@@ -1,58 +0,0 @@
|
||||
<template>
|
||||
<FileExplorer
|
||||
ref="fileExplorer"
|
||||
:key="Router.currentRoute.value.path"
|
||||
:path="props.path"
|
||||
:documents="documents"
|
||||
v-if="props.path"
|
||||
/>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { watchEffect, ref, computed } from 'vue'
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import Router from '@/router/index'
|
||||
import { needleFormat, localeIncludes, collator } from '@/utils';
|
||||
|
||||
const documentStore = useDocumentStore()
|
||||
const fileExplorer = ref()
|
||||
const props = defineProps({
|
||||
path: Array<string>
|
||||
})
|
||||
const documents = computed(() => {
|
||||
if (!props.path) return []
|
||||
const loc = props.path.join('/')
|
||||
// List the current location
|
||||
if (!documentStore.search) return documentStore.document.filter(doc => doc.loc === loc)
|
||||
// Find up to 100 newest documents that match the search
|
||||
const search = documentStore.search
|
||||
const needle = needleFormat(search)
|
||||
let limit = 100
|
||||
let docs = []
|
||||
for (const doc of documentStore.recentDocuments) {
|
||||
if (localeIncludes(doc.haystack, needle)) {
|
||||
docs.push(doc)
|
||||
if (--limit === 0) break
|
||||
}
|
||||
}
|
||||
// Organize by folder, by relevance
|
||||
const locsub = loc + '/'
|
||||
docs.sort((a, b) => (
|
||||
// @ts-ignore
|
||||
(b.loc === loc) - (a.loc === loc) ||
|
||||
// @ts-ignore
|
||||
(b.loc.slice(0, locsub.length) === locsub) - (a.loc.slice(0, locsub.length) === locsub) ||
|
||||
collator.compare(a.loc, b.loc) ||
|
||||
// @ts-ignore
|
||||
(a.type === 'file') - (b.type === 'file') ||
|
||||
// @ts-ignore
|
||||
b.name.includes(search) - a.name.includes(search) ||
|
||||
collator.compare(a.name, b.name)
|
||||
))
|
||||
return docs
|
||||
})
|
||||
|
||||
watchEffect(() => {
|
||||
documentStore.fileExplorer = fileExplorer.value
|
||||
})
|
||||
</script>
|
||||
@@ -1,3 +1,4 @@
|
||||
import os
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
@@ -61,13 +62,17 @@ def _main():
|
||||
path = None
|
||||
_confdir(args)
|
||||
exists = config.conffile.exists()
|
||||
print(config.conffile, exists)
|
||||
import_droppy = args["--import-droppy"]
|
||||
necessary_opts = exists or import_droppy or path and listen
|
||||
necessary_opts = exists or import_droppy or path
|
||||
if not necessary_opts:
|
||||
# Maybe run without arguments
|
||||
print(doc)
|
||||
print(
|
||||
"No config file found! Get started with:\n cista -l :8000 /path/to/files, or\n cista -l example.com --import-droppy # Uses Droppy files\n",
|
||||
"No config file found! Get started with one of:\n"
|
||||
" cista --user yourname --privileged\n"
|
||||
" cista --import-droppy\n"
|
||||
" cista -l :8000 /path/to/files\n"
|
||||
)
|
||||
return 1
|
||||
settings = {}
|
||||
@@ -79,8 +84,15 @@ def _main():
|
||||
settings = droppy.readconf()
|
||||
if path:
|
||||
settings["path"] = path
|
||||
elif not exists:
|
||||
settings["path"] = Path.home() / "Downloads"
|
||||
if listen:
|
||||
settings["listen"] = listen
|
||||
elif not exists:
|
||||
settings["listen"] = ":8000"
|
||||
if not exists and not import_droppy:
|
||||
# We have no users, so make it public
|
||||
settings["public"] = True
|
||||
operation = config.update_config(settings)
|
||||
print(f"Config {operation}: {config.conffile}")
|
||||
# Prepare to serve
|
||||
@@ -105,18 +117,31 @@ def _confdir(args):
|
||||
if confdir.exists() and not confdir.is_dir():
|
||||
if confdir.name != config.conffile.name:
|
||||
raise ValueError("Config path is not a directory")
|
||||
# Accidentally pointed to the cista.toml, use parent
|
||||
# Accidentally pointed to the db.toml, use parent
|
||||
confdir = confdir.parent
|
||||
config.conffile = config.conffile.with_parent(confdir)
|
||||
os.environ["CISTA_HOME"] = confdir.as_posix()
|
||||
config.init_confdir() # Uses environ if available
|
||||
|
||||
|
||||
def _user(args):
|
||||
_confdir(args)
|
||||
config.load_config()
|
||||
if config.conffile.exists():
|
||||
config.load_config()
|
||||
operation = False
|
||||
else:
|
||||
# Defaults for new config when user is created
|
||||
operation = config.update_config(
|
||||
{
|
||||
"listen": ":8000",
|
||||
"path": Path.home() / "Downloads",
|
||||
"public": False,
|
||||
}
|
||||
)
|
||||
print(f"Config {operation}: {config.conffile}\n")
|
||||
|
||||
name = args["--user"]
|
||||
if not name or not name.isidentifier():
|
||||
raise ValueError("Invalid username")
|
||||
config.load_config()
|
||||
u = config.config.users.get(name)
|
||||
info = f"User {name}" if u else f"New user {name}"
|
||||
changes = {}
|
||||
@@ -128,12 +153,17 @@ def _user(args):
|
||||
info += " (admin)" if oldadmin else ""
|
||||
if args["--password"] or not u:
|
||||
changes["password"] = pw = pwgen.generate()
|
||||
info += f"\n Password: {pw}"
|
||||
res = config.update_user(args["--user"], changes)
|
||||
info += f"\n Password: {pw}\n"
|
||||
res = config.update_user(name, changes)
|
||||
print(info)
|
||||
if res == "read":
|
||||
print(" No changes")
|
||||
|
||||
if operation == "created":
|
||||
print(
|
||||
"Now you can run the server:\n cista # defaults set: -l :8000 ~/Downloads\n"
|
||||
)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
|
||||
41
cista/api.py
@@ -37,10 +37,18 @@ async def upload(req, ws):
|
||||
)
|
||||
req = msgspec.json.decode(text, type=FileRange)
|
||||
pos = req.start
|
||||
data = None
|
||||
while pos < req.end and (data := await ws.recv()) and isinstance(data, bytes):
|
||||
while True:
|
||||
data = await ws.recv()
|
||||
if not isinstance(data, bytes):
|
||||
break
|
||||
if len(data) > req.end - pos:
|
||||
raise ValueError(
|
||||
f"Expected up to {req.end - pos} bytes, got {len(data)} bytes"
|
||||
)
|
||||
sentsize = await alink(("upload", req.name, pos, data, req.size))
|
||||
pos += typing.cast(int, sentsize)
|
||||
if pos >= req.end:
|
||||
break
|
||||
if pos != req.end:
|
||||
d = f"{len(data)} bytes" if isinstance(data, bytes) else data
|
||||
raise ValueError(f"Expected {req.end - pos} more bytes, got {d}")
|
||||
@@ -88,7 +96,7 @@ async def watch(req, ws):
|
||||
msgspec.json.encode(
|
||||
{
|
||||
"server": {
|
||||
"name": "Cista", # Should be configurable
|
||||
"name": config.config.name or config.config.path.name,
|
||||
"version": __version__,
|
||||
"public": config.config.public,
|
||||
},
|
||||
@@ -103,13 +111,28 @@ async def watch(req, ws):
|
||||
)
|
||||
uuid = token_bytes(16)
|
||||
try:
|
||||
with watching.tree_lock:
|
||||
q = watching.pubsub[uuid] = asyncio.Queue()
|
||||
# Init with disk usage and full tree
|
||||
await ws.send(watching.format_du())
|
||||
await ws.send(watching.format_tree())
|
||||
q, space, root = await asyncio.get_event_loop().run_in_executor(
|
||||
req.app.ctx.threadexec, subscribe, uuid, ws
|
||||
)
|
||||
await ws.send(space)
|
||||
await ws.send(root)
|
||||
# Send updates
|
||||
while True:
|
||||
await ws.send(await q.get())
|
||||
except RuntimeError as e:
|
||||
if str(e) == "cannot schedule new futures after shutdown":
|
||||
return # Server shutting down, drop the WebSocket
|
||||
raise
|
||||
finally:
|
||||
del watching.pubsub[uuid]
|
||||
watching.pubsub.pop(uuid, None) # Remove whether it got added yet or not
|
||||
|
||||
|
||||
def subscribe(uuid, ws):
|
||||
with watching.state.lock:
|
||||
q = watching.pubsub[uuid] = asyncio.Queue()
|
||||
# Init with disk usage and full tree
|
||||
return (
|
||||
q,
|
||||
watching.format_space(watching.state.space),
|
||||
watching.format_root(watching.state.root),
|
||||
)
|
||||
|
||||
201
cista/app.py
@@ -1,10 +1,10 @@
|
||||
import asyncio
|
||||
import datetime
|
||||
import mimetypes
|
||||
from collections import deque
|
||||
import threading
|
||||
from concurrent.futures import ThreadPoolExecutor
|
||||
from importlib.resources import files
|
||||
from pathlib import Path
|
||||
from multiprocessing import cpu_count
|
||||
from pathlib import Path, PurePath, PurePosixPath
|
||||
from stat import S_IFDIR, S_IFREG
|
||||
from urllib.parse import unquote
|
||||
from wsgiref.handlers import format_date_time
|
||||
@@ -12,15 +12,14 @@ from wsgiref.handlers import format_date_time
|
||||
import brotli
|
||||
import sanic.helpers
|
||||
from blake3 import blake3
|
||||
from natsort import natsorted, ns
|
||||
from sanic import Blueprint, Sanic, empty, raw
|
||||
from sanic import Blueprint, Sanic, empty, raw, redirect
|
||||
from sanic.exceptions import Forbidden, NotFound
|
||||
from sanic.log import logging
|
||||
from sanic.log import logger
|
||||
from setproctitle import setproctitle
|
||||
from stream_zip import ZIP_AUTO, stream_zip
|
||||
|
||||
from cista import auth, config, session, watching
|
||||
from cista import auth, config, preview, session, watching
|
||||
from cista.api import bp
|
||||
from cista.protocol import DirEntry
|
||||
from cista.util.apphelpers import handle_sanic_exception
|
||||
|
||||
# Workaround until Sanic PR #2824 is merged
|
||||
@@ -28,29 +27,40 @@ sanic.helpers._ENTITY_HEADERS = frozenset()
|
||||
|
||||
app = Sanic("cista", strict_slashes=True)
|
||||
app.blueprint(auth.bp)
|
||||
app.blueprint(preview.bp)
|
||||
app.blueprint(bp)
|
||||
app.exception(Exception)(handle_sanic_exception)
|
||||
|
||||
|
||||
setproctitle("cista-main")
|
||||
|
||||
|
||||
@app.before_server_start
|
||||
async def main_start(app, loop):
|
||||
config.load_config()
|
||||
await watching.start(app, loop)
|
||||
app.ctx.threadexec = ThreadPoolExecutor(max_workers=8)
|
||||
setproctitle(f"cista {config.config.path.name}")
|
||||
workers = max(2, min(8, cpu_count()))
|
||||
app.ctx.threadexec = ThreadPoolExecutor(
|
||||
max_workers=workers, thread_name_prefix="cista-ioworker"
|
||||
)
|
||||
watching.start(app, loop)
|
||||
|
||||
|
||||
@app.after_server_stop
|
||||
# Sanic sometimes fails to execute after_server_stop, so we do it before instead (potentially interrupting handlers)
|
||||
@app.before_server_stop
|
||||
async def main_stop(app, loop):
|
||||
await watching.stop(app, loop)
|
||||
quit.set()
|
||||
watching.stop(app)
|
||||
app.ctx.threadexec.shutdown()
|
||||
logger.debug("Cista worker threads all finished")
|
||||
|
||||
|
||||
@app.on_request
|
||||
async def use_session(req):
|
||||
req.ctx.session = session.get(req)
|
||||
try:
|
||||
req.ctx.username = req.ctx.session["username"]
|
||||
req.ctx.user = config.config.users[req.ctx.session["username"]] # type: ignore
|
||||
req.ctx.username = req.ctx.session["username"] # type: ignore
|
||||
req.ctx.user = config.config.users[req.ctx.username]
|
||||
except (AttributeError, KeyError, TypeError):
|
||||
req.ctx.username = None
|
||||
req.ctx.user = None
|
||||
@@ -81,22 +91,16 @@ def http_fileserver(app, _):
|
||||
www = {}
|
||||
|
||||
|
||||
@app.before_server_start
|
||||
async def load_wwwroot(*_ignored):
|
||||
global www
|
||||
www = await asyncio.get_event_loop().run_in_executor(None, _load_wwwroot, www)
|
||||
|
||||
|
||||
def _load_wwwroot(www):
|
||||
wwwnew = {}
|
||||
base = files("cista") / "wwwroot"
|
||||
paths = ["."]
|
||||
base = Path(__file__).with_name("wwwroot")
|
||||
paths = [PurePath()]
|
||||
while paths:
|
||||
path = paths.pop(0)
|
||||
current = base / path
|
||||
for p in current.iterdir():
|
||||
if p.is_dir():
|
||||
paths.append(current / p.parts[-1])
|
||||
paths.append(p.relative_to(base))
|
||||
continue
|
||||
name = p.relative_to(base).as_posix()
|
||||
mime = mimetypes.guess_type(name)[0] or "application/octet-stream"
|
||||
@@ -127,31 +131,63 @@ def _load_wwwroot(www):
|
||||
if len(br) >= len(data):
|
||||
br = False
|
||||
wwwnew[name] = data, br, headers
|
||||
if not wwwnew:
|
||||
msg = f"Web frontend missing from {base}\n Did you forget: hatch build\n"
|
||||
if not www:
|
||||
logger.warning(msg)
|
||||
if not app.debug:
|
||||
msg = "Web frontend missing. Cista installation is broken.\n"
|
||||
wwwnew[""] = (
|
||||
msg.encode(),
|
||||
False,
|
||||
{
|
||||
"etag": "error",
|
||||
"content-type": "text/plain",
|
||||
"cache-control": "no-store",
|
||||
},
|
||||
)
|
||||
return wwwnew
|
||||
|
||||
|
||||
@app.add_task
|
||||
@app.before_server_start
|
||||
async def start(app):
|
||||
await load_wwwroot(app)
|
||||
if app.debug:
|
||||
app.add_task(refresh_wwwroot(), name="refresh_wwwroot")
|
||||
|
||||
|
||||
async def load_wwwroot(app):
|
||||
global www
|
||||
www = await asyncio.get_event_loop().run_in_executor(
|
||||
app.ctx.threadexec, _load_wwwroot, www
|
||||
)
|
||||
|
||||
|
||||
quit = threading.Event()
|
||||
|
||||
|
||||
async def refresh_wwwroot():
|
||||
while True:
|
||||
try:
|
||||
wwwold = www
|
||||
await load_wwwroot()
|
||||
changes = ""
|
||||
for name in sorted(www):
|
||||
attr = www[name]
|
||||
if wwwold.get(name) == attr:
|
||||
continue
|
||||
headers = attr[2]
|
||||
changes += f"{headers['last-modified']} {headers['etag']} /{name}\n"
|
||||
for name in sorted(set(wwwold) - set(www)):
|
||||
changes += f"Deleted /{name}\n"
|
||||
if changes:
|
||||
print(f"Updated wwwroot:\n{changes}", end="", flush=True)
|
||||
except Exception as e:
|
||||
print("Error loading wwwroot", e)
|
||||
if not app.debug:
|
||||
return
|
||||
await asyncio.sleep(0.5)
|
||||
try:
|
||||
while not quit.is_set():
|
||||
try:
|
||||
wwwold = www
|
||||
await load_wwwroot(app)
|
||||
changes = ""
|
||||
for name in sorted(www):
|
||||
attr = www[name]
|
||||
if wwwold.get(name) == attr:
|
||||
continue
|
||||
headers = attr[2]
|
||||
changes += f"{headers['last-modified']} {headers['etag']} /{name}\n"
|
||||
for name in sorted(set(wwwold) - set(www)):
|
||||
changes += f"Deleted /{name}\n"
|
||||
if changes:
|
||||
print(f"Updated wwwroot:\n{changes}", end="", flush=True)
|
||||
except Exception as e:
|
||||
print(f"Error loading wwwroot: {e!r}")
|
||||
await asyncio.sleep(0.5)
|
||||
except asyncio.CancelledError:
|
||||
pass
|
||||
|
||||
|
||||
@app.route("/<path:path>", methods=["GET", "HEAD"])
|
||||
@@ -166,68 +202,78 @@ async def wwwroot(req, path=""):
|
||||
return empty(304, headers=headers)
|
||||
# Brotli compressed?
|
||||
if br and "br" in req.headers.accept_encoding.split(", "):
|
||||
headers = {
|
||||
**headers,
|
||||
"content-encoding": "br",
|
||||
}
|
||||
headers = {**headers, "content-encoding": "br"}
|
||||
data = br
|
||||
return raw(data, headers=headers)
|
||||
|
||||
|
||||
@app.route("/favicon.ico", methods=["GET", "HEAD"])
|
||||
async def favicon(req):
|
||||
# Browsers keep asking for it when viewing files (not HTML with icon link)
|
||||
return redirect("/assets/logo-97d1d7eb.svg", status=308)
|
||||
|
||||
|
||||
def get_files(wanted: set) -> list[tuple[PurePosixPath, Path]]:
|
||||
loc = PurePosixPath()
|
||||
idx = 0
|
||||
ret = []
|
||||
level: int | None = None
|
||||
parent: PurePosixPath | None = None
|
||||
with watching.state.lock:
|
||||
root = watching.state.root
|
||||
while idx < len(root):
|
||||
f = root[idx]
|
||||
loc = PurePosixPath(*loc.parts[: f.level - 1]) / f.name
|
||||
if parent is not None and f.level <= level:
|
||||
level = parent = None
|
||||
if f.key in wanted:
|
||||
level, parent = f.level, loc.parent
|
||||
if parent is not None:
|
||||
wanted.discard(f.key)
|
||||
ret.append((loc.relative_to(parent), watching.rootpath / loc))
|
||||
idx += 1
|
||||
return ret
|
||||
|
||||
|
||||
@app.get("/zip/<keys>/<zipfile:ext=zip>")
|
||||
async def zip_download(req, keys, zipfile, ext):
|
||||
"""Download a zip archive of the given keys"""
|
||||
|
||||
wanted = set(keys.split("+"))
|
||||
with watching.tree_lock:
|
||||
q = deque([([], None, watching.tree[""].dir)])
|
||||
files = []
|
||||
while q:
|
||||
locpar, relpar, d = q.pop()
|
||||
for name, attr in d.items():
|
||||
loc = [*locpar, name]
|
||||
rel = None
|
||||
if relpar or attr.key in wanted:
|
||||
rel = [*relpar, name] if relpar else [name]
|
||||
wanted.discard(attr.key)
|
||||
isdir = isinstance(attr, DirEntry)
|
||||
if isdir:
|
||||
q.append((loc, rel, attr.dir))
|
||||
if rel:
|
||||
files.append(
|
||||
("/".join(rel), Path(watching.rootpath.joinpath(*loc)))
|
||||
)
|
||||
files = get_files(wanted)
|
||||
|
||||
if not files:
|
||||
raise NotFound(
|
||||
"No files found",
|
||||
context={"keys": keys, "zipfile": zipfile, "wanted": wanted},
|
||||
context={"keys": keys, "zipfile": f"{zipfile}.{ext}", "wanted": wanted},
|
||||
)
|
||||
if wanted:
|
||||
raise NotFound("Files not found", context={"missing": wanted})
|
||||
|
||||
files = natsorted(files, key=lambda f: f[0], alg=ns.IGNORECASE)
|
||||
|
||||
def local_files(files):
|
||||
for rel, p in files:
|
||||
s = p.stat()
|
||||
size = s.st_size
|
||||
modified = datetime.datetime.fromtimestamp(s.st_mtime, datetime.UTC)
|
||||
name = rel.as_posix()
|
||||
if p.is_dir():
|
||||
yield rel, modified, S_IFDIR | 0o755, ZIP_AUTO(size), b""
|
||||
yield f"{name}/", modified, S_IFDIR | 0o755, ZIP_AUTO(size), iter(b"")
|
||||
else:
|
||||
yield rel, modified, S_IFREG | 0o644, ZIP_AUTO(size), contents(p)
|
||||
yield name, modified, S_IFREG | 0o644, ZIP_AUTO(size), contents(p, size)
|
||||
|
||||
def contents(name):
|
||||
def contents(name, size):
|
||||
with name.open("rb") as f:
|
||||
while chunk := f.read(65536):
|
||||
while size > 0 and (chunk := f.read(min(size, 1 << 20))):
|
||||
size -= len(chunk)
|
||||
yield chunk
|
||||
assert size == 0
|
||||
|
||||
def worker():
|
||||
try:
|
||||
for chunk in stream_zip(local_files(files)):
|
||||
asyncio.run_coroutine_threadsafe(queue.put(chunk), loop)
|
||||
asyncio.run_coroutine_threadsafe(queue.put(chunk), loop).result()
|
||||
except Exception:
|
||||
logging.exception("Error streaming ZIP")
|
||||
logger.exception("Error streaming ZIP")
|
||||
raise
|
||||
finally:
|
||||
asyncio.run_coroutine_threadsafe(queue.put(None), loop)
|
||||
@@ -238,7 +284,10 @@ async def zip_download(req, keys, zipfile, ext):
|
||||
thread = loop.run_in_executor(app.ctx.threadexec, worker)
|
||||
|
||||
# Stream the response
|
||||
res = await req.respond(content_type="application/zip")
|
||||
res = await req.respond(
|
||||
content_type="application/zip",
|
||||
headers={"cache-control": "no-store"},
|
||||
)
|
||||
while chunk := await queue.get():
|
||||
await res.send(chunk)
|
||||
|
||||
|
||||
@@ -68,10 +68,10 @@ def verify(request, *, privileged=False):
|
||||
if request.ctx.user:
|
||||
if request.ctx.user.privileged:
|
||||
return
|
||||
raise Forbidden("Access Forbidden: Only for privileged users")
|
||||
raise Forbidden("Access Forbidden: Only for privileged users", quiet=True)
|
||||
elif config.config.public or request.ctx.user:
|
||||
return
|
||||
raise Unauthorized("Login required", "cookie")
|
||||
raise Unauthorized(f"Login required for {request.path}", "cookie", quiet=True)
|
||||
|
||||
|
||||
bp = Blueprint("auth")
|
||||
@@ -159,3 +159,35 @@ async def logout_post(request):
|
||||
res = json({"message": msg})
|
||||
session.delete(res)
|
||||
return res
|
||||
|
||||
|
||||
@bp.post("/password-change")
|
||||
async def change_password(request):
|
||||
try:
|
||||
if request.headers.content_type == "application/json":
|
||||
username = request.json["username"]
|
||||
pwchange = request.json["passwordChange"]
|
||||
password = request.json["password"]
|
||||
else:
|
||||
username = request.form["username"][0]
|
||||
pwchange = request.form["passwordChange"][0]
|
||||
password = request.form["password"][0]
|
||||
if not username or not password:
|
||||
raise KeyError
|
||||
except KeyError:
|
||||
raise BadRequest(
|
||||
"Missing username, passwordChange or password",
|
||||
) from None
|
||||
try:
|
||||
user = login(username, password)
|
||||
set_password(user, pwchange)
|
||||
except ValueError as e:
|
||||
raise Forbidden(str(e), context={"redirect": "/login"}) from e
|
||||
|
||||
if "text/html" in request.headers.accept:
|
||||
res = redirect("/")
|
||||
session.flash(res, "Password updated")
|
||||
else:
|
||||
res = json({"message": "Password updated"})
|
||||
session.create(res, username)
|
||||
return res
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import os
|
||||
import secrets
|
||||
import sys
|
||||
from contextlib import suppress
|
||||
from functools import wraps
|
||||
from hashlib import sha256
|
||||
from pathlib import Path, PurePath
|
||||
@@ -14,6 +17,7 @@ class Config(msgspec.Struct):
|
||||
listen: str
|
||||
secret: str = secrets.token_hex(12)
|
||||
public: bool = False
|
||||
name: str = ""
|
||||
users: dict[str, User] = {}
|
||||
links: dict[str, Link] = {}
|
||||
|
||||
@@ -31,7 +35,23 @@ class Link(msgspec.Struct, omit_defaults=True):
|
||||
|
||||
|
||||
config = None
|
||||
conffile = Path.home() / ".local/share/cista/db.toml"
|
||||
conffile = None
|
||||
|
||||
|
||||
def init_confdir():
|
||||
if p := os.environ.get("CISTA_HOME"):
|
||||
home = Path(p)
|
||||
else:
|
||||
xdg = os.environ.get("XDG_CONFIG_HOME")
|
||||
home = (
|
||||
Path(xdg).expanduser() / "cista" if xdg else Path.home() / ".config/cista"
|
||||
)
|
||||
if not home.is_dir():
|
||||
home.mkdir(parents=True, exist_ok=True)
|
||||
home.chmod(0o700)
|
||||
|
||||
global conffile
|
||||
conffile = home / "db.toml"
|
||||
|
||||
|
||||
def derived_secret(*params, len=8) -> bytes:
|
||||
@@ -59,8 +79,8 @@ def dec_hook(typ, obj):
|
||||
|
||||
def config_update(modify):
|
||||
global config
|
||||
if not conffile.exists():
|
||||
conffile.parent.mkdir(parents=True, exist_ok=True)
|
||||
if conffile is None:
|
||||
init_confdir()
|
||||
tmpname = conffile.with_suffix(".tmp")
|
||||
try:
|
||||
f = tmpname.open("xb")
|
||||
@@ -74,10 +94,6 @@ def config_update(modify):
|
||||
old = conffile.read_bytes()
|
||||
c = msgspec.toml.decode(old, type=Config, dec_hook=dec_hook)
|
||||
except FileNotFoundError:
|
||||
# No existing config file, make sure we have a folder...
|
||||
confdir = conffile.parent
|
||||
confdir.mkdir(parents=True, exist_ok=True)
|
||||
confdir.chmod(0o700)
|
||||
old = b""
|
||||
c = None
|
||||
c = modify(c)
|
||||
@@ -89,6 +105,10 @@ def config_update(modify):
|
||||
return "read"
|
||||
f.write(new)
|
||||
f.close()
|
||||
if sys.platform == "win32":
|
||||
# Windows doesn't support atomic replace
|
||||
with suppress(FileNotFoundError):
|
||||
conffile.unlink()
|
||||
tmpname.rename(conffile) # Atomic replace
|
||||
except:
|
||||
f.close()
|
||||
@@ -116,6 +136,8 @@ def modifies_config(modify):
|
||||
|
||||
def load_config():
|
||||
global config
|
||||
if conffile is None:
|
||||
init_confdir()
|
||||
config = msgspec.toml.decode(conffile.read_bytes(), type=Config, dec_hook=dec_hook)
|
||||
|
||||
|
||||
@@ -134,7 +156,7 @@ def update_user(conf: Config, name: str, changes: dict) -> Config:
|
||||
# Encode into dict, update values with new, convert to Config
|
||||
try:
|
||||
u = conf.users[name].__copy__()
|
||||
except KeyError:
|
||||
except (KeyError, AttributeError):
|
||||
u = User()
|
||||
if "password" in changes:
|
||||
from . import auth
|
||||
@@ -143,7 +165,7 @@ def update_user(conf: Config, name: str, changes: dict) -> Config:
|
||||
del changes["password"]
|
||||
udict = msgspec.to_builtins(u, enc_hook=enc_hook)
|
||||
udict.update(changes)
|
||||
settings = msgspec.to_builtins(conf, enc_hook=enc_hook)
|
||||
settings = msgspec.to_builtins(conf, enc_hook=enc_hook) if conf else {"users": {}}
|
||||
settings["users"][name] = msgspec.convert(udict, User, dec_hook=dec_hook)
|
||||
return msgspec.convert(settings, Config, dec_hook=dec_hook)
|
||||
|
||||
|
||||
@@ -34,9 +34,11 @@ class File:
|
||||
self.open_rw()
|
||||
assert self.fd is not None
|
||||
if file_size is not None:
|
||||
assert pos + len(buffer) <= file_size
|
||||
os.ftruncate(self.fd, file_size)
|
||||
os.lseek(self.fd, pos, os.SEEK_SET)
|
||||
os.write(self.fd, buffer)
|
||||
if buffer:
|
||||
os.lseek(self.fd, pos, os.SEEK_SET)
|
||||
os.write(self.fd, buffer)
|
||||
|
||||
def __getitem__(self, slice):
|
||||
if self.fd is None:
|
||||
|
||||
237
cista/preview.py
Normal file
@@ -0,0 +1,237 @@
|
||||
import asyncio
|
||||
import gc
|
||||
import io
|
||||
import mimetypes
|
||||
import urllib.parse
|
||||
from pathlib import PurePosixPath
|
||||
from time import perf_counter
|
||||
from urllib.parse import unquote
|
||||
from wsgiref.handlers import format_date_time
|
||||
|
||||
import av
|
||||
import fitz # PyMuPDF
|
||||
import numpy as np
|
||||
import pillow_heif
|
||||
from PIL import Image
|
||||
from sanic import Blueprint, empty, raw
|
||||
from sanic.exceptions import NotFound
|
||||
from sanic.log import logger
|
||||
|
||||
from cista import config
|
||||
from cista.util.filename import sanitize
|
||||
|
||||
pillow_heif.register_heif_opener()
|
||||
|
||||
bp = Blueprint("preview", url_prefix="/preview")
|
||||
|
||||
# Map EXIF Orientation value to a corresponding PIL transpose
|
||||
EXIF_ORI = {
|
||||
2: Image.Transpose.FLIP_LEFT_RIGHT,
|
||||
3: Image.Transpose.ROTATE_180,
|
||||
4: Image.Transpose.FLIP_TOP_BOTTOM,
|
||||
5: Image.Transpose.TRANSPOSE,
|
||||
6: Image.Transpose.ROTATE_270,
|
||||
7: Image.Transpose.TRANSVERSE,
|
||||
8: Image.Transpose.ROTATE_90,
|
||||
}
|
||||
|
||||
|
||||
@bp.get("/<path:path>")
|
||||
async def preview(req, path):
|
||||
"""Preview a file"""
|
||||
maxsize = int(req.args.get("px", 1024))
|
||||
maxzoom = float(req.args.get("zoom", 2.0))
|
||||
quality = int(req.args.get("q", 60))
|
||||
rel = PurePosixPath(sanitize(unquote(path)))
|
||||
path = config.config.path / rel
|
||||
stat = path.lstat()
|
||||
etag = config.derived_secret(
|
||||
"preview", rel, stat.st_mtime_ns, quality, maxsize, maxzoom
|
||||
).hex()
|
||||
savename = PurePosixPath(path.name).with_suffix(".avif")
|
||||
headers = {
|
||||
"etag": etag,
|
||||
"last-modified": format_date_time(stat.st_mtime),
|
||||
"cache-control": "max-age=604800, immutable"
|
||||
+ ("" if config.config.public else ", private"),
|
||||
"content-type": "image/avif",
|
||||
"content-disposition": f"inline; filename*=UTF-8''{urllib.parse.quote(savename.as_posix())}",
|
||||
}
|
||||
if req.headers.if_none_match == etag:
|
||||
# The client has it cached, respond 304 Not Modified
|
||||
return empty(304, headers=headers)
|
||||
|
||||
if not path.is_file():
|
||||
raise NotFound("File not found")
|
||||
|
||||
img = await asyncio.get_event_loop().run_in_executor(
|
||||
req.app.ctx.threadexec, dispatch, path, quality, maxsize, maxzoom
|
||||
)
|
||||
return raw(img, headers=headers)
|
||||
|
||||
|
||||
def dispatch(path, quality, maxsize, maxzoom):
|
||||
if path.suffix.lower() in (".pdf", ".xps", ".epub", ".mobi"):
|
||||
return process_pdf(path, quality=quality, maxsize=maxsize, maxzoom=maxzoom)
|
||||
type, _ = mimetypes.guess_type(path.name)
|
||||
if type and type.startswith("video/"):
|
||||
return process_video(path, quality=quality, maxsize=maxsize)
|
||||
return process_image(path, quality=quality, maxsize=maxsize)
|
||||
|
||||
|
||||
def process_image(path, *, maxsize, quality):
|
||||
t_load = perf_counter()
|
||||
with Image.open(path) as img:
|
||||
# Force decode to include I/O in load timing
|
||||
img.load()
|
||||
t_proc = perf_counter()
|
||||
# Resize
|
||||
w, h = img.size
|
||||
img.thumbnail((min(w, maxsize), min(h, maxsize)))
|
||||
# Transpose pixels according to EXIF Orientation
|
||||
orientation = img.getexif().get(274, 1)
|
||||
if orientation in EXIF_ORI:
|
||||
img = img.transpose(EXIF_ORI[orientation])
|
||||
# Save as AVIF
|
||||
imgdata = io.BytesIO()
|
||||
t_save = perf_counter()
|
||||
img.save(imgdata, format="avif", quality=quality, speed=10, max_threads=1)
|
||||
|
||||
t_end = perf_counter()
|
||||
ret = imgdata.getvalue()
|
||||
|
||||
load_ms = (t_proc - t_load) * 1000
|
||||
proc_ms = (t_save - t_proc) * 1000
|
||||
save_ms = (t_end - t_save) * 1000
|
||||
logger.debug(
|
||||
"Preview image %s: load=%.1fms process=%.1fms save=%.1fms",
|
||||
path.name,
|
||||
load_ms,
|
||||
proc_ms,
|
||||
save_ms,
|
||||
)
|
||||
|
||||
return ret
|
||||
|
||||
|
||||
def process_pdf(path, *, maxsize, maxzoom, quality, page_number=0):
|
||||
t_load_start = perf_counter()
|
||||
pdf = fitz.open(path)
|
||||
page = pdf.load_page(page_number)
|
||||
w, h = page.rect[2:4]
|
||||
zoom = min(maxsize / w, maxsize / h, maxzoom)
|
||||
mat = fitz.Matrix(zoom, zoom)
|
||||
pix = page.get_pixmap(matrix=mat) # type: ignore[attr-defined]
|
||||
t_load_end = perf_counter()
|
||||
|
||||
t_save_start = perf_counter()
|
||||
ret = pix.pil_tobytes(format="avif", quality=quality, speed=10, max_threads=1)
|
||||
t_save_end = perf_counter()
|
||||
|
||||
logger.debug(
|
||||
"Preview pdf %s: load+render=%.1fms save=%.1fms",
|
||||
path.name,
|
||||
(t_load_end - t_load_start) * 1000,
|
||||
(t_save_end - t_save_start) * 1000,
|
||||
)
|
||||
return ret
|
||||
|
||||
|
||||
def process_video(path, *, maxsize, quality):
|
||||
frame = None
|
||||
imgdata = io.BytesIO()
|
||||
istream = ostream = icc = occ = frame = None
|
||||
t_load_start = perf_counter()
|
||||
# Initialize to avoid "possibly unbound" in static analysis when exceptions occur
|
||||
t_load_end = t_load_start
|
||||
t_save_start = t_load_start
|
||||
t_save_end = t_load_start
|
||||
with (
|
||||
av.open(str(path)) as icontainer,
|
||||
av.open(imgdata, "w", format="avif") as ocontainer,
|
||||
):
|
||||
istream = icontainer.streams.video[0]
|
||||
istream.codec_context.skip_frame = "NONKEY"
|
||||
icontainer.seek((icontainer.duration or 0) // 8)
|
||||
for frame in icontainer.decode(istream):
|
||||
if frame.dts is not None:
|
||||
break
|
||||
else:
|
||||
raise RuntimeError("No frames found in video")
|
||||
|
||||
# Resize frame to thumbnail size
|
||||
if frame.width > maxsize or frame.height > maxsize:
|
||||
scale_factor = min(maxsize / frame.width, maxsize / frame.height)
|
||||
new_width = int(frame.width * scale_factor)
|
||||
new_height = int(frame.height * scale_factor)
|
||||
frame = frame.reformat(width=new_width, height=new_height)
|
||||
|
||||
# Simple rotation detection and logging
|
||||
if frame.rotation:
|
||||
try:
|
||||
fplanes = frame.to_ndarray()
|
||||
# Split into Y, U, V planes of proper dimensions
|
||||
planes = [
|
||||
fplanes[: frame.height],
|
||||
fplanes[frame.height : frame.height + frame.height // 4].reshape(
|
||||
frame.height // 2, frame.width // 2
|
||||
),
|
||||
fplanes[frame.height + frame.height // 4 :].reshape(
|
||||
frame.height // 2, frame.width // 2
|
||||
),
|
||||
]
|
||||
# Rotate
|
||||
planes = [np.rot90(p, frame.rotation // 90) for p in planes]
|
||||
# Restore PyAV format
|
||||
planes = np.hstack([p.flat for p in planes]).reshape(
|
||||
-1, planes[0].shape[1]
|
||||
)
|
||||
frame = av.VideoFrame.from_ndarray(planes, format=frame.format.name)
|
||||
del planes, fplanes
|
||||
except Exception as e:
|
||||
if "not yet supported" in str(e):
|
||||
logger.warning(
|
||||
f"Not rotating {path.name} preview image by {frame.rotation}°:\n PyAV: {e}"
|
||||
)
|
||||
else:
|
||||
logger.exception(f"Error rotating video frame: {e}")
|
||||
t_load_end = perf_counter()
|
||||
|
||||
t_save_start = perf_counter()
|
||||
crf = str(int(63 * (1 - quality / 100) ** 2)) # Closely matching PIL quality-%
|
||||
ostream = ocontainer.add_stream(
|
||||
"av1",
|
||||
options={
|
||||
"crf": crf,
|
||||
"usage": "realtime",
|
||||
"cpu-used": "8",
|
||||
"threads": "1",
|
||||
},
|
||||
)
|
||||
assert isinstance(ostream, av.VideoStream)
|
||||
ostream.width = frame.width
|
||||
ostream.height = frame.height
|
||||
icc = istream.codec_context
|
||||
occ = ostream.codec_context
|
||||
|
||||
# Copy HDR metadata from input video stream
|
||||
occ.color_primaries = icc.color_primaries
|
||||
occ.color_trc = icc.color_trc
|
||||
occ.colorspace = icc.colorspace
|
||||
occ.color_range = icc.color_range
|
||||
|
||||
ocontainer.mux(ostream.encode(frame))
|
||||
ocontainer.mux(ostream.encode(None)) # Flush the stream
|
||||
t_save_end = perf_counter()
|
||||
|
||||
# Capture frame dimensions before cleanup
|
||||
ret = imgdata.getvalue()
|
||||
logger.debug(
|
||||
"Preview video %s: load+decode=%.1fms save=%.1fms",
|
||||
path.name,
|
||||
(t_load_end - t_load_start) * 1000,
|
||||
(t_save_end - t_save_start) * 1000,
|
||||
)
|
||||
del imgdata, istream, ostream, icc, occ, frame
|
||||
gc.collect()
|
||||
return ret
|
||||
@@ -22,7 +22,7 @@ class MkDir(ControlBase):
|
||||
|
||||
def __call__(self):
|
||||
path = config.config.path / filename.sanitize(self.path)
|
||||
path.mkdir(parents=False, exist_ok=False)
|
||||
path.mkdir(parents=True, exist_ok=False)
|
||||
|
||||
|
||||
class Rename(ControlBase):
|
||||
@@ -112,56 +112,42 @@ class ErrorMsg(msgspec.Struct):
|
||||
## Directory listings
|
||||
|
||||
|
||||
class FileEntry(msgspec.Struct):
|
||||
key: str
|
||||
size: int
|
||||
mtime: int
|
||||
|
||||
|
||||
class DirEntry(msgspec.Struct):
|
||||
key: str
|
||||
size: int
|
||||
mtime: int
|
||||
dir: DirList
|
||||
|
||||
def __getitem__(self, name):
|
||||
return self.dir[name]
|
||||
|
||||
def __setitem__(self, name, value):
|
||||
self.dir[name] = value
|
||||
|
||||
def __contains__(self, name):
|
||||
return name in self.dir
|
||||
|
||||
def __delitem__(self, name):
|
||||
del self.dir[name]
|
||||
|
||||
@property
|
||||
def props(self):
|
||||
return {k: v for k, v in self.__struct_fields__ if k != "dir"}
|
||||
|
||||
|
||||
DirList = dict[str, FileEntry | DirEntry]
|
||||
|
||||
|
||||
class UpdateEntry(msgspec.Struct, omit_defaults=True):
|
||||
"""Updates the named entry in the tree. Fields that are set replace old values. A list of entries recurses directories."""
|
||||
|
||||
class FileEntry(msgspec.Struct, array_like=True, frozen=True):
|
||||
level: int
|
||||
name: str
|
||||
key: str
|
||||
deleted: bool = False
|
||||
size: int | None = None
|
||||
mtime: int | None = None
|
||||
dir: DirList | None = None
|
||||
mtime: int
|
||||
size: int
|
||||
isfile: int
|
||||
|
||||
def __str__(self):
|
||||
return self.key or "FileEntry()"
|
||||
|
||||
def __repr__(self):
|
||||
return f"{self.name} ({self.size}, {self.mtime})"
|
||||
|
||||
|
||||
def make_dir_data(root):
|
||||
if len(root) == 3:
|
||||
return FileEntry(*root)
|
||||
id_, size, mtime, listing = root
|
||||
converted = {}
|
||||
for name, data in listing.items():
|
||||
converted[name] = make_dir_data(data)
|
||||
sz = sum(x.size for x in converted.values())
|
||||
mt = max(x.mtime for x in converted.values())
|
||||
return DirEntry(id_, sz, max(mt, mtime), converted)
|
||||
class Update(msgspec.Struct, array_like=True): ...
|
||||
|
||||
|
||||
class UpdKeep(Update, tag="k"):
|
||||
count: int
|
||||
|
||||
|
||||
class UpdDel(Update, tag="d"):
|
||||
count: int
|
||||
|
||||
|
||||
class UpdIns(Update, tag="i"):
|
||||
items: list[FileEntry]
|
||||
|
||||
|
||||
class UpdateMessage(msgspec.Struct):
|
||||
update: list[UpdKeep | UpdDel | UpdIns]
|
||||
|
||||
|
||||
class Space(msgspec.Struct):
|
||||
disk: int
|
||||
free: int
|
||||
usage: int
|
||||
storage: int
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import os
|
||||
import re
|
||||
from pathlib import Path, PurePath
|
||||
from pathlib import Path
|
||||
|
||||
from sanic import Sanic
|
||||
|
||||
@@ -15,7 +15,6 @@ def run(*, dev=False):
|
||||
# Silence Sanic's warning about running in production rather than debug
|
||||
os.environ["SANIC_IGNORE_PRODUCTION_WARNING"] = "1"
|
||||
confdir = config.conffile.parent
|
||||
wwwroot = PurePath(__file__).parent / "wwwroot"
|
||||
if opts.get("ssl"):
|
||||
# Run plain HTTP redirect/acme server on port 80
|
||||
server80.app.prepare(port=80, motd=False)
|
||||
@@ -27,7 +26,6 @@ def run(*, dev=False):
|
||||
motd=False,
|
||||
dev=dev,
|
||||
auto_reload=dev,
|
||||
reload_dir={confdir, wwwroot},
|
||||
access_log=True,
|
||||
) # type: ignore
|
||||
if dev:
|
||||
@@ -52,7 +50,7 @@ def parse_listen(listen):
|
||||
raise ValueError(
|
||||
f"Directory for unix socket does not exist: {unix.parent}/",
|
||||
)
|
||||
return "http://localhost", {"unix": unix}
|
||||
return "http://localhost", {"unix": unix.as_posix()}
|
||||
if re.fullmatch(r"(\w+(-\w+)*\.)+\w{2,}", listen, re.UNICODE):
|
||||
return f"https://{listen}", {"host": listen, "port": 443, "ssl": True}
|
||||
try:
|
||||
|
||||
@@ -21,7 +21,6 @@ def jres(data, **kwargs):
|
||||
|
||||
|
||||
async def handle_sanic_exception(request, e):
|
||||
logger.exception(e)
|
||||
context, code = {}, 500
|
||||
message = str(e)
|
||||
if isinstance(e, SanicException):
|
||||
@@ -30,6 +29,8 @@ async def handle_sanic_exception(request, e):
|
||||
if not message or not request.app.debug and code == 500:
|
||||
message = "Internal Server Error"
|
||||
message = f"⚠️ {message}" if code < 500 else f"🛑 {message}"
|
||||
if code == 500:
|
||||
logger.exception(e)
|
||||
# Non-browsers get JSON errors
|
||||
if "text/html" not in request.headers.accept:
|
||||
return jres(
|
||||
@@ -42,7 +43,7 @@ async def handle_sanic_exception(request, e):
|
||||
res.cookies.add_cookie("message", message, max_age=5)
|
||||
return res
|
||||
# Otherwise use Sanic's default error page
|
||||
return errorpages.HTMLRenderer(request, e, debug=request.app.debug).full()
|
||||
return errorpages.HTMLRenderer(request, e, debug=request.app.debug).render()
|
||||
|
||||
|
||||
def websocket_wrapper(handler):
|
||||
@@ -54,13 +55,14 @@ def websocket_wrapper(handler):
|
||||
auth.verify(request)
|
||||
await handler(request, ws, *args, **kwargs)
|
||||
except Exception as e:
|
||||
logger.exception(e)
|
||||
context, code, message = {}, 500, str(e) or "Internal Server Error"
|
||||
if isinstance(e, SanicException):
|
||||
context = e.context or {}
|
||||
code = e.status_code
|
||||
message = f"⚠️ {message}" if code < 500 else f"🛑 {message}"
|
||||
await asend(ws, ErrorMsg({"code": code, "message": message, **context}))
|
||||
if not getattr(e, "quiet", False) or code == 500:
|
||||
logger.exception(f"{code} {e!r}")
|
||||
raise
|
||||
|
||||
return wrapper
|
||||
|
||||
@@ -3,234 +3,457 @@ import shutil
|
||||
import sys
|
||||
import threading
|
||||
import time
|
||||
from contextlib import suppress
|
||||
from os import stat_result
|
||||
from pathlib import Path, PurePosixPath
|
||||
from stat import S_ISDIR, S_ISREG
|
||||
|
||||
import msgspec
|
||||
from sanic.log import logging
|
||||
from natsort import humansorted, natsort_keygen, ns
|
||||
from sanic.log import logger
|
||||
|
||||
from cista import config
|
||||
from cista.fileio import fuid
|
||||
from cista.protocol import DirEntry, FileEntry, UpdateEntry
|
||||
from cista.protocol import FileEntry, Space, UpdDel, UpdIns, UpdKeep
|
||||
|
||||
pubsub = {}
|
||||
tree = {"": None}
|
||||
tree_lock = threading.Lock()
|
||||
sortkey = natsort_keygen(alg=ns.LOCALE)
|
||||
|
||||
|
||||
class State:
|
||||
def __init__(self):
|
||||
self.lock = threading.RLock()
|
||||
self._space = Space(0, 0, 0, 0)
|
||||
self.root: list[FileEntry] = []
|
||||
|
||||
@property
|
||||
def space(self):
|
||||
with self.lock:
|
||||
return self._space
|
||||
|
||||
@space.setter
|
||||
def space(self, space):
|
||||
with self.lock:
|
||||
self._space = space
|
||||
|
||||
|
||||
def treeiter(rootmod):
|
||||
relpath = PurePosixPath()
|
||||
for i, entry in enumerate(rootmod):
|
||||
if entry.level > 0:
|
||||
relpath = PurePosixPath(*relpath.parts[: entry.level - 1]) / entry.name
|
||||
yield i, relpath, entry
|
||||
|
||||
|
||||
def treeget(rootmod: list[FileEntry], path: PurePosixPath):
|
||||
begin = None
|
||||
ret = []
|
||||
|
||||
for i, relpath, entry in treeiter(rootmod):
|
||||
if begin is None:
|
||||
if relpath == path:
|
||||
begin = i
|
||||
ret.append(entry)
|
||||
continue
|
||||
if entry.level <= len(path.parts):
|
||||
break
|
||||
ret.append(entry)
|
||||
|
||||
return begin, ret
|
||||
|
||||
|
||||
def treeinspos(rootmod: list[FileEntry], relpath: PurePosixPath, relfile: int):
|
||||
# Find the first entry greater than the new one
|
||||
# precondition: the new entry doesn't exist
|
||||
isfile = 0
|
||||
level = 0
|
||||
i = 0
|
||||
for i, rel, entry in treeiter(rootmod):
|
||||
if entry.level > level:
|
||||
# We haven't found item at level, skip subdirectories
|
||||
continue
|
||||
if entry.level < level:
|
||||
# We have passed the level, so the new item is the first
|
||||
return i
|
||||
if level == 0:
|
||||
# root
|
||||
level += 1
|
||||
continue
|
||||
|
||||
ename = rel.parts[level - 1]
|
||||
name = relpath.parts[level - 1]
|
||||
|
||||
esort = sortkey(ename)
|
||||
nsort = sortkey(name)
|
||||
# Non-leaf are always folders, only use relfile at leaf
|
||||
isfile = relfile if len(relpath.parts) == level else 0
|
||||
|
||||
# First compare by isfile, then by sorting order and if that too matches then case sensitive
|
||||
cmp = (
|
||||
entry.isfile - isfile
|
||||
or (esort > nsort) - (esort < nsort)
|
||||
or (ename > name) - (ename < name)
|
||||
)
|
||||
|
||||
if cmp > 0:
|
||||
return i
|
||||
if cmp < 0:
|
||||
continue
|
||||
|
||||
level += 1
|
||||
if level > len(relpath.parts):
|
||||
logger.error(
|
||||
f"insertpos level overflow: relpath={relpath}, i={i}, entry.name={entry.name}, entry.level={entry.level}, level={level}"
|
||||
)
|
||||
break
|
||||
else:
|
||||
i += 1
|
||||
|
||||
return i
|
||||
|
||||
|
||||
state = State()
|
||||
rootpath: Path = None # type: ignore
|
||||
quit = False
|
||||
modified_flags = (
|
||||
"IN_CREATE",
|
||||
"IN_DELETE",
|
||||
"IN_DELETE_SELF",
|
||||
"IN_MODIFY",
|
||||
"IN_MOVE_SELF",
|
||||
"IN_MOVED_FROM",
|
||||
"IN_MOVED_TO",
|
||||
)
|
||||
disk_usage = None
|
||||
quit = threading.Event()
|
||||
|
||||
## Filesystem scanning
|
||||
|
||||
|
||||
def watcher_thread(loop):
|
||||
global disk_usage, rootpath
|
||||
import inotify.adapters
|
||||
def walk(rel: PurePosixPath, stat: stat_result | None = None) -> list[FileEntry]:
|
||||
path = rootpath / rel
|
||||
ret = []
|
||||
try:
|
||||
st = stat or path.stat()
|
||||
isfile = int(not S_ISDIR(st.st_mode))
|
||||
entry = FileEntry(
|
||||
level=len(rel.parts),
|
||||
name=rel.name,
|
||||
key=fuid(st),
|
||||
mtime=int(st.st_mtime),
|
||||
size=st.st_size if isfile else 0,
|
||||
isfile=isfile,
|
||||
)
|
||||
if isfile:
|
||||
return [entry]
|
||||
# Walk all entries of the directory
|
||||
ret: list[FileEntry] = [...] # type: ignore
|
||||
li = []
|
||||
for f in path.iterdir():
|
||||
if quit.is_set():
|
||||
raise SystemExit("quit")
|
||||
if f.name.startswith("."):
|
||||
continue # No dotfiles
|
||||
with suppress(FileNotFoundError):
|
||||
s = f.lstat()
|
||||
isfile = S_ISREG(s.st_mode)
|
||||
isdir = S_ISDIR(s.st_mode)
|
||||
if not isfile and not isdir:
|
||||
continue
|
||||
li.append((int(isfile), f.name, s))
|
||||
# Build the tree as a list of FileEntries
|
||||
for [_, name, s] in humansorted(li):
|
||||
sub = walk(rel / name, stat=s)
|
||||
child = sub[0]
|
||||
entry = FileEntry(
|
||||
level=entry.level,
|
||||
name=entry.name,
|
||||
key=entry.key,
|
||||
size=entry.size + child.size,
|
||||
mtime=max(entry.mtime, child.mtime),
|
||||
isfile=entry.isfile,
|
||||
)
|
||||
ret.extend(sub)
|
||||
except FileNotFoundError:
|
||||
pass # Things may be rapidly in motion
|
||||
except OSError as e:
|
||||
if e.errno == 13: # Permission denied
|
||||
pass
|
||||
logger.error(f"Watching {path=}: {e!r}")
|
||||
if ret:
|
||||
ret[0] = entry
|
||||
return ret
|
||||
|
||||
while True:
|
||||
rootpath = config.config.path
|
||||
i = inotify.adapters.InotifyTree(rootpath.as_posix())
|
||||
old = format_tree() if tree[""] else None
|
||||
with tree_lock:
|
||||
# Initialize the tree from filesystem
|
||||
tree[""] = walk(rootpath)
|
||||
msg = format_tree()
|
||||
if msg != old:
|
||||
asyncio.run_coroutine_threadsafe(broadcast(msg), loop)
|
||||
|
||||
# The watching is not entirely reliable, so do a full refresh every minute
|
||||
refreshdl = time.monotonic() + 60.0
|
||||
def update_root(loop):
|
||||
"""Full filesystem scan"""
|
||||
old = state.root
|
||||
new = walk(PurePosixPath())
|
||||
if old != new:
|
||||
update = format_update(old, new)
|
||||
with state.lock:
|
||||
broadcast(update, loop)
|
||||
state.root = new
|
||||
|
||||
for event in i.event_gen():
|
||||
if quit:
|
||||
return
|
||||
# Disk usage update
|
||||
du = shutil.disk_usage(rootpath)
|
||||
if du != disk_usage:
|
||||
disk_usage = du
|
||||
asyncio.run_coroutine_threadsafe(broadcast(format_du()), loop)
|
||||
|
||||
def update_path(rootmod: list[FileEntry], relpath: PurePosixPath, loop):
|
||||
"""Called on FS updates, check the filesystem and broadcast any changes."""
|
||||
new = walk(relpath)
|
||||
obegin, old = treeget(rootmod, relpath)
|
||||
|
||||
if old == new:
|
||||
return
|
||||
|
||||
if obegin is not None:
|
||||
del rootmod[obegin : obegin + len(old)]
|
||||
|
||||
if new:
|
||||
i = treeinspos(rootmod, relpath, new[0].isfile)
|
||||
rootmod[i:i] = new
|
||||
|
||||
|
||||
def update_space(loop):
|
||||
"""Called periodically to update the disk usage."""
|
||||
du = shutil.disk_usage(rootpath)
|
||||
space = Space(*du, storage=state.root[0].size)
|
||||
# Update only on difference above 1 MB
|
||||
tol = 10**6
|
||||
old = msgspec.structs.astuple(state.space)
|
||||
new = msgspec.structs.astuple(space)
|
||||
if any(abs(o - n) > tol for o, n in zip(old, new, strict=True)):
|
||||
state.space = space
|
||||
broadcast(format_space(space), loop)
|
||||
|
||||
|
||||
## Messaging
|
||||
|
||||
|
||||
def format_update(old, new):
|
||||
# Make keep/del/insert diff until one of the lists ends
|
||||
oidx, nidx = 0, 0
|
||||
oremain, nremain = set(old), set(new)
|
||||
update = []
|
||||
keep_count = 0
|
||||
iteration_count = 0
|
||||
# Precompute index maps to allow deterministic tie-breaking when both
|
||||
# candidates exist in both sequences but are not equal (rename/move cases)
|
||||
old_pos = {e: i for i, e in enumerate(old)}
|
||||
new_pos = {e: i for i, e in enumerate(new)}
|
||||
|
||||
while oidx < len(old) and nidx < len(new):
|
||||
iteration_count += 1
|
||||
|
||||
# Emergency brake for potential infinite loops
|
||||
if iteration_count > 50000:
|
||||
logger.error(
|
||||
f"format_update potential infinite loop! iteration={iteration_count}, oidx={oidx}, nidx={nidx}"
|
||||
)
|
||||
raise Exception(
|
||||
f"format_update infinite loop detected at iteration {iteration_count}"
|
||||
)
|
||||
|
||||
modified = False
|
||||
# Matching entries are kept
|
||||
if old[oidx] == new[nidx]:
|
||||
entry = old[oidx]
|
||||
oremain.discard(entry)
|
||||
nremain.discard(entry)
|
||||
keep_count += 1
|
||||
oidx += 1
|
||||
nidx += 1
|
||||
continue
|
||||
|
||||
if keep_count > 0:
|
||||
modified = True
|
||||
update.append(UpdKeep(keep_count))
|
||||
keep_count = 0
|
||||
|
||||
# Items only in old are deleted
|
||||
del_count = 0
|
||||
while oidx < len(old) and old[oidx] not in nremain:
|
||||
oremain.remove(old[oidx])
|
||||
del_count += 1
|
||||
oidx += 1
|
||||
if del_count:
|
||||
update.append(UpdDel(del_count))
|
||||
continue
|
||||
|
||||
# Items only in new are inserted
|
||||
insert_items = []
|
||||
while nidx < len(new) and new[nidx] not in oremain:
|
||||
entry = new[nidx]
|
||||
nremain.discard(entry)
|
||||
insert_items.append(entry)
|
||||
nidx += 1
|
||||
if insert_items:
|
||||
modified = True
|
||||
update.append(UpdIns(insert_items))
|
||||
|
||||
if not modified:
|
||||
# Tie-break: both items exist in both lists but don't match here.
|
||||
# Decide whether to delete old[oidx] first or insert new[nidx] first
|
||||
# based on which alignment is closer.
|
||||
if oidx >= len(old) or nidx >= len(new):
|
||||
break
|
||||
# Do a full refresh?
|
||||
if time.monotonic() > refreshdl:
|
||||
break
|
||||
if event is None:
|
||||
continue
|
||||
_, flags, path, filename = event
|
||||
if not any(f in modified_flags for f in flags):
|
||||
continue
|
||||
# Update modified path
|
||||
path = PurePosixPath(path) / filename
|
||||
try:
|
||||
update(path.relative_to(rootpath), loop)
|
||||
except Exception as e:
|
||||
print("Watching error", e, path, rootpath)
|
||||
raise
|
||||
i = None # Free the inotify object
|
||||
cur_old = old[oidx]
|
||||
cur_new = new[nidx]
|
||||
|
||||
pos_old_in_new = new_pos.get(cur_old)
|
||||
pos_new_in_old = old_pos.get(cur_new)
|
||||
|
||||
# Default distances if not present (shouldn't happen if in remain sets)
|
||||
dist_del = (pos_old_in_new - nidx) if pos_old_in_new is not None else 1
|
||||
dist_ins = (pos_new_in_old - oidx) if pos_new_in_old is not None else 1
|
||||
|
||||
# Prefer the operation with smaller forward distance; tie => delete
|
||||
if dist_del <= dist_ins:
|
||||
# Delete current old item
|
||||
oremain.discard(cur_old)
|
||||
update.append(UpdDel(1))
|
||||
oidx += 1
|
||||
else:
|
||||
# Insert current new item
|
||||
nremain.discard(cur_new)
|
||||
update.append(UpdIns([cur_new]))
|
||||
nidx += 1
|
||||
|
||||
# Diff any remaining
|
||||
if keep_count > 0:
|
||||
update.append(UpdKeep(keep_count))
|
||||
if oremain:
|
||||
update.append(UpdDel(len(oremain)))
|
||||
elif nremain:
|
||||
update.append(UpdIns(new[nidx:]))
|
||||
|
||||
return msgspec.json.encode({"update": update}).decode()
|
||||
|
||||
|
||||
def watcher_thread_poll(loop):
|
||||
global disk_usage, rootpath
|
||||
|
||||
while not quit:
|
||||
rootpath = config.config.path
|
||||
old = format_tree() if tree[""] else None
|
||||
with tree_lock:
|
||||
# Initialize the tree from filesystem
|
||||
tree[""] = walk(rootpath)
|
||||
msg = format_tree()
|
||||
if msg != old:
|
||||
asyncio.run_coroutine_threadsafe(broadcast(msg), loop)
|
||||
|
||||
# Disk usage update
|
||||
du = shutil.disk_usage(rootpath)
|
||||
if du != disk_usage:
|
||||
disk_usage = du
|
||||
asyncio.run_coroutine_threadsafe(broadcast(format_du()), loop)
|
||||
|
||||
time.sleep(1.0)
|
||||
def format_space(usage):
|
||||
return msgspec.json.encode({"space": usage}).decode()
|
||||
|
||||
|
||||
def format_du():
|
||||
return msgspec.json.encode(
|
||||
{
|
||||
"space": {
|
||||
"disk": disk_usage.total,
|
||||
"used": disk_usage.used,
|
||||
"free": disk_usage.free,
|
||||
"storage": tree[""].size,
|
||||
},
|
||||
},
|
||||
).decode()
|
||||
|
||||
|
||||
def format_tree():
|
||||
root = tree[""]
|
||||
def format_root(root):
|
||||
return msgspec.json.encode({"root": root}).decode()
|
||||
|
||||
|
||||
def walk(path: Path) -> DirEntry | FileEntry | None:
|
||||
try:
|
||||
s = path.stat()
|
||||
key = fuid(s)
|
||||
assert key, repr(key)
|
||||
mtime = int(s.st_mtime)
|
||||
if path.is_file():
|
||||
return FileEntry(key, s.st_size, mtime)
|
||||
|
||||
tree = {
|
||||
p.name: v
|
||||
for p in path.iterdir()
|
||||
if not p.name.startswith(".")
|
||||
if (v := walk(p)) is not None
|
||||
}
|
||||
if tree:
|
||||
size = sum(v.size for v in tree.values())
|
||||
mtime = max(mtime, *(v.mtime for v in tree.values()))
|
||||
else:
|
||||
size = 0
|
||||
return DirEntry(key, size, mtime, tree)
|
||||
except FileNotFoundError:
|
||||
return None
|
||||
except OSError as e:
|
||||
print("OS error walking path", path, e)
|
||||
return None
|
||||
def broadcast(msg, loop):
|
||||
return asyncio.run_coroutine_threadsafe(abroadcast(msg), loop).result()
|
||||
|
||||
|
||||
def update(relpath: PurePosixPath, loop):
|
||||
"""Called by inotify updates, check the filesystem and broadcast any changes."""
|
||||
if rootpath is None or relpath is None:
|
||||
print("ERROR", rootpath, relpath)
|
||||
new = walk(rootpath / relpath)
|
||||
with tree_lock:
|
||||
update = update_internal(relpath, new)
|
||||
if not update:
|
||||
return # No changes
|
||||
msg = msgspec.json.encode({"update": update}).decode()
|
||||
asyncio.run_coroutine_threadsafe(broadcast(msg), loop)
|
||||
|
||||
|
||||
def update_internal(
|
||||
relpath: PurePosixPath,
|
||||
new: DirEntry | FileEntry | None,
|
||||
) -> list[UpdateEntry]:
|
||||
path = "", *relpath.parts
|
||||
old = tree
|
||||
elems = []
|
||||
for name in path:
|
||||
if name not in old:
|
||||
# File or folder created
|
||||
old = None
|
||||
elems.append((name, None))
|
||||
if len(elems) < len(path):
|
||||
# We got a notify for an item whose parent is not in tree
|
||||
print("Tree out of sync DEBUG", relpath)
|
||||
print(elems)
|
||||
print("Current tree:")
|
||||
print(tree[""])
|
||||
print("Walking all:")
|
||||
print(walk(rootpath))
|
||||
raise ValueError("Tree out of sync")
|
||||
break
|
||||
old = old[name]
|
||||
elems.append((name, old))
|
||||
if old == new:
|
||||
return []
|
||||
mt = new.mtime if new else 0
|
||||
szdiff = (new.size if new else 0) - (old.size if old else 0)
|
||||
# Update parents
|
||||
update = []
|
||||
for name, entry in elems[:-1]:
|
||||
u = UpdateEntry(name, entry.key)
|
||||
if szdiff:
|
||||
entry.size += szdiff
|
||||
u.size = entry.size
|
||||
if mt > entry.mtime:
|
||||
u.mtime = entry.mtime = mt
|
||||
update.append(u)
|
||||
# The last element is the one that changed
|
||||
name, entry = elems[-1]
|
||||
parent = elems[-2][1] if len(elems) > 1 else tree
|
||||
u = UpdateEntry(name, new.key if new else entry.key)
|
||||
if new:
|
||||
parent[name] = new
|
||||
if u.size != new.size:
|
||||
u.size = new.size
|
||||
if u.mtime != new.mtime:
|
||||
u.mtime = new.mtime
|
||||
if isinstance(new, DirEntry) and u.dir != new.dir:
|
||||
u.dir = new.dir
|
||||
else:
|
||||
del parent[name]
|
||||
u.deleted = True
|
||||
update.append(u)
|
||||
return update
|
||||
|
||||
|
||||
async def broadcast(msg):
|
||||
async def abroadcast(msg):
|
||||
try:
|
||||
for queue in pubsub.values():
|
||||
queue.put_nowait(msg)
|
||||
except Exception:
|
||||
# Log because asyncio would silently eat the error
|
||||
logging.exception("Broadcast error")
|
||||
logger.exception("Broadcast error")
|
||||
|
||||
|
||||
async def start(app, loop):
|
||||
## Watcher thread
|
||||
|
||||
|
||||
def watcher_inotify(loop):
|
||||
"""Inotify watcher thread (Linux only)"""
|
||||
import inotify.adapters
|
||||
|
||||
modified_flags = (
|
||||
"IN_CREATE",
|
||||
"IN_DELETE",
|
||||
"IN_DELETE_SELF",
|
||||
"IN_MODIFY",
|
||||
"IN_MOVE_SELF",
|
||||
"IN_MOVED_FROM",
|
||||
"IN_MOVED_TO",
|
||||
)
|
||||
while not quit.is_set():
|
||||
i = inotify.adapters.InotifyTree(rootpath.as_posix())
|
||||
# Initialize the tree from filesystem
|
||||
update_root(loop)
|
||||
trefresh = time.monotonic() + 300.0
|
||||
tspace = time.monotonic() + 5.0
|
||||
# Watch for changes (frequent wakeups needed for quiting)
|
||||
while not quit.is_set():
|
||||
t = time.monotonic()
|
||||
# The watching is not entirely reliable, so do a full refresh every 30 seconds
|
||||
if t >= trefresh:
|
||||
break
|
||||
# Disk usage update
|
||||
if t >= tspace:
|
||||
tspace = time.monotonic() + 5.0
|
||||
update_space(loop)
|
||||
# Inotify events, update the tree
|
||||
dirty = False
|
||||
rootmod = state.root[:]
|
||||
for event in i.event_gen(yield_nones=False, timeout_s=0.1):
|
||||
assert event
|
||||
if quit.is_set():
|
||||
return
|
||||
interesting = any(f in modified_flags for f in event[1])
|
||||
if interesting:
|
||||
# Update modified path
|
||||
path = PurePosixPath(event[2]) / event[3]
|
||||
try:
|
||||
rel_path = path.relative_to(rootpath)
|
||||
update_path(rootmod, rel_path, loop)
|
||||
except Exception as e:
|
||||
logger.error(
|
||||
f"Error processing inotify event for path {path}: {e}"
|
||||
)
|
||||
raise
|
||||
if not dirty:
|
||||
t = time.monotonic()
|
||||
dirty = True
|
||||
# Wait a maximum of 0.2s to push the updates
|
||||
if dirty and time.monotonic() >= t + 0.2:
|
||||
break
|
||||
if dirty and state.root != rootmod:
|
||||
try:
|
||||
update = format_update(state.root, rootmod)
|
||||
with state.lock:
|
||||
broadcast(update, loop)
|
||||
state.root = rootmod
|
||||
except Exception:
|
||||
logger.exception(
|
||||
"format_update failed; falling back to full rescan"
|
||||
)
|
||||
# Fallback: full rescan and try diff again; last resort send full root
|
||||
try:
|
||||
fresh = walk(PurePosixPath())
|
||||
try:
|
||||
update = format_update(state.root, fresh)
|
||||
with state.lock:
|
||||
broadcast(update, loop)
|
||||
state.root = fresh
|
||||
except Exception:
|
||||
logger.exception(
|
||||
"Fallback diff failed; sending full root snapshot"
|
||||
)
|
||||
with state.lock:
|
||||
broadcast(format_root(fresh), loop)
|
||||
state.root = fresh
|
||||
except Exception:
|
||||
logger.exception(
|
||||
"Full rescan failed; dropping this batch of updates"
|
||||
)
|
||||
|
||||
del i # Free the inotify object
|
||||
|
||||
|
||||
def watcher_poll(loop):
|
||||
"""Polling version of the watcher thread."""
|
||||
while not quit.is_set():
|
||||
t0 = time.perf_counter()
|
||||
update_root(loop)
|
||||
update_space(loop)
|
||||
dur = time.perf_counter() - t0
|
||||
if dur > 1.0:
|
||||
logger.debug(f"Reading the full file list took {dur:.1f}s")
|
||||
quit.wait(0.1 + 8 * dur)
|
||||
|
||||
|
||||
def start(app, loop):
|
||||
global rootpath
|
||||
config.load_config()
|
||||
rootpath = config.config.path
|
||||
use_inotify = sys.platform == "linux"
|
||||
app.ctx.watcher = threading.Thread(
|
||||
target=watcher_thread if sys.platform == "linux" else watcher_thread_poll,
|
||||
target=watcher_inotify if use_inotify else watcher_poll,
|
||||
args=[loop],
|
||||
# Descriptive name for system monitoring
|
||||
name=f"cista-watcher {rootpath}",
|
||||
)
|
||||
app.ctx.watcher.start()
|
||||
|
||||
|
||||
async def stop(app, loop):
|
||||
global quit
|
||||
quit = True
|
||||
def stop(app):
|
||||
quit.set()
|
||||
app.ctx.watcher.join()
|
||||
|
||||
BIN
docs/cista.webp
Normal file
|
After Width: | Height: | Size: 40 KiB |
2
frontend/.npmrc
Normal file
@@ -0,0 +1,2 @@
|
||||
audit=false
|
||||
fund=false
|
||||
@@ -1,40 +1,46 @@
|
||||
# cista-front
|
||||
|
||||
This template should help get you started developing with Vue 3 in Vite.
|
||||
|
||||
## Recommended IDE Setup
|
||||
|
||||
[VSCode](https://code.visualstudio.com/) + [Volar](https://marketplace.visualstudio.com/items?itemName=Vue.volar) (and disable Vetur) + [TypeScript Vue Plugin (Volar)](https://marketplace.visualstudio.com/items?itemName=Vue.vscode-typescript-vue-plugin).
|
||||
|
||||
## Type Support for `.vue` Imports in TS
|
||||
|
||||
TypeScript cannot handle type information for `.vue` imports by default, so we replace the `tsc` CLI with `vue-tsc` for type checking. In editors, we need [TypeScript Vue Plugin (Volar)](https://marketplace.visualstudio.com/items?itemName=Vue.vscode-typescript-vue-plugin) to make the TypeScript language service aware of `.vue` types.
|
||||
|
||||
If the standalone TypeScript plugin doesn't feel fast enough to you, Volar has also implemented a [Take Over Mode](https://github.com/johnsoncodehk/volar/discussions/471#discussioncomment-1361669) that is more performant. You can enable it by the following steps:
|
||||
|
||||
1. Disable the built-in TypeScript Extension
|
||||
1) Run `Extensions: Show Built-in Extensions` from VSCode's command palette
|
||||
2) Find `TypeScript and JavaScript Language Features`, right click and select `Disable (Workspace)`
|
||||
2. Reload the VSCode window by running `Developer: Reload Window` from the command palette.
|
||||
|
||||
## Customize configuration
|
||||
|
||||
See [Vite Configuration Reference](https://vitejs.dev/config/).
|
||||
|
||||
## Project Setup
|
||||
|
||||
```sh
|
||||
npm install
|
||||
```
|
||||
|
||||
### Compile and Hot-Reload for Development
|
||||
|
||||
```sh
|
||||
npm run dev
|
||||
```
|
||||
|
||||
### Type-Check, Compile and Minify for Production
|
||||
|
||||
```sh
|
||||
npm run build
|
||||
```
|
||||
# Cista Vue Frontend
|
||||
|
||||
The frontend is a Single-Page App implemented with Vue 3. Development uses the Vite server together with the main Python backend, but in production the latter also serves the prebuilt frontend files.
|
||||
|
||||
## Recommended IDE Setup
|
||||
|
||||
[VSCode](https://code.visualstudio.com/) + [Volar](https://marketplace.visualstudio.com/items?itemName=Vue.volar) (and disable Vetur) + [TypeScript Vue Plugin (Volar)](https://marketplace.visualstudio.com/items?itemName=Vue.vscode-typescript-vue-plugin).
|
||||
|
||||
## Type Support for `.vue` Imports in TS
|
||||
|
||||
TypeScript cannot handle type information for `.vue` imports by default, so we replace the `tsc` CLI with `vue-tsc` for type checking. In editors, we need [TypeScript Vue Plugin (Volar)](https://marketplace.visualstudio.com/items?itemName=Vue.vscode-typescript-vue-plugin) to make the TypeScript language service aware of `.vue` types.
|
||||
|
||||
If the standalone TypeScript plugin doesn't feel fast enough to you, Volar has also implemented a [Take Over Mode](https://github.com/johnsoncodehk/volar/discussions/471#discussioncomment-1361669) that is more performant. You can enable it by the following steps:
|
||||
|
||||
1. Disable the built-in TypeScript Extension
|
||||
1) Run `Extensions: Show Built-in Extensions` from VSCode's command palette
|
||||
2) Find `TypeScript and JavaScript Language Features`, right click and select `Disable (Workspace)`
|
||||
2. Reload the VSCode window by running `Developer: Reload Window` from the command palette.
|
||||
|
||||
## Hot-Reload for Development
|
||||
|
||||
### Run the backend
|
||||
|
||||
```fish
|
||||
uv sync --dev
|
||||
uv run cista --dev -l :8000
|
||||
```
|
||||
|
||||
### And the Vite server (in another terminal)
|
||||
|
||||
```fish
|
||||
cd frontend
|
||||
bun install
|
||||
bun run dev
|
||||
```
|
||||
Browse to Vite, which will proxy API requests to port 8000. Both servers live reload changes.
|
||||
|
||||
|
||||
### Type-Check, Compile and Minify for Production
|
||||
|
||||
This is also called by `uv build` during Python packaging:
|
||||
|
||||
```fish
|
||||
bun run build
|
||||
```
|
||||
|
||||
@@ -1,12 +1,11 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang=en>
|
||||
<meta charset=UTF-8>
|
||||
<title>Cista</title>
|
||||
<title>Cista Storage</title>
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
|
||||
<link rel="icon" href="/favicon.ico">
|
||||
<link rel="icon" href="/src/assets/logo.svg">
|
||||
<link rel="preconnect" href="https://fonts.googleapis.com">
|
||||
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
|
||||
<link href="https://fonts.googleapis.com/css2?family=Roboto+Mono&family=Roboto:wght@400;700&display=swap" rel="stylesheet">
|
||||
<script type="module" src="/src/main.ts"></script>
|
||||
|
||||
<div id="app"></div>
|
||||
<body id="app">
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"name": "front",
|
||||
"name": "cista-frontend",
|
||||
"version": "0.0.0",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
@@ -12,6 +12,9 @@
|
||||
"lint": "eslint . --ext .vue,.js,.jsx,.cjs,.mjs,.ts,.tsx,.cts,.mts --fix --ignore-path .gitignore",
|
||||
"format": "prettier --write src/"
|
||||
},
|
||||
"engines": {
|
||||
"node": ">=18.0.0"
|
||||
},
|
||||
"dependencies": {
|
||||
"@imengyu/vue3-context-menu": "^1.3.3",
|
||||
"@vueuse/core": "^10.4.1",
|
||||
@@ -21,7 +24,6 @@
|
||||
"pinia": "^2.1.6",
|
||||
"pinia-plugin-persistedstate": "^3.2.0",
|
||||
"unplugin-vue-components": "^0.25.2",
|
||||
"vite-plugin-rewrite-all": "^1.0.1",
|
||||
"vite-svg-loader": "^4.0.0",
|
||||
"vue": "^3.3.4",
|
||||
"vue-router": "^4.2.4"
|
||||
2
frontend/public/robots.txt
Normal file
@@ -0,0 +1,2 @@
|
||||
User-agent: *
|
||||
Disallow: /
|
||||
158
frontend/src/App.vue
Normal file
@@ -0,0 +1,158 @@
|
||||
<template>
|
||||
<LoginModal />
|
||||
<SettingsModal />
|
||||
<header>
|
||||
<HeaderMain ref="headerMain" :path="path.pathList" :query="path.query">
|
||||
<HeaderSelected :path="path.pathList" />
|
||||
</HeaderMain>
|
||||
<BreadCrumb :path="path.pathList" primary />
|
||||
</header>
|
||||
<main>
|
||||
<RouterView :path="path.pathList" :query="path.query" />
|
||||
</main>
|
||||
<footer>
|
||||
<TransferBar :status=store.uprogress @cancel=store.cancelUploads class=upload />
|
||||
<TransferBar :status=store.dprogress @cancel=store.cancelDownloads class=download />
|
||||
</footer>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { RouterView } from 'vue-router'
|
||||
import type { ComputedRef } from 'vue'
|
||||
import type HeaderMain from '@/components/HeaderMain.vue'
|
||||
import { onMounted, onUnmounted, ref, watchEffect } from 'vue'
|
||||
import { loadSession, watchConnect, watchDisconnect } from '@/repositories/WS'
|
||||
import { useMainStore } from '@/stores/main'
|
||||
|
||||
import { computed } from 'vue'
|
||||
import Router from '@/router/index'
|
||||
import type { SortOrder } from './utils/docsort'
|
||||
import type SettingsModalVue from './components/SettingsModal.vue'
|
||||
|
||||
interface Path {
|
||||
path: string
|
||||
pathList: string[]
|
||||
query: string
|
||||
}
|
||||
const store = useMainStore()
|
||||
const path: ComputedRef<Path> = computed(() => {
|
||||
const p = decodeURIComponent(Router.currentRoute.value.path).split('//')
|
||||
const pathList = p[0].split('/').filter(value => value !== '')
|
||||
const query = p.slice(1).join('//')
|
||||
return {
|
||||
path: p[0],
|
||||
pathList,
|
||||
query
|
||||
}
|
||||
})
|
||||
watchEffect(() => {
|
||||
document.title = path.value.path.replace(/\/$/, '').split('/').pop() || store.server.name || 'Cista Storage'
|
||||
})
|
||||
onMounted(loadSession)
|
||||
onMounted(watchConnect)
|
||||
onUnmounted(watchDisconnect)
|
||||
const headerMain = ref<typeof HeaderMain | null>(null)
|
||||
let vert = 0
|
||||
let timer: any = null
|
||||
const globalShortcutHandler = (event: KeyboardEvent) => {
|
||||
if (store.dialog) {
|
||||
if (timer) {
|
||||
clearTimeout(timer)
|
||||
timer = null
|
||||
}
|
||||
return
|
||||
}
|
||||
const fileExplorer = store.fileExplorer as any
|
||||
if (!fileExplorer) return
|
||||
const c = fileExplorer.isCursor()
|
||||
const input = (event.target as HTMLElement).tagName === 'INPUT'
|
||||
const keyup = event.type === 'keyup'
|
||||
if (event.repeat) {
|
||||
if (
|
||||
event.key === 'ArrowUp' ||
|
||||
event.key === 'ArrowDown' ||
|
||||
event.key === 'ArrowLeft' ||
|
||||
event.key === 'ArrowRight' ||
|
||||
(c && event.code === 'Space')
|
||||
) {
|
||||
if (!input) event.preventDefault()
|
||||
}
|
||||
return
|
||||
}
|
||||
//console.log("key pressed", event)
|
||||
/// Long if-else machina for all keys we handle here
|
||||
let arrow = ''
|
||||
if (!input && event.key.startsWith("Arrow")) arrow = event.key.slice(5).toLowerCase()
|
||||
// Find: process on keydown so that we can bypass the built-in search hotkey
|
||||
else if (!keyup && event.key === 'f' && (event.ctrlKey || event.metaKey)) {
|
||||
headerMain.value!.toggleSearchInput()
|
||||
}
|
||||
// Search also on / (UNIX style)
|
||||
else if (!input && keyup && event.key === '/') {
|
||||
headerMain.value!.toggleSearchInput()
|
||||
}
|
||||
// Globally close search, clear errors on Escape
|
||||
else if (keyup && event.key === 'Escape') {
|
||||
store.error = ''
|
||||
headerMain.value!.closeSearch(event)
|
||||
store.focusBreadcrumb()
|
||||
}
|
||||
else if (!input && keyup && event.key === 'Backspace') {
|
||||
Router.back()
|
||||
}
|
||||
// Select all (toggle); keydown to precede and prevent builtin
|
||||
else if (!input && !keyup && event.key === 'a' && (event.ctrlKey || event.metaKey)) {
|
||||
fileExplorer.toggleSelectAll()
|
||||
}
|
||||
// G toggles Gallery
|
||||
else if (!input && keyup && event.key === 'g') {
|
||||
store.prefs.gallery = !store.prefs.gallery
|
||||
}
|
||||
// Keys Backquote-1-2-3 to sort columns
|
||||
else if (
|
||||
!input &&
|
||||
keyup &&
|
||||
(event.code === 'Backquote' || event.key === '1' || event.key === '2' || event.key === '3')
|
||||
) {
|
||||
store.sort(['', 'name', 'modified', 'size'][+event.key || 0] as SortOrder)
|
||||
}
|
||||
// Rename
|
||||
else if (!input && c && keyup && !event.ctrlKey && (event.key === 'F2' || event.key === 'r')) {
|
||||
fileExplorer.cursorRename()
|
||||
}
|
||||
// Toggle selections on file explorer; ignore all spaces to prevent scrolling built-in hotkey
|
||||
else if (!input && c && event.code === 'Space') {
|
||||
if (keyup && !event.altKey && !event.ctrlKey)
|
||||
fileExplorer.cursorSelect()
|
||||
}
|
||||
else return
|
||||
/// We are handling this!
|
||||
event.preventDefault()
|
||||
if (timer) {
|
||||
clearTimeout(timer) // Good for either timeout or interval
|
||||
timer = null
|
||||
}
|
||||
let f: any
|
||||
switch (arrow) {
|
||||
case 'up': f = () => fileExplorer.up(event); break
|
||||
case 'down': f = () => fileExplorer.down(event); break
|
||||
case 'left': f = () => fileExplorer.left(event); break
|
||||
case 'right': f = () => fileExplorer.right(event); break
|
||||
}
|
||||
if (f && !keyup) {
|
||||
// Initial move, then t0 delay until repeats at tr intervals
|
||||
const t0 = 200, tr = event.altKey ? 20 : 100
|
||||
f()
|
||||
timer = setTimeout(() => { timer = setInterval(f, tr) }, t0 - tr)
|
||||
}
|
||||
}
|
||||
onMounted(() => {
|
||||
window.addEventListener('keydown', globalShortcutHandler)
|
||||
window.addEventListener('keyup', globalShortcutHandler)
|
||||
})
|
||||
onUnmounted(() => {
|
||||
window.removeEventListener('keydown', globalShortcutHandler)
|
||||
window.removeEventListener('keyup', globalShortcutHandler)
|
||||
})
|
||||
export type { Path }
|
||||
</script>
|
||||
1
frontend/src/assets/logo.svg
Normal file
@@ -0,0 +1 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="512" height="512" viewBox="0 0 512 512"><rect width="512" height="512" fill="#f80" rx="64" ry="64"/><path fill="#fff" d="M381 298h-84V167h-66L339 35l108 132h-66zm-168-84h-84v131H63l108 132 108-132h-66z"/></svg>
|
||||
|
After Width: | Height: | Size: 258 B |
@@ -14,7 +14,7 @@
|
||||
/* The following are overridden by responsive layouts */
|
||||
--root-font-size: 1rem;
|
||||
--header-font-size: 1rem;
|
||||
--header-height: calc(6.5 * var(--header-font-size));
|
||||
--header-height: 4rem;
|
||||
}
|
||||
@media (prefers-color-scheme: dark) {
|
||||
:root {
|
||||
@@ -37,12 +37,6 @@
|
||||
:root {
|
||||
--root-font-size: calc(8px + 8 * 100vw / 1000);
|
||||
}
|
||||
header .buttons:has(input[type='search']) > div {
|
||||
display: none;
|
||||
}
|
||||
header .buttons > div:has(input[type='search']) {
|
||||
display: inherit;
|
||||
}
|
||||
}
|
||||
@media screen and (min-width: 2000px) {
|
||||
:root {
|
||||
@@ -54,6 +48,7 @@
|
||||
:root {
|
||||
--header-font-size: calc(10px + 10 * 100vh / 600); /* 20px at 600px height */
|
||||
--root-font-size: 0.8rem;
|
||||
--header-height: 2rem;
|
||||
}
|
||||
header .breadcrumb > * {
|
||||
padding-top: calc(8 + 8 * 100vh / 600) !important;
|
||||
@@ -78,17 +73,13 @@
|
||||
}
|
||||
header {
|
||||
display: flex;
|
||||
flex-direction: row-reverse;
|
||||
justify-content: space-between;
|
||||
align-items: end;
|
||||
}
|
||||
header .breadcrumb {
|
||||
flex-shrink: 1;
|
||||
}
|
||||
header .breadcrumb > * {
|
||||
flex-shrink: 1;
|
||||
padding-top: 1rem !important;
|
||||
padding-bottom: 1rem !important;
|
||||
header .headermain { order: 1; }
|
||||
header .breadcrumb { align-self: stretch; }
|
||||
header .action-button {
|
||||
width: 2em;
|
||||
height: 2em;
|
||||
}
|
||||
}
|
||||
@media print {
|
||||
@@ -98,10 +89,10 @@
|
||||
--header-background: none;
|
||||
--header-color: black;
|
||||
}
|
||||
nav,
|
||||
.headermain,
|
||||
.menu,
|
||||
.rename-button {
|
||||
display: none;
|
||||
display: none !important;
|
||||
}
|
||||
.breadcrumb > a {
|
||||
color: black !important;
|
||||
@@ -116,16 +107,31 @@
|
||||
}
|
||||
.breadcrumb svg {
|
||||
fill: black !important;
|
||||
margin: 0 .5rem 0 1rem !important;
|
||||
}
|
||||
body#app {
|
||||
height: auto !important;
|
||||
}
|
||||
main {
|
||||
height: auto !important;
|
||||
padding-bottom: 0 !important;
|
||||
}
|
||||
thead tr {
|
||||
font-size: 1rem !important;
|
||||
position: static !important;
|
||||
background: none !important;
|
||||
border-bottom: 1pt solid black !important;
|
||||
}
|
||||
audio::-webkit-media-controls-timeline,
|
||||
video::-webkit-media-controls-timeline {
|
||||
display: none;
|
||||
}
|
||||
audio::-webkit-media-controls,
|
||||
video::-webkit-media-controls {
|
||||
display: none;
|
||||
}
|
||||
tr, figure {
|
||||
page-break-inside: avoid;
|
||||
}
|
||||
.selection {
|
||||
min-width: 0 !important;
|
||||
padding: 0 !important;
|
||||
@@ -142,14 +148,13 @@
|
||||
left: 0;
|
||||
}
|
||||
}
|
||||
* {
|
||||
box-sizing: border-box;
|
||||
}
|
||||
html {
|
||||
font-size: var(--root-font-size);
|
||||
overflow: hidden;
|
||||
}
|
||||
/* Hide scrollbar for all browsers */
|
||||
main::-webkit-scrollbar {
|
||||
display: none;
|
||||
}
|
||||
main {
|
||||
-ms-overflow-style: none; /* IE and Edge */
|
||||
scrollbar-width: none; /* Firefox */
|
||||
@@ -166,6 +171,7 @@ tbody .modified {
|
||||
font-family: 'Roboto Mono';
|
||||
}
|
||||
header {
|
||||
flex: 0 0 auto;
|
||||
background-color: var(--header-background);
|
||||
color: var(--header-color);
|
||||
font-size: var(--header-font-size);
|
||||
@@ -207,21 +213,23 @@ table {
|
||||
border: 0;
|
||||
gap: 0;
|
||||
}
|
||||
#app {
|
||||
height: 100%;
|
||||
body#app {
|
||||
height: 100vh;
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
}
|
||||
main {
|
||||
flex: 1 1 auto;
|
||||
padding-bottom: 3em; /* convenience space on the bottom */
|
||||
overflow-y: scroll;
|
||||
text-align: center;
|
||||
}
|
||||
|
||||
header nav.headermain {
|
||||
/* Position so that tooltips can appear on top of other positioned elements */
|
||||
position: relative;
|
||||
z-index: 100;
|
||||
}
|
||||
main {
|
||||
height: calc(100svh - var(--header-height));
|
||||
padding-bottom: 3em; /* convenience space on the bottom */
|
||||
overflow-y: scroll;
|
||||
}
|
||||
.spacer { flex-grow: 1 }
|
||||
.smallgap { flex-shrink: 1; width: 2em }
|
||||
|
||||
|
Before Width: | Height: | Size: 158 B After Width: | Height: | Size: 158 B |
|
Before Width: | Height: | Size: 168 B After Width: | Height: | Size: 168 B |
|
Before Width: | Height: | Size: 388 B After Width: | Height: | Size: 388 B |
|
Before Width: | Height: | Size: 128 B After Width: | Height: | Size: 128 B |
|
Before Width: | Height: | Size: 126 B After Width: | Height: | Size: 126 B |
|
Before Width: | Height: | Size: 158 B After Width: | Height: | Size: 158 B |
|
Before Width: | Height: | Size: 208 B After Width: | Height: | Size: 208 B |
|
Before Width: | Height: | Size: 563 B After Width: | Height: | Size: 563 B |
|
Before Width: | Height: | Size: 212 B After Width: | Height: | Size: 212 B |
|
Before Width: | Height: | Size: 293 B After Width: | Height: | Size: 293 B |
|
Before Width: | Height: | Size: 310 B After Width: | Height: | Size: 310 B |
|
Before Width: | Height: | Size: 193 B After Width: | Height: | Size: 193 B |
|
Before Width: | Height: | Size: 278 B After Width: | Height: | Size: 278 B |
|
Before Width: | Height: | Size: 711 B After Width: | Height: | Size: 711 B |
|
Before Width: | Height: | Size: 365 B After Width: | Height: | Size: 365 B |
|
Before Width: | Height: | Size: 783 B After Width: | Height: | Size: 783 B |
|
Before Width: | Height: | Size: 382 B After Width: | Height: | Size: 382 B |
|
Before Width: | Height: | Size: 200 B After Width: | Height: | Size: 200 B |
|
Before Width: | Height: | Size: 698 B After Width: | Height: | Size: 698 B |
|
Before Width: | Height: | Size: 156 B After Width: | Height: | Size: 156 B |
|
Before Width: | Height: | Size: 416 B After Width: | Height: | Size: 416 B |
|
Before Width: | Height: | Size: 517 B After Width: | Height: | Size: 517 B |
|
Before Width: | Height: | Size: 257 B After Width: | Height: | Size: 257 B |
|
Before Width: | Height: | Size: 297 B After Width: | Height: | Size: 297 B |
|
Before Width: | Height: | Size: 312 B After Width: | Height: | Size: 312 B |
|
Before Width: | Height: | Size: 109 B After Width: | Height: | Size: 109 B |
|
Before Width: | Height: | Size: 587 B After Width: | Height: | Size: 587 B |
|
Before Width: | Height: | Size: 269 B After Width: | Height: | Size: 269 B |
|
Before Width: | Height: | Size: 106 B After Width: | Height: | Size: 106 B |
|
Before Width: | Height: | Size: 393 B After Width: | Height: | Size: 393 B |
|
Before Width: | Height: | Size: 94 B After Width: | Height: | Size: 94 B |
|
Before Width: | Height: | Size: 229 B After Width: | Height: | Size: 229 B |
|
Before Width: | Height: | Size: 108 B After Width: | Height: | Size: 108 B |
|
Before Width: | Height: | Size: 407 B After Width: | Height: | Size: 407 B |
|
Before Width: | Height: | Size: 887 B After Width: | Height: | Size: 887 B |
|
Before Width: | Height: | Size: 908 B After Width: | Height: | Size: 908 B |
|
Before Width: | Height: | Size: 417 B After Width: | Height: | Size: 417 B |
|
Before Width: | Height: | Size: 554 B After Width: | Height: | Size: 554 B |
|
Before Width: | Height: | Size: 552 B After Width: | Height: | Size: 552 B |
|
Before Width: | Height: | Size: 114 B After Width: | Height: | Size: 114 B |
|
Before Width: | Height: | Size: 1.1 KiB After Width: | Height: | Size: 1.1 KiB |
|
Before Width: | Height: | Size: 91 B After Width: | Height: | Size: 91 B |
|
Before Width: | Height: | Size: 647 B After Width: | Height: | Size: 647 B |
|
Before Width: | Height: | Size: 95 B After Width: | Height: | Size: 95 B |
|
Before Width: | Height: | Size: 208 B After Width: | Height: | Size: 208 B |
|
Before Width: | Height: | Size: 104 B After Width: | Height: | Size: 104 B |
|
Before Width: | Height: | Size: 508 B After Width: | Height: | Size: 508 B |
|
Before Width: | Height: | Size: 1009 B After Width: | Height: | Size: 1009 B |
|
Before Width: | Height: | Size: 278 B After Width: | Height: | Size: 278 B |
|
Before Width: | Height: | Size: 753 B After Width: | Height: | Size: 753 B |
|
Before Width: | Height: | Size: 353 B After Width: | Height: | Size: 353 B |
|
Before Width: | Height: | Size: 542 B After Width: | Height: | Size: 542 B |
|
Before Width: | Height: | Size: 292 B After Width: | Height: | Size: 292 B |
|
Before Width: | Height: | Size: 621 B After Width: | Height: | Size: 621 B |
|
Before Width: | Height: | Size: 517 B After Width: | Height: | Size: 517 B |
|
Before Width: | Height: | Size: 289 B After Width: | Height: | Size: 289 B |
|
Before Width: | Height: | Size: 498 B After Width: | Height: | Size: 498 B |
|
Before Width: | Height: | Size: 464 B After Width: | Height: | Size: 464 B |
@@ -2,14 +2,19 @@
|
||||
<nav
|
||||
class="breadcrumb"
|
||||
aria-label="Breadcrumb"
|
||||
@keyup.left.stop="move(-1)"
|
||||
@keyup.right.stop="move(1)"
|
||||
@focus="move(0)"
|
||||
@keydown.left.stop="move(-1)"
|
||||
@keydown.right.stop="move(1)"
|
||||
@keyup.enter="move(0)"
|
||||
@focus=focusCurrent
|
||||
tabindex=0
|
||||
>
|
||||
<a href="#/"
|
||||
:ref="el => setLinkRef(0, el)"
|
||||
class="home"
|
||||
:class="{ current: !!isCurrent(0) }"
|
||||
:aria-current="isCurrent(0)"
|
||||
@click.prevent="navigate(0)"
|
||||
title="/"
|
||||
>
|
||||
<component :is="home" />
|
||||
</a>
|
||||
@@ -19,6 +24,7 @@
|
||||
:aria-current="isCurrent(index + 1)"
|
||||
@click.prevent="navigate(index + 1)"
|
||||
:ref="el => setLinkRef(index + 1, el)"
|
||||
:title="`/${longest.slice(0, index + 1).join('/')}`"
|
||||
>{{ location }}</a>
|
||||
</template>
|
||||
</nav>
|
||||
@@ -26,8 +32,9 @@
|
||||
|
||||
<script setup lang="ts">
|
||||
import home from '@/assets/svg/home.svg'
|
||||
import { onBeforeUpdate, ref, watchEffect } from 'vue'
|
||||
import { nextTick, onBeforeUpdate, ref, watchEffect } from 'vue'
|
||||
import { useRouter } from 'vue-router'
|
||||
import { exists } from '@/utils/fileutil'
|
||||
|
||||
const router = useRouter()
|
||||
|
||||
@@ -37,17 +44,33 @@ onBeforeUpdate(() => { links.length = 1 }) // 1 to keep home
|
||||
|
||||
const props = defineProps<{
|
||||
path: Array<string>
|
||||
primary?: boolean
|
||||
}>()
|
||||
|
||||
const longest = ref<Array<string>>([])
|
||||
|
||||
const isCurrent = (index: number) => index == props.path.length ? 'location' : undefined
|
||||
|
||||
const focusCurrent = () => {
|
||||
nextTick(() => {
|
||||
const index = props.path.length
|
||||
if (index < links.length) links[index].focus()
|
||||
})
|
||||
}
|
||||
|
||||
const navigate = (index: number) => {
|
||||
const link = links[index]
|
||||
if (!link) throw Error(`No link at index ${index} (path: ${props.path})`)
|
||||
link.focus()
|
||||
router.replace(`/${longest.value.slice(0, index).join('/')}`)
|
||||
const url = index ? `/${longest.value.slice(0, index).join('/')}/` : '/'
|
||||
const long = longest.value.length ? `/${longest.value.join('/')}/` : '/'
|
||||
const browser = decodeURIComponent(location.hash.slice(1).split('//')[0])
|
||||
const u = url.replaceAll('?', '%3F').replaceAll('#', '%23')
|
||||
// Clicking on current link clears the rest of the path and adds new history
|
||||
if (isCurrent(index)) { longest.value.splice(index); router.push(u) }
|
||||
// Moving along breadcrumbs doesn't create new history
|
||||
else if (long.startsWith(browser)) router.replace(u)
|
||||
// Nornal navigation from elsewhere (e.g. search result breadcrumbs)
|
||||
else router.push(u)
|
||||
}
|
||||
|
||||
const move = (dir: number) => {
|
||||
@@ -59,13 +82,25 @@ const move = (dir: number) => {
|
||||
watchEffect(() => {
|
||||
const longcut = longest.value.slice(0, props.path.length)
|
||||
const same = longcut.every((value, index) => value === props.path[index])
|
||||
// Navigated out of previous path, reset longest to current
|
||||
if (!same) longest.value = props.path
|
||||
else if (props.path.length > longcut.length) {
|
||||
longest.value = longcut.concat(props.path.slice(longcut.length))
|
||||
}
|
||||
})
|
||||
watchEffect(() => {
|
||||
if (links.length) navigate(props.path.length)
|
||||
else {
|
||||
// Prune deleted folders from longest
|
||||
for (let i = props.path.length; i < longest.value.length; ++i) {
|
||||
if (!exists(longest.value.slice(0, i + 1))) {
|
||||
longest.value = longest.value.slice(0, i)
|
||||
break
|
||||
}
|
||||
}
|
||||
}
|
||||
// If needed, focus primary navigation to new location
|
||||
if (props.primary) nextTick(() => {
|
||||
const act = document.activeElement as HTMLElement
|
||||
if (!act || [...links, document.body].includes(act)) focusCurrent()
|
||||
})
|
||||
})
|
||||
</script>
|
||||
|
||||
@@ -80,31 +115,36 @@ watchEffect(() => {
|
||||
--breadcrumb-transtime: 0.3s;
|
||||
}
|
||||
.breadcrumb {
|
||||
flex: 1 1 auto;
|
||||
display: flex;
|
||||
list-style: none;
|
||||
min-width: 20%;
|
||||
max-width: 100%;
|
||||
min-height: 2em;
|
||||
margin: 0;
|
||||
padding: 0 1em 0 0;
|
||||
overflow: hidden;
|
||||
}
|
||||
.breadcrumb > a {
|
||||
flex: 0 4 auto;
|
||||
display: flex;
|
||||
align-items: center;
|
||||
margin: 0 -0.5em 0 -0.5em;
|
||||
padding: 0;
|
||||
max-width: 8em;
|
||||
white-space: nowrap;
|
||||
text-overflow: ellipsis;
|
||||
overflow: hidden;
|
||||
height: 1.5em;
|
||||
color: var(--breadcrumb-color);
|
||||
padding: 0.3em 1.5em;
|
||||
clip-path: polygon(0 0, 1em 50%, 0 100%, 100% 100%, 100% 0, 0 0);
|
||||
transition: all var(--breadcrumb-transtime);
|
||||
}
|
||||
.breadcrumb a:first-child {
|
||||
margin-left: 0;
|
||||
padding-left: .2em;
|
||||
.breadcrumb > a:first-child {
|
||||
flex: 0 0 auto;
|
||||
padding-left: 1.5em;
|
||||
padding-right: 1.7em;
|
||||
clip-path: none;
|
||||
}
|
||||
.breadcrumb a:last-child {
|
||||
max-width: none;
|
||||
.breadcrumb > a:last-child {
|
||||
clip-path: polygon(
|
||||
0 0,
|
||||
calc(100% - 1em) 0,
|
||||
@@ -115,7 +155,7 @@ watchEffect(() => {
|
||||
0 0
|
||||
);
|
||||
}
|
||||
.breadcrumb a:only-child {
|
||||
.breadcrumb > a:only-child {
|
||||
clip-path: polygon(
|
||||
0 0,
|
||||
calc(100% - 1em) 0,
|
||||
@@ -127,9 +167,9 @@ watchEffect(() => {
|
||||
}
|
||||
.breadcrumb svg {
|
||||
/* FIXME: Custom positioning to align it well; needs proper solution */
|
||||
padding-left: 0.8em;
|
||||
width: 1.3em;
|
||||
height: 1.3em;
|
||||
margin: -.5em;
|
||||
fill: var(--breadcrumb-color);
|
||||
transition: fill var(--breadcrumb-transtime);
|
||||
}
|
||||
@@ -149,6 +189,6 @@ watchEffect(() => {
|
||||
}
|
||||
.breadcrumb a:hover { color: var(--breadcrumb-hover-color) }
|
||||
.breadcrumb a:hover svg { fill: var(--breadcrumb-hover-color) }
|
||||
.breadcrumb a.current { color: var(--accent-color) }
|
||||
.breadcrumb a.current { color: var(--accent-color); max-width: none; flex: 0 1 auto; }
|
||||
.breadcrumb a.current svg { fill: var(--accent-color) }
|
||||
</style>
|
||||
@@ -1,57 +1,51 @@
|
||||
<template>
|
||||
<template v-if="documentStore.selected.size">
|
||||
<div class="smallgap"></div>
|
||||
<p class="select-text">{{ documentStore.selected.size }} selected ➤</p>
|
||||
<SvgButton name="download" data-tooltip="Download" @click="download" />
|
||||
<SvgButton name="copy" data-tooltip="Copy here" @click="op('cp', dst)" />
|
||||
<SvgButton name="paste" data-tooltip="Move here" @click="op('mv', dst)" />
|
||||
<SvgButton name="trash" data-tooltip="Delete ⚠️" @click="op('rm')" />
|
||||
<button class="action-button unselect" data-tooltip="Unselect all" @click="documentStore.selected.clear()">❌</button>
|
||||
</template>
|
||||
<SvgButton name="download" data-tooltip="Download" @click="download" />
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import {connect, controlUrl} from '@/repositories/WS'
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import { computed } from 'vue'
|
||||
import { useMainStore } from '@/stores/main'
|
||||
import type { SelectedItems } from '@/repositories/Document'
|
||||
import { reactive } from 'vue';
|
||||
|
||||
const documentStore = useDocumentStore()
|
||||
const props = defineProps({
|
||||
path: Array<string>
|
||||
})
|
||||
const store = useMainStore()
|
||||
|
||||
const dst = computed(() => props.path!.join('/'))
|
||||
const op = (op: string, dst?: string) => {
|
||||
const sel = documentStore.selectedFiles
|
||||
const msg = {
|
||||
op,
|
||||
sel: sel.keys.map(key => {
|
||||
const doc = sel.docs[key]
|
||||
return doc.loc ? `${doc.loc}/${doc.name}` : doc.name
|
||||
})
|
||||
}
|
||||
// @ts-ignore
|
||||
if (dst !== undefined) msg.dst = dst
|
||||
const control = connect(controlUrl, {
|
||||
message(ev: WebSocmetMessageEvent) {
|
||||
const res = JSON.parse(ev.data)
|
||||
if ('error' in res) {
|
||||
console.error('Control socket error', msg, res.error)
|
||||
documentStore.error = res.error.message
|
||||
return
|
||||
} else if (res.status === 'ack') {
|
||||
console.log('Control ack OK', res)
|
||||
control.close()
|
||||
documentStore.selected.clear()
|
||||
return
|
||||
} else console.log('Unknown control response', msg, res)
|
||||
}
|
||||
})
|
||||
control.onopen = () => {
|
||||
control.send(JSON.stringify(msg))
|
||||
}
|
||||
const status_init = {
|
||||
total: 0,
|
||||
xfer: 0,
|
||||
t0: 0,
|
||||
tlast: 0,
|
||||
statbytes: 0,
|
||||
statdur: 0,
|
||||
files: [] as string[],
|
||||
filestart: 0,
|
||||
fileidx: 0,
|
||||
filecount: 0,
|
||||
filename: '',
|
||||
filesize: 0,
|
||||
filepos: 0,
|
||||
status: 'idle',
|
||||
}
|
||||
store.dprogress = {...status_init}
|
||||
setInterval(() => {
|
||||
if (Date.now() - store.dprogress.tlast > 3000) {
|
||||
// Reset
|
||||
store.dprogress.statbytes = 0
|
||||
store.dprogress.statdur = 1
|
||||
} else {
|
||||
// Running average by decay
|
||||
store.dprogress.statbytes *= .9
|
||||
store.dprogress.statdur *= .9
|
||||
}
|
||||
}, 100)
|
||||
const statReset = () => {
|
||||
Object.assign(store.dprogress, status_init)
|
||||
store.dprogress.t0 = Date.now()
|
||||
store.dprogress.tlast = store.dprogress.t0 + 1
|
||||
}
|
||||
const cancelDownloads = () => {
|
||||
location.reload() // FIXME
|
||||
}
|
||||
|
||||
|
||||
const linkdl = (href: string) => {
|
||||
const a = document.createElement('a')
|
||||
@@ -64,6 +58,12 @@ const filesystemdl = async (sel: SelectedItems, handle: FileSystemDirectoryHandl
|
||||
let hdir = ''
|
||||
let h = handle
|
||||
console.log('Downloading to filesystem', sel.recursive)
|
||||
for (const [rel, full, doc] of sel.recursive) {
|
||||
if (doc.dir) continue
|
||||
store.dprogress.files.push(rel)
|
||||
++store.dprogress.filecount
|
||||
store.dprogress.total += doc.size
|
||||
}
|
||||
for (const [rel, full, doc] of sel.recursive) {
|
||||
// Create any missing directories
|
||||
if (hdir && !rel.startsWith(hdir + '/')) {
|
||||
@@ -72,6 +72,7 @@ const filesystemdl = async (sel: SelectedItems, handle: FileSystemDirectoryHandl
|
||||
}
|
||||
const r = rel.slice(hdir.length)
|
||||
for (const dir of r.split('/').slice(0, doc.dir ? undefined : -1)) {
|
||||
if (!dir) continue
|
||||
hdir += `${dir}/`
|
||||
try {
|
||||
h = await h.getDirectoryHandle(dir.normalize('NFC'), { create: true })
|
||||
@@ -95,30 +96,49 @@ const filesystemdl = async (sel: SelectedItems, handle: FileSystemDirectoryHandl
|
||||
const url = `/files/${rel}`
|
||||
console.log('Fetching', url)
|
||||
const res = await fetch(url)
|
||||
if (!res.ok)
|
||||
if (!res.ok) {
|
||||
store.error = `Failed to download ${url}: ${res.status} ${res.statusText}`
|
||||
throw new Error(`Failed to download ${url}: ${res.status} ${res.statusText}`)
|
||||
if (res.body) await res.body.pipeTo(writable)
|
||||
else {
|
||||
// Zero-sized files don't have a body, so we need to create an empty file
|
||||
await writable.truncate(0)
|
||||
await writable.close()
|
||||
}
|
||||
if (res.body) {
|
||||
++store.dprogress.fileidx
|
||||
const reader = res.body.getReader()
|
||||
await writable.truncate(0)
|
||||
store.error = "Direct download."
|
||||
store.dprogress.tlast = Date.now()
|
||||
while (true) {
|
||||
const { value, done } = await reader.read()
|
||||
if (done) break
|
||||
await writable.write(value)
|
||||
const now = Date.now()
|
||||
const size = value.byteLength
|
||||
store.dprogress.xfer += size
|
||||
store.dprogress.filepos += size
|
||||
store.dprogress.statbytes += size
|
||||
store.dprogress.statdur += now - store.dprogress.tlast
|
||||
store.dprogress.tlast = now
|
||||
}
|
||||
}
|
||||
await writable.close()
|
||||
console.log('Saved', hdir + name)
|
||||
}
|
||||
statReset()
|
||||
}
|
||||
|
||||
const download = async () => {
|
||||
const sel = documentStore.selectedFiles
|
||||
const sel = store.selectedFiles
|
||||
console.log('Download', sel)
|
||||
if (sel.keys.length === 0) {
|
||||
console.warn('Attempted download but no files found. Missing selected keys:', sel.missing)
|
||||
documentStore.selected.clear()
|
||||
store.error = 'No existing files selected'
|
||||
store.selected.clear()
|
||||
return
|
||||
}
|
||||
// Plain old a href download if only one file (ignoring any folders)
|
||||
const files = sel.recursive.filter(([rel, full, doc]) => !doc.dir)
|
||||
if (files.length === 1) {
|
||||
documentStore.selected.clear()
|
||||
store.selected.clear()
|
||||
store.error = "Single file via browser downloads"
|
||||
return linkdl(`/files/${files[0][1]}`)
|
||||
}
|
||||
// Use FileSystem API if multiple files and the browser supports it
|
||||
@@ -129,26 +149,23 @@ const download = async () => {
|
||||
startIn: 'downloads',
|
||||
mode: 'readwrite'
|
||||
})
|
||||
filesystemdl(sel, handle).then(() => {
|
||||
documentStore.selected.clear()
|
||||
})
|
||||
await filesystemdl(sel, handle)
|
||||
store.selected.clear()
|
||||
return
|
||||
} catch (e) {
|
||||
console.error('Download to folder aborted', e)
|
||||
}
|
||||
}
|
||||
// Otherwise, zip and download
|
||||
console.log("Falling back to zip download")
|
||||
const name = sel.keys.length === 1 ? sel.docs[sel.keys[0]].name : 'download'
|
||||
linkdl(`/zip/${Array.from(sel.keys).join('+')}/${name}.zip`)
|
||||
documentStore.selected.clear()
|
||||
store.error = "Downloading as ZIP via browser downloads"
|
||||
store.selected.clear()
|
||||
}
|
||||
|
||||
</script>
|
||||
|
||||
<style>
|
||||
.select-text {
|
||||
color: var(--accent-color);
|
||||
white-space: nowrap;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
}
|
||||
<style scoped>
|
||||
|
||||
</style>
|
||||
38
frontend/src/components/EmptyFolder.vue
Normal file
@@ -0,0 +1,38 @@
|
||||
<template>
|
||||
<div v-if="!props.path || documents.length === 0" class="empty-container">
|
||||
<component :is="cog" class="cog"/>
|
||||
<p v-if="!store.connected">No Connection</p>
|
||||
<p v-else-if="store.document.length === 0">Waiting for File List</p>
|
||||
<p v-else-if="store.query">No matches!</p>
|
||||
<p v-else-if="!exists(props.path)">Folder not found</p>
|
||||
<p v-else>Empty folder</p>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { defineProps } from 'vue'
|
||||
import { useMainStore } from '@/stores/main'
|
||||
import cog from '@/assets/svg/cog.svg'
|
||||
import { exists } from '@/utils/fileutil'
|
||||
|
||||
const store = useMainStore()
|
||||
const props = defineProps<{
|
||||
path: string[],
|
||||
documents: Document[],
|
||||
}>()
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
@keyframes rotate {
|
||||
0% { transform: rotate(0deg); }
|
||||
100% { transform: rotate(360deg); }
|
||||
}
|
||||
svg.cog {
|
||||
width: 10rem;
|
||||
height: 10rem;
|
||||
margin: 0 auto;
|
||||
animation: rotate 10s linear infinite;
|
||||
filter: drop-shadow(0 0 1rem black);
|
||||
fill: #888;
|
||||
}
|
||||
</style>
|
||||
@@ -3,34 +3,11 @@
|
||||
<thead>
|
||||
<tr>
|
||||
<th class="selection">
|
||||
<input
|
||||
type="checkbox"
|
||||
tabindex="-1"
|
||||
v-model="allSelected"
|
||||
:indeterminate="selectionIndeterminate"
|
||||
/>
|
||||
</th>
|
||||
<th
|
||||
class="sortcolumn"
|
||||
:class="{ sortactive: sort === 'name' }"
|
||||
@click="toggleSort('name')"
|
||||
>
|
||||
Name
|
||||
</th>
|
||||
<th
|
||||
class="sortcolumn modified right"
|
||||
:class="{ sortactive: sort === 'modified' }"
|
||||
@click="toggleSort('modified')"
|
||||
>
|
||||
Modified
|
||||
</th>
|
||||
<th
|
||||
class="sortcolumn size right"
|
||||
:class="{ sortactive: sort === 'size' }"
|
||||
@click="toggleSort('size')"
|
||||
>
|
||||
Size
|
||||
<input type="checkbox" tabindex="-1" v-model="allSelected" :indeterminate="selectionIndeterminate">
|
||||
</th>
|
||||
<th class="sortcolumn" :class="{ sortactive: store.sortOrder === 'name' }" @click="store.toggleSort('name')">Name</th>
|
||||
<th class="sortcolumn modified right" :class="{ sortactive: store.sortOrder === 'modified' }" @click="store.toggleSort('modified')">Modified</th>
|
||||
<th class="sortcolumn size right" :class="{ sortactive: store.sortOrder === 'size' }" @click="store.toggleSort('size')">Size</th>
|
||||
<th class="menu"></th>
|
||||
</tr>
|
||||
</thead>
|
||||
@@ -38,94 +15,50 @@
|
||||
<tr v-if="editing?.key === 'new'" class="folder">
|
||||
<td class="selection"></td>
|
||||
<td class="name">
|
||||
<FileRenameInput
|
||||
:doc="editing"
|
||||
:rename="mkdir"
|
||||
:exit="
|
||||
() => {
|
||||
editing = null
|
||||
}
|
||||
"
|
||||
/>
|
||||
<FileRenameInput :doc="editing" :rename="mkdir" :exit="() => {editing = null}" />
|
||||
</td>
|
||||
<td class="modified right">
|
||||
<time :datetime="new Date(editing.mtime).toISOString().replace('.000', '')">{{
|
||||
editing.modified
|
||||
}}</time>
|
||||
</td>
|
||||
<td class="size right">{{ editing.sizedisp }}</td>
|
||||
<FileModified :doc=editing :key=nowkey />
|
||||
<FileSize :doc=editing />
|
||||
<td class="menu"></td>
|
||||
</tr>
|
||||
<template
|
||||
v-for="(doc, index) in sortedDocuments"
|
||||
:key="doc.key">
|
||||
<template v-for="(doc, index) in documents" :key="doc.key">
|
||||
<tr class="folder-change" v-if="showFolderBreadcrumb(index)">
|
||||
<th colspan="5"><BreadCrumb :path="doc.loc ? doc.loc.split('/') : []" /></th>
|
||||
</tr>
|
||||
|
||||
<tr
|
||||
:id="`file-${doc.key}`"
|
||||
:class="{ file: !doc.dir, folder: doc.dir, cursor: cursor === doc }"
|
||||
@click="cursor = cursor === doc ? null : doc"
|
||||
:class="{ file: !doc.dir, folder: doc.dir, cursor: store.cursor === doc.key }"
|
||||
@click="store.cursor = store.cursor === doc.key ? '' : doc.key"
|
||||
@contextmenu.prevent="contextMenu($event, doc)"
|
||||
>
|
||||
<td class="selection" @click.up.stop="cursor = cursor === doc ? doc : null">
|
||||
<td class="selection" @click.up.stop="store.cursor = store.cursor === doc.key ? doc.key : ''">
|
||||
<input
|
||||
type="checkbox"
|
||||
tabindex="-1"
|
||||
:checked="documentStore.selected.has(doc.key)"
|
||||
:checked="store.selected.has(doc.key)"
|
||||
@change="
|
||||
($event.target as HTMLInputElement).checked
|
||||
? documentStore.selected.add(doc.key)
|
||||
: documentStore.selected.delete(doc.key)
|
||||
? store.selected.add(doc.key)
|
||||
: store.selected.delete(doc.key)
|
||||
"
|
||||
/>
|
||||
</td>
|
||||
<td class="name">
|
||||
<template v-if="editing === doc"
|
||||
><FileRenameInput
|
||||
:doc="doc"
|
||||
:rename="rename"
|
||||
:exit="
|
||||
() => {
|
||||
editing = null
|
||||
}
|
||||
"
|
||||
/></template>
|
||||
<template v-if="editing === doc">
|
||||
<FileRenameInput :doc="doc" :rename="rename" :exit="() => {editing = null}" />
|
||||
</template>
|
||||
<template v-else>
|
||||
<a
|
||||
:href="url_for(doc)"
|
||||
tabindex="-1"
|
||||
@contextmenu.prevent
|
||||
@focus.stop="cursor = doc"
|
||||
@blur="ev => { if (!editing) cursor = null }"
|
||||
@keyup.left="router.back()"
|
||||
@keyup.right.stop="ev => { if (doc.dir) (ev.target as HTMLElement).click() }"
|
||||
>{{ doc.name }}</a
|
||||
>
|
||||
<button
|
||||
v-if="cursor == doc"
|
||||
class="rename-button"
|
||||
@click="() => (editing = doc)"
|
||||
>
|
||||
🖊️
|
||||
</button>
|
||||
<a :href=doc.url tabindex=-1 @contextmenu.stop @focus.stop="store.cursor = doc.key">
|
||||
{{ doc.name }}
|
||||
</a>
|
||||
<button tabindex=-1 v-if="store.cursor == doc.key" class="rename-button" @click="() => (editing = doc)">🖊️</button>
|
||||
</template>
|
||||
</td>
|
||||
<td class="modified right">
|
||||
<time
|
||||
:data-tooltip="new Date(1000 * doc.mtime).toISOString().replace('T', '\n').replace('.000Z', ' UTC')"
|
||||
>{{ doc.modified }}</time
|
||||
>
|
||||
</td>
|
||||
<td class="size right">{{ doc.sizedisp }}</td>
|
||||
<FileModified :doc=doc :key=nowkey />
|
||||
<FileSize :doc=doc />
|
||||
<td class="menu">
|
||||
<button
|
||||
tabindex="-1"
|
||||
@click.stop="contextMenu($event, doc)"
|
||||
>
|
||||
⋮
|
||||
</button>
|
||||
<button tabindex=-1 @click.stop="contextMenu($event, doc)">⋮</button>
|
||||
</td>
|
||||
</tr>
|
||||
</template>
|
||||
@@ -136,35 +69,27 @@
|
||||
</tr>
|
||||
</tbody>
|
||||
</table>
|
||||
<div v-else class="empty-container">Nothing to see here</div>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref, computed, watchEffect } from 'vue'
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import type { Document } from '@/repositories/Document'
|
||||
import { ref, computed, watchEffect, shallowRef, onMounted, onUnmounted } from 'vue'
|
||||
import { useMainStore } from '@/stores/main'
|
||||
import { Doc } from '@/repositories/Document'
|
||||
import FileRenameInput from './FileRenameInput.vue'
|
||||
import { connect, controlUrl } from '@/repositories/WS'
|
||||
import { collator, formatSize, formatUnixDate } from '@/utils'
|
||||
import { formatSize } from '@/utils'
|
||||
import { useRouter } from 'vue-router'
|
||||
import ContextMenu from '@imengyu/vue3-context-menu'
|
||||
|
||||
const props = withDefaults(
|
||||
defineProps<{
|
||||
path: Array<string>
|
||||
documents: Document[]
|
||||
}>(),
|
||||
{}
|
||||
)
|
||||
const documentStore = useDocumentStore()
|
||||
const props = defineProps<{
|
||||
path: Array<string>
|
||||
documents: Doc[]
|
||||
}>()
|
||||
const store = useMainStore()
|
||||
const router = useRouter()
|
||||
const url_for = (doc: Document) => {
|
||||
const p = doc.loc ? `${doc.loc}/${doc.name}` : doc.name
|
||||
return doc.dir ? `#/${p}/` : `/files/${p}`
|
||||
}
|
||||
const cursor = ref<Document | null>(null)
|
||||
// File rename
|
||||
const editing = ref<Document | null>(null)
|
||||
const rename = (doc: Document, newName: string) => {
|
||||
const editing = shallowRef<Doc | null>(null)
|
||||
const rename = (doc: Doc, newName: string) => {
|
||||
const oldName = doc.name
|
||||
const control = connect(controlUrl, {
|
||||
message(ev: MessageEvent) {
|
||||
@@ -188,75 +113,71 @@ const rename = (doc: Document, newName: string) => {
|
||||
}
|
||||
doc.name = newName // We should get an update from watch but this is quicker
|
||||
}
|
||||
const sortedDocuments = computed(() => sorted(props.documents as Document[]))
|
||||
const showFolderBreadcrumb = (i: number) => {
|
||||
const docs = sortedDocuments.value
|
||||
const docloc = docs[i].loc
|
||||
return i === 0 ? docloc !== loc.value : docloc !== docs[i - 1].loc
|
||||
}
|
||||
defineExpose({
|
||||
newFolder() {
|
||||
const now = Date.now() / 1000
|
||||
editing.value = {
|
||||
console.log("New folder")
|
||||
const now = Math.floor(Date.now() / 1000)
|
||||
editing.value = new Doc({
|
||||
loc: loc.value,
|
||||
key: 'new',
|
||||
name: 'New Folder',
|
||||
type: 'folder',
|
||||
dir: true,
|
||||
mtime: now,
|
||||
size: 0,
|
||||
sizedisp: formatSize(0),
|
||||
modified: formatUnixDate(now),
|
||||
haystack: '',
|
||||
}
|
||||
console.log("New")
|
||||
})
|
||||
store.cursor = editing.value.key
|
||||
},
|
||||
toggleSelectAll() {
|
||||
console.log('Select')
|
||||
allSelected.value = !allSelected.value
|
||||
},
|
||||
toggleSortColumn(column: number) {
|
||||
const columns = ['', 'name', 'modified', 'size', '']
|
||||
toggleSort(columns[column])
|
||||
},
|
||||
isCursor() {
|
||||
return cursor.value !== null && editing.value === null
|
||||
return store.cursor && editing.value === null
|
||||
},
|
||||
cursorRename() {
|
||||
editing.value = cursor.value
|
||||
editing.value = props.documents.find(doc => doc.key === store.cursor) ?? null
|
||||
},
|
||||
cursorSelect() {
|
||||
const doc = cursor.value
|
||||
if (!doc) return
|
||||
if (documentStore.selected.has(doc.key)) {
|
||||
documentStore.selected.delete(doc.key)
|
||||
const key = store.cursor
|
||||
if (!key) return
|
||||
if (store.selected.has(key)) {
|
||||
store.selected.delete(key)
|
||||
} else {
|
||||
documentStore.selected.add(doc.key)
|
||||
store.selected.add(key)
|
||||
}
|
||||
this.cursorMove(1)
|
||||
this.cursorMove(1, null)
|
||||
},
|
||||
cursorMove(d: number, select = false) {
|
||||
up(ev: KeyboardEvent) { this.cursorMove(-1, ev) },
|
||||
down(ev: KeyboardEvent) { this.cursorMove(1, ev) },
|
||||
left(ev: KeyboardEvent) { router.back() },
|
||||
right(ev: KeyboardEvent) {
|
||||
const a = document.querySelector(`#file-${store.cursor} a`) as HTMLAnchorElement | null
|
||||
if (a) a.click()
|
||||
},
|
||||
cursorMove(d: number, ev: KeyboardEvent | null) {
|
||||
const select = !!ev?.shiftKey
|
||||
// Move cursor up or down (keyboard navigation)
|
||||
const documents = sortedDocuments.value
|
||||
if (documents.length === 0) {
|
||||
cursor.value = null
|
||||
const docs = props.documents
|
||||
if (docs.length === 0) {
|
||||
store.cursor = ''
|
||||
return
|
||||
}
|
||||
const N = documents.length
|
||||
const N = docs.length
|
||||
const mod = (a: number, b: number) => ((a % b) + b) % b
|
||||
const increment = (i: number, d: number) => mod(i + d, N + 1)
|
||||
const index =
|
||||
cursor.value !== null ? documents.indexOf(cursor.value) : documents.length
|
||||
store.cursor ? docs.findIndex(doc => doc.key === store.cursor) : docs.length
|
||||
const moveto = increment(index, d)
|
||||
cursor.value = documents[moveto] ?? null
|
||||
const tr = cursor.value ? document.getElementById(`file-${cursor.value.key}`) : null
|
||||
store.cursor = docs[moveto]?.key ?? ''
|
||||
const tr = store.cursor ? document.getElementById(`file-${store.cursor}`) : ''
|
||||
if (select) {
|
||||
// Go forwards, possibly wrapping over the end; the last entry is not toggled
|
||||
let [begin, end] = d > 0 ? [index, moveto] : [moveto, index]
|
||||
for (let p = begin; p !== end; p = increment(p, 1)) {
|
||||
if (p === N) continue
|
||||
const key = documents[p].key
|
||||
if (documentStore.selected.has(key)) documentStore.selected.delete(key)
|
||||
else documentStore.selected.add(key)
|
||||
const key = docs[p].key
|
||||
if (store.selected.has(key)) store.selected.delete(key)
|
||||
else store.selected.add(key)
|
||||
}
|
||||
}
|
||||
// @ts-ignore
|
||||
@@ -278,22 +199,36 @@ const focusBreadcrumb = () => {
|
||||
let scrolltimer: any = null
|
||||
let scrolltr: any = null
|
||||
watchEffect(() => {
|
||||
if (cursor.value && cursor.value !== editing.value) editing.value = null
|
||||
if (editing.value) cursor.value = editing.value
|
||||
if (cursor.value) {
|
||||
if (store.cursor && store.cursor !== editing.value?.key) editing.value = null
|
||||
if (editing.value) store.cursor = editing.value?.key
|
||||
if (store.cursor) {
|
||||
const a = document.querySelector(
|
||||
`#file-${cursor.value.key} .name a`
|
||||
`#file-${store.cursor} .name a`
|
||||
) as HTMLAnchorElement | null
|
||||
if (a) a.focus()
|
||||
}
|
||||
})
|
||||
watchEffect(() => {
|
||||
if (!props.documents.length && cursor.value) {
|
||||
cursor.value = null
|
||||
if (!props.documents.length && store.cursor) {
|
||||
store.cursor = ''
|
||||
focusBreadcrumb()
|
||||
}
|
||||
})
|
||||
const mkdir = (doc: Document, name: string) => {
|
||||
let nowkey = ref(0)
|
||||
let modifiedTimer: any = null
|
||||
const updateModified = () => {
|
||||
nowkey.value = Math.floor(Date.now() / 1000)
|
||||
}
|
||||
onMounted(() => {
|
||||
updateModified(); modifiedTimer = setInterval(updateModified, 1000)
|
||||
const active = document.querySelector('.cursor') as HTMLElement | null
|
||||
if (active) {
|
||||
active.scrollIntoView({ block: 'center', behavior: 'instant' })
|
||||
active.focus()
|
||||
}
|
||||
})
|
||||
onUnmounted(() => { clearInterval(modifiedTimer) })
|
||||
const mkdir = (doc: Doc, name: string) => {
|
||||
const control = connect(controlUrl, {
|
||||
open() {
|
||||
control.send(
|
||||
@@ -310,34 +245,24 @@ const mkdir = (doc: Document, name: string) => {
|
||||
editing.value = null
|
||||
} else {
|
||||
console.log('mkdir', msg)
|
||||
router.push(`/${doc.loc}/${name}/`)
|
||||
router.push(doc.urlrouter)
|
||||
}
|
||||
}
|
||||
})
|
||||
doc.name = name // We should get an update from watch but this is quicker
|
||||
// We should get an update from watch but this is quicker
|
||||
doc.name = name
|
||||
doc.key = crypto.randomUUID()
|
||||
}
|
||||
|
||||
// Column sort
|
||||
const toggleSort = (name: string) => {
|
||||
sort.value = sort.value === name ? '' : name
|
||||
}
|
||||
const sort = ref<string>('')
|
||||
const sortCompare = {
|
||||
name: (a: Document, b: Document) => collator.compare(a.name, b.name),
|
||||
modified: (a: Document, b: Document) => b.mtime - a.mtime,
|
||||
size: (a: Document, b: Document) => b.size - a.size
|
||||
}
|
||||
const sorted = (documents: Document[]) => {
|
||||
const cmp = sortCompare[sort.value as keyof typeof sortCompare]
|
||||
const sorted = [...documents]
|
||||
if (cmp) sorted.sort(cmp)
|
||||
return sorted
|
||||
const showFolderBreadcrumb = (i: number) => {
|
||||
const docs = props.documents
|
||||
const docloc = docs[i].loc
|
||||
return i === 0 ? docloc !== loc.value : docloc !== docs[i - 1].loc
|
||||
}
|
||||
const selectionIndeterminate = computed({
|
||||
get: () => {
|
||||
return (
|
||||
props.documents.length > 0 &&
|
||||
props.documents.some((doc: Document) => documentStore.selected.has(doc.key)) &&
|
||||
props.documents.some((doc: Doc) => store.selected.has(doc.key)) &&
|
||||
!allSelected.value
|
||||
)
|
||||
},
|
||||
@@ -348,16 +273,16 @@ const allSelected = computed({
|
||||
get: () => {
|
||||
return (
|
||||
props.documents.length > 0 &&
|
||||
props.documents.every((doc: Document) => documentStore.selected.has(doc.key))
|
||||
props.documents.every((doc: Doc) => store.selected.has(doc.key))
|
||||
)
|
||||
},
|
||||
set: (value: boolean) => {
|
||||
console.log('Setting allSelected', value)
|
||||
for (const doc of props.documents) {
|
||||
if (value) {
|
||||
documentStore.selected.add(doc.key)
|
||||
store.selected.add(doc.key)
|
||||
} else {
|
||||
documentStore.selected.delete(doc.key)
|
||||
store.selected.delete(doc.key)
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -365,9 +290,13 @@ const allSelected = computed({
|
||||
|
||||
const loc = computed(() => props.path.join('/'))
|
||||
|
||||
const contextMenu = (ev: Event, doc: Document) => {
|
||||
cursor.value = doc
|
||||
console.log('Context menu', ev, doc)
|
||||
const contextMenu = (ev: MouseEvent, doc: Doc) => {
|
||||
store.cursor = doc.key
|
||||
ContextMenu.showContextMenu({
|
||||
x: ev.x, y: ev.y, items: [
|
||||
{ label: 'Rename', onClick: () => { editing.value = doc } },
|
||||
],
|
||||
})
|
||||
}
|
||||
</script>
|
||||
|
||||
@@ -385,29 +314,36 @@ tbody tr {
|
||||
position: relative;
|
||||
z-index: auto;
|
||||
}
|
||||
table thead input[type='checkbox'] {
|
||||
table thead .selection input[type='checkbox'] {
|
||||
position: inherit;
|
||||
width: 1em;
|
||||
height: 1em;
|
||||
padding: 0.5rem 0.5em;
|
||||
width: 1rem;
|
||||
height: 1rem;
|
||||
padding: 0;
|
||||
margin: auto;
|
||||
}
|
||||
table tbody input[type='checkbox'] {
|
||||
table tbody .selection input[type='checkbox'] {
|
||||
width: 2rem;
|
||||
height: 2rem;
|
||||
}
|
||||
table .selection {
|
||||
width: 2rem;
|
||||
width: 3rem;
|
||||
text-align: center;
|
||||
text-overflow: clip;
|
||||
padding: 0;
|
||||
}
|
||||
table .selection input {
|
||||
margin: auto;
|
||||
}
|
||||
table .modified {
|
||||
width: 8em;
|
||||
width: 10rem;
|
||||
text-overflow: clip;
|
||||
}
|
||||
table .size {
|
||||
width: 5em;
|
||||
width: 7rem;
|
||||
text-overflow: clip;
|
||||
}
|
||||
table .menu {
|
||||
width: 1rem;
|
||||
width: 2rem;
|
||||
}
|
||||
tbody td {
|
||||
font-size: 1.2rem;
|
||||
@@ -442,7 +378,7 @@ table td {
|
||||
}
|
||||
}
|
||||
thead tr {
|
||||
font-size: var(--header-font-size);
|
||||
font-size: 0.8rem;
|
||||
background: linear-gradient(to bottom, #eee, #fff 30%, #ddd);
|
||||
color: #000;
|
||||
box-shadow: 0 0 .2rem black;
|
||||
@@ -463,9 +399,11 @@ tbody tr.cursor {
|
||||
padding-right: 1.5rem;
|
||||
}
|
||||
.sortcolumn::after {
|
||||
font-size: 1rem;
|
||||
content: '▸';
|
||||
color: #888;
|
||||
margin-left: 0.5em;
|
||||
margin-left: 0.5rem;
|
||||
margin-top: -.2rem;
|
||||
position: absolute;
|
||||
transition: all var(--transition-time) linear;
|
||||
}
|
||||
@@ -515,3 +453,4 @@ tbody .selection input {
|
||||
color: #888;
|
||||
}
|
||||
</style>
|
||||
@/stores/main
|
||||
22
frontend/src/components/FileModified.vue
Normal file
@@ -0,0 +1,22 @@
|
||||
<template>
|
||||
<td class="modified right">
|
||||
<time :data-tooltip=tooltip :datetime=datetime>{{ doc.modified }}</time>
|
||||
</td>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { Doc } from '@/repositories/Document'
|
||||
import { computed } from 'vue'
|
||||
|
||||
const datetime = computed(() =>
|
||||
new Date(1000 * props.doc.mtime).toISOString().replace('.000Z', 'Z')
|
||||
)
|
||||
|
||||
const tooltip = computed(() =>
|
||||
datetime.value.replace('T', '\n').replace('Z', ' UTC')
|
||||
)
|
||||
|
||||
const props = defineProps<{
|
||||
doc: Doc
|
||||
}>()
|
||||
</script>
|
||||
@@ -12,7 +12,7 @@
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import type { Document } from '@/repositories/Document'
|
||||
import { Doc } from '@/repositories/Document'
|
||||
import { ref, onMounted, nextTick } from 'vue'
|
||||
|
||||
const input = ref<HTMLInputElement | null>(null)
|
||||
@@ -28,8 +28,8 @@ onMounted(() => {
|
||||
})
|
||||
|
||||
const props = defineProps<{
|
||||
doc: Document
|
||||
rename: (doc: Document, newName: string) => void
|
||||
doc: Doc
|
||||
rename: (doc: Doc, newName: string) => void
|
||||
exit: () => void
|
||||
}>()
|
||||
|
||||
@@ -46,8 +46,8 @@ const apply = () => {
|
||||
|
||||
<style>
|
||||
input#FileRenameInput {
|
||||
color: var(--primary-color);
|
||||
background: var(--primary-background);
|
||||
color: var(--input-color);
|
||||
background: var(--input-background);
|
||||
border: 0;
|
||||
border-radius: 0.3rem;
|
||||
padding: 0.4rem;
|
||||
@@ -56,4 +56,10 @@ input#FileRenameInput {
|
||||
outline: none;
|
||||
font: inherit;
|
||||
}
|
||||
.gallery input#FileRenameInput {
|
||||
padding: .75em;
|
||||
font-weight: 600;
|
||||
width: auto;
|
||||
}
|
||||
|
||||
</style>
|
||||
43
frontend/src/components/FileSize.vue
Normal file
@@ -0,0 +1,43 @@
|
||||
<template>
|
||||
<td class="size right" :class=sizeClass>{{ doc.sizedisp }}</td>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { Doc } from '@/repositories/Document'
|
||||
import { computed } from 'vue'
|
||||
|
||||
const sizeClass = computed(() => {
|
||||
const unit = props.doc.sizedisp.split('\u202F').slice(-1)[0]
|
||||
return +unit ? "bytes" : unit
|
||||
})
|
||||
|
||||
const props = defineProps<{
|
||||
doc: Doc
|
||||
}>()
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
.size.empty { color: #555 }
|
||||
.size.bytes { color: #77a }
|
||||
.size.kB { color: #474 }
|
||||
.size.MB { color: #a80 }
|
||||
.size.GB { color: #f83 }
|
||||
.size.TB, .size.PB, .size.EB, .size.huge {
|
||||
color: #f44;
|
||||
text-shadow: 0 0 .2em;
|
||||
}
|
||||
|
||||
@media (prefers-color-scheme: dark) {
|
||||
.size.empty { color: #bbb }
|
||||
.size.bytes { color: #99d }
|
||||
.size.kB { color: #aea }
|
||||
.size.MB { color: #ff4 }
|
||||
.size.GB { color: #f86 }
|
||||
.size.TB, .size.PB, .size.EB, .size.huge { color: #f55 }
|
||||
}
|
||||
|
||||
.cursor .size {
|
||||
color: inherit;
|
||||
text-shadow: none;
|
||||
}
|
||||
</style>
|
||||