Compare commits
29 Commits
37167a41a6
...
v0.5.0
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
c40c245ce6 | ||
|
|
1fdd00b833 | ||
|
|
520a9dff47 | ||
|
|
c5c65d136a | ||
|
|
61f9026e23 | ||
|
|
3e50149d4d | ||
|
|
7077b21159 | ||
|
|
938c5ca657 | ||
|
|
e0aef07783 | ||
|
|
36826a83c1 | ||
|
|
6880f82c19 | ||
|
|
5dd1bd9bdc | ||
|
|
41e8c78ecd | ||
|
|
dc4bb494f3 | ||
|
|
9b58b887b4 | ||
|
|
07848907f3 | ||
|
|
7a08f7cbe2 | ||
|
|
dd37238510 | ||
|
|
c8d5f335b1 | ||
|
|
bb80b3ee54 | ||
|
|
06d860c601 | ||
|
|
c321de13fd | ||
|
|
278e8303c4 | ||
|
|
9854dd01cc | ||
|
|
fb03fa5430 | ||
|
|
e26cb8f70a | ||
|
|
9bbbc829a1 | ||
|
|
876d76bc1f | ||
|
|
4a53d0b8e2 |
119
README.md
@@ -1,25 +1,116 @@
|
||||
# Web File Storage
|
||||
# Cista Web Storage
|
||||
|
||||
Run directly from repository with Hatch (or use pip install as usual):
|
||||
```sh
|
||||
hatch run cista -l :3000 /path/to/files
|
||||
<img src="https://git.zi.fi/Vasanko/cista-storage/raw/branch/main/docs/cista.jpg" align=right width=250>
|
||||
|
||||
Cista takes its name from the ancient cistae, metal containers used by Greeks and Egyptians to safeguard valuable items. This modern application provides a browser interface for secure and accessible file storage, echoing the trust and reliability of its historical namesake.
|
||||
|
||||
This is a cutting-edge **file and document server** designed for speed, efficiency, and unparalleled ease of use. Experience **lightning-fast browsing**, thanks to the file list maintained directly in your browser and updated from server filesystem events, coupled with our highly optimized code. Fully **keyboard-navigable** and with a responsive layout, Cista flawlessly adapts to your devices, providing a seamless experience wherever you are. Our powerful **instant search** means you're always just a few keystrokes away from finding exactly what you need. Press **1/2/3** to switch ordering, navigate with all four arrow keys (+Shift to select). Or click your way around on **breadcrumbs that remember where you were**.
|
||||
|
||||
The Cista project started as an inevitable remake of [Droppy](https://github.com/droppyjs/droppy) which we used and loved despite its numerous bugs. Cista Storage stands out in handling even the most exotic filenames, ensuring a smooth experience where others falter.
|
||||
|
||||
All of this is wrapped in an intuitive interface with automatic light and dark themes, making Cista Storage the ideal choice for anyone seeking a reliable, versatile, and quick file storage solution. Quickly setup your own Cista where your files are just a click away, safe, and always accessible.
|
||||
|
||||
Experience Cista by visiting [Cista Demo](https://drop.zi.fi) for a test run and perhaps upload something...
|
||||
|
||||
|
||||
## Getting Started
|
||||
### Installation
|
||||
|
||||
To install the cista application, use:
|
||||
|
||||
```fish
|
||||
pip install cista
|
||||
```
|
||||
|
||||
Settings incl. these arguments are stored to config file on the first startup and later `hatch run cista` is sufficient. If the `cista` script is missing, consider `pip install -e .` (within `hatch shell`) or some other trickery (known issue with installs made prior to adding the startup script).
|
||||
Note: Some Linux distributions might need `--break-system-packages` to install Python packages, which are safely installed in the user's home folder. As an alternative to avoid installation, run it with command `pipx run cista`
|
||||
|
||||
Create your user account:
|
||||
```sh
|
||||
hatch run cista --user admin --privileged
|
||||
### Running the Server
|
||||
|
||||
Create an account: (or run a public server without authentication)
|
||||
```fish
|
||||
cista --user yourname --privileged
|
||||
```
|
||||
|
||||
## Build frontend
|
||||
Serve your files at http://localhost:8000:
|
||||
```fish
|
||||
cista -l :8000 /path/to/files
|
||||
```
|
||||
|
||||
Prebuilt frontend is provided in repository but for any changes it will need to be manually rebuilt:
|
||||
The server remembers its settings in the config folder (default `~/.local/share/cista/`), including the listen port and directory, for future runs without arguments.
|
||||
|
||||
```sh
|
||||
cd cista-front
|
||||
### Internet Access
|
||||
|
||||
To use your own TLS certificates, place them in the config folder and run:
|
||||
|
||||
```fish
|
||||
cista -l cista.example.com
|
||||
```
|
||||
|
||||
Most admins instead find the [Caddy](https://caddyserver.com/) web server convenient for its auto TLS certificates and all. A proxy also allows running multiple web services or Cista instances on the same IP address. Caddy configuration **/etc/caddy/Caddyfile** is dead simple:
|
||||
|
||||
```Caddyfile
|
||||
cista.example.com {
|
||||
reverse_proxy :8000
|
||||
}
|
||||
```
|
||||
|
||||
## Development setup
|
||||
|
||||
For rapid development, we use the Vite development server for the Vue frontend, while running the backend on port 8000 that Vite proxies backend requests to. Each server live reloads whenever its code or configuration are modified.
|
||||
|
||||
```fish
|
||||
cd frontend
|
||||
npm install
|
||||
npm run build
|
||||
npm run dev
|
||||
```
|
||||
|
||||
This will place the front in `cista/wwwroot` from where the backend server delivers it, and that also gets included in the Python package built via `hatch build`.
|
||||
Concurrently, start the backend on another terminal:
|
||||
|
||||
```fish
|
||||
hatch shell
|
||||
pip install -e '.[dev]'
|
||||
cista --dev -l :8000 /path/to/files
|
||||
```
|
||||
|
||||
We use `hatch shell` for installing on a virtual environment, to not disturb the rest of the system with our hacking.
|
||||
|
||||
Vue is used to build files in `cista/wwwroot`, included prebuilt in the Python package. Running `hatch build` builds the frontend and creates a NodeJS-independent Python package.
|
||||
|
||||
## System Deployment
|
||||
|
||||
This setup allows easy addition of storages, each with its own domain, configuration, and files.
|
||||
|
||||
Assuming a restricted user account **storage** for serving files and that cista is installed system-wide or on this account (check with `sudo -u storage -s`). Alternatively, use `pipx run cista` or `hatch run cista` as the ExecStart command.
|
||||
|
||||
Create **/etc/systemd/system/cista@.service**:
|
||||
|
||||
```ini
|
||||
[Unit]
|
||||
Description=Cista storage %i
|
||||
|
||||
[Service]
|
||||
User=storage
|
||||
ExecStart=cista -c /srv/cista/%i -l /srv/cista/%i/socket /media/storage/@%i/
|
||||
Restart=always
|
||||
|
||||
[Install]
|
||||
WantedBy=multi-user.target
|
||||
```
|
||||
|
||||
This setup supports multiple storages, each under `/media/storage/<domain>` for files and `/srv/cista/<domain>/` for configuration. UNIX sockets are used instead of numeric ports for convenience.
|
||||
|
||||
```fish
|
||||
systemctl daemon-reload
|
||||
systemctl enable --now cista@foo.example.com
|
||||
systemctl enable --now cista@bar.example.com
|
||||
```
|
||||
|
||||
Public exposure is easiest using the Caddy web server, but Nginx or others also work. Run the server with -l domain.example.com if you have TLS certificates in the config folder.
|
||||
|
||||
**/etc/caddy/Caddyfile**:
|
||||
|
||||
```Caddyfile
|
||||
foo.example.com, bar.example.com {
|
||||
reverse_proxy unix//srv/cista/{host}/socket
|
||||
}
|
||||
```
|
||||
|
||||
|
Before Width: | Height: | Size: 4.2 KiB |
@@ -1,241 +0,0 @@
|
||||
<!DOCTYPE html>
|
||||
<title>Storage</title>
|
||||
<style>
|
||||
body {
|
||||
font-family: sans-serif;
|
||||
max-width: 100ch;
|
||||
margin: 0 auto;
|
||||
padding: 1em;
|
||||
background-color: #333;
|
||||
color: #eee;
|
||||
}
|
||||
td {
|
||||
text-align: right;
|
||||
padding: .5em;
|
||||
}
|
||||
td:first-child {
|
||||
text-align: left;
|
||||
}
|
||||
a {
|
||||
color: inherit;
|
||||
text-decoration: none;
|
||||
}
|
||||
</style>
|
||||
<div>
|
||||
<h2>Quick file upload</h2>
|
||||
<p>Uses parallel WebSocket connections for increased bandwidth /api/upload</p>
|
||||
<input type=file id=fileInput>
|
||||
<progress id=progressBar value=0 max=1></progress>
|
||||
</div>
|
||||
|
||||
<div>
|
||||
<h2>Files</h2>
|
||||
<ul id=file_list></ul>
|
||||
</div>
|
||||
|
||||
<script>
|
||||
let files = {}
|
||||
let flatfiles = {}
|
||||
|
||||
function createWatchSocket() {
|
||||
const wsurl = new URL("/api/watch", location.href.replace(/^http/, 'ws'))
|
||||
const ws = new WebSocket(wsurl)
|
||||
ws.onmessage = event => {
|
||||
msg = JSON.parse(event.data)
|
||||
if (msg.update) {
|
||||
tree_update(msg.update)
|
||||
file_list(files)
|
||||
} else {
|
||||
console.log("Unkonwn message from watch socket", msg)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
createWatchSocket()
|
||||
|
||||
function tree_update(msg) {
|
||||
console.log("Tree update", msg)
|
||||
let node = files
|
||||
for (const elem of msg) {
|
||||
if (elem.deleted) {
|
||||
const p = node.dir[elem.name].path
|
||||
delete node.dir[elem.name]
|
||||
delete flatfiles[p]
|
||||
break
|
||||
}
|
||||
if (elem.name !== undefined) node = node.dir[elem.name] ||= {}
|
||||
if (elem.size !== undefined) node.size = elem.size
|
||||
if (elem.mtime !== undefined) node.mtime = elem.mtime
|
||||
if (elem.dir !== undefined) node.dir = elem.dir
|
||||
}
|
||||
// Update paths and flatfiles
|
||||
files.path = "/"
|
||||
const nodes = [files]
|
||||
flatfiles = {}
|
||||
while (node = nodes.pop()) {
|
||||
flatfiles[node.path] = node
|
||||
if (node.dir === undefined) continue
|
||||
for (const name of Object.keys(node.dir)) {
|
||||
const child = node.dir[name]
|
||||
child.path = node.path + name + (child.dir === undefined ? "" : "/")
|
||||
nodes.push(child)
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
var collator = new Intl.Collator(undefined, {numeric: true, sensitivity: 'base'});
|
||||
|
||||
const compare_path = (a, b) => collator.compare(a.path, b.path)
|
||||
const compare_time = (a, b) => a.mtime > b.mtime
|
||||
|
||||
function file_list(files) {
|
||||
const table = document.getElementById("file_list")
|
||||
const sorted = Object.values(flatfiles).sort(compare_time)
|
||||
table.innerHTML = ""
|
||||
for (const f of sorted) {
|
||||
const {path, size, mtime} = f
|
||||
const tr = document.createElement("tr")
|
||||
const name_td = document.createElement("td")
|
||||
const size_td = document.createElement("td")
|
||||
const mtime_td = document.createElement("td")
|
||||
const a = document.createElement("a")
|
||||
table.appendChild(tr)
|
||||
tr.appendChild(name_td)
|
||||
tr.appendChild(size_td)
|
||||
tr.appendChild(mtime_td)
|
||||
name_td.appendChild(a)
|
||||
size_td.textContent = size
|
||||
mtime_td.textContent = formatUnixDate(mtime)
|
||||
a.textContent = path
|
||||
a.href = `/files${path}`
|
||||
/*a.onclick = event => {
|
||||
if (window.showSaveFilePicker) {
|
||||
event.preventDefault()
|
||||
download_ws(name, size)
|
||||
}
|
||||
}
|
||||
a.download = ""*/
|
||||
}
|
||||
}
|
||||
|
||||
function formatUnixDate(t) {
|
||||
const date = new Date(t * 1000)
|
||||
const now = new Date()
|
||||
const diff = date - now
|
||||
const formatter = new Intl.RelativeTimeFormat('en', { numeric: 'auto' })
|
||||
|
||||
if (Math.abs(diff) <= 60000) {
|
||||
return formatter.format(Math.round(diff / 1000), 'second')
|
||||
}
|
||||
|
||||
if (Math.abs(diff) <= 3600000) {
|
||||
return formatter.format(Math.round(diff / 60000), 'minute')
|
||||
}
|
||||
|
||||
if (Math.abs(diff) <= 86400000) {
|
||||
return formatter.format(Math.round(diff / 3600000), 'hour')
|
||||
}
|
||||
|
||||
if (Math.abs(diff) <= 604800000) {
|
||||
return formatter.format(Math.round(diff / 86400000), 'day')
|
||||
}
|
||||
|
||||
return date.toLocaleDateString()
|
||||
}
|
||||
|
||||
async function download_ws(name, size) {
|
||||
const fh = await window.showSaveFilePicker({
|
||||
suggestedName: name,
|
||||
})
|
||||
const writer = await fh.createWritable()
|
||||
writer.truncate(size)
|
||||
const wsurl = new URL("/api/download", location.href.replace(/^http/, 'ws'))
|
||||
const ws = new WebSocket(wsurl)
|
||||
let pos = 0
|
||||
ws.onopen = () => {
|
||||
console.log("Downloading over WebSocket", name, size)
|
||||
ws.send(JSON.stringify({name, start: 0, end: size, size}))
|
||||
}
|
||||
ws.onmessage = event => {
|
||||
if (typeof event.data === 'string') {
|
||||
const msg = JSON.parse(event.data)
|
||||
console.log("Download finished", msg)
|
||||
ws.close()
|
||||
return
|
||||
}
|
||||
console.log("Received chunk", name, pos, pos + event.data.size)
|
||||
pos += event.data.size
|
||||
writer.write(event.data)
|
||||
}
|
||||
ws.onclose = () => {
|
||||
if (pos < size) {
|
||||
console.log("Download aborted", name, pos)
|
||||
writer.truncate(pos)
|
||||
}
|
||||
writer.close()
|
||||
}
|
||||
}
|
||||
|
||||
const fileInput = document.getElementById("fileInput")
|
||||
const progress = document.getElementById("progressBar")
|
||||
const numConnections = 2
|
||||
const chunkSize = 1<<20
|
||||
const wsConnections = new Set()
|
||||
|
||||
//for (let i = 0; i < numConnections; i++) createUploadWS()
|
||||
|
||||
function createUploadWS() {
|
||||
const wsurl = new URL("/api/upload", location.href.replace(/^http/, 'ws'))
|
||||
const ws = new WebSocket(wsurl)
|
||||
ws.binaryType = 'arraybuffer'
|
||||
ws.onopen = () => {
|
||||
wsConnections.add(ws)
|
||||
console.log("Upload socket connected")
|
||||
}
|
||||
ws.onmessage = event => {
|
||||
msg = JSON.parse(event.data)
|
||||
if (msg.written) progress.value += +msg.written
|
||||
else console.log(`Error: ${msg.error}`)
|
||||
}
|
||||
ws.onclose = () => {
|
||||
wsConnections.delete(ws)
|
||||
console.log("Upload socket disconnected, reconnecting...")
|
||||
setTimeout(createUploadWS, 1000)
|
||||
}
|
||||
}
|
||||
|
||||
async function load(file, start, end) {
|
||||
const reader = new FileReader()
|
||||
const load = new Promise(resolve => reader.onload = resolve)
|
||||
reader.readAsArrayBuffer(file.slice(start, end))
|
||||
const event = await load
|
||||
return event.target.result
|
||||
}
|
||||
|
||||
async function sendChunk(file, start, end, ws) {
|
||||
const chunk = await load(file, start, end)
|
||||
ws.send(JSON.stringify({
|
||||
name: file.name,
|
||||
size: file.size,
|
||||
start: start,
|
||||
end: end
|
||||
}))
|
||||
ws.send(chunk)
|
||||
}
|
||||
|
||||
fileInput.addEventListener("change", async function() {
|
||||
const file = this.files[0]
|
||||
const numChunks = Math.ceil(file.size / chunkSize)
|
||||
progress.value = 0
|
||||
progress.max = file.size
|
||||
|
||||
console.log(wsConnections)
|
||||
for (let i = 0; i < numChunks; i++) {
|
||||
const ws = Array.from(wsConnections)[i % wsConnections.size]
|
||||
const start = i * chunkSize
|
||||
const end = Math.min(file.size, start + chunkSize)
|
||||
const res = await sendChunk(file, start, end, ws)
|
||||
}
|
||||
})
|
||||
|
||||
</script>
|
||||
@@ -1,52 +0,0 @@
|
||||
<template>
|
||||
<object
|
||||
v-if="props.type === 'pdf'"
|
||||
:data="dataURL"
|
||||
type="application/pdf"
|
||||
width="100%"
|
||||
height="100%"
|
||||
></object>
|
||||
<a-image
|
||||
v-else-if="props.type === 'image'"
|
||||
width="50%"
|
||||
:src="dataURL"
|
||||
@click="() => setVisible(true)"
|
||||
:previewMask="false"
|
||||
:preview="{
|
||||
visibleImg,
|
||||
onVisibleChange: setVisible
|
||||
}"
|
||||
/>
|
||||
<!-- Unknown case -->
|
||||
<h1 v-else>Unsupported file type</h1>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { watchEffect, ref } from 'vue'
|
||||
import Router from '@/router/index'
|
||||
import { url_document_get } from '@/repositories/Document'
|
||||
|
||||
const dataURL = ref('')
|
||||
watchEffect(() => {
|
||||
dataURL.value = new URL(
|
||||
url_document_get + Router.currentRoute.value.path,
|
||||
location.origin
|
||||
).toString()
|
||||
})
|
||||
const emit = defineEmits({
|
||||
visibleImg(value: boolean) {
|
||||
return value
|
||||
}
|
||||
})
|
||||
|
||||
function setVisible(value: boolean) {
|
||||
emit('visibleImg', value)
|
||||
}
|
||||
|
||||
const props = defineProps<{
|
||||
type?: string
|
||||
visibleImg: boolean
|
||||
}>()
|
||||
</script>
|
||||
|
||||
<style></style>
|
||||
@@ -1,27 +0,0 @@
|
||||
<template>
|
||||
<template v-for="upload in documentStore.uploadingDocuments" :key="upload.key">
|
||||
<span>{{ upload.name }}</span>
|
||||
<div class="progress-container">
|
||||
<a-progress :percent="upload.progress" />
|
||||
<CloseCircleOutlined class="close-button" @click="dismissUpload(upload.key)" />
|
||||
</div>
|
||||
</template>
|
||||
</template>
|
||||
<script setup lang="ts">
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
const documentStore = useDocumentStore()
|
||||
|
||||
function dismissUpload(key: number) {
|
||||
documentStore.deleteUploadingDocument(key)
|
||||
}
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
.progress-container {
|
||||
display: flex;
|
||||
align-items: center;
|
||||
}
|
||||
.close-button:hover {
|
||||
color: #b81414;
|
||||
}
|
||||
</style>
|
||||
@@ -1,101 +0,0 @@
|
||||
<script setup lang="ts">
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import { h, ref } from 'vue'
|
||||
|
||||
const fileUploadButton = ref()
|
||||
const folderUploadButton = ref()
|
||||
const documentStore = useDocumentStore()
|
||||
const open = (placement: any) => openNotification(placement)
|
||||
|
||||
const isNotificationOpen = ref(false)
|
||||
const openNotification = (placement: any) => {
|
||||
if (!isNotificationOpen.value) {
|
||||
/*
|
||||
api.open({
|
||||
message: `Uploading documents`,
|
||||
description: h(NotificationLoading),
|
||||
placement,
|
||||
duration: 0,
|
||||
onClose: () => { isNotificationOpen.value = false }
|
||||
});*/
|
||||
isNotificationOpen.value = true
|
||||
}
|
||||
}
|
||||
|
||||
function uploadFileHandler() {
|
||||
fileUploadButton.value.click()
|
||||
}
|
||||
|
||||
async function load(file: File, start: number, end: number): Promise<ArrayBuffer> {
|
||||
const reader = new FileReader()
|
||||
const load = new Promise<Event>(resolve => (reader.onload = resolve))
|
||||
reader.readAsArrayBuffer(file.slice(start, end))
|
||||
const event = await load
|
||||
if (event.target && event.target instanceof FileReader) {
|
||||
return event.target.result as ArrayBuffer
|
||||
} else {
|
||||
throw new Error('Error loading file')
|
||||
}
|
||||
}
|
||||
|
||||
async function sendChunk(file: File, start: number, end: number) {
|
||||
const ws = documentStore.wsUpload
|
||||
if (ws) {
|
||||
const chunk = await load(file, start, end)
|
||||
|
||||
ws.send(
|
||||
JSON.stringify({
|
||||
name: file.name,
|
||||
size: file.size,
|
||||
start: start,
|
||||
end: end
|
||||
})
|
||||
)
|
||||
ws.send(chunk)
|
||||
}
|
||||
}
|
||||
|
||||
async function uploadHandler(event: Event) {
|
||||
const target = event.target as HTMLInputElement
|
||||
const chunkSize = 1 << 20
|
||||
if (!target?.files?.length) {
|
||||
documentStore.error = 'No files selected'
|
||||
return
|
||||
}
|
||||
for (const idx in target.files) {
|
||||
const file = target.files[idx]
|
||||
console.log('Uploading', file)
|
||||
const numChunks = Math.ceil(file.size / chunkSize)
|
||||
const document = documentStore.pushUploadingDocuments(file.name)
|
||||
open('bottomRight')
|
||||
for (let i = 0; i < numChunks; i++) {
|
||||
const start = i * chunkSize
|
||||
const end = Math.min(file.size, start + chunkSize)
|
||||
const res = await sendChunk(file, start, end)
|
||||
console.log('progress: ' + (100 * (i + 1)) / numChunks)
|
||||
console.log('Num Chunks: ' + numChunks)
|
||||
documentStore.updateUploadingDocuments(document.key, (100 * (i + 1)) / numChunks)
|
||||
}
|
||||
}
|
||||
}
|
||||
</script>
|
||||
<template>
|
||||
<template>
|
||||
<input
|
||||
ref="fileUploadButton"
|
||||
@change="uploadHandler"
|
||||
class="upload-input"
|
||||
type="file"
|
||||
multiple
|
||||
/>
|
||||
<input
|
||||
ref="folderUploadButton"
|
||||
@change="uploadHandler"
|
||||
class="upload-input"
|
||||
type="file"
|
||||
webkitdirectory
|
||||
/>
|
||||
</template>
|
||||
<SvgButton name="add-file" data-tooltip="Upload files" @click="fileUploadButton.click()" />
|
||||
<SvgButton name="add-folder" data-tooltip="Upload folder" @click="folderUploadButton.click()" />
|
||||
</template>
|
||||
@@ -1,55 +0,0 @@
|
||||
export type FUID = string
|
||||
|
||||
export type Document = {
|
||||
loc: string
|
||||
name: string
|
||||
key: FUID
|
||||
size: number
|
||||
sizedisp: string
|
||||
mtime: number
|
||||
modified: string
|
||||
haystack: string
|
||||
dir: boolean
|
||||
}
|
||||
|
||||
export type errorEvent = {
|
||||
error: {
|
||||
code: number
|
||||
message: string
|
||||
redirect: string
|
||||
}
|
||||
}
|
||||
|
||||
// Raw types the backend /api/watch sends us
|
||||
|
||||
export type FileEntry = {
|
||||
key: FUID
|
||||
size: number
|
||||
mtime: number
|
||||
}
|
||||
|
||||
export type DirEntry = {
|
||||
key: FUID
|
||||
size: number
|
||||
mtime: number
|
||||
dir: DirList
|
||||
}
|
||||
|
||||
export type DirList = Record<string, FileEntry | DirEntry>
|
||||
|
||||
export type UpdateEntry = {
|
||||
name: string
|
||||
deleted?: boolean
|
||||
key?: FUID
|
||||
size?: number
|
||||
mtime?: number
|
||||
dir?: DirList
|
||||
}
|
||||
|
||||
// Helper structure for selections
|
||||
export interface SelectedItems {
|
||||
keys: FUID[]
|
||||
docs: Record<FUID, Document>
|
||||
recursive: Array<[string, string, Document]>
|
||||
missing: Set<FUID>
|
||||
}
|
||||
@@ -1,183 +0,0 @@
|
||||
import type {
|
||||
Document,
|
||||
DirEntry,
|
||||
FileEntry,
|
||||
FUID,
|
||||
SelectedItems
|
||||
} from '@/repositories/Document'
|
||||
import { formatSize, formatUnixDate, haystackFormat } from '@/utils'
|
||||
import { defineStore } from 'pinia'
|
||||
import { collator } from '@/utils'
|
||||
import { logoutUser } from '@/repositories/User'
|
||||
import { watchConnect } from '@/repositories/WS'
|
||||
|
||||
type FileData = { id: string; mtime: number; size: number; dir: DirectoryData }
|
||||
type DirectoryData = {
|
||||
[filename: string]: FileData
|
||||
}
|
||||
type User = {
|
||||
username: string
|
||||
privileged: boolean
|
||||
isOpenLoginModal: boolean
|
||||
isLoggedIn: boolean
|
||||
}
|
||||
|
||||
export const useDocumentStore = defineStore({
|
||||
id: 'documents',
|
||||
state: () => ({
|
||||
document: [] as Document[],
|
||||
search: "" as string,
|
||||
selected: new Set<FUID>(),
|
||||
uploadingDocuments: [],
|
||||
uploadCount: 0 as number,
|
||||
fileExplorer: null,
|
||||
error: '' as string,
|
||||
connected: false,
|
||||
user: {
|
||||
username: '',
|
||||
privileged: false,
|
||||
isLoggedIn: false,
|
||||
isOpenLoginModal: false
|
||||
} as User
|
||||
}),
|
||||
persist: {
|
||||
storage: sessionStorage,
|
||||
paths: ['document'],
|
||||
},
|
||||
actions: {
|
||||
updateRoot(root: DirEntry | null = null) {
|
||||
if (!root) {
|
||||
this.document = []
|
||||
return
|
||||
}
|
||||
// Transform tree data to flat documents array
|
||||
let loc = ""
|
||||
const mapper = ([name, attr]: [string, FileEntry | DirEntry]) => ({
|
||||
...attr,
|
||||
loc,
|
||||
name,
|
||||
sizedisp: formatSize(attr.size),
|
||||
modified: formatUnixDate(attr.mtime),
|
||||
haystack: haystackFormat(name),
|
||||
})
|
||||
const queue = [...Object.entries(root.dir ?? {}).map(mapper)]
|
||||
const docs = []
|
||||
for (let doc; (doc = queue.shift()) !== undefined;) {
|
||||
docs.push(doc)
|
||||
if ("dir" in doc) {
|
||||
// Recurse but replace recursive structure with boolean
|
||||
loc = doc.loc ? `${doc.loc}/${doc.name}` : doc.name
|
||||
queue.push(...Object.entries(doc.dir).map(mapper))
|
||||
// @ts-ignore
|
||||
doc.dir = true
|
||||
}
|
||||
// @ts-ignore
|
||||
else doc.dir = false
|
||||
}
|
||||
// Pre sort directory entries folders first then files, names in natural ordering
|
||||
docs.sort((a, b) =>
|
||||
// @ts-ignore
|
||||
b.dir - a.dir ||
|
||||
collator.compare(a.name, b.name)
|
||||
)
|
||||
this.document = docs as Document[]
|
||||
},
|
||||
updateUploadingDocuments(key: number, progress: number) {
|
||||
for (const d of this.uploadingDocuments) {
|
||||
if (d.key === key) d.progress = progress
|
||||
}
|
||||
},
|
||||
pushUploadingDocuments(name: string) {
|
||||
this.uploadCount++
|
||||
const document = {
|
||||
key: this.uploadCount,
|
||||
name: name,
|
||||
progress: 0
|
||||
}
|
||||
this.uploadingDocuments.push(document)
|
||||
return document
|
||||
},
|
||||
deleteUploadingDocument(key: number) {
|
||||
this.uploadingDocuments = this.uploadingDocuments.filter(e => e.key !== key)
|
||||
},
|
||||
updateModified() {
|
||||
for (const d of this.document) {
|
||||
if ('mtime' in d) d.modified = formatUnixDate(d.mtime)
|
||||
}
|
||||
},
|
||||
login(username: string, privileged: boolean) {
|
||||
this.user.username = username
|
||||
this.user.privileged = privileged
|
||||
this.user.isLoggedIn = true
|
||||
this.user.isOpenLoginModal = false
|
||||
if (!this.connected) watchConnect()
|
||||
},
|
||||
loginDialog() {
|
||||
this.user.isOpenLoginModal = true
|
||||
},
|
||||
async logout() {
|
||||
console.log("Logout")
|
||||
await logoutUser()
|
||||
this.$reset()
|
||||
history.go() // Reload page
|
||||
}
|
||||
},
|
||||
getters: {
|
||||
isUserLogged(): boolean {
|
||||
return this.user.isLoggedIn
|
||||
},
|
||||
recentDocuments(): Document[] {
|
||||
const ret = [...this.document]
|
||||
ret.sort((a, b) => b.mtime - a.mtime)
|
||||
return ret
|
||||
},
|
||||
largeDocuments(): Document[] {
|
||||
const ret = [...this.document]
|
||||
ret.sort((a, b) => b.size - a.size)
|
||||
return ret
|
||||
},
|
||||
selectedFiles(): SelectedItems {
|
||||
const selected = this.selected
|
||||
const found = new Set<FUID>()
|
||||
const ret: SelectedItems = {
|
||||
missing: new Set(),
|
||||
docs: {},
|
||||
keys: [],
|
||||
recursive: [],
|
||||
}
|
||||
for (const doc of this.document) {
|
||||
if (selected.has(doc.key)) {
|
||||
found.add(doc.key)
|
||||
ret.keys.push(doc.key)
|
||||
ret.docs[doc.key] = doc
|
||||
}
|
||||
}
|
||||
// What did we not select?
|
||||
for (const key of selected) if (!found.has(key)) ret.missing.add(key)
|
||||
// Build a flat list including contents recursively
|
||||
const relnames = new Set<string>()
|
||||
function add(rel: string, full: string, doc: Document) {
|
||||
if (!doc.dir && relnames.has(rel)) throw Error(`Multiple selections conflict for: ${rel}`)
|
||||
relnames.add(rel)
|
||||
ret.recursive.push([rel, full, doc])
|
||||
}
|
||||
for (const key of ret.keys) {
|
||||
const base = ret.docs[key]
|
||||
const basepath = base.loc ? `${base.loc}/${base.name}` : base.name
|
||||
const nremove = base.loc.length
|
||||
add(base.name, basepath, base)
|
||||
for (const doc of this.document) {
|
||||
if (doc.loc === basepath || doc.loc.startsWith(basepath) && doc.loc[basepath.length] === '/') {
|
||||
const full = doc.loc ? `${doc.loc}/${doc.name}` : doc.name
|
||||
const rel = full.slice(nremove)
|
||||
add(rel, full, doc)
|
||||
}
|
||||
}
|
||||
}
|
||||
// Sort by rel (name stored as on download)
|
||||
ret.recursive.sort((a, b) => collator.compare(a[0], b[0]))
|
||||
|
||||
return ret
|
||||
}
|
||||
}
|
||||
})
|
||||
@@ -62,12 +62,15 @@ def _main():
|
||||
_confdir(args)
|
||||
exists = config.conffile.exists()
|
||||
import_droppy = args["--import-droppy"]
|
||||
necessary_opts = exists or import_droppy or path and listen
|
||||
necessary_opts = exists or import_droppy or path
|
||||
if not necessary_opts:
|
||||
# Maybe run without arguments
|
||||
print(doc)
|
||||
print(
|
||||
"No config file found! Get started with:\n cista -l :8000 /path/to/files, or\n cista -l example.com --import-droppy # Uses Droppy files\n",
|
||||
"No config file found! Get started with one of:\n"
|
||||
" cista --user yourname --privileged\n"
|
||||
" cista --import-droppy\n"
|
||||
" cista -l :8000 /path/to/files\n"
|
||||
)
|
||||
return 1
|
||||
settings = {}
|
||||
@@ -79,8 +82,15 @@ def _main():
|
||||
settings = droppy.readconf()
|
||||
if path:
|
||||
settings["path"] = path
|
||||
elif not exists:
|
||||
settings["path"] = Path.home() / "Downloads"
|
||||
if listen:
|
||||
settings["listen"] = listen
|
||||
elif not exists:
|
||||
settings["listen"] = ":8000"
|
||||
if not exists and not import_droppy:
|
||||
# We have no users, so make it public
|
||||
settings["public"] = True
|
||||
operation = config.update_config(settings)
|
||||
print(f"Config {operation}: {config.conffile}")
|
||||
# Prepare to serve
|
||||
@@ -105,18 +115,30 @@ def _confdir(args):
|
||||
if confdir.exists() and not confdir.is_dir():
|
||||
if confdir.name != config.conffile.name:
|
||||
raise ValueError("Config path is not a directory")
|
||||
# Accidentally pointed to the cista.toml, use parent
|
||||
# Accidentally pointed to the db.toml, use parent
|
||||
confdir = confdir.parent
|
||||
config.conffile = config.conffile.with_parent(confdir)
|
||||
config.conffile = confdir / config.conffile.name
|
||||
|
||||
|
||||
def _user(args):
|
||||
_confdir(args)
|
||||
config.load_config()
|
||||
if config.conffile.exists():
|
||||
config.load_config()
|
||||
operation = False
|
||||
else:
|
||||
# Defaults for new config when user is created
|
||||
operation = config.update_config(
|
||||
{
|
||||
"listen": ":8000",
|
||||
"path": Path.home() / "Downloads",
|
||||
"public": False,
|
||||
}
|
||||
)
|
||||
print(f"Config {operation}: {config.conffile}\n")
|
||||
|
||||
name = args["--user"]
|
||||
if not name or not name.isidentifier():
|
||||
raise ValueError("Invalid username")
|
||||
config.load_config()
|
||||
u = config.config.users.get(name)
|
||||
info = f"User {name}" if u else f"New user {name}"
|
||||
changes = {}
|
||||
@@ -128,12 +150,17 @@ def _user(args):
|
||||
info += " (admin)" if oldadmin else ""
|
||||
if args["--password"] or not u:
|
||||
changes["password"] = pw = pwgen.generate()
|
||||
info += f"\n Password: {pw}"
|
||||
res = config.update_user(args["--user"], changes)
|
||||
info += f"\n Password: {pw}\n"
|
||||
res = config.update_user(name, changes)
|
||||
print(info)
|
||||
if res == "read":
|
||||
print(" No changes")
|
||||
|
||||
if operation == "created":
|
||||
print(
|
||||
"Now you can run the server:\n cista # defaults set: -l :8000 ~/Downloads\n"
|
||||
)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
sys.exit(main())
|
||||
|
||||
20
cista/api.py
@@ -37,10 +37,18 @@ async def upload(req, ws):
|
||||
)
|
||||
req = msgspec.json.decode(text, type=FileRange)
|
||||
pos = req.start
|
||||
data = None
|
||||
while pos < req.end and (data := await ws.recv()) and isinstance(data, bytes):
|
||||
while True:
|
||||
data = await ws.recv()
|
||||
if not isinstance(data, bytes):
|
||||
break
|
||||
if len(data) > req.end - pos:
|
||||
raise ValueError(
|
||||
f"Expected up to {req.end - pos} bytes, got {len(data)} bytes"
|
||||
)
|
||||
sentsize = await alink(("upload", req.name, pos, data, req.size))
|
||||
pos += typing.cast(int, sentsize)
|
||||
if pos >= req.end:
|
||||
break
|
||||
if pos != req.end:
|
||||
d = f"{len(data)} bytes" if isinstance(data, bytes) else data
|
||||
raise ValueError(f"Expected {req.end - pos} more bytes, got {d}")
|
||||
@@ -88,7 +96,7 @@ async def watch(req, ws):
|
||||
msgspec.json.encode(
|
||||
{
|
||||
"server": {
|
||||
"name": "Cista", # Should be configurable
|
||||
"name": config.config.name or config.config.path.name,
|
||||
"version": __version__,
|
||||
"public": config.config.public,
|
||||
},
|
||||
@@ -103,11 +111,11 @@ async def watch(req, ws):
|
||||
)
|
||||
uuid = token_bytes(16)
|
||||
try:
|
||||
with watching.tree_lock:
|
||||
with watching.state.lock:
|
||||
q = watching.pubsub[uuid] = asyncio.Queue()
|
||||
# Init with disk usage and full tree
|
||||
await ws.send(watching.format_du())
|
||||
await ws.send(watching.format_tree())
|
||||
await ws.send(watching.format_space(watching.state.space))
|
||||
await ws.send(watching.format_root(watching.state.root))
|
||||
# Send updates
|
||||
while True:
|
||||
await ws.send(await q.get())
|
||||
|
||||
129
cista/app.py
@@ -1,10 +1,8 @@
|
||||
import asyncio
|
||||
import datetime
|
||||
import mimetypes
|
||||
from collections import deque
|
||||
from concurrent.futures import ThreadPoolExecutor
|
||||
from importlib.resources import files
|
||||
from pathlib import Path
|
||||
from pathlib import Path, PurePath, PurePosixPath
|
||||
from stat import S_IFDIR, S_IFREG
|
||||
from urllib.parse import unquote
|
||||
from wsgiref.handlers import format_date_time
|
||||
@@ -12,7 +10,6 @@ from wsgiref.handlers import format_date_time
|
||||
import brotli
|
||||
import sanic.helpers
|
||||
from blake3 import blake3
|
||||
from natsort import natsorted, ns
|
||||
from sanic import Blueprint, Sanic, empty, raw
|
||||
from sanic.exceptions import Forbidden, NotFound
|
||||
from sanic.log import logging
|
||||
@@ -20,7 +17,6 @@ from stream_zip import ZIP_AUTO, stream_zip
|
||||
|
||||
from cista import auth, config, session, watching
|
||||
from cista.api import bp
|
||||
from cista.protocol import DirEntry
|
||||
from cista.util.apphelpers import handle_sanic_exception
|
||||
|
||||
# Workaround until Sanic PR #2824 is merged
|
||||
@@ -36,7 +32,9 @@ app.exception(Exception)(handle_sanic_exception)
|
||||
async def main_start(app, loop):
|
||||
config.load_config()
|
||||
await watching.start(app, loop)
|
||||
app.ctx.threadexec = ThreadPoolExecutor(max_workers=8)
|
||||
app.ctx.threadexec = ThreadPoolExecutor(
|
||||
max_workers=8, thread_name_prefix="cista-ioworker"
|
||||
)
|
||||
|
||||
|
||||
@app.after_server_stop
|
||||
@@ -49,8 +47,8 @@ async def main_stop(app, loop):
|
||||
async def use_session(req):
|
||||
req.ctx.session = session.get(req)
|
||||
try:
|
||||
req.ctx.username = req.ctx.session["username"]
|
||||
req.ctx.user = config.config.users[req.ctx.session["username"]] # type: ignore
|
||||
req.ctx.username = req.ctx.session["username"] # type: ignore
|
||||
req.ctx.user = config.config.users[req.ctx.username]
|
||||
except (AttributeError, KeyError, TypeError):
|
||||
req.ctx.username = None
|
||||
req.ctx.user = None
|
||||
@@ -81,22 +79,16 @@ def http_fileserver(app, _):
|
||||
www = {}
|
||||
|
||||
|
||||
@app.before_server_start
|
||||
async def load_wwwroot(*_ignored):
|
||||
global www
|
||||
www = await asyncio.get_event_loop().run_in_executor(None, _load_wwwroot, www)
|
||||
|
||||
|
||||
def _load_wwwroot(www):
|
||||
wwwnew = {}
|
||||
base = files("cista") / "wwwroot"
|
||||
paths = ["."]
|
||||
base = Path(__file__).with_name("wwwroot")
|
||||
paths = [PurePath()]
|
||||
while paths:
|
||||
path = paths.pop(0)
|
||||
current = base / path
|
||||
for p in current.iterdir():
|
||||
if p.is_dir():
|
||||
paths.append(current / p.parts[-1])
|
||||
paths.append(p.relative_to(base))
|
||||
continue
|
||||
name = p.relative_to(base).as_posix()
|
||||
mime = mimetypes.guess_type(name)[0] or "application/octet-stream"
|
||||
@@ -127,15 +119,44 @@ def _load_wwwroot(www):
|
||||
if len(br) >= len(data):
|
||||
br = False
|
||||
wwwnew[name] = data, br, headers
|
||||
if not wwwnew:
|
||||
msg = f"Web frontend missing from {base}\n Did you forget: hatch build\n"
|
||||
if not www:
|
||||
logging.warning(msg)
|
||||
if not app.debug:
|
||||
msg = "Web frontend missing. Cista installation is broken.\n"
|
||||
wwwnew[""] = (
|
||||
msg.encode(),
|
||||
False,
|
||||
{
|
||||
"etag": "error",
|
||||
"content-type": "text/plain",
|
||||
"cache-control": "no-store",
|
||||
},
|
||||
)
|
||||
return wwwnew
|
||||
|
||||
|
||||
@app.add_task
|
||||
@app.before_server_start
|
||||
async def start(app):
|
||||
await load_wwwroot(app)
|
||||
if app.debug:
|
||||
app.add_task(refresh_wwwroot())
|
||||
|
||||
|
||||
async def load_wwwroot(app):
|
||||
global www
|
||||
www = await asyncio.get_event_loop().run_in_executor(
|
||||
app.ctx.threadexec, _load_wwwroot, www
|
||||
)
|
||||
|
||||
|
||||
async def refresh_wwwroot():
|
||||
while True:
|
||||
await asyncio.sleep(0.5)
|
||||
try:
|
||||
wwwold = www
|
||||
await load_wwwroot()
|
||||
await load_wwwroot(app)
|
||||
changes = ""
|
||||
for name in sorted(www):
|
||||
attr = www[name]
|
||||
@@ -151,7 +172,6 @@ async def refresh_wwwroot():
|
||||
print("Error loading wwwroot", e)
|
||||
if not app.debug:
|
||||
return
|
||||
await asyncio.sleep(0.5)
|
||||
|
||||
|
||||
@app.route("/<path:path>", methods=["GET", "HEAD"])
|
||||
@@ -166,66 +186,70 @@ async def wwwroot(req, path=""):
|
||||
return empty(304, headers=headers)
|
||||
# Brotli compressed?
|
||||
if br and "br" in req.headers.accept_encoding.split(", "):
|
||||
headers = {
|
||||
**headers,
|
||||
"content-encoding": "br",
|
||||
}
|
||||
headers = {**headers, "content-encoding": "br"}
|
||||
data = br
|
||||
return raw(data, headers=headers)
|
||||
|
||||
|
||||
def get_files(wanted: set) -> list[tuple[PurePosixPath, Path]]:
|
||||
loc = PurePosixPath()
|
||||
idx = 0
|
||||
ret = []
|
||||
level: int | None = None
|
||||
parent: PurePosixPath | None = None
|
||||
with watching.state.lock:
|
||||
root = watching.state.root
|
||||
while idx < len(root):
|
||||
f = root[idx]
|
||||
loc = PurePosixPath(*loc.parts[: f.level - 1]) / f.name
|
||||
if parent is not None and f.level <= level:
|
||||
level = parent = None
|
||||
if f.key in wanted:
|
||||
level, parent = f.level, loc.parent
|
||||
if parent is not None:
|
||||
wanted.discard(f.key)
|
||||
ret.append((loc.relative_to(parent), watching.rootpath / loc))
|
||||
idx += 1
|
||||
return ret
|
||||
|
||||
|
||||
@app.get("/zip/<keys>/<zipfile:ext=zip>")
|
||||
async def zip_download(req, keys, zipfile, ext):
|
||||
"""Download a zip archive of the given keys"""
|
||||
|
||||
wanted = set(keys.split("+"))
|
||||
with watching.tree_lock:
|
||||
q = deque([([], None, watching.tree[""].dir)])
|
||||
files = []
|
||||
while q:
|
||||
locpar, relpar, d = q.pop()
|
||||
for name, attr in d.items():
|
||||
loc = [*locpar, name]
|
||||
rel = None
|
||||
if relpar or attr.key in wanted:
|
||||
rel = [*relpar, name] if relpar else [name]
|
||||
wanted.discard(attr.key)
|
||||
isdir = isinstance(attr, DirEntry)
|
||||
if isdir:
|
||||
q.append((loc, rel, attr.dir))
|
||||
if rel:
|
||||
files.append(
|
||||
("/".join(rel), Path(watching.rootpath.joinpath(*loc)))
|
||||
)
|
||||
files = get_files(wanted)
|
||||
|
||||
if not files:
|
||||
raise NotFound(
|
||||
"No files found",
|
||||
context={"keys": keys, "zipfile": zipfile, "wanted": wanted},
|
||||
context={"keys": keys, "zipfile": f"{zipfile}.{ext}", "wanted": wanted},
|
||||
)
|
||||
if wanted:
|
||||
raise NotFound("Files not found", context={"missing": wanted})
|
||||
|
||||
files = natsorted(files, key=lambda f: f[0], alg=ns.IGNORECASE)
|
||||
|
||||
def local_files(files):
|
||||
for rel, p in files:
|
||||
s = p.stat()
|
||||
size = s.st_size
|
||||
modified = datetime.datetime.fromtimestamp(s.st_mtime, datetime.UTC)
|
||||
name = rel.as_posix()
|
||||
if p.is_dir():
|
||||
yield rel, modified, S_IFDIR | 0o755, ZIP_AUTO(size), b""
|
||||
yield f"{name}/", modified, S_IFDIR | 0o755, ZIP_AUTO(size), iter(b"")
|
||||
else:
|
||||
yield rel, modified, S_IFREG | 0o644, ZIP_AUTO(size), contents(p)
|
||||
yield name, modified, S_IFREG | 0o644, ZIP_AUTO(size), contents(p, size)
|
||||
|
||||
def contents(name):
|
||||
def contents(name, size):
|
||||
with name.open("rb") as f:
|
||||
while chunk := f.read(65536):
|
||||
while size > 0 and (chunk := f.read(min(size, 1 << 20))):
|
||||
size -= len(chunk)
|
||||
yield chunk
|
||||
assert size == 0
|
||||
|
||||
def worker():
|
||||
try:
|
||||
for chunk in stream_zip(local_files(files)):
|
||||
asyncio.run_coroutine_threadsafe(queue.put(chunk), loop)
|
||||
asyncio.run_coroutine_threadsafe(queue.put(chunk), loop).result()
|
||||
except Exception:
|
||||
logging.exception("Error streaming ZIP")
|
||||
raise
|
||||
@@ -238,7 +262,10 @@ async def zip_download(req, keys, zipfile, ext):
|
||||
thread = loop.run_in_executor(app.ctx.threadexec, worker)
|
||||
|
||||
# Stream the response
|
||||
res = await req.respond(content_type="application/zip")
|
||||
res = await req.respond(
|
||||
content_type="application/zip",
|
||||
headers={"cache-control": "no-store"},
|
||||
)
|
||||
while chunk := await queue.get():
|
||||
await res.send(chunk)
|
||||
|
||||
|
||||
@@ -68,10 +68,10 @@ def verify(request, *, privileged=False):
|
||||
if request.ctx.user:
|
||||
if request.ctx.user.privileged:
|
||||
return
|
||||
raise Forbidden("Access Forbidden: Only for privileged users")
|
||||
raise Forbidden("Access Forbidden: Only for privileged users", quiet=True)
|
||||
elif config.config.public or request.ctx.user:
|
||||
return
|
||||
raise Unauthorized("Login required", "cookie")
|
||||
raise Unauthorized("Login required", "cookie", quiet=True)
|
||||
|
||||
|
||||
bp = Blueprint("auth")
|
||||
|
||||
@@ -1,6 +1,7 @@
|
||||
from __future__ import annotations
|
||||
|
||||
import secrets
|
||||
import sys
|
||||
from functools import wraps
|
||||
from hashlib import sha256
|
||||
from pathlib import Path, PurePath
|
||||
@@ -14,6 +15,7 @@ class Config(msgspec.Struct):
|
||||
listen: str
|
||||
secret: str = secrets.token_hex(12)
|
||||
public: bool = False
|
||||
name: str = ""
|
||||
users: dict[str, User] = {}
|
||||
links: dict[str, Link] = {}
|
||||
|
||||
@@ -89,6 +91,8 @@ def config_update(modify):
|
||||
return "read"
|
||||
f.write(new)
|
||||
f.close()
|
||||
if sys.platform == "win32":
|
||||
conffile.unlink() # Windows doesn't support atomic replace
|
||||
tmpname.rename(conffile) # Atomic replace
|
||||
except:
|
||||
f.close()
|
||||
@@ -134,7 +138,7 @@ def update_user(conf: Config, name: str, changes: dict) -> Config:
|
||||
# Encode into dict, update values with new, convert to Config
|
||||
try:
|
||||
u = conf.users[name].__copy__()
|
||||
except KeyError:
|
||||
except (KeyError, AttributeError):
|
||||
u = User()
|
||||
if "password" in changes:
|
||||
from . import auth
|
||||
@@ -143,7 +147,7 @@ def update_user(conf: Config, name: str, changes: dict) -> Config:
|
||||
del changes["password"]
|
||||
udict = msgspec.to_builtins(u, enc_hook=enc_hook)
|
||||
udict.update(changes)
|
||||
settings = msgspec.to_builtins(conf, enc_hook=enc_hook)
|
||||
settings = msgspec.to_builtins(conf, enc_hook=enc_hook) if conf else {"users": {}}
|
||||
settings["users"][name] = msgspec.convert(udict, User, dec_hook=dec_hook)
|
||||
return msgspec.convert(settings, Config, dec_hook=dec_hook)
|
||||
|
||||
|
||||
@@ -34,9 +34,11 @@ class File:
|
||||
self.open_rw()
|
||||
assert self.fd is not None
|
||||
if file_size is not None:
|
||||
assert pos + len(buffer) <= file_size
|
||||
os.ftruncate(self.fd, file_size)
|
||||
os.lseek(self.fd, pos, os.SEEK_SET)
|
||||
os.write(self.fd, buffer)
|
||||
if buffer:
|
||||
os.lseek(self.fd, pos, os.SEEK_SET)
|
||||
os.write(self.fd, buffer)
|
||||
|
||||
def __getitem__(self, slice):
|
||||
if self.fd is None:
|
||||
|
||||
@@ -22,7 +22,7 @@ class MkDir(ControlBase):
|
||||
|
||||
def __call__(self):
|
||||
path = config.config.path / filename.sanitize(self.path)
|
||||
path.mkdir(parents=False, exist_ok=False)
|
||||
path.mkdir(parents=True, exist_ok=False)
|
||||
|
||||
|
||||
class Rename(ControlBase):
|
||||
@@ -112,56 +112,40 @@ class ErrorMsg(msgspec.Struct):
|
||||
## Directory listings
|
||||
|
||||
|
||||
class FileEntry(msgspec.Struct):
|
||||
key: str
|
||||
size: int
|
||||
mtime: int
|
||||
|
||||
|
||||
class DirEntry(msgspec.Struct):
|
||||
key: str
|
||||
size: int
|
||||
mtime: int
|
||||
dir: DirList
|
||||
|
||||
def __getitem__(self, name):
|
||||
return self.dir[name]
|
||||
|
||||
def __setitem__(self, name, value):
|
||||
self.dir[name] = value
|
||||
|
||||
def __contains__(self, name):
|
||||
return name in self.dir
|
||||
|
||||
def __delitem__(self, name):
|
||||
del self.dir[name]
|
||||
|
||||
@property
|
||||
def props(self):
|
||||
return {k: v for k, v in self.__struct_fields__ if k != "dir"}
|
||||
|
||||
|
||||
DirList = dict[str, FileEntry | DirEntry]
|
||||
|
||||
|
||||
class UpdateEntry(msgspec.Struct, omit_defaults=True):
|
||||
"""Updates the named entry in the tree. Fields that are set replace old values. A list of entries recurses directories."""
|
||||
|
||||
class FileEntry(msgspec.Struct, array_like=True):
|
||||
level: int
|
||||
name: str
|
||||
key: str
|
||||
deleted: bool = False
|
||||
size: int | None = None
|
||||
mtime: int | None = None
|
||||
dir: DirList | None = None
|
||||
mtime: int
|
||||
size: int
|
||||
isfile: int
|
||||
|
||||
def __repr__(self):
|
||||
return self.key or "FileEntry()"
|
||||
|
||||
|
||||
def make_dir_data(root):
|
||||
if len(root) == 3:
|
||||
return FileEntry(*root)
|
||||
id_, size, mtime, listing = root
|
||||
converted = {}
|
||||
for name, data in listing.items():
|
||||
converted[name] = make_dir_data(data)
|
||||
sz = sum(x.size for x in converted.values())
|
||||
mt = max(x.mtime for x in converted.values())
|
||||
return DirEntry(id_, sz, max(mt, mtime), converted)
|
||||
class Update(msgspec.Struct, array_like=True):
|
||||
...
|
||||
|
||||
|
||||
class UpdKeep(Update, tag="k"):
|
||||
count: int
|
||||
|
||||
|
||||
class UpdDel(Update, tag="d"):
|
||||
count: int
|
||||
|
||||
|
||||
class UpdIns(Update, tag="i"):
|
||||
items: list[FileEntry]
|
||||
|
||||
|
||||
class UpdateMessage(msgspec.Struct):
|
||||
update: list[UpdKeep | UpdDel | UpdIns]
|
||||
|
||||
|
||||
class Space(msgspec.Struct):
|
||||
disk: int
|
||||
free: int
|
||||
usage: int
|
||||
storage: int
|
||||
|
||||
@@ -1,6 +1,6 @@
|
||||
import os
|
||||
import re
|
||||
from pathlib import Path, PurePath
|
||||
from pathlib import Path
|
||||
|
||||
from sanic import Sanic
|
||||
|
||||
@@ -15,7 +15,6 @@ def run(*, dev=False):
|
||||
# Silence Sanic's warning about running in production rather than debug
|
||||
os.environ["SANIC_IGNORE_PRODUCTION_WARNING"] = "1"
|
||||
confdir = config.conffile.parent
|
||||
wwwroot = PurePath(__file__).parent / "wwwroot"
|
||||
if opts.get("ssl"):
|
||||
# Run plain HTTP redirect/acme server on port 80
|
||||
server80.app.prepare(port=80, motd=False)
|
||||
@@ -27,7 +26,7 @@ def run(*, dev=False):
|
||||
motd=False,
|
||||
dev=dev,
|
||||
auto_reload=dev,
|
||||
reload_dir={confdir, wwwroot},
|
||||
reload_dir={confdir},
|
||||
access_log=True,
|
||||
) # type: ignore
|
||||
if dev:
|
||||
|
||||
@@ -1,20 +1,117 @@
|
||||
import asyncio
|
||||
import shutil
|
||||
import stat
|
||||
import sys
|
||||
import threading
|
||||
import time
|
||||
from os import stat_result
|
||||
from pathlib import Path, PurePosixPath
|
||||
|
||||
import msgspec
|
||||
from natsort import humansorted, natsort_keygen, ns
|
||||
from sanic.log import logging
|
||||
|
||||
from cista import config
|
||||
from cista.fileio import fuid
|
||||
from cista.protocol import DirEntry, FileEntry, UpdateEntry
|
||||
from cista.protocol import FileEntry, Space, UpdDel, UpdIns, UpdKeep
|
||||
|
||||
pubsub = {}
|
||||
tree = {"": None}
|
||||
tree_lock = threading.Lock()
|
||||
sortkey = natsort_keygen(alg=ns.LOCALE)
|
||||
|
||||
|
||||
class State:
|
||||
def __init__(self):
|
||||
self.lock = threading.RLock()
|
||||
self._space = Space(0, 0, 0, 0)
|
||||
self._listing: list[FileEntry] = []
|
||||
|
||||
@property
|
||||
def space(self):
|
||||
with self.lock:
|
||||
return self._space
|
||||
|
||||
@space.setter
|
||||
def space(self, space):
|
||||
with self.lock:
|
||||
self._space = space
|
||||
|
||||
@property
|
||||
def root(self) -> list[FileEntry]:
|
||||
with self.lock:
|
||||
return self._listing[:]
|
||||
|
||||
@root.setter
|
||||
def root(self, listing: list[FileEntry]):
|
||||
with self.lock:
|
||||
self._listing = listing
|
||||
|
||||
def _slice(self, idx: PurePosixPath | tuple[PurePosixPath, int]):
|
||||
relpath, relfile = idx if isinstance(idx, tuple) else (idx, 0)
|
||||
begin, end = 0, len(self._listing)
|
||||
level = 0
|
||||
isfile = 0
|
||||
|
||||
# Special case for root
|
||||
if not relpath.parts:
|
||||
return slice(begin, end)
|
||||
|
||||
begin += 1
|
||||
for part in relpath.parts:
|
||||
level += 1
|
||||
found = False
|
||||
|
||||
while begin < end:
|
||||
entry = self._listing[begin]
|
||||
|
||||
if entry.level < level:
|
||||
break
|
||||
|
||||
if entry.level == level:
|
||||
if entry.name == part:
|
||||
found = True
|
||||
if level == len(relpath.parts):
|
||||
isfile = relfile
|
||||
else:
|
||||
begin += 1
|
||||
break
|
||||
cmp = entry.isfile - isfile or sortkey(entry.name) > sortkey(part)
|
||||
if cmp > 0:
|
||||
break
|
||||
|
||||
begin += 1
|
||||
|
||||
if not found:
|
||||
return slice(begin, begin)
|
||||
|
||||
# Found the starting point, now find the end of the slice
|
||||
for end in range(begin + 1, len(self._listing) + 1):
|
||||
if end == len(self._listing) or self._listing[end].level <= level:
|
||||
break
|
||||
return slice(begin, end)
|
||||
|
||||
def __getitem__(self, index: PurePosixPath | tuple[PurePosixPath, int]):
|
||||
with self.lock:
|
||||
return self._listing[self._slice(index)]
|
||||
|
||||
def __setitem__(
|
||||
self, index: tuple[PurePosixPath, int], value: list[FileEntry]
|
||||
) -> None:
|
||||
rel, isfile = index
|
||||
with self.lock:
|
||||
if rel.parts:
|
||||
parent = self._slice(rel.parent)
|
||||
if parent.start == parent.stop:
|
||||
raise ValueError(
|
||||
f"Parent folder {rel.as_posix()} is missing for {rel.name}"
|
||||
)
|
||||
self._listing[self._slice(index)] = value
|
||||
|
||||
def __delitem__(self, relpath: PurePosixPath):
|
||||
with self.lock:
|
||||
del self._listing[self._slice(relpath)]
|
||||
|
||||
|
||||
state = State()
|
||||
rootpath: Path = None # type: ignore
|
||||
quit = False
|
||||
modified_flags = (
|
||||
@@ -26,35 +123,35 @@ modified_flags = (
|
||||
"IN_MOVED_FROM",
|
||||
"IN_MOVED_TO",
|
||||
)
|
||||
disk_usage = None
|
||||
|
||||
|
||||
def watcher_thread(loop):
|
||||
global disk_usage, rootpath
|
||||
global rootpath
|
||||
import inotify.adapters
|
||||
|
||||
while True:
|
||||
while not quit:
|
||||
rootpath = config.config.path
|
||||
i = inotify.adapters.InotifyTree(rootpath.as_posix())
|
||||
old = format_tree() if tree[""] else None
|
||||
with tree_lock:
|
||||
# Initialize the tree from filesystem
|
||||
tree[""] = walk(rootpath)
|
||||
msg = format_tree()
|
||||
if msg != old:
|
||||
asyncio.run_coroutine_threadsafe(broadcast(msg), loop)
|
||||
# Initialize the tree from filesystem
|
||||
new = walk()
|
||||
with state.lock:
|
||||
old = state.root
|
||||
if old != new:
|
||||
state.root = new
|
||||
broadcast(format_update(old, new), loop)
|
||||
|
||||
# The watching is not entirely reliable, so do a full refresh every minute
|
||||
refreshdl = time.monotonic() + 60.0
|
||||
# The watching is not entirely reliable, so do a full refresh every 30 seconds
|
||||
refreshdl = time.monotonic() + 30.0
|
||||
|
||||
for event in i.event_gen():
|
||||
if quit:
|
||||
return
|
||||
# Disk usage update
|
||||
du = shutil.disk_usage(rootpath)
|
||||
if du != disk_usage:
|
||||
disk_usage = du
|
||||
asyncio.run_coroutine_threadsafe(broadcast(format_du()), loop)
|
||||
space = Space(*du, storage=state.root[0].size)
|
||||
if space != state.space:
|
||||
state.space = space
|
||||
broadcast(format_space(space), loop)
|
||||
break
|
||||
# Do a full refresh?
|
||||
if time.monotonic() > refreshdl:
|
||||
@@ -75,144 +172,145 @@ def watcher_thread(loop):
|
||||
|
||||
|
||||
def watcher_thread_poll(loop):
|
||||
global disk_usage, rootpath
|
||||
global rootpath
|
||||
|
||||
while not quit:
|
||||
rootpath = config.config.path
|
||||
old = format_tree() if tree[""] else None
|
||||
with tree_lock:
|
||||
# Initialize the tree from filesystem
|
||||
tree[""] = walk(rootpath)
|
||||
msg = format_tree()
|
||||
if msg != old:
|
||||
asyncio.run_coroutine_threadsafe(broadcast(msg), loop)
|
||||
new = walk()
|
||||
with state.lock:
|
||||
old = state.root
|
||||
if old != new:
|
||||
state.root = new
|
||||
broadcast(format_update(old, new), loop)
|
||||
|
||||
# Disk usage update
|
||||
du = shutil.disk_usage(rootpath)
|
||||
if du != disk_usage:
|
||||
disk_usage = du
|
||||
asyncio.run_coroutine_threadsafe(broadcast(format_du()), loop)
|
||||
space = Space(*du, storage=state.root[0].size)
|
||||
if space != state.space:
|
||||
state.space = space
|
||||
broadcast(format_space(space), loop)
|
||||
|
||||
time.sleep(1.0)
|
||||
time.sleep(2.0)
|
||||
|
||||
|
||||
def format_du():
|
||||
return msgspec.json.encode(
|
||||
{
|
||||
"space": {
|
||||
"disk": disk_usage.total,
|
||||
"used": disk_usage.used,
|
||||
"free": disk_usage.free,
|
||||
"storage": tree[""].size,
|
||||
},
|
||||
},
|
||||
).decode()
|
||||
|
||||
|
||||
def format_tree():
|
||||
root = tree[""]
|
||||
return msgspec.json.encode({"root": root}).decode()
|
||||
|
||||
|
||||
def walk(path: Path) -> DirEntry | FileEntry | None:
|
||||
def walk(rel=PurePosixPath()) -> list[FileEntry]: # noqa: B008
|
||||
path = rootpath / rel
|
||||
try:
|
||||
s = path.stat()
|
||||
key = fuid(s)
|
||||
assert key, repr(key)
|
||||
mtime = int(s.st_mtime)
|
||||
if path.is_file():
|
||||
return FileEntry(key, s.st_size, mtime)
|
||||
st = path.stat()
|
||||
except OSError:
|
||||
return []
|
||||
return _walk(rel, int(not stat.S_ISDIR(st.st_mode)), st)
|
||||
|
||||
tree = {
|
||||
p.name: v
|
||||
for p in path.iterdir()
|
||||
if not p.name.startswith(".")
|
||||
if (v := walk(p)) is not None
|
||||
}
|
||||
if tree:
|
||||
size = sum(v.size for v in tree.values())
|
||||
mtime = max(mtime, *(v.mtime for v in tree.values()))
|
||||
else:
|
||||
size = 0
|
||||
return DirEntry(key, size, mtime, tree)
|
||||
|
||||
def _walk(rel: PurePosixPath, isfile: int, st: stat_result) -> list[FileEntry]:
|
||||
entry = FileEntry(
|
||||
level=len(rel.parts),
|
||||
name=rel.name,
|
||||
key=fuid(st),
|
||||
mtime=int(st.st_mtime),
|
||||
size=st.st_size if isfile else 0,
|
||||
isfile=isfile,
|
||||
)
|
||||
if isfile:
|
||||
return [entry]
|
||||
ret = [entry]
|
||||
path = rootpath / rel
|
||||
try:
|
||||
li = []
|
||||
for f in path.iterdir():
|
||||
if quit:
|
||||
raise SystemExit("quit")
|
||||
if f.name.startswith("."):
|
||||
continue # No dotfiles
|
||||
s = f.stat()
|
||||
li.append((int(not stat.S_ISDIR(s.st_mode)), f.name, s))
|
||||
for [isfile, name, s] in humansorted(li):
|
||||
if quit:
|
||||
raise SystemExit("quit")
|
||||
subtree = _walk(rel / name, isfile, s)
|
||||
child = subtree[0]
|
||||
entry.mtime = max(entry.mtime, child.mtime)
|
||||
entry.size += child.size
|
||||
ret.extend(subtree)
|
||||
except FileNotFoundError:
|
||||
return None
|
||||
pass # Things may be rapidly in motion
|
||||
except OSError as e:
|
||||
print("OS error walking path", path, e)
|
||||
return None
|
||||
return ret
|
||||
|
||||
|
||||
def update(relpath: PurePosixPath, loop):
|
||||
"""Called by inotify updates, check the filesystem and broadcast any changes."""
|
||||
if rootpath is None or relpath is None:
|
||||
print("ERROR", rootpath, relpath)
|
||||
new = walk(rootpath / relpath)
|
||||
with tree_lock:
|
||||
update = update_internal(relpath, new)
|
||||
if not update:
|
||||
return # No changes
|
||||
msg = msgspec.json.encode({"update": update}).decode()
|
||||
asyncio.run_coroutine_threadsafe(broadcast(msg), loop)
|
||||
new = walk(relpath)
|
||||
with state.lock:
|
||||
old = state[relpath]
|
||||
if old == new:
|
||||
return
|
||||
old = state.root
|
||||
if new:
|
||||
state[relpath, new[0].isfile] = new
|
||||
else:
|
||||
del state[relpath]
|
||||
broadcast(format_update(old, state.root), loop)
|
||||
|
||||
|
||||
def update_internal(
|
||||
relpath: PurePosixPath,
|
||||
new: DirEntry | FileEntry | None,
|
||||
) -> list[UpdateEntry]:
|
||||
path = "", *relpath.parts
|
||||
old = tree
|
||||
elems = []
|
||||
for name in path:
|
||||
if name not in old:
|
||||
# File or folder created
|
||||
old = None
|
||||
elems.append((name, None))
|
||||
if len(elems) < len(path):
|
||||
# We got a notify for an item whose parent is not in tree
|
||||
print("Tree out of sync DEBUG", relpath)
|
||||
print(elems)
|
||||
print("Current tree:")
|
||||
print(tree[""])
|
||||
print("Walking all:")
|
||||
print(walk(rootpath))
|
||||
raise ValueError("Tree out of sync")
|
||||
break
|
||||
old = old[name]
|
||||
elems.append((name, old))
|
||||
if old == new:
|
||||
return []
|
||||
mt = new.mtime if new else 0
|
||||
szdiff = (new.size if new else 0) - (old.size if old else 0)
|
||||
# Update parents
|
||||
def format_update(old, new):
|
||||
# Make keep/del/insert diff until one of the lists ends
|
||||
oidx, nidx = 0, 0
|
||||
update = []
|
||||
for name, entry in elems[:-1]:
|
||||
u = UpdateEntry(name, entry.key)
|
||||
if szdiff:
|
||||
entry.size += szdiff
|
||||
u.size = entry.size
|
||||
if mt > entry.mtime:
|
||||
u.mtime = entry.mtime = mt
|
||||
update.append(u)
|
||||
# The last element is the one that changed
|
||||
name, entry = elems[-1]
|
||||
parent = elems[-2][1] if len(elems) > 1 else tree
|
||||
u = UpdateEntry(name, new.key if new else entry.key)
|
||||
if new:
|
||||
parent[name] = new
|
||||
if u.size != new.size:
|
||||
u.size = new.size
|
||||
if u.mtime != new.mtime:
|
||||
u.mtime = new.mtime
|
||||
if isinstance(new, DirEntry) and u.dir != new.dir:
|
||||
u.dir = new.dir
|
||||
else:
|
||||
del parent[name]
|
||||
u.deleted = True
|
||||
update.append(u)
|
||||
return update
|
||||
keep_count = 0
|
||||
while oidx < len(old) and nidx < len(new):
|
||||
if old[oidx] == new[nidx]:
|
||||
keep_count += 1
|
||||
oidx += 1
|
||||
nidx += 1
|
||||
continue
|
||||
if keep_count > 0:
|
||||
update.append(UpdKeep(keep_count))
|
||||
keep_count = 0
|
||||
|
||||
del_count = 0
|
||||
rest = new[nidx:]
|
||||
while oidx < len(old) and old[oidx] not in rest:
|
||||
del_count += 1
|
||||
oidx += 1
|
||||
if del_count:
|
||||
update.append(UpdDel(del_count))
|
||||
continue
|
||||
|
||||
insert_items = []
|
||||
rest = old[oidx:]
|
||||
while nidx < len(new) and new[nidx] not in rest:
|
||||
insert_items.append(new[nidx])
|
||||
nidx += 1
|
||||
update.append(UpdIns(insert_items))
|
||||
|
||||
# Diff any remaining
|
||||
if keep_count > 0:
|
||||
update.append(UpdKeep(keep_count))
|
||||
if oidx < len(old):
|
||||
update.append(UpdDel(len(old) - oidx))
|
||||
elif nidx < len(new):
|
||||
update.append(UpdIns(new[nidx:]))
|
||||
|
||||
return msgspec.json.encode({"update": update}).decode()
|
||||
|
||||
|
||||
async def broadcast(msg):
|
||||
def format_space(usage):
|
||||
return msgspec.json.encode({"space": usage}).decode()
|
||||
|
||||
|
||||
def format_root(root):
|
||||
return msgspec.json.encode({"root": root}).decode()
|
||||
|
||||
|
||||
def broadcast(msg, loop):
|
||||
return asyncio.run_coroutine_threadsafe(abroadcast(msg), loop).result()
|
||||
|
||||
|
||||
async def abroadcast(msg):
|
||||
try:
|
||||
for queue in pubsub.values():
|
||||
queue.put_nowait(msg)
|
||||
@@ -223,8 +321,9 @@ async def broadcast(msg):
|
||||
|
||||
async def start(app, loop):
|
||||
config.load_config()
|
||||
use_inotify = sys.platform == "linux"
|
||||
app.ctx.watcher = threading.Thread(
|
||||
target=watcher_thread if sys.platform == "linux" else watcher_thread_poll,
|
||||
target=watcher_thread if use_inotify else watcher_thread_poll,
|
||||
args=[loop],
|
||||
)
|
||||
app.ctx.watcher.start()
|
||||
|
||||
BIN
docs/cista.jpg
Normal file
|
After Width: | Height: | Size: 39 KiB |
@@ -1,9 +1,9 @@
|
||||
<!DOCTYPE html>
|
||||
<html lang=en>
|
||||
<meta charset=UTF-8>
|
||||
<title>Cista</title>
|
||||
<title>Cista Storage</title>
|
||||
<meta name="viewport" content="width=device-width, initial-scale=1, shrink-to-fit=no">
|
||||
<link rel="icon" href="/favicon.ico">
|
||||
<link rel="icon" href="/src/assets/logo.svg">
|
||||
<link rel="preconnect" href="https://fonts.googleapis.com">
|
||||
<link rel="preconnect" href="https://fonts.gstatic.com" crossorigin>
|
||||
<link href="https://fonts.googleapis.com/css2?family=Roboto+Mono&family=Roboto:wght@400;700&display=swap" rel="stylesheet">
|
||||
@@ -1,5 +1,5 @@
|
||||
{
|
||||
"name": "front",
|
||||
"name": "cista-frontend",
|
||||
"version": "0.0.0",
|
||||
"private": true,
|
||||
"scripts": {
|
||||
2
frontend/public/robots.txt
Normal file
@@ -0,0 +1,2 @@
|
||||
User-agent: *
|
||||
Disallow: /
|
||||
@@ -1,13 +1,13 @@
|
||||
<template>
|
||||
<LoginModal />
|
||||
<header>
|
||||
<HeaderMain ref="headerMain">
|
||||
<HeaderMain ref="headerMain" :path="path.pathList" :query="path.query">
|
||||
<HeaderSelected :path="path.pathList" />
|
||||
</HeaderMain>
|
||||
<BreadCrumb :path="path.pathList" tabindex="-1"/>
|
||||
<BreadCrumb :path="path.pathList" primary />
|
||||
</header>
|
||||
<main>
|
||||
<RouterView :path="path.pathList" />
|
||||
<RouterView :path="path.pathList" :query="path.query" />
|
||||
</main>
|
||||
</template>
|
||||
|
||||
@@ -15,9 +15,9 @@
|
||||
import { RouterView } from 'vue-router'
|
||||
import type { ComputedRef } from 'vue'
|
||||
import type HeaderMain from '@/components/HeaderMain.vue'
|
||||
import { onMounted, onUnmounted, ref } from 'vue'
|
||||
import { watchConnect, watchDisconnect } from '@/repositories/WS'
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import { onMounted, onUnmounted, ref, watchEffect } from 'vue'
|
||||
import { loadSession, watchConnect, watchDisconnect } from '@/repositories/WS'
|
||||
import { useMainStore } from '@/stores/main'
|
||||
|
||||
import { computed } from 'vue'
|
||||
import Router from '@/router/index'
|
||||
@@ -25,27 +25,33 @@ import Router from '@/router/index'
|
||||
interface Path {
|
||||
path: string
|
||||
pathList: string[]
|
||||
query: string
|
||||
}
|
||||
const documentStore = useDocumentStore()
|
||||
const store = useMainStore()
|
||||
const path: ComputedRef<Path> = computed(() => {
|
||||
const p = decodeURIComponent(Router.currentRoute.value.path)
|
||||
const pathList = p.split('/').filter(value => value !== '')
|
||||
const p = decodeURIComponent(Router.currentRoute.value.path).split('//')
|
||||
const pathList = p[0].split('/').filter(value => value !== '')
|
||||
const query = p.slice(1).join('//')
|
||||
return {
|
||||
path: p,
|
||||
pathList
|
||||
path: p[0],
|
||||
pathList,
|
||||
query
|
||||
}
|
||||
})
|
||||
watchEffect(() => {
|
||||
document.title = path.value.path.replace(/\/$/, '').split('/').pop() || store.server.name || 'Cista Storage'
|
||||
})
|
||||
onMounted(loadSession)
|
||||
onMounted(watchConnect)
|
||||
onUnmounted(watchDisconnect)
|
||||
// Update human-readable x seconds ago messages from mtimes
|
||||
setInterval(documentStore.updateModified, 1000)
|
||||
const headerMain = ref<typeof HeaderMain | null>(null)
|
||||
let vert = 0
|
||||
let timer: any = null
|
||||
const globalShortcutHandler = (event: KeyboardEvent) => {
|
||||
const fileExplorer = documentStore.fileExplorer as any
|
||||
const fileExplorer = store.fileExplorer as any
|
||||
if (!fileExplorer) return
|
||||
const c = fileExplorer.isCursor()
|
||||
const input = (event.target as HTMLElement).tagName === 'INPUT'
|
||||
const keyup = event.type === 'keyup'
|
||||
if (event.repeat) {
|
||||
if (
|
||||
@@ -65,13 +71,21 @@ const globalShortcutHandler = (event: KeyboardEvent) => {
|
||||
else if (!keyup && event.key === 'f' && (event.ctrlKey || event.metaKey)) {
|
||||
headerMain.value!.toggleSearchInput()
|
||||
}
|
||||
// Search also on / (UNIX style)
|
||||
else if (keyup && !input && event.key === '/') {
|
||||
headerMain.value!.toggleSearchInput()
|
||||
}
|
||||
// Globally close search on Escape
|
||||
else if (keyup && event.key === 'Escape') {
|
||||
headerMain.value!.closeSearch(event)
|
||||
}
|
||||
// Select all (toggle); keydown to prevent builtin
|
||||
else if (!keyup && event.key === 'a' && (event.ctrlKey || event.metaKey)) {
|
||||
fileExplorer.toggleSelectAll()
|
||||
}
|
||||
// Keys 1-3 to sort columns
|
||||
else if (
|
||||
c &&
|
||||
!input &&
|
||||
keyup &&
|
||||
(event.key === '1' || event.key === '2' || event.key === '3')
|
||||
) {
|
||||
@@ -119,3 +133,4 @@ onUnmounted(() => {
|
||||
})
|
||||
export type { Path }
|
||||
</script>
|
||||
@/stores/main
|
||||
1
frontend/src/assets/logo.svg
Normal file
@@ -0,0 +1 @@
|
||||
<svg xmlns="http://www.w3.org/2000/svg" width="512" height="512" viewBox="0 0 512 512"><rect width="512" height="512" fill="#f80" rx="64" ry="64"/><path fill="#fff" d="M381 298h-84V167h-66L339 35l108 132h-66zm-168-84h-84v131H63l108 132 108-132h-66z"/></svg>
|
||||
|
After Width: | Height: | Size: 258 B |
|
Before Width: | Height: | Size: 158 B After Width: | Height: | Size: 158 B |
|
Before Width: | Height: | Size: 168 B After Width: | Height: | Size: 168 B |
|
Before Width: | Height: | Size: 388 B After Width: | Height: | Size: 388 B |
|
Before Width: | Height: | Size: 128 B After Width: | Height: | Size: 128 B |
|
Before Width: | Height: | Size: 126 B After Width: | Height: | Size: 126 B |
|
Before Width: | Height: | Size: 158 B After Width: | Height: | Size: 158 B |
|
Before Width: | Height: | Size: 208 B After Width: | Height: | Size: 208 B |
|
Before Width: | Height: | Size: 563 B After Width: | Height: | Size: 563 B |
|
Before Width: | Height: | Size: 212 B After Width: | Height: | Size: 212 B |
|
Before Width: | Height: | Size: 293 B After Width: | Height: | Size: 293 B |
|
Before Width: | Height: | Size: 310 B After Width: | Height: | Size: 310 B |
|
Before Width: | Height: | Size: 193 B After Width: | Height: | Size: 193 B |
|
Before Width: | Height: | Size: 278 B After Width: | Height: | Size: 278 B |
|
Before Width: | Height: | Size: 711 B After Width: | Height: | Size: 711 B |
|
Before Width: | Height: | Size: 365 B After Width: | Height: | Size: 365 B |
|
Before Width: | Height: | Size: 783 B After Width: | Height: | Size: 783 B |
|
Before Width: | Height: | Size: 382 B After Width: | Height: | Size: 382 B |
|
Before Width: | Height: | Size: 200 B After Width: | Height: | Size: 200 B |
|
Before Width: | Height: | Size: 698 B After Width: | Height: | Size: 698 B |
|
Before Width: | Height: | Size: 156 B After Width: | Height: | Size: 156 B |
|
Before Width: | Height: | Size: 416 B After Width: | Height: | Size: 416 B |
|
Before Width: | Height: | Size: 517 B After Width: | Height: | Size: 517 B |
|
Before Width: | Height: | Size: 257 B After Width: | Height: | Size: 257 B |
|
Before Width: | Height: | Size: 297 B After Width: | Height: | Size: 297 B |
|
Before Width: | Height: | Size: 312 B After Width: | Height: | Size: 312 B |
|
Before Width: | Height: | Size: 109 B After Width: | Height: | Size: 109 B |
|
Before Width: | Height: | Size: 587 B After Width: | Height: | Size: 587 B |
|
Before Width: | Height: | Size: 269 B After Width: | Height: | Size: 269 B |
|
Before Width: | Height: | Size: 106 B After Width: | Height: | Size: 106 B |
|
Before Width: | Height: | Size: 393 B After Width: | Height: | Size: 393 B |
|
Before Width: | Height: | Size: 94 B After Width: | Height: | Size: 94 B |
|
Before Width: | Height: | Size: 229 B After Width: | Height: | Size: 229 B |
|
Before Width: | Height: | Size: 108 B After Width: | Height: | Size: 108 B |
|
Before Width: | Height: | Size: 407 B After Width: | Height: | Size: 407 B |
|
Before Width: | Height: | Size: 887 B After Width: | Height: | Size: 887 B |
|
Before Width: | Height: | Size: 908 B After Width: | Height: | Size: 908 B |
|
Before Width: | Height: | Size: 417 B After Width: | Height: | Size: 417 B |
|
Before Width: | Height: | Size: 554 B After Width: | Height: | Size: 554 B |
|
Before Width: | Height: | Size: 552 B After Width: | Height: | Size: 552 B |
|
Before Width: | Height: | Size: 114 B After Width: | Height: | Size: 114 B |
|
Before Width: | Height: | Size: 1.1 KiB After Width: | Height: | Size: 1.1 KiB |
|
Before Width: | Height: | Size: 91 B After Width: | Height: | Size: 91 B |
|
Before Width: | Height: | Size: 647 B After Width: | Height: | Size: 647 B |
|
Before Width: | Height: | Size: 95 B After Width: | Height: | Size: 95 B |
|
Before Width: | Height: | Size: 208 B After Width: | Height: | Size: 208 B |
|
Before Width: | Height: | Size: 104 B After Width: | Height: | Size: 104 B |
|
Before Width: | Height: | Size: 508 B After Width: | Height: | Size: 508 B |
|
Before Width: | Height: | Size: 1009 B After Width: | Height: | Size: 1009 B |
|
Before Width: | Height: | Size: 278 B After Width: | Height: | Size: 278 B |
|
Before Width: | Height: | Size: 753 B After Width: | Height: | Size: 753 B |
|
Before Width: | Height: | Size: 353 B After Width: | Height: | Size: 353 B |
|
Before Width: | Height: | Size: 542 B After Width: | Height: | Size: 542 B |
|
Before Width: | Height: | Size: 292 B After Width: | Height: | Size: 292 B |
|
Before Width: | Height: | Size: 621 B After Width: | Height: | Size: 621 B |
|
Before Width: | Height: | Size: 517 B After Width: | Height: | Size: 517 B |
|
Before Width: | Height: | Size: 289 B After Width: | Height: | Size: 289 B |
|
Before Width: | Height: | Size: 498 B After Width: | Height: | Size: 498 B |
|
Before Width: | Height: | Size: 464 B After Width: | Height: | Size: 464 B |
@@ -4,7 +4,9 @@
|
||||
aria-label="Breadcrumb"
|
||||
@keyup.left.stop="move(-1)"
|
||||
@keyup.right.stop="move(1)"
|
||||
@focus="move(0)"
|
||||
@keyup.enter="move(0)"
|
||||
@focus=focusCurrent
|
||||
tabindex=0
|
||||
>
|
||||
<a href="#/"
|
||||
:ref="el => setLinkRef(0, el)"
|
||||
@@ -26,7 +28,7 @@
|
||||
|
||||
<script setup lang="ts">
|
||||
import home from '@/assets/svg/home.svg'
|
||||
import { onBeforeUpdate, ref, watchEffect } from 'vue'
|
||||
import { nextTick, onBeforeUpdate, ref, watchEffect } from 'vue'
|
||||
import { useRouter } from 'vue-router'
|
||||
|
||||
const router = useRouter()
|
||||
@@ -37,17 +39,33 @@ onBeforeUpdate(() => { links.length = 1 }) // 1 to keep home
|
||||
|
||||
const props = defineProps<{
|
||||
path: Array<string>
|
||||
primary?: boolean
|
||||
}>()
|
||||
|
||||
const longest = ref<Array<string>>([])
|
||||
|
||||
const isCurrent = (index: number) => index == props.path.length ? 'location' : undefined
|
||||
|
||||
const focusCurrent = () => {
|
||||
nextTick(() => {
|
||||
const index = props.path.length
|
||||
if (index < links.length) links[index].focus()
|
||||
})
|
||||
}
|
||||
|
||||
const navigate = (index: number) => {
|
||||
const link = links[index]
|
||||
if (!link) throw Error(`No link at index ${index} (path: ${props.path})`)
|
||||
link.focus()
|
||||
router.replace(`/${longest.value.slice(0, index).join('/')}`)
|
||||
const url = index ? `/${longest.value.slice(0, index).join('/')}/` : '/'
|
||||
const long = longest.value.length ? `/${longest.value.join('/')}/` : '/'
|
||||
const browser = decodeURIComponent(location.hash.slice(1).split('//')[0])
|
||||
const u = url.replaceAll('?', '%3F').replaceAll('#', '%23')
|
||||
// Clicking on current link clears the rest of the path and adds new history
|
||||
if (isCurrent(index)) { longest.value.splice(index); router.push(u) }
|
||||
// Moving along breadcrumbs doesn't create new history
|
||||
else if (long.startsWith(browser)) router.replace(u)
|
||||
// Nornal navigation from elsewhere (e.g. search result breadcrumbs)
|
||||
else router.push(u)
|
||||
}
|
||||
|
||||
const move = (dir: number) => {
|
||||
@@ -59,13 +77,16 @@ const move = (dir: number) => {
|
||||
watchEffect(() => {
|
||||
const longcut = longest.value.slice(0, props.path.length)
|
||||
const same = longcut.every((value, index) => value === props.path[index])
|
||||
// Navigated out of previous path, reset longest to current
|
||||
if (!same) longest.value = props.path
|
||||
else if (props.path.length > longcut.length) {
|
||||
longest.value = longcut.concat(props.path.slice(longcut.length))
|
||||
}
|
||||
})
|
||||
watchEffect(() => {
|
||||
if (links.length) navigate(props.path.length)
|
||||
// If needed, focus primary navigation to new location
|
||||
if (props.primary) nextTick(() => {
|
||||
const act = document.activeElement as HTMLElement
|
||||
if (!act || [...links, document.body].includes(act)) focusCurrent()
|
||||
})
|
||||
})
|
||||
</script>
|
||||
|
||||
@@ -3,34 +3,11 @@
|
||||
<thead>
|
||||
<tr>
|
||||
<th class="selection">
|
||||
<input
|
||||
type="checkbox"
|
||||
tabindex="-1"
|
||||
v-model="allSelected"
|
||||
:indeterminate="selectionIndeterminate"
|
||||
/>
|
||||
</th>
|
||||
<th
|
||||
class="sortcolumn"
|
||||
:class="{ sortactive: sort === 'name' }"
|
||||
@click="toggleSort('name')"
|
||||
>
|
||||
Name
|
||||
</th>
|
||||
<th
|
||||
class="sortcolumn modified right"
|
||||
:class="{ sortactive: sort === 'modified' }"
|
||||
@click="toggleSort('modified')"
|
||||
>
|
||||
Modified
|
||||
</th>
|
||||
<th
|
||||
class="sortcolumn size right"
|
||||
:class="{ sortactive: sort === 'size' }"
|
||||
@click="toggleSort('size')"
|
||||
>
|
||||
Size
|
||||
<input type="checkbox" tabindex="-1" v-model="allSelected" :indeterminate="selectionIndeterminate">
|
||||
</th>
|
||||
<th class="sortcolumn" :class="{ sortactive: store.sortOrder === 'name' }" @click="store.toggleSort('name')">Name</th>
|
||||
<th class="sortcolumn modified right" :class="{ sortactive: store.sortOrder === 'modified' }" @click="store.toggleSort('modified')">Modified</th>
|
||||
<th class="sortcolumn size right" :class="{ sortactive: store.sortOrder === 'size' }" @click="store.toggleSort('size')">Size</th>
|
||||
<th class="menu"></th>
|
||||
</tr>
|
||||
</thead>
|
||||
@@ -38,27 +15,13 @@
|
||||
<tr v-if="editing?.key === 'new'" class="folder">
|
||||
<td class="selection"></td>
|
||||
<td class="name">
|
||||
<FileRenameInput
|
||||
:doc="editing"
|
||||
:rename="mkdir"
|
||||
:exit="
|
||||
() => {
|
||||
editing = null
|
||||
}
|
||||
"
|
||||
/>
|
||||
<FileRenameInput :doc="editing" :rename="mkdir" :exit="() => {editing = null}" />
|
||||
</td>
|
||||
<td class="modified right">
|
||||
<time :datetime="new Date(editing.mtime).toISOString().replace('.000', '')">{{
|
||||
editing.modified
|
||||
}}</time>
|
||||
</td>
|
||||
<td class="size right">{{ editing.sizedisp }}</td>
|
||||
<FileModified :doc=editing :key=nowkey />
|
||||
<FileSize :doc=editing />
|
||||
<td class="menu"></td>
|
||||
</tr>
|
||||
<template
|
||||
v-for="(doc, index) in sortedDocuments"
|
||||
:key="doc.key">
|
||||
<template v-for="(doc, index) in documents" :key="doc.key">
|
||||
<tr class="folder-change" v-if="showFolderBreadcrumb(index)">
|
||||
<th colspan="5"><BreadCrumb :path="doc.loc ? doc.loc.split('/') : []" /></th>
|
||||
</tr>
|
||||
@@ -73,59 +36,35 @@
|
||||
<input
|
||||
type="checkbox"
|
||||
tabindex="-1"
|
||||
:checked="documentStore.selected.has(doc.key)"
|
||||
:checked="store.selected.has(doc.key)"
|
||||
@change="
|
||||
($event.target as HTMLInputElement).checked
|
||||
? documentStore.selected.add(doc.key)
|
||||
: documentStore.selected.delete(doc.key)
|
||||
? store.selected.add(doc.key)
|
||||
: store.selected.delete(doc.key)
|
||||
"
|
||||
/>
|
||||
</td>
|
||||
<td class="name">
|
||||
<template v-if="editing === doc"
|
||||
><FileRenameInput
|
||||
:doc="doc"
|
||||
:rename="rename"
|
||||
:exit="
|
||||
() => {
|
||||
editing = null
|
||||
}
|
||||
"
|
||||
/></template>
|
||||
<template v-if="editing === doc">
|
||||
<FileRenameInput :doc="doc" :rename="rename" :exit="() => {editing = null}" />
|
||||
</template>
|
||||
<template v-else>
|
||||
<a
|
||||
:href="url_for(doc)"
|
||||
:href="doc.url"
|
||||
tabindex="-1"
|
||||
@contextmenu.prevent
|
||||
@focus.stop="cursor = doc"
|
||||
@blur="ev => { if (!editing) cursor = null }"
|
||||
@keyup.left="router.back()"
|
||||
@keyup.right.stop="ev => { if (doc.dir) (ev.target as HTMLElement).click() }"
|
||||
>{{ doc.name }}</a
|
||||
>
|
||||
<button
|
||||
v-if="cursor == doc"
|
||||
class="rename-button"
|
||||
@click="() => (editing = doc)"
|
||||
>
|
||||
🖊️
|
||||
</button>
|
||||
<button tabindex=-1 v-if="cursor == doc" class="rename-button" @click="() => (editing = doc)">🖊️</button>
|
||||
</template>
|
||||
</td>
|
||||
<td class="modified right">
|
||||
<time
|
||||
:data-tooltip="new Date(1000 * doc.mtime).toISOString().replace('T', '\n').replace('.000Z', ' UTC')"
|
||||
>{{ doc.modified }}</time
|
||||
>
|
||||
</td>
|
||||
<td class="size right">{{ doc.sizedisp }}</td>
|
||||
<FileModified :doc=doc :key=nowkey />
|
||||
<FileSize :doc=doc />
|
||||
<td class="menu">
|
||||
<button
|
||||
tabindex="-1"
|
||||
@click.stop="contextMenu($event, doc)"
|
||||
>
|
||||
⋮
|
||||
</button>
|
||||
<button tabindex=-1 @click.stop="contextMenu($event, doc)">⋮</button>
|
||||
</td>
|
||||
</tr>
|
||||
</template>
|
||||
@@ -140,31 +79,26 @@
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { ref, computed, watchEffect } from 'vue'
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import type { Document } from '@/repositories/Document'
|
||||
import { ref, computed, watchEffect, shallowRef, onMounted, onUnmounted } from 'vue'
|
||||
import { useMainStore } from '@/stores/main'
|
||||
import { Doc } from '@/repositories/Document'
|
||||
import FileRenameInput from './FileRenameInput.vue'
|
||||
import { connect, controlUrl } from '@/repositories/WS'
|
||||
import { collator, formatSize, formatUnixDate } from '@/utils'
|
||||
import { formatSize } from '@/utils'
|
||||
import { useRouter } from 'vue-router'
|
||||
import ContextMenu from '@imengyu/vue3-context-menu'
|
||||
import type { SortOrder } from '@/utils/docsort'
|
||||
|
||||
const props = withDefaults(
|
||||
defineProps<{
|
||||
path: Array<string>
|
||||
documents: Document[]
|
||||
}>(),
|
||||
{}
|
||||
)
|
||||
const documentStore = useDocumentStore()
|
||||
const props = defineProps<{
|
||||
path: Array<string>
|
||||
documents: Doc[]
|
||||
}>()
|
||||
const store = useMainStore()
|
||||
const router = useRouter()
|
||||
const url_for = (doc: Document) => {
|
||||
const p = doc.loc ? `${doc.loc}/${doc.name}` : doc.name
|
||||
return doc.dir ? `#/${p}/` : `/files/${p}`
|
||||
}
|
||||
const cursor = ref<Document | null>(null)
|
||||
const cursor = shallowRef<Doc | null>(null)
|
||||
// File rename
|
||||
const editing = ref<Document | null>(null)
|
||||
const rename = (doc: Document, newName: string) => {
|
||||
const editing = shallowRef<Doc | null>(null)
|
||||
const rename = (doc: Doc, newName: string) => {
|
||||
const oldName = doc.name
|
||||
const control = connect(controlUrl, {
|
||||
message(ev: MessageEvent) {
|
||||
@@ -188,35 +122,25 @@ const rename = (doc: Document, newName: string) => {
|
||||
}
|
||||
doc.name = newName // We should get an update from watch but this is quicker
|
||||
}
|
||||
const sortedDocuments = computed(() => sorted(props.documents as Document[]))
|
||||
const showFolderBreadcrumb = (i: number) => {
|
||||
const docs = sortedDocuments.value
|
||||
const docloc = docs[i].loc
|
||||
return i === 0 ? docloc !== loc.value : docloc !== docs[i - 1].loc
|
||||
}
|
||||
defineExpose({
|
||||
newFolder() {
|
||||
const now = Date.now() / 1000
|
||||
editing.value = {
|
||||
const now = Math.floor(Date.now() / 1000)
|
||||
editing.value = new Doc({
|
||||
loc: loc.value,
|
||||
key: 'new',
|
||||
name: 'New Folder',
|
||||
type: 'folder',
|
||||
dir: true,
|
||||
mtime: now,
|
||||
size: 0,
|
||||
sizedisp: formatSize(0),
|
||||
modified: formatUnixDate(now),
|
||||
haystack: '',
|
||||
}
|
||||
console.log("New")
|
||||
})
|
||||
},
|
||||
toggleSelectAll() {
|
||||
console.log('Select')
|
||||
allSelected.value = !allSelected.value
|
||||
},
|
||||
toggleSortColumn(column: number) {
|
||||
const columns = ['', 'name', 'modified', 'size', '']
|
||||
toggleSort(columns[column])
|
||||
const order = ['', 'name', 'modified', 'size', ''][column]
|
||||
if (order) store.toggleSort(order as SortOrder)
|
||||
},
|
||||
isCursor() {
|
||||
return cursor.value !== null && editing.value === null
|
||||
@@ -227,36 +151,36 @@ defineExpose({
|
||||
cursorSelect() {
|
||||
const doc = cursor.value
|
||||
if (!doc) return
|
||||
if (documentStore.selected.has(doc.key)) {
|
||||
documentStore.selected.delete(doc.key)
|
||||
if (store.selected.has(doc.key)) {
|
||||
store.selected.delete(doc.key)
|
||||
} else {
|
||||
documentStore.selected.add(doc.key)
|
||||
store.selected.add(doc.key)
|
||||
}
|
||||
this.cursorMove(1)
|
||||
},
|
||||
cursorMove(d: number, select = false) {
|
||||
// Move cursor up or down (keyboard navigation)
|
||||
const documents = sortedDocuments.value
|
||||
if (documents.length === 0) {
|
||||
const docs = props.documents
|
||||
if (docs.length === 0) {
|
||||
cursor.value = null
|
||||
return
|
||||
}
|
||||
const N = documents.length
|
||||
const N = docs.length
|
||||
const mod = (a: number, b: number) => ((a % b) + b) % b
|
||||
const increment = (i: number, d: number) => mod(i + d, N + 1)
|
||||
const index =
|
||||
cursor.value !== null ? documents.indexOf(cursor.value) : documents.length
|
||||
cursor.value !== null ? docs.indexOf(cursor.value) : docs.length
|
||||
const moveto = increment(index, d)
|
||||
cursor.value = documents[moveto] ?? null
|
||||
cursor.value = docs[moveto] ?? null
|
||||
const tr = cursor.value ? document.getElementById(`file-${cursor.value.key}`) : null
|
||||
if (select) {
|
||||
// Go forwards, possibly wrapping over the end; the last entry is not toggled
|
||||
let [begin, end] = d > 0 ? [index, moveto] : [moveto, index]
|
||||
for (let p = begin; p !== end; p = increment(p, 1)) {
|
||||
if (p === N) continue
|
||||
const key = documents[p].key
|
||||
if (documentStore.selected.has(key)) documentStore.selected.delete(key)
|
||||
else documentStore.selected.add(key)
|
||||
const key = docs[p].key
|
||||
if (store.selected.has(key)) store.selected.delete(key)
|
||||
else store.selected.add(key)
|
||||
}
|
||||
}
|
||||
// @ts-ignore
|
||||
@@ -293,7 +217,14 @@ watchEffect(() => {
|
||||
focusBreadcrumb()
|
||||
}
|
||||
})
|
||||
const mkdir = (doc: Document, name: string) => {
|
||||
let nowkey = ref(0)
|
||||
let modifiedTimer: any = null
|
||||
const updateModified = () => {
|
||||
nowkey.value = Math.floor(Date.now() / 1000)
|
||||
}
|
||||
onMounted(() => { updateModified(); modifiedTimer = setInterval(updateModified, 1000) })
|
||||
onUnmounted(() => { clearInterval(modifiedTimer) })
|
||||
const mkdir = (doc: Doc, name: string) => {
|
||||
const control = connect(controlUrl, {
|
||||
open() {
|
||||
control.send(
|
||||
@@ -310,34 +241,24 @@ const mkdir = (doc: Document, name: string) => {
|
||||
editing.value = null
|
||||
} else {
|
||||
console.log('mkdir', msg)
|
||||
router.push(`/${doc.loc}/${name}/`)
|
||||
router.push(doc.urlrouter)
|
||||
}
|
||||
}
|
||||
})
|
||||
doc.name = name // We should get an update from watch but this is quicker
|
||||
// We should get an update from watch but this is quicker
|
||||
doc.name = name
|
||||
doc.key = crypto.randomUUID()
|
||||
}
|
||||
|
||||
// Column sort
|
||||
const toggleSort = (name: string) => {
|
||||
sort.value = sort.value === name ? '' : name
|
||||
}
|
||||
const sort = ref<string>('')
|
||||
const sortCompare = {
|
||||
name: (a: Document, b: Document) => collator.compare(a.name, b.name),
|
||||
modified: (a: Document, b: Document) => b.mtime - a.mtime,
|
||||
size: (a: Document, b: Document) => b.size - a.size
|
||||
}
|
||||
const sorted = (documents: Document[]) => {
|
||||
const cmp = sortCompare[sort.value as keyof typeof sortCompare]
|
||||
const sorted = [...documents]
|
||||
if (cmp) sorted.sort(cmp)
|
||||
return sorted
|
||||
const showFolderBreadcrumb = (i: number) => {
|
||||
const docs = props.documents
|
||||
const docloc = docs[i].loc
|
||||
return i === 0 ? docloc !== loc.value : docloc !== docs[i - 1].loc
|
||||
}
|
||||
const selectionIndeterminate = computed({
|
||||
get: () => {
|
||||
return (
|
||||
props.documents.length > 0 &&
|
||||
props.documents.some((doc: Document) => documentStore.selected.has(doc.key)) &&
|
||||
props.documents.some((doc: Doc) => store.selected.has(doc.key)) &&
|
||||
!allSelected.value
|
||||
)
|
||||
},
|
||||
@@ -348,16 +269,16 @@ const allSelected = computed({
|
||||
get: () => {
|
||||
return (
|
||||
props.documents.length > 0 &&
|
||||
props.documents.every((doc: Document) => documentStore.selected.has(doc.key))
|
||||
props.documents.every((doc: Doc) => store.selected.has(doc.key))
|
||||
)
|
||||
},
|
||||
set: (value: boolean) => {
|
||||
console.log('Setting allSelected', value)
|
||||
for (const doc of props.documents) {
|
||||
if (value) {
|
||||
documentStore.selected.add(doc.key)
|
||||
store.selected.add(doc.key)
|
||||
} else {
|
||||
documentStore.selected.delete(doc.key)
|
||||
store.selected.delete(doc.key)
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -365,9 +286,13 @@ const allSelected = computed({
|
||||
|
||||
const loc = computed(() => props.path.join('/'))
|
||||
|
||||
const contextMenu = (ev: Event, doc: Document) => {
|
||||
const contextMenu = (ev: MouseEvent, doc: Doc) => {
|
||||
cursor.value = doc
|
||||
console.log('Context menu', ev, doc)
|
||||
ContextMenu.showContextMenu({
|
||||
x: ev.x, y: ev.y, items: [
|
||||
{ label: 'Rename', onClick: () => { editing.value = doc } },
|
||||
],
|
||||
})
|
||||
}
|
||||
</script>
|
||||
|
||||
@@ -401,7 +326,7 @@ table .selection {
|
||||
text-overflow: clip;
|
||||
}
|
||||
table .modified {
|
||||
width: 8em;
|
||||
width: 9em;
|
||||
}
|
||||
table .size {
|
||||
width: 5em;
|
||||
@@ -515,3 +440,4 @@ tbody .selection input {
|
||||
color: #888;
|
||||
}
|
||||
</style>
|
||||
@/stores/main
|
||||
22
frontend/src/components/FileModified.vue
Normal file
@@ -0,0 +1,22 @@
|
||||
<template>
|
||||
<td class="modified right">
|
||||
<time :data-tooltip=tooltip :datetime=datetime>{{ doc.modified }}</time>
|
||||
</td>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { Doc } from '@/repositories/Document'
|
||||
import { computed } from 'vue'
|
||||
|
||||
const datetime = computed(() =>
|
||||
new Date(1000 * props.doc.mtime).toISOString().replace('.000Z', 'Z')
|
||||
)
|
||||
|
||||
const tooltip = computed(() =>
|
||||
datetime.value.replace('T', '\n').replace('Z', ' UTC')
|
||||
)
|
||||
|
||||
const props = defineProps<{
|
||||
doc: Doc
|
||||
}>()
|
||||
</script>
|
||||
@@ -12,7 +12,7 @@
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import type { Document } from '@/repositories/Document'
|
||||
import { Doc } from '@/repositories/Document'
|
||||
import { ref, onMounted, nextTick } from 'vue'
|
||||
|
||||
const input = ref<HTMLInputElement | null>(null)
|
||||
@@ -28,8 +28,8 @@ onMounted(() => {
|
||||
})
|
||||
|
||||
const props = defineProps<{
|
||||
doc: Document
|
||||
rename: (doc: Document, newName: string) => void
|
||||
doc: Doc
|
||||
rename: (doc: Doc, newName: string) => void
|
||||
exit: () => void
|
||||
}>()
|
||||
|
||||
@@ -46,8 +46,8 @@ const apply = () => {
|
||||
|
||||
<style>
|
||||
input#FileRenameInput {
|
||||
color: var(--primary-color);
|
||||
background: var(--primary-background);
|
||||
color: var(--input-color);
|
||||
background: var(--input-background);
|
||||
border: 0;
|
||||
border-radius: 0.3rem;
|
||||
padding: 0.4rem;
|
||||
43
frontend/src/components/FileSize.vue
Normal file
@@ -0,0 +1,43 @@
|
||||
<template>
|
||||
<td class="size right" :class=sizeClass>{{ doc.sizedisp }}</td>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { Doc } from '@/repositories/Document'
|
||||
import { computed } from 'vue'
|
||||
|
||||
const sizeClass = computed(() => {
|
||||
const unit = props.doc.sizedisp.split('\u202F').slice(-1)[0]
|
||||
return +unit ? "bytes" : unit
|
||||
})
|
||||
|
||||
const props = defineProps<{
|
||||
doc: Doc
|
||||
}>()
|
||||
</script>
|
||||
|
||||
<style scoped>
|
||||
.size.empty { color: #555 }
|
||||
.size.bytes { color: #77a }
|
||||
.size.kB { color: #474 }
|
||||
.size.MB { color: #a80 }
|
||||
.size.GB { color: #f83 }
|
||||
.size.TB, .size.PB, .size.EB, .size.huge {
|
||||
color: #f44;
|
||||
text-shadow: 0 0 .2em;
|
||||
}
|
||||
|
||||
@media (prefers-color-scheme: dark) {
|
||||
.size.empty { color: #bbb }
|
||||
.size.bytes { color: #99d }
|
||||
.size.kB { color: #aea }
|
||||
.size.MB { color: #ff4 }
|
||||
.size.GB { color: #f86 }
|
||||
.size.TB, .size.PB, .size.EB, .size.huge { color: #f55 }
|
||||
}
|
||||
|
||||
.cursor .size {
|
||||
color: inherit;
|
||||
text-shadow: none;
|
||||
}
|
||||
</style>
|
||||
@@ -1,15 +1,15 @@
|
||||
<template>
|
||||
<nav class="headermain">
|
||||
<div class="buttons">
|
||||
<template v-if="documentStore.error">
|
||||
<div class="error-message" @click="documentStore.error = ''">{{ documentStore.error }}</div>
|
||||
<template v-if="store.error">
|
||||
<div class="error-message" @click="store.error = ''">{{ store.error }}</div>
|
||||
<div class="smallgap"></div>
|
||||
</template>
|
||||
<UploadButton />
|
||||
<UploadButton :path="props.path" />
|
||||
<SvgButton
|
||||
name="create-folder"
|
||||
data-tooltip="New folder"
|
||||
@click="() => documentStore.fileExplorer.newFolder()"
|
||||
@click="() => store.fileExplorer!.newFolder()"
|
||||
/>
|
||||
<slot></slot>
|
||||
<div class="spacer smallgap"></div>
|
||||
@@ -17,10 +17,10 @@
|
||||
<input
|
||||
ref="search"
|
||||
type="search"
|
||||
v-model="documentStore.search"
|
||||
:value="query"
|
||||
@input="updateSearch"
|
||||
placeholder="Search words"
|
||||
class="margin-input"
|
||||
@keyup.escape="closeSearch"
|
||||
/>
|
||||
</template>
|
||||
<SvgButton ref="searchButton" name="find" @click.prevent="toggleSearchInput" />
|
||||
@@ -30,38 +30,54 @@
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import { ref, nextTick } from 'vue'
|
||||
import { useMainStore } from '@/stores/main'
|
||||
import { ref, nextTick, watchEffect } from 'vue'
|
||||
import ContextMenu from '@imengyu/vue3-context-menu'
|
||||
import router from '@/router';
|
||||
|
||||
const documentStore = useDocumentStore()
|
||||
const store = useMainStore()
|
||||
const showSearchInput = ref<boolean>(false)
|
||||
const search = ref<HTMLInputElement | null>()
|
||||
const searchButton = ref<HTMLButtonElement | null>()
|
||||
const props = defineProps<{
|
||||
path: Array<string>
|
||||
query: string
|
||||
}>()
|
||||
|
||||
const closeSearch = () => {
|
||||
const closeSearch = (ev: Event) => {
|
||||
if (!showSearchInput.value) return // Already closing
|
||||
showSearchInput.value = false
|
||||
documentStore.search = ''
|
||||
const breadcrumb = document.querySelector('.breadcrumb') as HTMLElement
|
||||
breadcrumb.focus()
|
||||
updateSearch(ev)
|
||||
}
|
||||
const toggleSearchInput = () => {
|
||||
const updateSearch = (ev: Event) => {
|
||||
const q = (ev.target as HTMLInputElement).value
|
||||
let p = props.path.join('/')
|
||||
p = p ? `/${p}` : ''
|
||||
const url = q ? `${p}//${q}` : (p || '/')
|
||||
const u = url.replaceAll('?', '%3F').replaceAll('#', '%23')
|
||||
if (!props.query && q) router.push(u)
|
||||
else router.replace(u)
|
||||
}
|
||||
const toggleSearchInput = (ev: Event) => {
|
||||
showSearchInput.value = !showSearchInput.value
|
||||
if (!showSearchInput.value) return closeSearch()
|
||||
if (!showSearchInput.value) return closeSearch(ev)
|
||||
nextTick(() => {
|
||||
const input = search.value
|
||||
if (input) input.focus()
|
||||
})
|
||||
}
|
||||
|
||||
watchEffect(() => {
|
||||
if (props.query) showSearchInput.value = true
|
||||
})
|
||||
const settingsMenu = (e: Event) => {
|
||||
// show the context menu
|
||||
const items = []
|
||||
if (documentStore.user.isLoggedIn) {
|
||||
items.push({ label: `Logout ${documentStore.user.username ?? ''}`, onClick: () => documentStore.logout() })
|
||||
if (store.user.isLoggedIn) {
|
||||
items.push({ label: `Logout ${store.user.username ?? ''}`, onClick: () => store.logout() })
|
||||
} else {
|
||||
items.push({ label: 'Login', onClick: () => documentStore.loginDialog() })
|
||||
items.push({ label: 'Login', onClick: () => store.loginDialog() })
|
||||
}
|
||||
ContextMenu.showContextMenu({
|
||||
// @ts-ignore
|
||||
@@ -69,7 +85,6 @@ const settingsMenu = (e: Event) => {
|
||||
items,
|
||||
})
|
||||
}
|
||||
|
||||
defineExpose({
|
||||
toggleSearchInput,
|
||||
closeSearch,
|
||||
@@ -98,3 +113,4 @@ input[type='search'] {
|
||||
max-width: 30vw;
|
||||
}
|
||||
</style>
|
||||
@/stores/main
|
||||
@@ -1,29 +1,29 @@
|
||||
<template>
|
||||
<template v-if="documentStore.selected.size">
|
||||
<template v-if="store.selected.size">
|
||||
<div class="smallgap"></div>
|
||||
<p class="select-text">{{ documentStore.selected.size }} selected ➤</p>
|
||||
<p class="select-text">{{ store.selected.size }} selected ➤</p>
|
||||
<SvgButton name="download" data-tooltip="Download" @click="download" />
|
||||
<SvgButton name="copy" data-tooltip="Copy here" @click="op('cp', dst)" />
|
||||
<SvgButton name="paste" data-tooltip="Move here" @click="op('mv', dst)" />
|
||||
<SvgButton name="trash" data-tooltip="Delete ⚠️" @click="op('rm')" />
|
||||
<button class="action-button unselect" data-tooltip="Unselect all" @click="documentStore.selected.clear()">❌</button>
|
||||
<button class="action-button unselect" data-tooltip="Unselect all" @click="store.selected.clear()">❌</button>
|
||||
</template>
|
||||
</template>
|
||||
|
||||
<script setup lang="ts">
|
||||
import {connect, controlUrl} from '@/repositories/WS'
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import { useMainStore } from '@/stores/main'
|
||||
import { computed } from 'vue'
|
||||
import type { SelectedItems } from '@/repositories/Document'
|
||||
|
||||
const documentStore = useDocumentStore()
|
||||
const store = useMainStore()
|
||||
const props = defineProps({
|
||||
path: Array<string>
|
||||
})
|
||||
|
||||
const dst = computed(() => props.path!.join('/'))
|
||||
const op = (op: string, dst?: string) => {
|
||||
const sel = documentStore.selectedFiles
|
||||
const sel = store.selectedFiles
|
||||
const msg = {
|
||||
op,
|
||||
sel: sel.keys.map(key => {
|
||||
@@ -34,16 +34,16 @@ const op = (op: string, dst?: string) => {
|
||||
// @ts-ignore
|
||||
if (dst !== undefined) msg.dst = dst
|
||||
const control = connect(controlUrl, {
|
||||
message(ev: WebSocmetMessageEvent) {
|
||||
message(ev: MessageEvent) {
|
||||
const res = JSON.parse(ev.data)
|
||||
if ('error' in res) {
|
||||
console.error('Control socket error', msg, res.error)
|
||||
documentStore.error = res.error.message
|
||||
store.error = res.error.message
|
||||
return
|
||||
} else if (res.status === 'ack') {
|
||||
console.log('Control ack OK', res)
|
||||
control.close()
|
||||
documentStore.selected.clear()
|
||||
store.selected.clear()
|
||||
return
|
||||
} else console.log('Unknown control response', msg, res)
|
||||
}
|
||||
@@ -108,17 +108,17 @@ const filesystemdl = async (sel: SelectedItems, handle: FileSystemDirectoryHandl
|
||||
}
|
||||
|
||||
const download = async () => {
|
||||
const sel = documentStore.selectedFiles
|
||||
const sel = store.selectedFiles
|
||||
console.log('Download', sel)
|
||||
if (sel.keys.length === 0) {
|
||||
console.warn('Attempted download but no files found. Missing selected keys:', sel.missing)
|
||||
documentStore.selected.clear()
|
||||
store.selected.clear()
|
||||
return
|
||||
}
|
||||
// Plain old a href download if only one file (ignoring any folders)
|
||||
const files = sel.recursive.filter(([rel, full, doc]) => !doc.dir)
|
||||
if (files.length === 1) {
|
||||
documentStore.selected.clear()
|
||||
store.selected.clear()
|
||||
return linkdl(`/files/${files[0][1]}`)
|
||||
}
|
||||
// Use FileSystem API if multiple files and the browser supports it
|
||||
@@ -130,7 +130,7 @@ const download = async () => {
|
||||
mode: 'readwrite'
|
||||
})
|
||||
filesystemdl(sel, handle).then(() => {
|
||||
documentStore.selected.clear()
|
||||
store.selected.clear()
|
||||
})
|
||||
return
|
||||
} catch (e) {
|
||||
@@ -140,7 +140,7 @@ const download = async () => {
|
||||
// Otherwise, zip and download
|
||||
const name = sel.keys.length === 1 ? sel.docs[sel.keys[0]].name : 'download'
|
||||
linkdl(`/zip/${Array.from(sel.keys).join('+')}/${name}.zip`)
|
||||
documentStore.selected.clear()
|
||||
store.selected.clear()
|
||||
}
|
||||
</script>
|
||||
|
||||
@@ -152,3 +152,4 @@ const download = async () => {
|
||||
text-overflow: ellipsis;
|
||||
}
|
||||
</style>
|
||||
@/stores/main
|
||||
@@ -39,10 +39,10 @@
|
||||
import { reactive, ref } from 'vue'
|
||||
import { loginUser } from '@/repositories/User'
|
||||
import type { ISimpleError } from '@/repositories/Client'
|
||||
import { useDocumentStore } from '@/stores/documents'
|
||||
import { useMainStore } from '@/stores/main'
|
||||
|
||||
const confirmLoading = ref<boolean>(false)
|
||||
const store = useDocumentStore()
|
||||
const store = useMainStore()
|
||||
|
||||
const loginForm = reactive({
|
||||
username: '',
|
||||
@@ -99,3 +99,4 @@ const login = async () => {
|
||||
height: 1em;
|
||||
}
|
||||
</style>
|
||||
@/stores/main
|
||||
305
frontend/src/components/UploadButton.vue
Normal file
@@ -0,0 +1,305 @@
|
||||
<script setup lang="ts">
|
||||
import { connect, uploadUrl } from '@/repositories/WS';
|
||||
import { useMainStore } from '@/stores/main'
|
||||
import { collator } from '@/utils';
|
||||
import { computed, onMounted, onUnmounted, reactive, ref } from 'vue'
|
||||
|
||||
const fileInput = ref()
|
||||
const folderInput = ref()
|
||||
const store = useMainStore()
|
||||
const props = defineProps({
|
||||
path: Array<string>
|
||||
})
|
||||
|
||||
type CloudFile = {
|
||||
file: File
|
||||
cloudName: string
|
||||
cloudPos: number
|
||||
}
|
||||
function pasteHandler(event: ClipboardEvent) {
|
||||
const items = Array.from(event.clipboardData?.items ?? [])
|
||||
const infiles = [] as File[]
|
||||
const dirs = [] as FileSystemDirectoryEntry[]
|
||||
for (const item of items) {
|
||||
if (item.kind !== 'file') continue
|
||||
const entry = item.webkitGetAsEntry()
|
||||
if (entry?.isFile) {
|
||||
const file = item.getAsFile()
|
||||
if (file) infiles.push(file)
|
||||
} else if (entry?.isDirectory) {
|
||||
dirs.push(entry as FileSystemDirectoryEntry)
|
||||
}
|
||||
}
|
||||
if (infiles.length || dirs.length) {
|
||||
event.preventDefault()
|
||||
uploadFiles(infiles)
|
||||
for (const entry of dirs) pasteDirectory(entry, `${props.path!.join('/')}/${entry.name}`)
|
||||
}
|
||||
}
|
||||
const pasteDirectory = async (entry: FileSystemDirectoryEntry, loc: string) => {
|
||||
const reader = entry.createReader()
|
||||
const entries = await new Promise<any[]>(resolve => reader.readEntries(resolve))
|
||||
const cloudfiles = [] as CloudFile[]
|
||||
for (const entry of entries) {
|
||||
const cloudName = `${loc}/${entry.name}`
|
||||
if (entry.isFile) {
|
||||
const file = await new Promise(resolve => entry.file(resolve)) as File
|
||||
cloudfiles.push({file, cloudName, cloudPos: 0})
|
||||
} else if (entry.isDirectory) {
|
||||
await pasteDirectory(entry, cloudName)
|
||||
}
|
||||
}
|
||||
if (cloudfiles.length) uploadCloudFiles(cloudfiles)
|
||||
}
|
||||
function uploadHandler(event: Event) {
|
||||
event.preventDefault()
|
||||
// @ts-ignore
|
||||
const input = event.target as HTMLInputElement | null
|
||||
const infiles = Array.from((input ?? (event as DragEvent).dataTransfer)?.files ?? []) as File[]
|
||||
if (input) input.value = ''
|
||||
if (infiles.length) uploadFiles(infiles)
|
||||
}
|
||||
|
||||
const uploadFiles = (infiles: File[]) => {
|
||||
const loc = props.path!.join('/')
|
||||
let files = []
|
||||
for (const file of infiles) {
|
||||
files.push({
|
||||
file,
|
||||
cloudName: loc + '/' + (file.webkitRelativePath || file.name),
|
||||
cloudPos: 0,
|
||||
})
|
||||
}
|
||||
uploadCloudFiles(files)
|
||||
}
|
||||
const uploadCloudFiles = (files: CloudFile[]) => {
|
||||
const dotfiles = files.filter(f => f.cloudName.includes('/.'))
|
||||
if (dotfiles.length) {
|
||||
store.error = "Won't upload dotfiles"
|
||||
console.log("Dotfiles omitted", dotfiles)
|
||||
files = files.filter(f => !f.cloudName.includes('/.'))
|
||||
}
|
||||
if (!files.length) return
|
||||
files.sort((a, b) => collator.compare(a.cloudName, b.cloudName))
|
||||
// @ts-ignore
|
||||
upqueue = [...upqueue, ...files]
|
||||
statsAdd(files)
|
||||
startWorker()
|
||||
}
|
||||
|
||||
const cancelUploads = () => {
|
||||
upqueue = []
|
||||
statReset()
|
||||
}
|
||||
|
||||
const uprogress_init = {
|
||||
total: 0,
|
||||
uploaded: 0,
|
||||
t0: 0,
|
||||
tlast: 0,
|
||||
statbytes: 0,
|
||||
statdur: 0,
|
||||
files: [] as CloudFile[],
|
||||
filestart: 0,
|
||||
fileidx: 0,
|
||||
filecount: 0,
|
||||
filename: '',
|
||||
filesize: 0,
|
||||
filepos: 0,
|
||||
status: 'idle',
|
||||
}
|
||||
const uprogress = reactive({...uprogress_init})
|
||||
const percent = computed(() => uprogress.uploaded / uprogress.total * 100)
|
||||
const speed = computed(() => {
|
||||
let s = uprogress.statbytes / uprogress.statdur / 1e3
|
||||
const tsince = (Date.now() - uprogress.tlast) / 1e3
|
||||
if (tsince > 5 / s) return 0 // Less than fifth of previous speed => stalled
|
||||
if (tsince > 1 / s) return 1 / tsince // Next block is late or not coming, decay
|
||||
return s // "Current speed"
|
||||
})
|
||||
const speeddisp = computed(() => speed.value ? speed.value.toFixed(speed.value < 10 ? 1 : 0) + '\u202FMB/s': 'stalled')
|
||||
setInterval(() => {
|
||||
if (Date.now() - uprogress.tlast > 3000) {
|
||||
// Reset
|
||||
uprogress.statbytes = 0
|
||||
uprogress.statdur = 1
|
||||
} else {
|
||||
// Running average by decay
|
||||
uprogress.statbytes *= .9
|
||||
uprogress.statdur *= .9
|
||||
}
|
||||
}, 100)
|
||||
const statUpdate = ({name, size, start, end}: {name: string, size: number, start: number, end: number}) => {
|
||||
if (name !== uprogress.filename) return // If stats have been reset
|
||||
const now = Date.now()
|
||||
uprogress.uploaded = uprogress.filestart + end
|
||||
uprogress.filepos = end
|
||||
uprogress.statbytes += end - start
|
||||
uprogress.statdur += now - uprogress.tlast
|
||||
uprogress.tlast = now
|
||||
// File finished?
|
||||
if (end === size) {
|
||||
uprogress.filestart += size
|
||||
statNextFile()
|
||||
if (++uprogress.fileidx >= uprogress.filecount) statReset()
|
||||
}
|
||||
}
|
||||
const statNextFile = () => {
|
||||
const f = uprogress.files.shift()
|
||||
if (!f) return statReset()
|
||||
uprogress.filepos = 0
|
||||
uprogress.filesize = f.file.size
|
||||
uprogress.filename = f.cloudName
|
||||
}
|
||||
const statReset = () => {
|
||||
Object.assign(uprogress, uprogress_init)
|
||||
uprogress.t0 = Date.now()
|
||||
uprogress.tlast = uprogress.t0 + 1
|
||||
}
|
||||
const statsAdd = (f: CloudFile[]) => {
|
||||
if (uprogress.files.length === 0) statReset()
|
||||
uprogress.total += f.reduce((a, b) => a + b.file.size, 0)
|
||||
uprogress.filecount += f.length
|
||||
uprogress.files = [...uprogress.files, ...f]
|
||||
statNextFile()
|
||||
}
|
||||
let upqueue = [] as CloudFile[]
|
||||
|
||||
// TODO: Rewrite as WebSocket class
|
||||
const WSCreate = async () => await new Promise<WebSocket>(resolve => {
|
||||
const ws = connect(uploadUrl, {
|
||||
open(ev: Event) { resolve(ws) },
|
||||
error(ev: Event) {
|
||||
console.error('Upload socket error', ev)
|
||||
store.error = 'Upload socket error'
|
||||
},
|
||||
message(ev: MessageEvent) {
|
||||
const res = JSON.parse(ev!.data)
|
||||
if ('error' in res) {
|
||||
console.error('Upload socket error', res.error)
|
||||
store.error = res.error.message
|
||||
return
|
||||
}
|
||||
if (res.status === 'ack') {
|
||||
statUpdate(res.req)
|
||||
} else console.log('Unknown upload response', res)
|
||||
},
|
||||
})
|
||||
// @ts-ignore
|
||||
ws.sendMsg = (msg: any) => ws.send(JSON.stringify(msg))
|
||||
// @ts-ignore
|
||||
ws.sendData = async (data: any) => {
|
||||
// Wait until the WS is ready to send another message
|
||||
uprogress.status = "uploading"
|
||||
await new Promise(resolve => {
|
||||
const t = setInterval(() => {
|
||||
if (ws.bufferedAmount > 1<<20) return
|
||||
resolve(undefined)
|
||||
clearInterval(t)
|
||||
}, 1)
|
||||
})
|
||||
uprogress.status = "processing"
|
||||
ws.send(data)
|
||||
}
|
||||
})
|
||||
const worker = async () => {
|
||||
const ws = await WSCreate()
|
||||
while (upqueue.length) {
|
||||
const f = upqueue[0]
|
||||
const start = f.cloudPos
|
||||
const end = Math.min(f.file.size, start + (1<<20))
|
||||
const control = { name: f.cloudName, size: f.file.size, start, end }
|
||||
const data = f.file.slice(start, end)
|
||||
f.cloudPos = end
|
||||
// Note: files may get modified during I/O
|
||||
// @ts-ignore FIXME proper WebSocket class, avoid attaching functions to WebSocket object
|
||||
ws.sendMsg(control)
|
||||
// @ts-ignore
|
||||
await ws.sendData(data)
|
||||
if (f.cloudPos === f.file.size) upqueue.shift()
|
||||
}
|
||||
if (upqueue.length) startWorker()
|
||||
uprogress.status = "idle"
|
||||
workerRunning = false
|
||||
}
|
||||
let workerRunning: any = false
|
||||
const startWorker = () => {
|
||||
if (workerRunning === false) workerRunning = setTimeout(() => {
|
||||
workerRunning = true
|
||||
worker()
|
||||
}, 0)
|
||||
}
|
||||
|
||||
onMounted(() => {
|
||||
// Need to prevent both to prevent browser from opening the file
|
||||
addEventListener('dragover', uploadHandler)
|
||||
addEventListener('drop', uploadHandler)
|
||||
addEventListener('paste', pasteHandler)
|
||||
})
|
||||
onUnmounted(() => {
|
||||
removeEventListener('paste', pasteHandler)
|
||||
removeEventListener('dragover', uploadHandler)
|
||||
removeEventListener('drop', uploadHandler)
|
||||
})
|
||||
</script>
|
||||
<template>
|
||||
<template>
|
||||
<input ref="fileInput" @change="uploadHandler" type="file" multiple>
|
||||
<input ref="folderInput" @change="uploadHandler" type="file" webkitdirectory>
|
||||
</template>
|
||||
<SvgButton name="add-file" data-tooltip="Upload files" @click="fileInput.click()" />
|
||||
<SvgButton name="add-folder" data-tooltip="Upload folder" @click="folderInput.click()" />
|
||||
<div class="uploadprogress" v-if="uprogress.total" :style="`background: linear-gradient(to right, var(--bar) 0, var(--bar) ${percent}%, var(--nobar) ${percent}%, var(--nobar) 100%);`">
|
||||
<div class="statustext">
|
||||
<span v-if="uprogress.filecount > 1" class="index">
|
||||
[{{ uprogress.fileidx }}/{{ uprogress.filecount }}]
|
||||
</span>
|
||||
<span class="filename">{{ uprogress.filename.split('/').pop() }}
|
||||
<span v-if="uprogress.filesize > 1e7" class="percent">
|
||||
{{ (uprogress.filepos / uprogress.filesize * 100).toFixed(0) + '\u202F%' }}
|
||||
</span>
|
||||
</span>
|
||||
<span class="position" v-if="uprogress.total > 1e7">
|
||||
{{ (uprogress.uploaded / 1e6).toFixed(0) + '\u202F/\u202F' + (uprogress.total / 1e6).toFixed(0) + '\u202FMB' }}
|
||||
</span>
|
||||
<span class="speed">{{ speeddisp }}</span>
|
||||
<button class="close" @click="cancelUploads">❌</button>
|
||||
</div>
|
||||
</div>
|
||||
</template>
|
||||
|
||||
<style scoped>
|
||||
.uploadprogress {
|
||||
--bar: var(--accent-color);
|
||||
--nobar: var(--header-background);
|
||||
display: flex;
|
||||
flex-direction: column;
|
||||
color: var(--primary-color);
|
||||
position: fixed;
|
||||
left: 0;
|
||||
bottom: 0;
|
||||
width: 100vw;
|
||||
}
|
||||
.statustext {
|
||||
display: flex;
|
||||
padding: 0.5rem 0;
|
||||
}
|
||||
span {
|
||||
color: #ccc;
|
||||
white-space: nowrap;
|
||||
text-align: right;
|
||||
padding: 0 0.5em;
|
||||
}
|
||||
.filename {
|
||||
color: #fff;
|
||||
flex: 1 1;
|
||||
white-space: nowrap;
|
||||
overflow: hidden;
|
||||
text-overflow: ellipsis;
|
||||
text-align: left;
|
||||
}
|
||||
.index { min-width: 3.5em }
|
||||
.position { min-width: 4em }
|
||||
.speed { min-width: 4em }
|
||||
</style>
|
||||
@/stores/main
|
||||
67
frontend/src/repositories/Document.ts
Normal file
@@ -0,0 +1,67 @@
|
||||
import { formatSize, formatUnixDate, haystackFormat } from "@/utils"
|
||||
|
||||
export type FUID = string
|
||||
|
||||
export type DocProps = {
|
||||
loc: string
|
||||
name: string
|
||||
key: FUID
|
||||
size: number
|
||||
mtime: number
|
||||
dir: boolean
|
||||
}
|
||||
|
||||
export class Doc {
|
||||
private _name: string = ""
|
||||
public loc: string = ""
|
||||
public key: FUID = ""
|
||||
public size: number = 0
|
||||
public mtime: number = 0
|
||||
public haystack: string = ""
|
||||
public dir: boolean = false
|
||||
|
||||
constructor(props: Partial<DocProps> = {}) { Object.assign(this, props) }
|
||||
get name() { return this._name }
|
||||
set name(name: string) {
|
||||
if (name.includes('/') || name.startsWith('.')) throw Error(`Invalid name: ${name}`)
|
||||
this._name = name
|
||||
this.haystack = haystackFormat(name)
|
||||
}
|
||||
get sizedisp(): string { return formatSize(this.size) }
|
||||
get modified(): string { return formatUnixDate(this.mtime) }
|
||||
get url(): string {
|
||||
const p = this.loc ? `${this.loc}/${this.name}` : this.name
|
||||
return this.dir ? '/#/' + `${p}/`.replaceAll('#', '%23') : `/files/${p}`.replaceAll('?', '%3F').replaceAll('#', '%23')
|
||||
}
|
||||
get urlrouter(): string {
|
||||
return this.url.replace(/^\/#/, '')
|
||||
}
|
||||
}
|
||||
export type errorEvent = {
|
||||
error: {
|
||||
code: number
|
||||
message: string
|
||||
redirect: string
|
||||
}
|
||||
}
|
||||
|
||||
// Raw types the backend /api/watch sends us
|
||||
|
||||
export type FileEntry = [
|
||||
number, // level
|
||||
string, // name
|
||||
FUID,
|
||||
number, //mtime
|
||||
number, // size
|
||||
number, // isfile
|
||||
]
|
||||
|
||||
export type UpdateEntry = ['k', number] | ['d', number] | ['i', Array<FileEntry>]
|
||||
|
||||
// Helper structure for selections
|
||||
export interface SelectedItems {
|
||||
keys: FUID[]
|
||||
docs: Record<FUID, Doc>
|
||||
recursive: Array<[string, string, Doc]>
|
||||
missing: Set<FUID>
|
||||
}
|
||||