76 Commits

Author SHA1 Message Date
ffa1194ed6 Release v1.1.0
All checks were successful
Build & Publish / build_publish (push) Successful in 3m14s
2026-01-25 11:12:54 +01:00
bb8bbf8a24 feat: enhance logging and error handling, update HTML templates, and add footer 2026-01-25 11:07:44 +01:00
f403b6549d Release v1.0.1 - Bug fixes for logging and upload UI
All checks were successful
Build & Publish / build_publish (push) Successful in 1m29s
2026-01-16 16:38:12 +01:00
cfbd9ff4d3 feat: release v0.3.1 with bug fixes for upload workflow and logging improvements 2026-01-16 16:33:41 +01:00
c7c5c5f135 feat: enhance upload handling and logging improvements 2026-01-16 15:17:57 +01:00
e0d1f263dd chore: release v1.0.0
All checks were successful
Build & Publish / build_publish (push) Successful in 1m30s
2026-01-16 11:29:22 +01:00
1d75df2d41 feat: implement upload error handling and rate limiting improvements 2026-01-16 11:23:14 +01:00
e90c4576a5 feat: update dependencies, enhance upload rate limiting, and improve UI elements 2026-01-16 08:54:14 +01:00
7a01525ca5 chore: release v0.3.0
All checks were successful
Build & Publish / build_publish (push) Successful in 46s
2026-01-13 10:52:46 +01:00
6ac669f8c7 wip 2026-01-13 10:48:51 +01:00
48acf723de Add favicon images and web manifest for site branding
- Added favicon-16x16.png, favicon-32x32.png, and favicon.ico to enhance site identity across platforms.
- Introduced site.webmanifest to define application metadata, including icons and color themes for a better user experience on mobile devices.
2026-01-13 09:58:43 +01:00
650352b103 fix: enhance log file rotation to include timestamp and improve error handling 2026-01-12 17:22:06 +01:00
9e567ae760 fix: update log file handling to include timestamp in renamed log files 2026-01-12 16:28:09 +01:00
0685de8ffa fix: enhance code content display with syntax highlighting and improve logging structure 2026-01-12 16:14:47 +01:00
1b295aa843 Release 0.2.0
All checks were successful
Build & Publish / build_publish (push) Successful in 42s
2026-01-11 10:00:04 +01:00
28b7860c6c fix: update asset logging to use serialized values and enhance asset struct with default implementation 2026-01-11 09:54:31 +01:00
62f3c49e8a fix: refactor logging events to use owned asset instances and simplify log event structures 2026-01-11 09:36:40 +01:00
7d02443e67 fix: enhance logging structure by adding missing log event types and improving error handling in API 2026-01-11 08:46:14 +01:00
2ef2b827b7 fix: update Rust installation check to use executable path for cargo 2026-01-11 08:31:01 +01:00
8441dbd74e fix: update Gitea token environment variable for package upload 2026-01-11 08:25:28 +01:00
1fa4c50191 fix: update environment variable names for Gitea package upload 2026-01-11 08:24:29 +01:00
d831bbe85f fix: add python3 to dependencies and improve package name extraction logic 2026-01-11 08:18:42 +01:00
e24630c4a9 refactor: update caching key to use rust-toolchain.toml and improve package name extraction 2026-01-11 08:14:51 +01:00
b8e209bd03 refactor: improve caching and installation steps in build workflow 2026-01-11 08:08:12 +01:00
8145f1c7e4 refactor: separate Rust installation step in build workflow 2026-01-11 08:03:49 +01:00
840cf0ba99 refactor: enhance build workflow with debugging and caching steps 2026-01-11 07:59:39 +01:00
d47e73f47b fix: update build workflow to use Ubuntu for dependency installation 2026-01-11 07:56:34 +01:00
cde83139b1 Refactor statistics page and enhance logging
- Updated the layout and styling of the statistics page for better responsiveness and visual appeal.
- Introduced a new error page for 404 errors with user-friendly messaging and navigation options.
- Enhanced logging functionality to capture detailed events related to asset uploads, deletions, and HTTP requests.
- Implemented an AssetTracker to manage assets in memory, allowing for efficient tracking and retrieval.
- Improved the API for uploading and retrieving assets, ensuring better error handling and response formatting.
- Added auto-refresh functionality to the statistics page to keep data up-to-date.
2026-01-11 07:51:47 +01:00
81656ec0da test
Some checks failed
Build & Publish / check (push) Successful in 3s
Build & Publish / build_publish (push) Failing after 54s
2026-01-09 21:35:07 +01:00
70d7b08b7d refactor: update CI workflow and Dockerfile for improved build process 2026-01-09 21:25:23 +01:00
d6c465466a feat: add statistics API and dashboard for asset metrics
All checks were successful
Rust CI / build-test (push) Successful in 1m22s
- Implemented `/api/stats` endpoint to return JSON metrics including active assets, total uploads, storage usage, and recent activity.
- Created `stats.html` page to display real-time statistics with auto-refresh functionality.
- Enhanced asset logging to include uploader IP and detailed event information for uploads and deletions.
- Updated asset model to store uploader IP for audit purposes.
- Improved logging functionality to ensure log directory exists before writing.
- Refactored asset creation and management to support new features and logging.
2026-01-09 20:59:24 +01:00
954a5be8cb OK
All checks were successful
Rust CI / build-test (push) Successful in 42s
2026-01-08 17:49:12 +01:00
f3b5ae677d okok 2026-01-08 17:48:35 +01:00
099e628418 ok
Some checks failed
Rust CI / build-test (push) Failing after 1m12s
2026-01-08 17:41:38 +01:00
92d3ba1929 ok
Some checks failed
Rust CI / build-test (push) Has been cancelled
2026-01-08 17:40:34 +01:00
747fec0749 test
Some checks failed
Rust CI / build-test (push) Failing after 2m5s
2026-01-08 17:32:02 +01:00
86c96bf9b2 wip
All checks were successful
Rust CI / build-test (push) Successful in 1m26s
2026-01-08 16:31:39 +01:00
d375b233ef wip
Some checks failed
Rust CI / build-test (push) Failing after 2m9s
2026-01-08 15:45:36 +01:00
1d0ba36d85 wip
All checks were successful
Rust CI / build-test (push) Successful in 2m8s
2026-01-08 14:52:33 +01:00
909518cec6 ok
All checks were successful
Rust CI / build-test (push) Successful in 1m13s
2026-01-08 14:48:53 +01:00
a630415818 ok test
Some checks failed
Rust CI / build-test (push) Failing after 1m7s
2026-01-08 14:47:14 +01:00
d2ca118eb8 wip
Some checks failed
Rust CI / build-test (push) Failing after 33s
2026-01-08 14:41:59 +01:00
b90df5bfed test build
Some checks failed
Rust CI / build-test (push) Failing after 33s
2026-01-08 14:35:13 +01:00
d95b4a8fb5 Add Rust CI workflow for build and test processes
Some checks failed
Rust CI / build-test (push) Failing after 1m22s
2026-01-08 14:32:37 +01:00
dd63e94140 Refactor CI workflow to include Rust build and test steps
Some checks failed
Rust CI / build-test (push) Failing after 1m17s
2026-01-08 14:28:56 +01:00
715ae5c971 wip
All checks were successful
runner-test / test (push) Successful in 1s
runner-test / test2 (push) Successful in 1s
2026-01-06 19:21:11 +01:00
ccb38db7f5 wip
All checks were successful
runner-test / test (push) Successful in 1s
runner-test / test2 (push) Successful in 1s
2026-01-06 19:19:51 +01:00
c13960750c wip
All checks were successful
runner-test / test (push) Successful in 1s
2026-01-06 19:18:58 +01:00
c6285f18e8 wip
All checks were successful
runner-test / test (push) Successful in 13m13s
2026-01-06 19:04:21 +01:00
2380417f24 wip
Some checks failed
runner-test / test (push) Has been cancelled
2026-01-06 19:03:32 +01:00
d2b6f80aee wip
Some checks failed
runner-test / test (push) Has been cancelled
2026-01-06 19:02:14 +01:00
2aa2bd2c23 wip 2026-01-06 18:10:35 +01:00
28f2dc7787 wip 2026-01-06 17:35:17 +01:00
37d17dc8b8 Refactor environment variables to use LazyLock for dynamic binding address and port 2026-01-06 17:18:13 +01:00
f5ed10b822 Update docker-compose configuration for improved service management 2026-01-06 17:00:47 +01:00
a84f6209f2 Update upload instructions to include text data in file selection 2026-01-06 16:57:39 +01:00
d7d8e4ebbf ss 2026-01-06 16:50:04 +01:00
7e21dc213a wip 2026-01-06 16:49:20 +01:00
c150d8005f wip 2026-01-06 16:46:30 +01:00
62eea535e4 wip 2026-01-06 16:45:37 +01:00
8f29b335a5 wip 2026-01-06 16:43:18 +01:00
fc46d0952a wip 2026-01-06 16:41:11 +01:00
a288859edb Update WORKDIR in Dockerfile for consistency 2026-01-06 16:35:15 +01:00
0aff6caee7 wip 2026-01-06 16:34:13 +01:00
63b780ac11 Refactor Dockerfile and docker-compose for improved build process and clarity 2026-01-06 16:33:23 +01:00
292a081e9d wip 2026-01-06 15:49:25 +01:00
323e28760b Add Traefik labels for improved routing configuration in docker-compose 2026-01-06 15:24:22 +01:00
abac91df4e Remove git clone command from Dockerfile 2026-01-06 14:43:32 +01:00
7faae610f9 wip 2026-01-06 14:40:13 +01:00
c6eba691a8 Refactor asset and logging directory creation for improved clarity 2026-01-06 14:34:31 +01:00
10384d15e5 Fix directory creation in Asset save method 2026-01-06 13:42:42 +01:00
1147d9b3f0 wip 2026-01-06 13:25:01 +01:00
46cb35e14e wip 2026-01-06 13:24:29 +01:00
b0e7d6a40a wip 2026-01-06 13:22:46 +01:00
4ddf4656a1 WIP 2026-01-06 12:57:53 +01:00
301f6d6202 Update Dockerfile and docker-compose.yaml for build process and volume configuration 2026-01-06 12:55:54 +01:00
28 changed files with 1834 additions and 324 deletions

198
.gitea/workflows/build.yaml Normal file
View File

@@ -0,0 +1,198 @@
name: Build & Publish
on:
push:
tags: ["v*"]
workflow_dispatch: {}
jobs:
build_publish:
runs-on: ubuntu-latest
steps:
- name: Install deps (Ubuntu)
run: |
sudo apt-get update
sudo apt-get install -y \
build-essential git ca-certificates curl tar gzip python3
- name: Checkout
uses: actions/checkout@v4
- name: Cache Rust toolchain
uses: actions/cache@v4
with:
path: |
~/.cargo/bin
~/.rustup
~/.cargo/registry
~/.cargo/git
key: ${{ runner.os }}-rustup-${{ hashFiles('rust-toolchain.toml') }}
restore-keys: |
${{ runner.os }}-rustup-
- name: Cache Cargo build
uses: actions/cache@v4
with:
path: |
target
key: ${{ runner.os }}-cargo-target-${{ hashFiles('**/Cargo.lock') }}
restore-keys: |
${{ runner.os }}-cargo-target-
- name: Install Rust (stable)
shell: bash
run: |
set -e
if [ ! -x "$HOME/.cargo/bin/cargo" ]; then
curl -fsSL https://sh.rustup.rs | sh -s -- -y --default-toolchain stable
fi
echo "$HOME/.cargo/bin" >> "$GITHUB_PATH"
- name: Debug workspace
shell: bash
run: |
set -e
pwd
ls -la
- name: Read package name
id: pkg_meta
shell: bash
run: |
set -e
if [ -f Cargo.toml ]; then
PKG_NAME="$(cargo metadata --no-deps --format-version=1 2>/dev/null | python3 -c 'import json,sys; data=json.load(sys.stdin); names=[t.get("name","") for p in data.get("packages", []) for t in p.get("targets", []) if "bin" in t.get("kind", [])]; print(names[0] if names else "")')"
if [ -z "${PKG_NAME:-}" ]; then
PKG_NAME="$(sed -n 's/^name = \"\\(.*\\)\"/\\1/p' Cargo.toml | head -n 1)"
fi
fi
if [ -z "${PKG_NAME:-}" ]; then
FULL="${GITHUB_REPOSITORY:-}"
if [ -z "$FULL" ]; then
echo "Could not read Cargo.toml and GITHUB_REPOSITORY is empty"
exit 1
fi
PKG_NAME="${FULL##*/}"
echo "Cargo.toml missing or unreadable. Falling back to repo name: $PKG_NAME"
fi
echo "pkg_name=$PKG_NAME" >> "$GITHUB_OUTPUT"
- name: Compute versions
id: version_meta
shell: bash
run: |
set -euo pipefail
CARGO_VER="$(python3 - << 'PY'
import re
txt = open("Cargo.toml", "r", encoding="utf-8").read()
m = re.search(r'(?m)^\s*version\s*=\s*"([^"]+)"\s*$', txt)
print(m.group(1) if m else "")
PY
)"
if [ -z "$CARGO_VER" ]; then
echo "Could not read version from Cargo.toml"
exit 1
fi
REF="${GITHUB_REF_NAME:-}"
SHA="${GITHUB_SHA:-}"
SHORT_SHA="${SHA:0:8}"
if [[ "$REF" == v* ]]; then
PKG_VERSION="${REF#v}"
else
PKG_VERSION="${CARGO_VER}+g${SHORT_SHA}"
fi
echo "cargo_version=$CARGO_VER" >> "$GITHUB_OUTPUT"
echo "pkg_version=$PKG_VERSION" >> "$GITHUB_OUTPUT"
- name: Create source tarball (code)
shell: bash
run: |
set -e
FULL="${GITHUB_REPOSITORY:-}"
if [ -z "$FULL" ]; then
echo "GITHUB_REPOSITORY is empty. Set it in runner env or switch to explicit OWNER/REPO vars."
exit 1
fi
OWNER="${FULL%%/*}"
REPO="${FULL##*/}"
PKG_VERSION="${{ steps.version_meta.outputs.pkg_version }}"
BIN_NAME="${{ steps.pkg_meta.outputs.pkg_name }}"
mkdir -p dist
# Clean source snapshot of the repository at current commit
git archive --format=tar.gz \
--prefix="${BIN_NAME}-${PKG_VERSION}/" \
-o "dist/${BIN_NAME}-${PKG_VERSION}-source.tar.gz" \
HEAD
ls -lh dist
# OPTIONAL: build binary and package it too
- name: Build (release)
shell: bash
run: |
set -e
cargo build --release
- name: Collect binary
shell: bash
run: |
set -e
FULL="${GITHUB_REPOSITORY:-}"
if [ -z "$FULL" ]; then
echo "GITHUB_REPOSITORY is empty. Set it in runner env or switch to explicit OWNER/REPO vars."
exit 1
fi
REPO="${FULL##*/}"
PKG_VERSION="${{ steps.version_meta.outputs.pkg_version }}"
BIN_NAME="${{ steps.pkg_meta.outputs.pkg_name }}"
mkdir -p dist
cp "target/release/${BIN_NAME}" "dist/${BIN_NAME}-${PKG_VERSION}-linux-x86_64"
chmod +x "dist/${BIN_NAME}-${PKG_VERSION}-linux-x86_64"
ls -lh dist
- name: Upload to Gitea Generic Packages
shell: bash
env:
BASE_URL: ${{ vars.BASE_URL }}
GITEA: ${{ secrets.GITEA }}
run: |
set -e
FULL="${GITHUB_REPOSITORY:-}"
if [ -z "$FULL" ]; then
echo "GITHUB_REPOSITORY is empty. Set it in runner env or switch to explicit OWNER/REPO vars."
exit 1
fi
OWNER="${FULL%%/*}"
REPO="${FULL##*/}"
PKG_VERSION="${{ steps.version_meta.outputs.pkg_version }}"
BIN_NAME="${{ steps.pkg_meta.outputs.pkg_name }}"
if [ -z "${BASE_URL:-}" ]; then
echo "Missing vars.BASE_URL (example: https://gitea.example.com)"
exit 1
fi
if [ -z "${GITEA:-}" ]; then
echo "Missing secrets.GITEA"
exit 1
fi
# Choose a package name (keep stable). Here: cargo package name.
PACKAGE_NAME="$BIN_NAME"
for FILE in dist/*; do
FILENAME="$(basename "$FILE")"
URL="${BASE_URL}/api/packages/${OWNER}/generic/${PACKAGE_NAME}/${PKG_VERSION}/${FILENAME}"
echo "Uploading $FILENAME -> $URL"
curl -fsS -X PUT \
-H "Authorization: token ${GITEA}" \
--upload-file "$FILE" \
"$URL"
done

1
.gitignore vendored
View File

@@ -1,4 +1,5 @@
.cargo/
.codex/
/target
/data/storage/*
/data/logs/*

114
CHANGELOG.md Normal file
View File

@@ -0,0 +1,114 @@
# Changelog
All notable changes to this project will be documented in this file.
The format is based on [Keep a Changelog](https://keepachangelog.com/en/1.1.0/),
and this project adheres to [Semantic Versioning](https://semver.org/spec/v2.0.0.html).
## [1.1.0] - 2026-01-25
### Added
- Shared HTML footer template injected into all pages.
- Structured log file handler with rotation, append-only writes, and stats parsing support.
- Upload size guard using `MAX_ASSET_SIZE_BYTES`.
- Unit tests for core storage and rate limiting behavior.
### Changed
- Asset storage cleanup and capacity checks are now race-safe.
- Stats aggregation now reads from the structured log file helper.
## [1.0.1] - 2026-01-16
### Fixed
- Asset addition logging now displays immediately instead of being buffered (changed `print!` to `println!`).
- Upload functionality blocked after first successful upload - users can now only upload once per page session.
- Paste, drag & drop, and file selection disabled after successful upload to prevent confusion.
- JavaScript syntax errors in event listener registration that prevented copy/paste functionality.
- Removed nested and duplicated event listeners that caused unexpected behavior.
### Changed
- Added `uploadCompleted` flag to track upload state and prevent multiple uploads per session.
- Reset button now properly clears the `uploadCompleted` flag to allow new uploads.
## [1.0.0] - 2026-01-14
### Added
- UI error banner for failed uploads, including retry timing.
- `retry_after_seconds` in the upload error response to inform clients when to retry.
- Server-side duration clamping for uploads (1-60 minutes).
### Changed
- Upload throttling now tracks active assets per user using asset expiration times.
- Upload error responses are consistently JSON.
### Removed
- `Retry-After` response header on upload limit errors.
## [0.3.0] - 2026-01-13
### Added
- Favicon set and web manifest for site branding.
- Syntax-highlighted rendering for code-like text content in the viewer.
- Startup log rotation that archives the previous log with a timestamp.
### Changed
- Access logs now write to `data/logs/log.txt` instead of `access.log`.
## [0.2.0] - 2026-01-11
### Added
- Default implementation for the asset model to simplify log parsing fallbacks.
- Basic UI polish for the stats page (background glow and hover highlight on recent activity).
### Changed
- Asset logging now records serialized values without cloning asset content.
- Release workflow uses tag-based versioning and caches Rust/toolchain artifacts.
## [0.1.1] - 2026-01-09
## [0.1.0] - 2026-01-09
### Added
- **Statistics Dashboard** (`/stats.html`) with real-time metrics:
- Active assets count
- Total uploads and deletions
- Storage usage
- Image vs text breakdown
- Average server response time
- Total request count
- Recent activity feed (last 20 events)
- Auto-refresh every 30 seconds
- **Statistics API** (`GET /api/stats`) returning JSON metrics
- **Enhanced logging** for asset events:
- Upload events with uploader IP, MIME type, size, duration, timestamps
- Delete events with full asset metadata
- Request timing (`dur_ms`) in access logs
- **Uploader IP tracking** stored with each asset for audit purposes
- Stats link in index page footer
- Ephemeral image and text sharing with configurable TTL (1-60 minutes)
- Drag/drop, paste, and file picker upload support
- Base64 encoding for images, raw text for plain text
- UUID-based asset storage as JSON files
- Background cleanup task (every 60 seconds)
- Dark theme UI with zoom overlay
- View page for shared content
- Access logging with timing, IPs, and user agent
- Docker and docker-compose support with Traefik labels
- Environment variables for bind address and port
- Access logging with timing, IPs, and user agent
- Docker and docker-compose support with Traefik labels
- Environment variables for bind address and port

85
Cargo.lock generated
View File

@@ -273,7 +273,7 @@ checksum = "812e12b5285cc515a9c72a5c1d3b6d46a19dac5acfef5265968c166106e31dd3"
[[package]]
name = "black_hole_share"
version = "0.1.0"
version = "1.1.0"
dependencies = [
"actix-files",
"actix-web",
@@ -281,6 +281,7 @@ dependencies = [
"base64",
"chrono",
"futures",
"mime_guess",
"serde",
"serde_json",
"tokio",
@@ -340,9 +341,9 @@ dependencies = [
[[package]]
name = "cc"
version = "1.2.51"
version = "1.2.52"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "7a0aeaff4ff1a90589618835a598e545176939b97874f7abc7851caa0618f203"
checksum = "cd4932aefd12402b36c60956a4fe0035421f544799057659ff86f923657aada3"
dependencies = [
"find-msvc-tools",
"jobserver",
@@ -503,15 +504,15 @@ dependencies = [
[[package]]
name = "find-msvc-tools"
version = "0.1.6"
version = "0.1.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "645cbb3a84e60b7531617d5ae4e57f7e27308f6445f5abf653209ea76dec8dff"
checksum = "f449e6c6c08c865631d4890cfacf252b3d396c9bcc83adb6623cdb02a8336c41"
[[package]]
name = "flate2"
version = "1.1.5"
version = "1.1.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "bfe33edd8e85a12a67454e37f8c75e730830d83e313556ab9ebf9ee7fbeb3bfb"
checksum = "b375d6465b98090a5f25b1c7703f3859783755aa9a80433b36e0379a3ec2f369"
dependencies = [
"crc32fast",
"miniz_oxide",
@@ -837,9 +838,9 @@ checksum = "e8a5a9a0ff0086c7a148acb942baaabeadf9504d10400b5a05645853729b9cd2"
[[package]]
name = "indexmap"
version = "2.12.1"
version = "2.13.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "0ad4bb2b565bca0645f4d68c5c9af97fba094e9791da685bf83cb5f3ce74acf2"
checksum = "7714e70437a7dc3ac8eb7e6f8df75fd8eb422675fc7678aff7364301092b1017"
dependencies = [
"equivalent",
"hashbrown",
@@ -879,9 +880,9 @@ checksum = "d4345964bb142484797b161f473a503a434de77149dd8c7427788c6e13379388"
[[package]]
name = "libc"
version = "0.2.178"
version = "0.2.180"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "37c93d8daa9d8a012fd8ab92f088405fb202ea0b6ab73ee2482ae66af4f42091"
checksum = "bcc35a38544a891a5f7c865aca548a982ccb3b8650a5b06d0fd33a10283c56fc"
[[package]]
name = "litemap"
@@ -1059,18 +1060,18 @@ dependencies = [
[[package]]
name = "proc-macro2"
version = "1.0.104"
version = "1.0.105"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "9695f8df41bb4f3d222c95a67532365f569318332d03d5f3f67f37b20e6ebdf0"
checksum = "535d180e0ecab6268a3e718bb9fd44db66bbbc256257165fc699dadf70d16fe7"
dependencies = [
"unicode-ident",
]
[[package]]
name = "quote"
version = "1.0.42"
version = "1.0.43"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "a338cc41d27e6cc6dce6cefc13a0729dfbb81c262b1f519331575dd80ef3067f"
checksum = "dc74d9a594b72ae6656596548f56f667211f8a97b3d4c3d467150794690dc40a"
dependencies = [
"proc-macro2",
]
@@ -1103,9 +1104,9 @@ dependencies = [
[[package]]
name = "rand_core"
version = "0.9.3"
version = "0.9.5"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "99d9a13982dcf210057a8a78572b2217b667c3beacbf3a0d8b454f6f82837d38"
checksum = "76afc826de14238e6e8c374ddcc1fa19e374fd8dd986b0d2af0d02377261d83c"
dependencies = [
"getrandom",
]
@@ -1219,9 +1220,9 @@ dependencies = [
[[package]]
name = "serde_json"
version = "1.0.148"
version = "1.0.149"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "3084b546a1dd6289475996f182a22aba973866ea8e8b02c51d9f46b1336a22da"
checksum = "83fc039473c5595ace860d8c4fafa220ff474b3fc6bfdb4293327f1a37e94d86"
dependencies = [
"itoa",
"memchr",
@@ -1315,9 +1316,9 @@ checksum = "6ce2be8dc25455e1f91df71bfa12ad37d7af1092ae736f3a6cd0e37bc7810596"
[[package]]
name = "syn"
version = "2.0.112"
version = "2.0.114"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "21f182278bf2d2bcb3c88b1b08a37df029d71ce3d3ae26168e3c653b213b99d4"
checksum = "d4d107df263a3013ef9b1879b0df87d706ff80f65a86ea879bd9c31f9b307c2a"
dependencies = [
"proc-macro2",
"quote",
@@ -1337,30 +1338,30 @@ dependencies = [
[[package]]
name = "time"
version = "0.3.44"
version = "0.3.45"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "91e7d9e3bb61134e77bde20dd4825b97c010155709965fedf0f49bb138e52a9d"
checksum = "f9e442fc33d7fdb45aa9bfeb312c095964abdf596f7567261062b2a7107aaabd"
dependencies = [
"deranged",
"itoa",
"num-conv",
"powerfmt",
"serde",
"serde_core",
"time-core",
"time-macros",
]
[[package]]
name = "time-core"
version = "0.1.6"
version = "0.1.7"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "40868e7c1d2f0b8d73e4a8c7f0ff63af4f6d19be117e90bd73eb1d62cf831c6b"
checksum = "8b36ee98fd31ec7426d599183e8fe26932a8dc1fb76ddb6214d05493377d34ca"
[[package]]
name = "time-macros"
version = "0.2.24"
version = "0.2.25"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "30cfb0125f12d9c277f35663a0a33f8c30190f4e4574868a330595412d34ebf3"
checksum = "71e552d1249bf61ac2a52db88179fd0673def1e1ad8243a00d9ec9ed71fee3dd"
dependencies = [
"num-conv",
"time-core",
@@ -1378,9 +1379,9 @@ dependencies = [
[[package]]
name = "tokio"
version = "1.48.0"
version = "1.49.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "ff360e02eab121e0bc37a2d3b4d4dc622e6eda3a8e5253d5435ecf5bd4c68408"
checksum = "72a2903cd7736441aac9df9d7688bd0ce48edccaadf181c3b90be801e81d3d86"
dependencies = [
"bytes",
"libc",
@@ -1406,9 +1407,9 @@ dependencies = [
[[package]]
name = "tokio-util"
version = "0.7.17"
version = "0.7.18"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "2efa149fe76073d6e8fd97ef4f4eca7b67f599660115591483572e406e165594"
checksum = "9ae9cec805b01e8fc3fd2fe289f89149a9b66dd16786abd8b19cfa7b48cb0098"
dependencies = [
"bytes",
"futures-core",
@@ -1457,9 +1458,9 @@ checksum = "562d481066bde0658276a35467c4af00bdc6ee726305698a55b86e61d7ad82bb"
[[package]]
name = "unicase"
version = "2.8.1"
version = "2.9.0"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "75b844d17643ee918803943289730bec8aac480150456169e647ed0b576ba539"
checksum = "dbc4bc3a9f746d862c45cb89d705aa10f187bb96c76001afab07a0d35ce60142"
[[package]]
name = "unicode-ident"
@@ -1481,9 +1482,9 @@ checksum = "ebc1c04c71510c7f702b52b7c350734c9ff1295c464a03335b00bb84fc54f853"
[[package]]
name = "url"
version = "2.5.7"
version = "2.5.8"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "08bc136a29a3d1758e07a9cca267be308aeebf5cfd5a10f3f67ab2097683ef5b"
checksum = "ff67a8a4397373c3ef660812acab3268222035010ab8680ec4215f38ba3d0eed"
dependencies = [
"form_urlencoded",
"idna",
@@ -1832,18 +1833,18 @@ dependencies = [
[[package]]
name = "zerocopy"
version = "0.8.31"
version = "0.8.33"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "fd74ec98b9250adb3ca554bdde269adf631549f51d8a8f8f0a10b50f1cb298c3"
checksum = "668f5168d10b9ee831de31933dc111a459c97ec93225beb307aed970d1372dfd"
dependencies = [
"zerocopy-derive",
]
[[package]]
name = "zerocopy-derive"
version = "0.8.31"
version = "0.8.33"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "d8a8d209fdf45cf5138cbb5a506f6b52522a25afccc534d1475dad8e31105c6a"
checksum = "2c7962b26b0a8685668b671ee4b54d007a67d4eaf05fda79ac0ecf41e32270f1"
dependencies = [
"proc-macro2",
"quote",
@@ -1906,9 +1907,9 @@ dependencies = [
[[package]]
name = "zmij"
version = "1.0.7"
version = "1.0.14"
source = "registry+https://github.com/rust-lang/crates.io-index"
checksum = "de9211a9f64b825911bdf0240f58b7a8dac217fe260fc61f080a07f61372fbd5"
checksum = "bd8f3f50b848df28f887acb68e41201b5aea6bc8a8dacc00fb40635ff9a72fea"
[[package]]
name = "zstd"

View File

@@ -1,6 +1,6 @@
[package]
name = "black_hole_share"
version = "0.1.0"
version = "1.1.0"
edition = "2024"
[dependencies]
@@ -11,11 +11,7 @@ chrono = "0.4"
futures = "0.3.31"
serde = { version = "1.0.228", features = ["derive"] }
serde_json = "1.0.148"
tokio = { version = "1.48.0", features = [
"macros",
"rt-multi-thread",
"signal",
"time",
] }
tokio = { version = "1.48.0", features = ["fs", "macros", "rt-multi-thread", "signal", "time"] }
uuid = { version = "1.19.0", features = ["v4"] }
base64 = "0.22.1"
mime_guess = "2.0.5"

View File

@@ -20,5 +20,14 @@ RUN pacman -Syu --noconfirm --needed \
RUN curl https://sh.rustup.rs -sSf | sh -s -- -y
ENV PATH="/root/.cargo/bin:${PATH}"
WORKDIR /data
COPY src /opt/bhs/src
COPY Cargo.toml /opt/bhs/Cargo.toml
COPY Cargo.lock /opt/bhs/Cargo.lock
WORKDIR /opt/bhs
RUN ls -al ./
RUN cargo build --release
RUN cp ./target/release/black_hole_share /usr/local/bin/black_hole_share
WORKDIR /
CMD [ "black_hole_share" ]

21
LICENSE Normal file
View File

@@ -0,0 +1,21 @@
MIT License
Copyright (c) 2026 Black Hole Share Contributors
Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:
The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.

View File

@@ -0,0 +1,45 @@
# Black Hole Share
A lightweight, ephemeral file sharing service built with Rust and Actix-Web. Upload images or text with a configurable TTL (1-60 minutes) and share via a unique link. Content is automatically purged after expiration.
## Usage
### Run locally
```bash
cargo run --release
```
Server starts at `http://0.0.0.0:8080` by default.
### Run with Docker
```bash
docker-compose up --build
```
Exposes port `8080` mapped to container port `80`. Volume mounts `./data:/data`.
### Configuration
| Environment Variable | Default | Description |
| -------------------- | --------- | --------------- |
| `BIND_ADDR` | `0.0.0.0` | Address to bind |
| `BIND_PORT` | `8080` | Port to bind |
### Web
- `GET /` - Upload page
- `GET /stats` - Stats dashboard
- `GET /bhs/{id}` - View shared content
### API
- `POST /api/upload` with JSON `{ duration, content_type, content }`
- `GET /api/content/{id}`
- `GET /api/stats`
### Logging
- Logs are written to `data/logs/log.txt`.
- On startup, the previous log file is rotated with a timestamped name.

Binary file not shown.

After

Width:  |  Height:  |  Size: 63 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 347 KiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 56 KiB

30
data/html/error.html Normal file
View File

@@ -0,0 +1,30 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Black Hole Share - Error</title>
<link rel="stylesheet" href="/style.css" />
</head>
<body class="view-page error-page">
<h1><a href="/" class="home-link">Black Hole Share</a> - Error</h1>
<div class="view-container">
<div class="content-area">
<div class="error-content">
<div class="error-code">404</div>
<p class="error-message">The page you're looking for vanished into the black hole.</p>
<div class="error-actions">
<a class="upload-btn action-btn" href="/">Go Home</a>
<a class="reset-btn action-btn" href="/stats">View Stats</a>
</div>
</div>
</div>
</div>
{{FOOTER}}
</body>
</html>

BIN
data/html/favicon-16x16.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 844 B

BIN
data/html/favicon-32x32.png Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 2.6 KiB

BIN
data/html/favicon.ico Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 866 B

16
data/html/footer.html Normal file
View File

@@ -0,0 +1,16 @@
<footer class="powered-by" style="display: flex; align-items: center">
<span style="flex: 1; text-align: left">
<span style="
color: var(--text-secondary);
font-size: 0.8em;
">{{VERSION}}</span>
</span>
<span>Powered by: <img src="/logo.png" alt="ICSBox" class="footer-logo" /></span>
<span style="flex: 1; text-align: right">
<a href="/stats" style="
color: var(--text-secondary);
font-size: 0.8em;
text-decoration: none;
">📊 Stats</a>
</span>
</footer>

View File

@@ -2,73 +2,100 @@
<html lang="en">
<head>
<meta charset="UTF-8">
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Image Upload</title>
<link rel="stylesheet" href="style.css">
<link rel="stylesheet" href="style.css" />
<link rel="icon" href="/favicon.ico" sizes="any">
<link rel="icon" type="image/png" sizes="16x16" href="/favicon-16x16.png">
<link rel="icon" type="image/png" sizes="32x32" href="/favicon-32x32.png">
<link rel="apple-touch-icon" sizes="180x180" href="/apple-touch-icon.png">
<link rel="manifest" href="/site.webmanifest">
<meta name="theme-color" content="#000000">
</head>
<body>
<h1>Black Hole Share</h1>
<h1><a href="/" class="home-link">Black Hole Share</a></h1>
<div class="upload-container">
<div class="upload-area">
<input type="file" id="fileInput" accept="image/*" style="display: none;">
<input type="file" id="fileInput" accept="image/*" style="display: none" />
<div id="uploadZone" class="upload-zone">
<p>Click to select file, paste image data, or drag & drop</p>
<p>Click to select file, paste image, text data, or drag & drop</p>
</div>
</div>
</div>
<div id="uploadError" class="upload-error" style="display: none" role="status" aria-live="polite"></div>
<div class="duration-container">
<label for="durationSlider">Duration: <span id="durationValue">5</span> min</label>
<input type="range" id="durationSlider" min="1" max="60" value="5" step="1">
<input type="range" id="durationSlider" min="1" max="60" value="5" step="1" />
<div class="button-row">
<button id="resetBtn" class="reset-btn" style="display: none;">Reset</button>
<button id="uploadBtn" class="upload-btn" style="display: none;">Upload</button>
<button id="resetBtn" class="reset-btn" style="display: none">
Reset
</button>
<button id="uploadBtn" class="upload-btn" style="display: none">
Upload
</button>
</div>
<div id="linkContainer" class="link-container" style="display: none;">
<div id="linkContainer" class="link-container" style="display: none">
<p>Link:</p>
<a id="uploadedLink" href="#" target="_blank"></a>
<p id="clipboardMessage" class="clipboard-message" style="display: none;"></p>
<p id="clipboardMessage" class="clipboard-message" style="display: none"></p>
</div>
</div>
<footer class="powered-by">
<span>Powered by: <img src="logo.png" alt="ICSBox" class="footer-logo"></span>
</footer>
{{FOOTER}}
<!-- Zoom overlay -->
<div id="zoomOverlay" class="zoom-overlay" style="display: none;">
</div>
<div id="zoomOverlay" class="zoom-overlay" style="display: none"></div>
<script>
let currentContentData = null;
const fileInput = document.getElementById('fileInput');
const uploadZone = document.getElementById('uploadZone');
const uploadContainer = document.querySelector('.upload-container');
const durationSlider = document.getElementById('durationSlider');
const durationValue = document.getElementById('durationValue');
const uploadBtn = document.getElementById('uploadBtn');
const resetBtn = document.getElementById('resetBtn');
const zoomOverlay = document.getElementById('zoomOverlay');
const linkContainer = document.getElementById('linkContainer');
const uploadedLink = document.getElementById('uploadedLink');
const clipboardMessage = document.getElementById('clipboardMessage');
let uploadCompleted = false;
const fileInput = document.getElementById("fileInput");
const uploadZone = document.getElementById("uploadZone");
const uploadContainer = document.querySelector(".upload-container");
const durationSlider = document.getElementById("durationSlider");
const durationValue = document.getElementById("durationValue");
const uploadBtn = document.getElementById("uploadBtn");
const resetBtn = document.getElementById("resetBtn");
const zoomOverlay = document.getElementById("zoomOverlay");
const linkContainer = document.getElementById("linkContainer");
const uploadedLink = document.getElementById("uploadedLink");
const clipboardMessage = document.getElementById("clipboardMessage");
const uploadError = document.getElementById("uploadError");
function formatRetryAfter(seconds) {
const safeSeconds = Math.max(0, Math.floor(seconds));
const minutes = Math.floor(safeSeconds / 60);
const remainder = safeSeconds % 60;
if (minutes > 0) {
return `${minutes}m ${remainder}s`;
}
return `${remainder}s`;
}
// Update duration display
durationSlider.addEventListener('input', function () {
durationSlider.addEventListener("input", function () {
durationValue.textContent = this.value;
});
uploadBtn.addEventListener('click', async () => {
// fischi20 thanks!!!
durationSlider.addEventListener("wheel", (e) => {
e.preventDefault();
durationSlider.valueAsNumber += e.deltaY < 0 ? -1 : 1;
durationValue.textContent = durationSlider.value;
});
uploadBtn.addEventListener("click", async () => {
const duration = durationSlider.value;
const isText = uploadZone.querySelector('.text-content') !== null;
const mimeType = isText ? 'text/plain' : 'image/png';
const isText = uploadZone.querySelector(".text-content") !== null;
const mimeType = isText ? "text/plain" : "image/png";
const contentData = currentContentData;
if (!contentData) {
console.log('❌ No content to upload!');
console.log("❌ No content to upload!");
return;
}
@@ -76,104 +103,139 @@
});
async function sendUpload(duration, mimeType, contentData) {
const isText = mimeType === 'text/plain';
const isText = mimeType === "text/plain";
let content = contentData;
// For images, remove data URL prefix to send only base64 string
if (!isText && contentData.includes('base64,')) {
content = contentData.split('base64,')[1];
if (!isText && contentData.includes("base64,")) {
content = contentData.split("base64,")[1];
}
const payload = {
duration: parseInt(duration),
content_type: mimeType,
content: content
content: content,
};
try {
const response = await fetch('/api/upload', {
method: 'POST',
uploadError.style.display = "none";
uploadError.textContent = "";
const response = await fetch("/api/upload", {
method: "POST",
headers: {
'Content-Type': 'application/json',
"Content-Type": "application/json",
},
body: JSON.stringify(payload)
body: JSON.stringify(payload),
});
const result = await response.json();
console.log(`✅ Upload received!\n${JSON.stringify(result, null, 2)}`);
let result = null;
try {
result = await response.json();
} catch (parseError) {
result = null;
}
if (!response.ok) {
const retryAfterSeconds = result && Number.isFinite(Number(result.retry_after_seconds))
? Number(result.retry_after_seconds)
: null;
let errorMessage =
(result && result.error) ||
`Upload failed (${response.status})`;
if (retryAfterSeconds !== null) {
errorMessage += ` Try again in ${formatRetryAfter(retryAfterSeconds)}.`;
}
uploadError.textContent = errorMessage;
uploadError.style.display = "block";
return;
}
if (!result || !result.link) {
uploadError.textContent = "Upload failed (invalid response)";
uploadError.style.display = "block";
return;
}
// Mark upload as completed to prevent further pastes
uploadCompleted = true;
// Hide duration controls and buttons
document.querySelector('label[for="durationSlider"]').style.display = 'none';
durationSlider.style.display = 'none';
uploadBtn.style.display = 'none';
resetBtn.style.display = 'none';
document.querySelector('label[for="durationSlider"]').style.display =
"none";
durationSlider.style.display = "none";
uploadBtn.style.display = "none";
resetBtn.style.display = "none";
// Show link
const fullLink = window.location.origin + result.link;
uploadedLink.href = fullLink;
uploadedLink.textContent = fullLink;
linkContainer.style.display = 'block';
linkContainer.style.display = "block";
// Copy to clipboard
try {
await navigator.clipboard.writeText(fullLink);
clipboardMessage.textContent = '✓ Copied to clipboard!';
clipboardMessage.style.color = 'var(--accent-green)';
clipboardMessage.style.cursor = 'default';
clipboardMessage.style.display = 'block';
clipboardMessage.textContent = "✓ Copied to clipboard!";
clipboardMessage.style.color = "var(--accent-green)";
clipboardMessage.style.cursor = "default";
clipboardMessage.style.display = "block";
clipboardMessage.onclick = null;
} catch (error) {
clipboardMessage.textContent = '⚠ Click here to copy link';
clipboardMessage.style.color = 'var(--accent-cyan)';
clipboardMessage.style.cursor = 'pointer';
clipboardMessage.style.display = 'block';
clipboardMessage.textContent = "⚠ Click here to copy link";
clipboardMessage.style.color = "var(--accent-cyan)";
clipboardMessage.style.cursor = "pointer";
clipboardMessage.style.display = "block";
clipboardMessage.onclick = function () {
const textArea = document.createElement('textarea');
const textArea = document.createElement("textarea");
textArea.value = fullLink;
textArea.style.position = 'fixed';
textArea.style.left = '-999999px';
textArea.style.position = "fixed";
textArea.style.left = "-999999px";
document.body.appendChild(textArea);
textArea.select();
try {
document.execCommand('copy');
clipboardMessage.textContent = '✓ Copied to clipboard!';
clipboardMessage.style.color = 'var(--accent-green)';
clipboardMessage.style.cursor = 'default';
document.execCommand("copy");
clipboardMessage.textContent = "✓ Copied to clipboard!";
clipboardMessage.style.color = "var(--accent-green)";
clipboardMessage.style.cursor = "default";
clipboardMessage.onclick = null;
} catch (e) {
clipboardMessage.textContent = '✗ Copy failed';
clipboardMessage.style.color = '#ff6666';
clipboardMessage.textContent = "✗ Copy failed";
clipboardMessage.style.color = "#ff6666";
}
document.body.removeChild(textArea);
};
}
} catch (error) {
console.log(`❌ Error: ${error.message}`);
uploadError.textContent = `Upload failed (${error.message})`;
uploadError.style.display = "block";
}
}
// Reset to initial state
resetBtn.addEventListener('click', function () {
resetBtn.addEventListener("click", function () {
currentContentData = null;
uploadZone.innerHTML = '<p>Click to select file, paste image data, or drag & drop</p>';
uploadContainer.style.height = '180px';
uploadContainer.style.pointerEvents = '';
uploadContainer.style.overflow = '';
uploadZone.style.pointerEvents = '';
uploadZone.style.alignItems = '';
uploadZone.style.justifyContent = '';
uploadZone.style.padding = '';
fileInput.value = '';
durationSlider.value = '5';
durationValue.textContent = '5';
document.querySelector('label[for="durationSlider"]').style.display = '';
durationSlider.style.display = '';
uploadBtn.style.display = 'none';
resetBtn.style.display = 'none';
linkContainer.style.display = 'none';
clipboardMessage.style.display = 'none';
uploadCompleted = false;
uploadZone.innerHTML =
"<p>Click to select file, paste image data, or drag & drop</p>";
uploadContainer.style.height = "180px";
uploadContainer.style.pointerEvents = "";
uploadContainer.style.overflow = "";
uploadZone.style.pointerEvents = "";
uploadZone.style.alignItems = "";
uploadZone.style.justifyContent = "";
uploadZone.style.padding = "";
fileInput.value = "";
durationSlider.value = "5";
durationValue.textContent = "5";
document.querySelector('label[for="durationSlider"]').style.display =
"";
durationSlider.style.display = "";
uploadBtn.style.display = "none";
resetBtn.style.display = "none";
linkContainer.style.display = "none";
clipboardMessage.style.display = "none";
uploadError.style.display = "none";
uploadError.textContent = "";
uploadZone.focus();
});
@@ -183,42 +245,47 @@
if (isText) {
// Display text content - ZOOM ENABLED
uploadZone.innerHTML = `<div class="text-content" style="cursor: zoom-in;">${content.replace(/</g, '&lt;').replace(/>/g, '&gt;')}</div>`;
uploadContainer.style.height = '500px';
uploadContainer.style.overflow = 'hidden';
uploadZone.style.pointerEvents = 'auto';
uploadZone.style.alignItems = 'stretch';
uploadZone.style.justifyContent = 'stretch';
uploadZone.style.padding = '0';
uploadBtn.style.display = 'block';
resetBtn.style.display = 'block';
uploadZone.innerHTML = `<div class="text-content" style="cursor: zoom-in;">${content
.replace(/</g, "&lt;")
.replace(/>/g, "&gt;")}</div>`;
uploadContainer.style.height = "500px";
uploadContainer.style.overflow = "hidden";
uploadZone.style.pointerEvents = "auto";
uploadZone.style.alignItems = "stretch";
uploadZone.style.justifyContent = "stretch";
uploadZone.style.padding = "0";
uploadBtn.style.display = "block";
resetBtn.style.display = "block";
// ZOOM FOR TEXT
const textContent = uploadZone.querySelector('.text-content');
textContent.addEventListener('click', function (e) {
const textContent = uploadZone.querySelector(".text-content");
textContent.addEventListener("click", function (e) {
e.stopPropagation();
showZoom(content, true);
});
} else {
// Display image
const img = new Image();
img.onload = function () {
const maxWidth = 620;
const maxHeight = 800;
const scale = Math.min(maxWidth / img.width, maxHeight / img.height, 1);
const scale = Math.min(
maxWidth / img.width,
maxHeight / img.height,
1
);
const displayHeight = Math.floor(img.height * scale);
const displayWidth = Math.floor(img.width * scale);
uploadZone.innerHTML = `<img src="${content}" alt="Uploaded Image" style="width: ${displayWidth}px; height: ${displayHeight}px; object-fit: contain; cursor: zoom-in;">`;
uploadContainer.style.height = `${displayHeight + 20}px`;
uploadContainer.style.pointerEvents = 'none';
uploadZone.style.pointerEvents = 'auto';
uploadBtn.style.display = 'block';
resetBtn.style.display = 'block';
uploadContainer.style.pointerEvents = "none";
uploadZone.style.pointerEvents = "auto";
uploadBtn.style.display = "block";
resetBtn.style.display = "block";
const uploadedImg = uploadZone.querySelector('img');
uploadedImg.addEventListener('click', function (e) {
const uploadedImg = uploadZone.querySelector("img");
uploadedImg.addEventListener("click", function (e) {
e.stopPropagation();
showZoom(content, false);
});
@@ -228,13 +295,19 @@
}
// Open file picker on container click (ONLY IF EMPTY)
uploadContainer.addEventListener('click', function (e) {
if (uploadContainer.style.pointerEvents !== 'none' && !uploadZone.querySelector('.text-content') && !uploadZone.querySelector('img')) {
uploadContainer.addEventListener("click", function (e) {
if (
!uploadCompleted &&
uploadContainer.style.pointerEvents !== "none" &&
!uploadZone.querySelector(".text-content") &&
!uploadZone.querySelector("img")
) {
fileInput.click();
}
});
fileInput.addEventListener('change', function (e) {
fileInput.addEventListener("change", function (e) {
if (uploadCompleted) return;
const file = e.target.files[0];
if (file) {
const reader = new FileReader();
@@ -246,12 +319,16 @@
});
// Handle paste from clipboard
uploadZone.addEventListener('paste', function (e) {
uploadZone.addEventListener("paste", function (e) {
if (uploadCompleted) {
e.preventDefault();
return;
}
e.preventDefault();
const items = e.clipboardData.items;
for (let item of items) {
if (item.type.startsWith('image/')) {
if (item.type.startsWith("image/")) {
const file = item.getAsFile();
const reader = new FileReader();
reader.onload = function (event) {
@@ -262,20 +339,21 @@
}
}
const text = e.clipboardData.getData('text');
const text = e.clipboardData.getData("text");
if (text) {
displayContent(text, true);
}
});
// Handle drag and drop
uploadZone.addEventListener('drop', handleDrop);
uploadContainer.addEventListener('drop', handleDrop);
uploadZone.addEventListener("drop", handleDrop);
uploadContainer.addEventListener("drop", handleDrop);
function handleDrop(e) {
e.preventDefault();
if (uploadCompleted) return;
const file = e.dataTransfer.files[0];
if (file && file.type.startsWith('image/')) {
if (file && file.type.startsWith("image/")) {
const reader = new FileReader();
reader.onload = function (event) {
displayContent(event.target.result);
@@ -284,22 +362,22 @@
}
}
uploadZone.addEventListener('dragover', function (e) {
uploadZone.addEventListener("dragover", function (e) {
e.preventDefault();
uploadZone.style.borderColor = 'var(--border-hover)';
uploadZone.style.borderColor = "var(--border-hover)";
});
uploadZone.addEventListener('dragleave', function (e) {
uploadZone.style.borderColor = '';
uploadZone.addEventListener("dragleave", function (e) {
uploadZone.style.borderColor = "";
});
uploadContainer.addEventListener('dragover', function (e) {
uploadContainer.addEventListener("dragover", function (e) {
e.preventDefault();
});
uploadZone.setAttribute('tabindex', '0');
uploadZone.setAttribute("tabindex", "0");
window.addEventListener('focus', function () {
window.addEventListener("focus", function () {
uploadZone.focus();
});
@@ -309,28 +387,50 @@
function showZoom(content, isText = false) {
if (isText) {
zoomOverlay.innerHTML = `
<div class="zoom-text-content">${content.replace(/</g, '&lt;').replace(/>/g, '&gt;')}</div>
<div class="zoom-text-content">${content
.replace(/</g, "&lt;")
.replace(/>/g, "&gt;")}</div>
`;
} else {
zoomOverlay.innerHTML = `<img id="zoomImage" src="${content}" alt="Zoomed Image" style="max-width: 95vw; max-height: 95vh; object-fit: contain; box-shadow: 0 0 50px rgba(51, 204, 255, 0.5);">`;
}
zoomOverlay.style.display = 'flex';
zoomOverlay.style.display = "flex";
}
function hideZoom() {
zoomOverlay.style.display = 'none';
zoomOverlay.style.display = "none";
}
zoomOverlay.addEventListener('click', hideZoom);
zoomOverlay.addEventListener("click", hideZoom);
// ESC TO EXIT ZOOM
document.addEventListener('keydown', function (e) {
if (e.key === 'Escape' || e.key === 'Esc') {
document.addEventListener("keydown", function (e) {
if (e.key === "Escape" || e.key === "Esc") {
hideZoom();
}
});
window.addEventListener('resize', function () {
function canTriggerUpload() {
return (
currentContentData &&
window.getComputedStyle(uploadBtn).display !== "none" &&
zoomOverlay.style.display !== "flex"
);
}
// ENTER TO UPLOAD (when content is ready)
document.addEventListener(
"keydown",
function (e) {
if ((e.key === "Enter" || e.code === "NumpadEnter") && canTriggerUpload()) {
e.preventDefault();
uploadBtn.click();
}
},
true
);
window.addEventListener("resize", function () {
if (currentContentData) {
displayContent(currentContentData);
}

View File

@@ -0,0 +1,19 @@
{
"name": "Black Hole Share",
"short_name": "BHS",
"icons": [
{
"src": "/android-chrome-192x192.png",
"sizes": "192x192",
"type": "image/png"
},
{
"src": "/android-chrome-512x512.png",
"sizes": "512x512",
"type": "image/png"
}
],
"theme_color": "#000000",
"background_color": "#000000",
"display": "standalone"
}

285
data/html/stats.html Normal file
View File

@@ -0,0 +1,285 @@
<!DOCTYPE html>
<html lang="en">
<head>
<meta charset="UTF-8" />
<meta name="viewport" content="width=device-width, initial-scale=1.0" />
<title>Black Hole Share - Statistics</title>
<link rel="stylesheet" href="/style.css" />
<style>
.stats-layout {
display: grid;
grid-template-columns: minmax(0, 1fr) minmax(140px, 170px);
gap: 20px;
margin-top: 20px;
align-items: stretch;
}
.stats-grid {
display: grid;
grid-template-columns: repeat(3, minmax(160px, 1fr));
gap: 20px;
}
.stats-request-card {
height: 100%;
display: flex;
flex-direction: column;
justify-content: center;
align-items: center;
}
.stat-card {
background-color: var(--bg-secondary);
border: 2px solid var(--border-color);
border-radius: 12px;
padding: 20px;
text-align: center;
transition: all 0.3s ease;
}
.stat-card:hover {
border-color: var(--border-hover);
box-shadow: 0 4px 15px rgba(0, 255, 153, 0.2);
}
.stat-value {
font-size: 1.5em;
font-weight: bold;
color: var(--accent-cyan);
margin: 10px 0;
}
.stat-label {
color: var(--text-secondary);
font-size: 0.9em;
text-transform: uppercase;
letter-spacing: 1px;
}
.stat-card.highlight .stat-value {
color: var(--accent-green);
}
.recent-activity {
margin-top: 30px;
background-color: var(--bg-secondary);
border: 2px solid var(--border-color);
border-radius: 12px;
padding: 20px;
transition: all 0.3s ease;
}
.recent-activity h2 {
color: var(--accent-cyan);
margin: 0 0 15px 0;
font-size: 1.2em;
}
.recent-activity:hover {
border-color: var(--border-hover);
box-shadow: 0 4px 15px rgba(0, 255, 153, 0.2);
}
.activity-list {
max-height: 260px;
overflow-y: auto;
font-family: "JetBrains Mono", monospace;
font-size: 0.85em;
}
.activity-item {
padding: 8px 0;
border-bottom: 1px solid var(--inactive-gray);
display: grid;
grid-template-columns: 90px minmax(120px, 1fr) minmax(90px, 1fr) minmax(180px, 1fr);
align-items: center;
gap: 10px;
white-space: nowrap;
}
.activity-item:last-child {
border-bottom: none;
}
.activity-action {
padding: 2px 8px;
border-radius: 4px;
font-size: 0.8em;
font-weight: bold;
}
.activity-action.upload {
background-color: rgba(0, 255, 153, 0.2);
color: var(--accent-green);
}
.activity-action.delete {
background-color: rgba(255, 102, 102, 0.2);
color: #ff6666;
}
.activity-time {
color: var(--text-secondary);
white-space: nowrap;
}
.activity-details {
color: var(--text-primary);
display: contents;
}
.activity-mime {
text-align: left;
}
.activity-duration {
text-align: left;
}
.activity-time {
text-align: left;
}
.refresh-btn {
background-color: var(--border-color);
color: var(--bg-tertiary);
border: none;
padding: 10px 20px;
border-radius: 8px;
cursor: pointer;
font-weight: bold;
margin-top: 20px;
transition: all 0.2s ease;
}
.refresh-btn:hover {
background-color: var(--border-hover);
}
.loading {
text-align: center;
color: var(--text-secondary);
padding: 40px;
}
</style>
</head>
<body class="view-page">
<h1><a href="/" class="home-link">Black Hole Share</a> - Statistics</h1>
<div id="statsContent" class="loading">
<p>Loading statistics...</p>
</div>
{{FOOTER}}
<script>
async function loadStats() {
try {
const response = await fetch("/api/stats");
if (!response.ok) {
throw new Error("Failed to load stats");
}
const stats = await response.json();
renderStats(stats);
} catch (error) {
document.getElementById("statsContent").innerHTML = `
<p class="error">Failed to load statistics: ${error.message}</p>
`;
}
}
function formatBytes(bytes) {
if (bytes === 0) return "0 B";
const k = 1024;
const sizes = ["B", "KB", "MB", "GB"];
const i = Math.floor(Math.log(bytes) / Math.log(k));
return parseFloat((bytes / Math.pow(k, i)).toFixed(2)) + " " + sizes[i];
}
function formatTime(timestamp) {
const date = new Date(timestamp);
return date.toLocaleString("en-GB", {
year: "numeric",
month: "2-digit",
day: "2-digit",
hour: "2-digit",
minute: "2-digit",
second: "2-digit",
hour12: false,
});
}
function renderStats(stats) {
const html = `
<div class="stats-layout">
<div class="stats-grid">
<div class="stat-card highlight">
<div class="stat-label">Active Assets</div>
<div class="stat-value">${stats.active_assets}</div>
</div>
<div class="stat-card">
<div class="stat-label">Total Uploads</div>
<div class="stat-value">${stats.total_uploads}</div>
</div>
<div class="stat-card">
<div class="stat-label">Total Deleted</div>
<div class="stat-value">${stats.total_deleted}</div>
</div>
<div class="stat-card">
<div class="stat-label">Storage Used</div>
<div class="stat-value">${formatBytes(stats.storage_bytes)}</div>
</div>
<div class="stat-card">
<div class="stat-label">Images</div>
<div class="stat-value">${stats.image_count}</div>
</div>
<div class="stat-card">
<div class="stat-label">Text</div>
<div class="stat-value">${stats.text_count}</div>
</div>
</div>
<div class="stat-card stats-request-card">
<div class="stat-label">Total Server Requests</div>
<div class="stat-value">${stats.total_requests}</div>
</div>
</div>
<div class="recent-activity">
<h2>Recent Activity</h2>
<div class="activity-list">
${stats.recent_activity.length === 0
? '<p style="color: var(--text-secondary);">No recent activity</p>'
: stats.recent_activity
.map(
(item) => `
<div class="activity-item">
<span class="activity-action ${item.action}">${item.action
}</span>
<span class="activity-details">
<span class="activity-mime">${item.mime}</span>
<span class="activity-duration">${item.share_duration} min</span>
</span>
<span class="activity-time">${formatTime(item.timestamp)}</span>
</div>
`
)
.join("")
}
</div>
</div>
<button class="refresh-btn" onclick="loadStats()">Refresh</button>
`;
document.getElementById("statsContent").innerHTML = html;
}
loadStats();
// Auto-refresh every 30 seconds
setInterval(loadStats, 30000);
</script>
</body>
</html>

View File

@@ -3,6 +3,8 @@
--bg-primary: #1e1e2e;
--bg-secondary: #1a1a1a;
--bg-tertiary: #1a1a1a;
--bg-glow: rgba(51, 204, 255, 0.08);
--bg-glow-strong: rgba(0, 255, 153, 0.07);
--active-cyan: #33ccff;
--active-green: #00ff99;
--inactive-gray: #595959;
@@ -27,8 +29,13 @@ body {
height: 100vh;
margin: 0 auto;
padding: 20px;
padding-bottom: 140px;
padding-bottom: 80px;
background-color: var(--bg-tertiary);
background-image:
radial-gradient(1200px 800px at 10% -20%, var(--bg-glow), transparent 60%),
radial-gradient(900px 700px at 110% 0%, var(--bg-glow-strong), transparent 55%),
linear-gradient(180deg, rgba(30, 30, 46, 0.35), rgba(26, 26, 26, 0.85));
background-attachment: fixed;
color: var(--text-primary);
display: flex;
flex-direction: column;
@@ -84,6 +91,18 @@ h1 .home-link:hover {
transition: all 0.3s ease;
}
.upload-error {
margin: 12px 0 0 0;
padding: 10px 12px;
border: 1px solid #ff6666;
border-radius: 10px;
background-color: rgba(255, 102, 102, 0.12);
color: #ff6666;
font-size: 0.9em;
text-align: center;
box-shadow: 0 4px 12px rgba(255, 102, 102, 0.15);
}
.duration-container .button-row {
display: flex;
flex-direction: row;
@@ -356,6 +375,18 @@ h1 .home-link:hover {
scrollbar-width: thin;
}
.zoom-text-content.code-content {
background-color: var(--bg-primary);
border-color: var(--border-color);
overflow-x: auto;
white-space: pre;
}
.zoom-text-content.code-content code {
display: block;
white-space: pre;
}
.zoom-text-content::-webkit-scrollbar {
width: 8px;
}
@@ -386,7 +417,7 @@ h1 .home-link:hover {
align-items: center;
justify-content: center;
z-index: 9999;
cursor: zoom-out;
cursor: default;
padding: 20px;
box-sizing: border-box;
}
@@ -434,7 +465,7 @@ h1 .home-link:hover {
/* View page styles */
body.view-page {
width: 860px;
padding-bottom: 140px;
padding-bottom: 80px;
}
.view-container {
@@ -457,6 +488,8 @@ body.view-page {
border-top: 1px solid var(--border-color);
font-size: 0.9em;
color: var(--text-secondary);
width: 100%;
z-index: 10;
}
.powered-by .home-link {
@@ -512,6 +545,55 @@ body.view-page {
text-align: center;
}
/* Error page styles */
.error-page .content-area {
min-height: 320px;
}
.error-content {
display: flex;
flex-direction: column;
align-items: center;
gap: 12px;
text-align: center;
padding: 10px;
}
.error-code {
font-size: 3.2em;
font-weight: bold;
color: var(--accent-cyan);
text-shadow: 0 0 12px rgba(51, 204, 255, 0.4);
}
.error-message {
color: var(--text-secondary);
font-size: 1.05em;
margin: 0;
}
.error-actions {
display: flex;
gap: 12px;
flex-wrap: wrap;
justify-content: center;
align-items: center;
}
.action-btn {
text-decoration: none;
display: inline-flex;
align-items: center;
justify-content: center;
text-align: center;
min-width: 140px;
}
.error-actions .upload-btn,
.error-actions .reset-btn {
flex: 0 0 auto;
}
@keyframes pulse {
0%,
@@ -544,6 +626,18 @@ body.view-page {
border: none;
}
.text-content-view.code-content {
background-color: var(--bg-secondary);
border: 1px solid var(--border-color);
overflow-x: auto;
white-space: pre;
}
.text-content-view.code-content code {
display: block;
white-space: pre;
}
.text-content-view::-webkit-scrollbar {
width: 8px;
}

0
data/html/test/test.txt Normal file
View File

View File

@@ -6,6 +6,8 @@
<meta name="viewport" content="width=device-width, initial-scale=1.0">
<title>Black Hole Share - View</title>
<link rel="stylesheet" href="/style.css">
<link rel="stylesheet"
href="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/styles/atom-one-dark.min.css">
</head>
<body class="view-page">
@@ -17,13 +19,12 @@
</div>
</div>
<footer class="powered-by">
<span>Powered by: <img src="/logo.png" alt="ICSBox" class="footer-logo"></span>
</footer>
{{FOOTER}}
<!-- Zoom overlay -->
<div id="zoomOverlay" class="zoom-overlay" style="display: none;"></div>
<script src="https://cdnjs.cloudflare.com/ajax/libs/highlight.js/11.9.0/highlight.min.js"></script>
<script>
const contentArea = document.getElementById('contentArea');
const zoomOverlay = document.getElementById('zoomOverlay');
@@ -32,6 +33,28 @@
const pathParts = window.location.pathname.split('/');
const assetId = pathParts[pathParts.length - 1];
function escapeHtml(text) {
return text.replace(/&/g, '&amp;')
.replace(/</g, '&lt;')
.replace(/>/g, '&gt;');
}
function isCodeLike(text) {
const lines = text.split('\n');
if (lines.length < 2) {
return false;
}
const indicators = [
/;\s*$/,
/^\s*(fn|function|class|def|public|private|struct|enum|pub\s+struct)\b/,
/^\s*#\[/,
/=>|::|#include|import\s+\w+/,
/\{|\}|\(|\)|\[|\]/,
];
const indicatorHits = indicators.reduce((count, re) => count + (re.test(text) ? 1 : 0), 0);
return indicatorHits >= 2;
}
async function loadContent() {
try {
const response = await fetch(`/api/content/${assetId}`);
@@ -74,12 +97,23 @@
} else if (contentType.startsWith('text/')) {
// Display text
const text = await response.text();
contentArea.innerHTML = `<div class="text-content-view" style="cursor: zoom-in;">${text.replace(/</g, '&lt;').replace(/>/g, '&gt;')}</div>`;
const safeText = escapeHtml(text);
const isCode = isCodeLike(text);
const textHtml = isCode
? `<pre class="text-content-view code-content"><code>${safeText}</code></pre>`
: `<div class="text-content-view">${safeText}</div>`;
contentArea.innerHTML = textHtml;
if (isCode && window.hljs) {
contentArea.querySelectorAll('pre code').forEach((block) => {
window.hljs.highlightElement(block);
});
}
const textContent = contentArea.querySelector('.text-content-view');
textContent.addEventListener('click', function (e) {
e.stopPropagation();
showZoom(text, true);
showZoom(text, true, isCode);
});
} else {
contentArea.innerHTML = '<p class="error">Unsupported content type</p>';
@@ -91,11 +125,19 @@
}
}
function showZoom(content, isText = false) {
function showZoom(content, isText = false, isCode = false) {
if (isText) {
zoomOverlay.innerHTML = `
<div class="zoom-text-content">${content.replace(/</g, '&lt;').replace(/>/g, '&gt;')}</div>
`;
const safeText = escapeHtml(content);
const zoomClass = isCode ? 'zoom-text-content code-content' : 'zoom-text-content';
const zoomHtml = isCode
? `<pre class="${zoomClass}"><code>${safeText}</code></pre>`
: `<div class="${zoomClass}">${safeText}</div>`;
zoomOverlay.innerHTML = zoomHtml;
if (isCode && window.hljs) {
zoomOverlay.querySelectorAll('pre code').forEach((block) => {
window.hljs.highlightElement(block);
});
}
} else {
zoomOverlay.innerHTML = `<img id="zoomImage" src="${content}" alt="Zoomed Content"
style="max-width: 95vw; max-height: 95vh; object-fit: contain; box-shadow: 0 0 50px rgba(51, 204, 255, 0.5);">`;
@@ -107,8 +149,6 @@
zoomOverlay.style.display = 'none';
}
zoomOverlay.addEventListener('click', hideZoom);
document.addEventListener('keydown', function (e) {
if (e.key === 'Escape' || e.key === 'Esc') {
hideZoom();

View File

@@ -7,9 +7,27 @@ services:
volumes:
- ./data:/data
- /etc/localtime:/etc/localtime:ro
labels:
- "traefik.enable=true"
- "traefik.docker.network=vlan250"
- "traefik.http.routers.bhs.rule=Host(`bhs.qosnet.it`)"
- "traefik.http.routers.bhs.entrypoints=websecure"
- "traefik.http.routers.bhs.tls=true"
- "traefik.http.routers.bhs.tls.certresolver=le"
- "traefik.http.services.bhs.loadbalancer.server.port=80"
environment:
- TZ="Europe/Rome"
- TZ=Europe/Rome
- BIND_ADDR=0.0.0.0
- BIND_PORT=80
tty: true
stdin_open: true
ports:
- "8080:80"
networks:
- vlan250
networks:
vlan250:
external: true

View File

@@ -1,9 +1,16 @@
use actix_web::{HttpRequest, HttpResponse, get, post, web};
use base64::{Engine, engine::general_purpose};
use chrono::Utc;
use serde::Deserialize;
use serde_json::json;
use crate::{DATA_STORAGE, logs::log_to_file};
use crate::{MAX_ASSET_DURATION, MIN_ASSET_DURATION};
use crate::{
MAX_ASSET_SIZE_BYTES,
data_mgt::{AppState, Asset},
logs::LogEventType,
};
#[derive(Deserialize, Debug)]
pub struct UploadRequest {
@@ -13,38 +20,181 @@ pub struct UploadRequest {
}
#[post("/api/upload")]
async fn api_upload(req: web::Json<UploadRequest>) -> Result<HttpResponse, actix_web::Error> {
async fn api_upload(
req: HttpRequest,
body: web::Json<UploadRequest>,
app_state: web::Data<AppState>,
) -> Result<HttpResponse, actix_web::Error> {
// Check for rate limiting
let now = Utc::now().timestamp_millis();
let connection_info = req.connection_info();
let uploader_ip = connection_info
.realip_remote_addr()
.map(|s| s.to_string())
.or_else(|| connection_info.peer_addr().map(|value| value.to_string()))
.ok_or_else(|| actix_web::error::ErrorBadRequest("Cannot determine client ip"))?;
// Convert to bytes
let content_bytes = if req.content_type == "text/plain" {
req.content.as_bytes().to_vec() // UTF-8 bytes
let content_bytes = if body.content_type == "text/plain" {
body.content.as_bytes().to_vec()
} else {
// Decode base64 → bytes
general_purpose::STANDARD.decode(&req.content).unwrap()
match general_purpose::STANDARD.decode(&body.content) {
Ok(bytes) => bytes,
Err(_) => return Ok(HttpResponse::BadRequest().json(json!({ "error": "Invalid base64 content" }))),
}
};
let asset = crate::data_mgt::Asset::new(req.duration, req.content_type.clone(), content_bytes);
let id = asset
.save()
.map_err(|e| actix_web::error::ErrorInternalServerError(format!("Failed to save asset: {}", e)))?;
if content_bytes.len() > MAX_ASSET_SIZE_BYTES {
let error = json!({"error": "Asset too large"});
app_state
.log_file
.write_event(LogEventType::Error(error.clone()))
.await
.unwrap_or_else(|e| println!("Failed to log HTTP request: {}", e));
return Ok(HttpResponse::PayloadTooLarge().json(json!({
"error": "Asset too large"
})));
}
let clamped_duration = body.duration.clamp(MIN_ASSET_DURATION, MAX_ASSET_DURATION);
let asset_expiration_time = now + (clamped_duration as i64 * 60 * 1000);
let (allowed, retry_after_ms) = app_state
.connection_tracker
.check(&uploader_ip, asset_expiration_time)
.await;
if !allowed {
let retry_after_seconds = retry_after_ms.map(|ms| ((ms + 999) / 1000).max(1));
let response_body = match retry_after_seconds {
Some(seconds) => json!({ "error": "Upload limit exceeded", "retry_after_seconds": seconds }),
None => json!({ "error": "Upload limit exceeded" }),
};
// return Ok(HttpResponse::TooManyRequests().body("Upload limit exceeded"));
return Ok(HttpResponse::TooManyRequests().json(response_body));
}
let asset = crate::data_mgt::Asset::new(
clamped_duration,
body.content_type.clone(),
content_bytes,
Some(uploader_ip.clone()),
);
let id = asset.id();
match app_state.assets.add_asset(asset.clone(), &app_state.log_file).await {
Ok(_) => {
app_state
.log_file
.write_event(LogEventType::AssetUploaded(asset.to_value()))
.await
.unwrap_or_else(|e| println!("Failed to log HTTP request: {}", e));
let response_body = json!({ "link": format!("/bhs/{}", id) });
Ok(HttpResponse::Ok().json(response_body))
}
Err(e) => {
let error = json!({"error": format!("Failed to store asset: {}", e)});
app_state
.log_file
.write_event(LogEventType::Error(error.clone()))
.await
.unwrap_or_else(|e| println!("Failed to log HTTP request: {}", e));
Ok(HttpResponse::InternalServerError().json(error))
}
}
}
#[get("/api/content/{id}")]
async fn api_get_asset(req: HttpRequest, path: web::Path<String>) -> Result<HttpResponse, actix_web::Error> {
let now = std::time::Instant::now();
let id = path.into_inner();
let asset_path = format!("{}{}", DATA_STORAGE, id);
let data = std::fs::read(&asset_path).map_err(|_| actix_web::error::ErrorNotFound("Asset not found"))?;
let asset = serde_json::from_slice::<crate::data_mgt::Asset>(&data)
.map_err(|_| actix_web::error::ErrorInternalServerError("Failed to parse asset data"))?;
if asset.is_expired() {
return Err(actix_web::error::ErrorNotFound("Asset has expired"));
async fn api_get_asset(
req: HttpRequest,
path: web::Path<String>,
app_state: web::Data<AppState>,
) -> Result<HttpResponse, actix_web::Error> {
app_state
.log_file
.write_event(LogEventType::HttpRequest(req.into()))
.await
.unwrap_or_else(|e| println!("Failed to log HTTP request: {}", e));
match app_state.assets.get_asset(&path.into_inner()).await {
None => Ok(HttpResponse::NotFound().body("Asset not found")),
Some(asset) => Ok(HttpResponse::Ok().content_type(asset.mime()).body(asset.content())),
}
}
log_to_file(&req, now);
Ok(HttpResponse::Ok().content_type(asset.mime()).body(asset.content()))
#[derive(serde::Serialize)]
struct StatsResponse {
active_assets: usize,
total_uploads: usize,
total_deleted: usize,
storage_bytes: u64,
image_count: usize,
text_count: usize,
total_requests: usize,
recent_activity: Vec<ActivityItem>,
}
#[derive(serde::Serialize)]
struct ActivityItem {
action: String,
mime: String,
share_duration: u32,
timestamp: String,
}
#[get("/api/stats")]
async fn api_stats(app_state: web::Data<AppState>) -> Result<HttpResponse, actix_web::Error> {
let (active_assets, storage_bytes, image_count, text_count) = app_state.assets.stats_summary().await;
let mut total_uploads = 0;
let mut total_deleted = 0;
let mut recent_activity: Vec<ActivityItem> = Vec::new();
let mut request_count: usize = 0;
let log_events = app_state.log_file.read_events().await.unwrap_or_default();
for line in log_events {
match line.event {
LogEventType::HttpRequest(_req) => {
request_count += 1;
}
LogEventType::AssetUploaded(asset) => {
let asset = serde_json::from_value::<Asset>(asset).unwrap_or_default();
total_uploads += 1;
recent_activity.push(ActivityItem {
action: "upload".to_string(),
mime: asset.mime(),
share_duration: asset.share_duration(),
timestamp: line.time,
});
}
LogEventType::AssetDeleted(asset) => {
let asset = serde_json::from_value::<Asset>(asset).unwrap_or_default();
total_deleted += 1;
recent_activity.push(ActivityItem {
action: "delete".to_string(),
mime: asset.mime(),
share_duration: asset.share_duration(),
timestamp: line.time,
});
}
LogEventType::Error(_event) => {}
}
}
// Keep only last 20, most recent first
recent_activity.reverse();
recent_activity.truncate(20);
let response = StatsResponse {
active_assets,
total_uploads,
total_deleted,
storage_bytes,
image_count,
text_count,
total_requests: request_count,
recent_activity,
};
Ok(HttpResponse::Ok().json(response))
}

View File

@@ -1,21 +1,31 @@
use std::sync::Arc;
use std::{collections::HashMap, fmt::Debug};
use anyhow::Result;
use chrono::{Duration, Utc};
use futures::lock::Mutex;
use serde::{Deserialize, Serialize};
use serde_json::Value;
use crate::DATA_STORAGE;
use crate::MAX_ASSETS;
use crate::logs::LogFile;
use crate::{MAX_UPLOADS_PER_USER, logs::LogEventType};
#[derive(Debug, Serialize, Deserialize, Default)]
#[derive(Debug, Serialize, Deserialize, Clone, Default)]
pub struct Asset {
id: String,
share_duration: u32,
created_at: i64,
expires_at: i64,
mime: String,
#[serde(skip)]
content: Vec<u8>,
uploader_ip: Option<String>,
}
#[allow(dead_code)]
impl Asset {
pub fn new(share_duration: u32, mime: String, content: Vec<u8>) -> Self {
pub fn new(share_duration: u32, mime: String, content: Vec<u8>, uploader_ip: Option<String>) -> Self {
let id = uuid::Uuid::new_v4().to_string();
let created_at = Utc::now().timestamp_millis();
let expires_at = created_at + Duration::minutes(share_duration as i64).num_milliseconds();
@@ -26,51 +36,215 @@ impl Asset {
expires_at,
mime,
content,
uploader_ip,
}
}
pub fn is_expired(&self) -> bool {
Utc::now().timestamp_millis() > self.expires_at
}
pub fn id(&self) -> &str {
&self.id
pub fn id(&self) -> String {
self.id.clone()
}
pub fn mime(&self) -> &str {
&self.mime
pub fn mime(&self) -> String {
self.mime.clone()
}
pub fn content(&self) -> Vec<u8> {
self.content.clone()
}
pub fn share_duration(&self) -> u32 {
self.share_duration
}
pub fn created_at(&self) -> i64 {
self.created_at
}
pub fn expires_at(&self) -> i64 {
self.expires_at
}
pub fn mime_type(&self) -> &str {
&self.mime
}
pub fn size_bytes(&self) -> usize {
self.content.len()
}
pub fn uploader_ip(&self) -> Option<&str> {
self.uploader_ip.as_deref()
}
pub fn to_bytes(&self) -> Result<Vec<u8>> {
let bytes = serde_json::to_vec(self)?;
Ok(bytes)
}
pub fn save(&self) -> Result<String> {
let id = self.id.clone();
let path = format!("{}{}", DATA_STORAGE, self.id);
std::fs::create_dir_all(DATA_STORAGE)?;
std::fs::write(&path, self.to_bytes()?)?;
Ok(id)
pub fn to_value(&self) -> Value {
serde_json::to_value(self).unwrap_or(Value::Null)
}
}
pub async fn clear_assets() -> Result<()> {
let entries = std::fs::read_dir(DATA_STORAGE)?;
for entry in entries {
let entry = entry?;
let path = entry.path();
if path.is_file() {
let data = std::fs::read(&path)?;
let asset = serde_json::from_slice::<Asset>(&data)?;
if asset.is_expired() {
println!("Removing expired asset: {}", asset.id());
std::fs::remove_file(&path)?;
#[derive(Clone, Debug)]
pub struct AppState {
pub assets: AssetStorage,
pub connection_tracker: RateLimiter,
pub log_file: LogFile,
}
#[derive(Clone, Debug, Default)]
pub struct AssetStorage {
assets: Arc<Mutex<Vec<Asset>>>,
}
#[allow(dead_code)]
impl AssetStorage {
pub fn new() -> Self {
Self {
assets: Arc::new(Mutex::new(Vec::with_capacity(MAX_ASSETS))),
}
}
pub async fn add_asset(&self, asset: Asset, log: &LogFile) -> Result<()> {
let now = chrono::Utc::now();
let mut removed: Vec<Asset> = Vec::new();
{
let mut assets = self.assets.lock().await;
let removed_iter = assets.extract_if(.., |a| a.is_expired());
removed.extend(removed_iter);
if assets.len() >= MAX_ASSETS {
return Err(anyhow::anyhow!("Asset storage full"));
}
println!("[{}] Adding asset: {}", now.to_rfc3339(), asset.id());
assets.push(asset);
}
for asset in removed {
println!("[{}] Removing asset: {}", now.to_rfc3339(), asset.id());
log.write_event(LogEventType::AssetDeleted(asset.to_value())).await?;
}
Ok(())
}
async fn push_asset(&self, asset: Asset) {
let mut assets = self.assets.lock().await;
assets.push(asset);
}
pub async fn remove_expired(&self, log: &LogFile) {
let mut assets = self.assets.lock().await;
let removed_assets = assets.extract_if(.., |asset| asset.is_expired());
for asset in removed_assets {
println!("[{}] Removing asset: {}", chrono::Local::now().to_rfc3339(), asset.id());
log.write_event(LogEventType::AssetDeleted(asset.to_value()))
.await
.unwrap();
}
}
pub async fn active_assets(&self) -> usize {
self.assets.lock().await.len()
}
pub async fn stats_summary(&self) -> (usize, u64, usize, usize) {
let assets = self.assets.lock().await;
let mut active_assets = 0;
let mut storage_bytes: u64 = 0;
let mut image_count = 0;
let mut text_count = 0;
for asset in assets.iter() {
if asset.is_expired() {
continue;
}
active_assets += 1;
storage_bytes += asset.size_bytes() as u64;
if asset.mime().starts_with("image/") {
image_count += 1;
} else if asset.mime().starts_with("text/") {
text_count += 1;
}
}
(active_assets, storage_bytes, image_count, text_count)
}
pub async fn show_assets(&self) {
for asset in self.assets.lock().await.iter() {
println!(
"[{}] Asset ID: {}, Expires At: {}, Mime: {}, Size: {} bytes",
chrono::Local::now().to_rfc3339(),
asset.id(),
asset.expires_at(),
asset.mime(),
asset.size_bytes()
);
}
}
pub async fn get_asset(&self, id: &str) -> Option<Asset> {
let assets = self.assets.lock().await;
for asset in assets.iter().cloned() {
if asset.id() == id {
return Some(asset);
}
}
None
}
pub async fn assets_len(&self) -> usize {
self.assets.lock().await.len()
}
}
#[derive(Clone, Debug, Default)]
pub struct RateLimiter {
pub clients: Arc<Mutex<HashMap<String, Vec<i64>>>>,
}
impl RateLimiter {
pub async fn check(&self, client_ip: &str, asset_exp_time: i64) -> (bool, Option<i64>) {
self.clear_expired().await;
let now = Utc::now().timestamp_millis();
let mut clients = self.clients.lock().await;
let entry = clients.entry(client_ip.to_string()).or_insert_with(Vec::new);
let ret_val = if entry.len() < MAX_UPLOADS_PER_USER {
entry.push(asset_exp_time);
(true, None)
} else {
println!(
"[{}] Rate limit exceeded for IP: {}",
chrono::Local::now().to_rfc3339(),
client_ip
);
let first_to_expire = entry.iter().min().copied().unwrap();
let retry_after_ms = (first_to_expire - now).max(1);
(false, Some(retry_after_ms))
};
ret_val
}
pub async fn clear_expired(&self) {
let mut clients = self.clients.lock().await;
let now = Utc::now().timestamp_millis();
for timestamps in clients.values_mut() {
timestamps.retain(|&timestamp| timestamp > now);
}
}
}
pub async fn clear_app_data(app_state: &AppState) -> Result<()> {
app_state.assets.remove_expired(&app_state.log_file).await;
app_state.connection_tracker.clear_expired().await;
Ok(())
}

View File

@@ -1,47 +1,132 @@
use std::{
fs::{self, OpenOptions},
io::Write,
time::Instant,
};
use anyhow::Result;
use std::{path::PathBuf, sync::Arc};
use actix_web::HttpRequest;
use crate::LOG_DIR;
pub fn log_to_file(req: &HttpRequest, start: Instant) {
let delta = start.elapsed().as_nanos();
println!("Request processed in {} ns", delta);
let duration_ms = delta as f64 / 1000_000.0;
let _ = fs::create_dir_all(LOG_DIR);
let log_path = LOG_DIR.to_string() + "access.log";
let Ok(mut file) = OpenOptions::new().create(true).append(true).open(log_path) else {
eprintln!("failed to open log file");
return;
use serde::{Deserialize, Serialize};
use serde_json::Value;
use tokio::{
fs::{File, OpenOptions, rename},
io::{AsyncReadExt, AsyncWriteExt},
sync::Mutex,
};
let ts = chrono::Local::now().to_rfc3339();
#[derive(Debug, Clone)]
pub struct LogFile {
_path: PathBuf,
handle: Arc<Mutex<File>>,
}
let method = req.method();
impl LogFile {
pub async fn new(path: impl Into<PathBuf>) -> Result<Self> {
let path = path.into();
if LogFile::log_file_exist(&path).await? {
LogFile::log_file_rotate(&path).await?;
}
let handle = OpenOptions::new().create(true).append(true).open(&path).await?;
println!("Log file created at: {}", path.display());
Ok(Self {
_path: path,
handle: Arc::new(Mutex::new(handle)),
})
}
pub async fn read_events(&self) -> Result<Vec<LogEvent>> {
let mut file = File::open(&self._path).await?;
let mut contents = String::new();
file.read_to_string(&mut contents).await?;
let mut events: Vec<LogEvent> = Vec::new();
for line in contents.lines() {
match serde_json::from_str::<LogEvent>(line) {
Ok(event) => events.push(event),
Err(e) => println!("Failed to parse log line: {}: {}", e, line),
}
}
Ok(events)
}
pub async fn write_event(&self, event: LogEventType) -> Result<()> {
let log_event: LogEvent = event.into();
let line = serde_json::to_string(&log_event)? + "\n";
self.handle.lock().await.write_all(line.as_bytes()).await?;
Ok(())
}
async fn log_file_exist(path: impl Into<PathBuf>) -> Result<bool> {
if tokio::fs::metadata(path.into()).await.is_ok() { Ok(true) } else { Ok(false) }
}
async fn log_file_rotate(path: impl Into<PathBuf>) -> Result<()> {
let path: PathBuf = path.into();
let now = chrono::Utc::now().format("%Y_%m_%_d-%H%M%S").to_string();
let Some(dir) = path.parent() else {
return Err(anyhow::anyhow!("Failed to get parent directory for log rotation"));
};
let filename = path.file_name().unwrap_or_else(|| std::ffi::OsStr::new("log.txt"));
let rotated = dir.join(format!("{}_{}", now, filename.to_string_lossy()));
rename(path, rotated).await?;
Ok(())
}
}
#[derive(Debug, Serialize, Deserialize)]
pub struct LogHttpRequest {
pub method: String,
pub path: String,
pub query_string: String,
pub scheme: String,
pub ip: String,
pub real_ip: String,
pub user_agent: String,
}
impl From<HttpRequest> for LogHttpRequest {
fn from(req: HttpRequest) -> Self {
let method = req.method().as_str().to_string();
let uri = req.uri();
let path = uri.path();
let query = uri.query().unwrap_or("-");
let path = uri.path().to_string();
let query_string = uri.query().unwrap_or("-").to_string();
let connection_info = req.connection_info();
let scheme = connection_info.scheme();
let ip = connection_info.peer_addr().unwrap_or("-");
let real_ip = connection_info.realip_remote_addr().unwrap_or("-");
let scheme = connection_info.scheme().to_string();
let ip = connection_info.peer_addr().unwrap_or("-").to_string();
let real_ip = connection_info.realip_remote_addr().unwrap_or("-").to_string();
let ua = req
let user_agent = req
.headers()
.get("user-agent")
.and_then(|v| v.to_str().ok())
.unwrap_or("-");
.unwrap_or("-")
.to_string();
let line = format!(
"{ts} scheme={scheme} ip={ip} real_ip={real_ip} method={method} path={path} qs={query} dur_ms={duration_ms} ua=\"{ua}\"\n"
);
let _ = file.write_all(line.as_bytes());
LogHttpRequest {
method,
path,
query_string,
scheme,
ip,
real_ip,
user_agent,
}
}
}
#[derive(Debug, Serialize, Deserialize)]
pub enum LogEventType {
AssetUploaded(Value),
AssetDeleted(Value),
HttpRequest(LogHttpRequest),
Error(Value),
}
#[derive(Debug, Serialize, Deserialize)]
pub struct LogEvent {
pub time: String,
pub event: LogEventType,
}
impl From<LogEventType> for LogEvent {
fn from(event: LogEventType) -> Self {
let time = chrono::Utc::now().to_rfc3339();
LogEvent { time, event }
}
}

View File

@@ -1,81 +1,194 @@
mod api;
mod data_mgt;
mod logs;
use actix_files::NamedFile;
#[cfg(test)]
mod tests;
use actix_web::{
App, HttpRequest, HttpResponse, HttpServer, get, route,
App, HttpRequest, HttpResponse, HttpServer, get, mime, route,
web::{self},
};
use serde_json::Value;
use std::path::PathBuf;
pub static BIND_ADDR: &str = "0.0.0.0";
pub static BIND_PORT: u16 = 80;
pub static STATIC_PAGES: &[&str] = &["index.html", "style.css", "view.html", "logo.png"];
pub static HTML_DIR: &str = "html/";
pub static LOG_DIR: &str = "logs/";
pub static DATA_STORAGE: &str = "storage/";
use anyhow::Result;
use mime_guess::from_path;
use serde_json::Value;
use std::{
env, fs,
path::{Path, PathBuf},
sync::LazyLock,
};
pub static HTML_DIR: &str = "data/html/";
pub static LOG_DIR: &str = "data/logs/";
pub static LOG_FILE_NAME: &str = "log.txt";
pub static MIN_ASSET_DURATION: u32 = 1; // in minutes
pub static MAX_ASSET_DURATION: u32 = 60; // in minutes
pub static MAX_ASSETS: usize = 1000;
pub static MAX_ASSET_SIZE_BYTES: usize = 3 * 1024 * 1024; // 3 MB
pub static MAX_UPLOADS_PER_USER: usize = 10;
pub static FOOTER_HTML: LazyLock<String> =
LazyLock::new(|| fs::read_to_string(Path::new(HTML_DIR).join("footer.html")).unwrap_or_default());
pub static HTML_VARS: LazyLock<Vec<(&str, &str)>> = LazyLock::new(|| {
vec![
("{{FOOTER}}", (*FOOTER_HTML).as_str()),
("{{VERSION}}", env!("CARGO_PKG_VERSION")),
]
});
pub static BIND_ADDR: LazyLock<String> = LazyLock::new(|| match env::var("BIND_ADDR") {
Ok(addr) => {
println!("Binding to address: {}", addr);
addr.parse().unwrap_or("127.0.0.1".to_string())
}
Err(_) => {
println!("Binding to default address: 0.0.0.0");
"0.0.0.0".to_string()
}
});
pub static BIND_PORT: LazyLock<u16> = LazyLock::new(|| match env::var("BIND_PORT") {
Ok(port_str) => {
println!("Binding to port: {}", port_str);
port_str.parse().unwrap_or(8080)
}
Err(_) => {
println!("Binding to default port: 8080");
8080
}
});
pub static STATIC_PAGES: LazyLock<Vec<String>> = LazyLock::new(|| {
fs::read_dir(HTML_DIR)
.unwrap()
.filter_map(|entry| entry.ok().and_then(|e| e.file_name().to_str().map(|s| s.to_string())))
.collect()
});
use crate::{
api::{api_get_asset, api_upload},
logs::log_to_file,
api::{api_get_asset, api_stats, api_upload},
data_mgt::AppState,
logs::{LogEventType, LogFile},
};
#[get("/")]
async fn index(reg: HttpRequest) -> actix_web::Result<NamedFile> {
let now = std::time::Instant::now();
async fn index(req: HttpRequest, app_state: web::Data<AppState>) -> actix_web::Result<HttpResponse> {
let path: PathBuf = PathBuf::from(HTML_DIR.to_string() + "index.html");
log_to_file(&reg, now);
Ok(NamedFile::open(path)?)
app_state
.log_file
.write_event(LogEventType::HttpRequest(req.into()))
.await
.unwrap_or_else(|e| println!("Failed to log HTTP request: {}", e));
get_static_file(path).await
}
#[get("/stats")]
async fn stats(req: HttpRequest, app_state: web::Data<AppState>) -> actix_web::Result<HttpResponse> {
let path: PathBuf = PathBuf::from(HTML_DIR.to_string() + "stats.html");
app_state
.log_file
.write_event(LogEventType::HttpRequest(req.into()))
.await
.unwrap_or_else(|e| println!("Failed to log HTTP request: {}", e));
get_static_file(path).await
}
#[get("/bhs/{id}")]
async fn view_asset(req: HttpRequest) -> actix_web::Result<NamedFile> {
let now = std::time::Instant::now();
async fn view_asset(req: HttpRequest, app_state: web::Data<AppState>) -> actix_web::Result<HttpResponse> {
let path: PathBuf = PathBuf::from(HTML_DIR.to_string() + "view.html");
log_to_file(&req, now);
Ok(NamedFile::open(path)?)
app_state
.log_file
.write_event(LogEventType::HttpRequest(req.into()))
.await
.unwrap_or_else(|e| println!("Failed to log HTTP request: {}", e));
get_static_file(path).await
}
#[route("/{tail:.*}", method = "GET", method = "POST")]
async fn catch_all(req: HttpRequest, _payload: Option<web::Json<Value>>) -> actix_web::Result<HttpResponse> {
let now = std::time::Instant::now();
async fn catch_all(
req: HttpRequest,
_payload: Option<web::Json<Value>>,
app_state: web::Data<AppState>,
) -> actix_web::Result<HttpResponse> {
let response = match req.uri().path() {
path if STATIC_PAGES.contains(&&path[1..]) => {
path if STATIC_PAGES.contains(&path[1..].into()) => {
let file_path = HTML_DIR.to_string() + path;
Ok(NamedFile::open(file_path)?.into_response(&req))
get_static_file(file_path).await
}
_ => {
let file_path = PathBuf::from(HTML_DIR.to_string() + "error.html");
get_static_file(file_path).await
}
_ => Ok(HttpResponse::NotFound().body("Not Found")),
};
log_to_file(&req, now);
app_state
.log_file
.write_event(LogEventType::HttpRequest(req.into()))
.await
.unwrap_or_else(|e| println!("Failed to log HTTP request: {}", e));
response
}
#[actix_web::main]
async fn main() -> std::io::Result<()> {
println!("Starting server at http://{}:{}/", BIND_ADDR, BIND_PORT);
tokio::spawn(async {
let mut interval = tokio::time::interval(tokio::time::Duration::from_secs(60));
#[tokio::main]
async fn main() -> Result<()> {
let log_file = LogFile::new(format!("{}{}", LOG_DIR, LOG_FILE_NAME))
.await
.expect("Failed to create or open log file");
let app_state = data_mgt::AppState {
assets: data_mgt::AssetStorage::new(),
connection_tracker: data_mgt::RateLimiter::default(),
log_file,
};
println!("Starting server at http://{}:{}/", *BIND_ADDR, *BIND_PORT);
let inner_app_state = app_state.clone();
tokio::spawn(async move {
let mut interval = tokio::time::interval(tokio::time::Duration::from_secs(1));
loop {
interval.tick().await;
if let Err(e) = data_mgt::clear_assets().await {
if let Err(e) = data_mgt::clear_app_data(&inner_app_state).await {
eprintln!("Error clearing assets: {}", e);
}
}
});
HttpServer::new(|| {
HttpServer::new(move || {
App::new()
.app_data(web::JsonConfig::default().limit(1024 * 1024 * 3))
.app_data(web::JsonConfig::default().limit(1024 * 1024 * 3)) // 3MB limit
.app_data(web::Data::new(app_state.clone()))
.service(index)
.service(stats)
.service(view_asset)
.service(api_get_asset)
.service(api_upload)
.service(api_stats)
.service(catch_all)
})
.bind((BIND_ADDR, BIND_PORT))?
.bind((BIND_ADDR.clone(), *BIND_PORT))?
.run()
.await
.await?;
Ok(())
}
pub async fn get_static_file<P: AsRef<Path>>(path: P) -> actix_web::Result<HttpResponse> {
let path = path.as_ref();
let mime = from_path(path).first_or_octet_stream();
// HTML → text + replace
if mime.type_() == mime::TEXT && mime.subtype() == mime::HTML {
let mut html = tokio::fs::read_to_string(path)
.await
.map_err(actix_web::error::ErrorInternalServerError)?;
for (k, v) in HTML_VARS.iter() {
html = html.replace(k, v);
}
return Ok(HttpResponse::Ok().content_type("text/html; charset=utf-8").body(html));
}
let bytes = tokio::fs::read(path)
.await
.map_err(actix_web::error::ErrorInternalServerError)?;
Ok(HttpResponse::Ok().content_type(mime.as_ref()).body(bytes))
}

1
src/tests.rs Normal file
View File

@@ -0,0 +1 @@