Compare commits
10 Commits
4e53f23e79
...
dep-hell
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
4b8693fe68 | ||
| e0a383042e | |||
| eb9070966a | |||
| 00adbb2c4a | |||
|
|
3ea9b83ed9 | ||
|
|
0c363dd8ae | ||
|
|
4514f578ae | ||
|
|
2dc38b6c95 | ||
|
|
6deab0b87e | ||
|
|
81f4654b50 |
163
.agents/skills/caveman-compress/README.md
Normal file
163
.agents/skills/caveman-compress/README.md
Normal file
@@ -0,0 +1,163 @@
|
||||
<p align="center">
|
||||
<img src="https://em-content.zobj.net/source/apple/391/rock_1faa8.png" width="80" />
|
||||
</p>
|
||||
|
||||
<h1 align="center">caveman-compress</h1>
|
||||
|
||||
<p align="center">
|
||||
<strong>shrink memory file. save token every session.</strong>
|
||||
</p>
|
||||
|
||||
---
|
||||
|
||||
A Claude Code skill that compresses your project memory files (`CLAUDE.md`, todos, preferences) into caveman format — so every session loads fewer tokens automatically.
|
||||
|
||||
Claude read `CLAUDE.md` on every session start. If file big, cost big. Caveman make file small. Cost go down forever.
|
||||
|
||||
## What It Do
|
||||
|
||||
```
|
||||
/caveman:compress CLAUDE.md
|
||||
```
|
||||
|
||||
```
|
||||
CLAUDE.md ← compressed (Claude reads this — fewer tokens every session)
|
||||
CLAUDE.original.md ← human-readable backup (you edit this)
|
||||
```
|
||||
|
||||
Original never lost. You can read and edit `.original.md`. Run skill again to re-compress after edits.
|
||||
|
||||
## Benchmarks
|
||||
|
||||
Real results on real project files:
|
||||
|
||||
| File | Original | Compressed | Saved |
|
||||
|------|----------:|----------:|------:|
|
||||
| `claude-md-preferences.md` | 706 | 285 | **59.6%** |
|
||||
| `project-notes.md` | 1145 | 535 | **53.3%** |
|
||||
| `claude-md-project.md` | 1122 | 636 | **43.3%** |
|
||||
| `todo-list.md` | 627 | 388 | **38.1%** |
|
||||
| `mixed-with-code.md` | 888 | 560 | **36.9%** |
|
||||
| **Average** | **898** | **481** | **46%** |
|
||||
|
||||
All validations passed ✅ — headings, code blocks, URLs, file paths preserved exactly.
|
||||
|
||||
## Before / After
|
||||
|
||||
<table>
|
||||
<tr>
|
||||
<td width="50%">
|
||||
|
||||
### 📄 Original (706 tokens)
|
||||
|
||||
> "I strongly prefer TypeScript with strict mode enabled for all new code. Please don't use `any` type unless there's genuinely no way around it, and if you do, leave a comment explaining the reasoning. I find that taking the time to properly type things catches a lot of bugs before they ever make it to runtime."
|
||||
|
||||
</td>
|
||||
<td width="50%">
|
||||
|
||||
### 🪨 Caveman (285 tokens)
|
||||
|
||||
> "Prefer TypeScript strict mode always. No `any` unless unavoidable — comment why if used. Proper types catch bugs early."
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
**Same instructions. 60% fewer tokens. Every. Single. Session.**
|
||||
|
||||
## Security
|
||||
|
||||
`caveman-compress` is flagged as Snyk High Risk due to subprocess and file I/O patterns detected by static analysis. This is a false positive — see [SECURITY.md](./SECURITY.md) for a full explanation of what the skill does and does not do.
|
||||
|
||||
## Install
|
||||
|
||||
Compress is built in with the `caveman` plugin. Install `caveman` once, then use `/caveman:compress`.
|
||||
|
||||
If you need local files, the compress skill lives at:
|
||||
|
||||
```bash
|
||||
caveman-compress/
|
||||
```
|
||||
|
||||
**Requires:** Python 3.10+
|
||||
|
||||
## Usage
|
||||
|
||||
```
|
||||
/caveman:compress <filepath>
|
||||
```
|
||||
|
||||
Examples:
|
||||
```
|
||||
/caveman:compress CLAUDE.md
|
||||
/caveman:compress docs/preferences.md
|
||||
/caveman:compress todos.md
|
||||
```
|
||||
|
||||
### What files work
|
||||
|
||||
| Type | Compress? |
|
||||
|------|-----------|
|
||||
| `.md`, `.txt`, `.rst` | ✅ Yes |
|
||||
| Extensionless natural language | ✅ Yes |
|
||||
| `.py`, `.js`, `.ts`, `.json`, `.yaml` | ❌ Skip (code/config) |
|
||||
| `*.original.md` | ❌ Skip (backup files) |
|
||||
|
||||
## How It Work
|
||||
|
||||
```
|
||||
/caveman:compress CLAUDE.md
|
||||
↓
|
||||
detect file type (no tokens)
|
||||
↓
|
||||
Claude compresses (tokens — one call)
|
||||
↓
|
||||
validate output (no tokens)
|
||||
checks: headings, code blocks, URLs, file paths, bullets
|
||||
↓
|
||||
if errors: Claude fixes cherry-picked issues only (tokens — targeted fix)
|
||||
does NOT recompress — only patches broken parts
|
||||
↓
|
||||
retry up to 2 times
|
||||
↓
|
||||
write compressed → CLAUDE.md
|
||||
write original → CLAUDE.original.md
|
||||
```
|
||||
|
||||
Only two things use tokens: initial compression + targeted fix if validation fails. Everything else is local Python.
|
||||
|
||||
## What Is Preserved
|
||||
|
||||
Caveman compress natural language. It never touch:
|
||||
|
||||
- Code blocks (` ``` ` fenced or indented)
|
||||
- Inline code (`` `backtick content` ``)
|
||||
- URLs and links
|
||||
- File paths (`/src/components/...`)
|
||||
- Commands (`npm install`, `git commit`)
|
||||
- Technical terms, library names, API names
|
||||
- Headings (exact text preserved)
|
||||
- Tables (structure preserved, cell text compressed)
|
||||
- Dates, version numbers, numeric values
|
||||
|
||||
## Why This Matter
|
||||
|
||||
`CLAUDE.md` loads on **every session start**. A 1000-token project memory file costs tokens every single time you open a project. Over 100 sessions that's 100,000 tokens of overhead — just for context you already wrote.
|
||||
|
||||
Caveman cut that by ~46% on average. Same instructions. Same accuracy. Less waste.
|
||||
|
||||
```
|
||||
┌────────────────────────────────────────────┐
|
||||
│ TOKEN SAVINGS PER FILE █████ 46% │
|
||||
│ SESSIONS THAT BENEFIT ██████████ 100% │
|
||||
│ INFORMATION PRESERVED ██████████ 100% │
|
||||
│ SETUP TIME █ 1x │
|
||||
└────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Part of Caveman
|
||||
|
||||
This skill is part of the [caveman](https://github.com/JuliusBrussee/caveman) toolkit — making Claude use fewer tokens without losing accuracy.
|
||||
|
||||
- **caveman** — make Claude *speak* like caveman (cuts response tokens ~65%)
|
||||
- **caveman-compress** — make Claude *read* less (cuts context tokens ~46%)
|
||||
31
.agents/skills/caveman-compress/SECURITY.md
Normal file
31
.agents/skills/caveman-compress/SECURITY.md
Normal file
@@ -0,0 +1,31 @@
|
||||
# Security
|
||||
|
||||
## Snyk High Risk Rating
|
||||
|
||||
`caveman-compress` receives a Snyk High Risk rating due to static analysis heuristics. This document explains what the skill does and does not do.
|
||||
|
||||
### What triggers the rating
|
||||
|
||||
1. **subprocess usage**: The skill calls the `claude` CLI via `subprocess.run()` as a fallback when `ANTHROPIC_API_KEY` is not set. The subprocess call uses a fixed argument list — no shell interpolation occurs. User file content is passed via stdin, not as a shell argument.
|
||||
|
||||
2. **File read/write**: The skill reads the file the user explicitly points it at, compresses it, and writes the result back to the same path. A `.original.md` backup is saved alongside it. No files outside the user-specified path are read or written.
|
||||
|
||||
### What the skill does NOT do
|
||||
|
||||
- Does not execute user file content as code
|
||||
- Does not make network requests except to Anthropic's API (via SDK or CLI)
|
||||
- Does not access files outside the path the user provides
|
||||
- Does not use shell=True or string interpolation in subprocess calls
|
||||
- Does not collect or transmit any data beyond the file being compressed
|
||||
|
||||
### Auth behavior
|
||||
|
||||
If `ANTHROPIC_API_KEY` is set, the skill uses the Anthropic Python SDK directly (no subprocess). If not set, it falls back to the `claude` CLI, which uses the user's existing Claude desktop authentication.
|
||||
|
||||
### File size limit
|
||||
|
||||
Files larger than 500KB are rejected before any API call is made.
|
||||
|
||||
### Reporting a vulnerability
|
||||
|
||||
If you believe you've found a genuine security issue, please open a GitHub issue with the label `security`.
|
||||
111
.agents/skills/caveman-compress/SKILL.md
Normal file
111
.agents/skills/caveman-compress/SKILL.md
Normal file
@@ -0,0 +1,111 @@
|
||||
---
|
||||
name: caveman-compress
|
||||
description: >
|
||||
Compress natural language memory files (CLAUDE.md, todos, preferences) into caveman format
|
||||
to save input tokens. Preserves all technical substance, code, URLs, and structure.
|
||||
Compressed version overwrites the original file. Human-readable backup saved as FILE.original.md.
|
||||
Trigger: /caveman:compress <filepath> or "compress memory file"
|
||||
---
|
||||
|
||||
# Caveman Compress
|
||||
|
||||
## Purpose
|
||||
|
||||
Compress natural language files (CLAUDE.md, todos, preferences) into caveman-speak to reduce input tokens. Compressed version overwrites original. Human-readable backup saved as `<filename>.original.md`.
|
||||
|
||||
## Trigger
|
||||
|
||||
`/caveman:compress <filepath>` or when user asks to compress a memory file.
|
||||
|
||||
## Process
|
||||
|
||||
1. The compression scripts live in `caveman-compress/scripts/` (adjacent to this SKILL.md). If the path is not immediately available, search for `caveman-compress/scripts/__main__.py`.
|
||||
|
||||
2. Run:
|
||||
|
||||
cd caveman-compress && python3 -m scripts <absolute_filepath>
|
||||
|
||||
3. The CLI will:
|
||||
- detect file type (no tokens)
|
||||
- call Claude to compress
|
||||
- validate output (no tokens)
|
||||
- if errors: cherry-pick fix with Claude (targeted fixes only, no recompression)
|
||||
- retry up to 2 times
|
||||
- if still failing after 2 retries: report error to user, leave original file untouched
|
||||
|
||||
4. Return result to user
|
||||
|
||||
## Compression Rules
|
||||
|
||||
### Remove
|
||||
- Articles: a, an, the
|
||||
- Filler: just, really, basically, actually, simply, essentially, generally
|
||||
- Pleasantries: "sure", "certainly", "of course", "happy to", "I'd recommend"
|
||||
- Hedging: "it might be worth", "you could consider", "it would be good to"
|
||||
- Redundant phrasing: "in order to" → "to", "make sure to" → "ensure", "the reason is because" → "because"
|
||||
- Connective fluff: "however", "furthermore", "additionally", "in addition"
|
||||
|
||||
### Preserve EXACTLY (never modify)
|
||||
- Code blocks (fenced ``` and indented)
|
||||
- Inline code (`backtick content`)
|
||||
- URLs and links (full URLs, markdown links)
|
||||
- File paths (`/src/components/...`, `./config.yaml`)
|
||||
- Commands (`npm install`, `git commit`, `docker build`)
|
||||
- Technical terms (library names, API names, protocols, algorithms)
|
||||
- Proper nouns (project names, people, companies)
|
||||
- Dates, version numbers, numeric values
|
||||
- Environment variables (`$HOME`, `NODE_ENV`)
|
||||
|
||||
### Preserve Structure
|
||||
- All markdown headings (keep exact heading text, compress body below)
|
||||
- Bullet point hierarchy (keep nesting level)
|
||||
- Numbered lists (keep numbering)
|
||||
- Tables (compress cell text, keep structure)
|
||||
- Frontmatter/YAML headers in markdown files
|
||||
|
||||
### Compress
|
||||
- Use short synonyms: "big" not "extensive", "fix" not "implement a solution for", "use" not "utilize"
|
||||
- Fragments OK: "Run tests before commit" not "You should always run tests before committing"
|
||||
- Drop "you should", "make sure to", "remember to" — just state the action
|
||||
- Merge redundant bullets that say the same thing differently
|
||||
- Keep one example where multiple examples show the same pattern
|
||||
|
||||
CRITICAL RULE:
|
||||
Anything inside ``` ... ``` must be copied EXACTLY.
|
||||
Do not:
|
||||
- remove comments
|
||||
- remove spacing
|
||||
- reorder lines
|
||||
- shorten commands
|
||||
- simplify anything
|
||||
|
||||
Inline code (`...`) must be preserved EXACTLY.
|
||||
Do not modify anything inside backticks.
|
||||
|
||||
If file contains code blocks:
|
||||
- Treat code blocks as read-only regions
|
||||
- Only compress text outside them
|
||||
- Do not merge sections around code
|
||||
|
||||
## Pattern
|
||||
|
||||
Original:
|
||||
> You should always make sure to run the test suite before pushing any changes to the main branch. This is important because it helps catch bugs early and prevents broken builds from being deployed to production.
|
||||
|
||||
Compressed:
|
||||
> Run tests before push to main. Catch bugs early, prevent broken prod deploys.
|
||||
|
||||
Original:
|
||||
> The application uses a microservices architecture with the following components. The API gateway handles all incoming requests and routes them to the appropriate service. The authentication service is responsible for managing user sessions and JWT tokens.
|
||||
|
||||
Compressed:
|
||||
> Microservices architecture. API gateway route all requests to services. Auth service manage user sessions + JWT tokens.
|
||||
|
||||
## Boundaries
|
||||
|
||||
- ONLY compress natural language files (.md, .txt, extensionless)
|
||||
- NEVER modify: .py, .js, .ts, .json, .yaml, .yml, .toml, .env, .lock, .css, .html, .xml, .sql, .sh
|
||||
- If file has mixed content (prose + code), compress ONLY the prose sections
|
||||
- If unsure whether something is code or prose, leave it unchanged
|
||||
- Original file is backed up as FILE.original.md before overwriting
|
||||
- Never compress FILE.original.md (skip it)
|
||||
9
.agents/skills/caveman-compress/scripts/__init__.py
Normal file
9
.agents/skills/caveman-compress/scripts/__init__.py
Normal file
@@ -0,0 +1,9 @@
|
||||
"""Caveman compress scripts.
|
||||
|
||||
This package provides tools to compress natural language markdown files
|
||||
into caveman format to save input tokens.
|
||||
"""
|
||||
|
||||
__all__ = ["cli", "compress", "detect", "validate"]
|
||||
|
||||
__version__ = "1.0.0"
|
||||
3
.agents/skills/caveman-compress/scripts/__main__.py
Normal file
3
.agents/skills/caveman-compress/scripts/__main__.py
Normal file
@@ -0,0 +1,3 @@
|
||||
from .cli import main
|
||||
|
||||
main()
|
||||
78
.agents/skills/caveman-compress/scripts/benchmark.py
Normal file
78
.agents/skills/caveman-compress/scripts/benchmark.py
Normal file
@@ -0,0 +1,78 @@
|
||||
#!/usr/bin/env python3
|
||||
from pathlib import Path
|
||||
import sys
|
||||
|
||||
# Support both direct execution and module import
|
||||
try:
|
||||
from .validate import validate
|
||||
except ImportError:
|
||||
sys.path.insert(0, str(Path(__file__).parent))
|
||||
from validate import validate
|
||||
|
||||
try:
|
||||
import tiktoken
|
||||
_enc = tiktoken.get_encoding("o200k_base")
|
||||
except ImportError:
|
||||
_enc = None
|
||||
|
||||
|
||||
def count_tokens(text):
|
||||
if _enc is None:
|
||||
return len(text.split()) # fallback: word count
|
||||
return len(_enc.encode(text))
|
||||
|
||||
|
||||
def benchmark_pair(orig_path: Path, comp_path: Path):
|
||||
orig_text = orig_path.read_text()
|
||||
comp_text = comp_path.read_text()
|
||||
|
||||
orig_tokens = count_tokens(orig_text)
|
||||
comp_tokens = count_tokens(comp_text)
|
||||
saved = 100 * (orig_tokens - comp_tokens) / orig_tokens if orig_tokens > 0 else 0.0
|
||||
result = validate(orig_path, comp_path)
|
||||
|
||||
return (comp_path.name, orig_tokens, comp_tokens, saved, result.is_valid)
|
||||
|
||||
|
||||
def print_table(rows):
|
||||
print("\n| File | Original | Compressed | Saved % | Valid |")
|
||||
print("|------|----------|------------|---------|-------|")
|
||||
for r in rows:
|
||||
print(f"| {r[0]} | {r[1]} | {r[2]} | {r[3]:.1f}% | {'✅' if r[4] else '❌'} |")
|
||||
|
||||
|
||||
def main():
|
||||
# Direct file pair: python3 benchmark.py original.md compressed.md
|
||||
if len(sys.argv) == 3:
|
||||
orig = Path(sys.argv[1]).resolve()
|
||||
comp = Path(sys.argv[2]).resolve()
|
||||
if not orig.exists():
|
||||
print(f"❌ Not found: {orig}")
|
||||
sys.exit(1)
|
||||
if not comp.exists():
|
||||
print(f"❌ Not found: {comp}")
|
||||
sys.exit(1)
|
||||
print_table([benchmark_pair(orig, comp)])
|
||||
return
|
||||
|
||||
# Glob mode: repo_root/tests/caveman-compress/
|
||||
tests_dir = Path(__file__).parent.parent.parent / "tests" / "caveman-compress"
|
||||
if not tests_dir.exists():
|
||||
print(f"❌ Tests dir not found: {tests_dir}")
|
||||
sys.exit(1)
|
||||
|
||||
rows = []
|
||||
for orig in sorted(tests_dir.glob("*.original.md")):
|
||||
comp = orig.with_name(orig.stem.removesuffix(".original") + ".md")
|
||||
if comp.exists():
|
||||
rows.append(benchmark_pair(orig, comp))
|
||||
|
||||
if not rows:
|
||||
print("No compressed file pairs found.")
|
||||
return
|
||||
|
||||
print_table(rows)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
73
.agents/skills/caveman-compress/scripts/cli.py
Normal file
73
.agents/skills/caveman-compress/scripts/cli.py
Normal file
@@ -0,0 +1,73 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Caveman Compress CLI
|
||||
|
||||
Usage:
|
||||
caveman <filepath>
|
||||
"""
|
||||
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
from .compress import compress_file
|
||||
from .detect import detect_file_type, should_compress
|
||||
|
||||
|
||||
def print_usage():
|
||||
print("Usage: caveman <filepath>")
|
||||
|
||||
|
||||
def main():
|
||||
if len(sys.argv) != 2:
|
||||
print_usage()
|
||||
sys.exit(1)
|
||||
|
||||
filepath = Path(sys.argv[1])
|
||||
|
||||
# Check file exists
|
||||
if not filepath.exists():
|
||||
print(f"❌ File not found: {filepath}")
|
||||
sys.exit(1)
|
||||
|
||||
if not filepath.is_file():
|
||||
print(f"❌ Not a file: {filepath}")
|
||||
sys.exit(1)
|
||||
|
||||
filepath = filepath.resolve()
|
||||
|
||||
# Detect file type
|
||||
file_type = detect_file_type(filepath)
|
||||
|
||||
print(f"Detected: {file_type}")
|
||||
|
||||
# Check if compressible
|
||||
if not should_compress(filepath):
|
||||
print("Skipping: file is not natural language (code/config)")
|
||||
sys.exit(0)
|
||||
|
||||
print("Starting caveman compression...\n")
|
||||
|
||||
try:
|
||||
success = compress_file(filepath)
|
||||
|
||||
if success:
|
||||
print("\nCompression completed successfully")
|
||||
backup_path = filepath.with_name(filepath.stem + ".original.md")
|
||||
print(f"Compressed: {filepath}")
|
||||
print(f"Original: {backup_path}")
|
||||
sys.exit(0)
|
||||
else:
|
||||
print("\n❌ Compression failed after retries")
|
||||
sys.exit(2)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("\nInterrupted by user")
|
||||
sys.exit(130)
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n❌ Error: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
227
.agents/skills/caveman-compress/scripts/compress.py
Normal file
227
.agents/skills/caveman-compress/scripts/compress.py
Normal file
@@ -0,0 +1,227 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Caveman Memory Compression Orchestrator
|
||||
|
||||
Usage:
|
||||
python scripts/compress.py <filepath>
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from typing import List
|
||||
|
||||
OUTER_FENCE_REGEX = re.compile(
|
||||
r"\A\s*(`{3,}|~{3,})[^\n]*\n(.*)\n\1\s*\Z", re.DOTALL
|
||||
)
|
||||
|
||||
# Filenames and paths that almost certainly hold secrets or PII. Compressing
|
||||
# them ships raw bytes to the Anthropic API — a third-party data boundary that
|
||||
# developers on sensitive codebases cannot cross. detect.py already skips .env
|
||||
# by extension, but credentials.md / secrets.txt / ~/.aws/credentials would
|
||||
# slip through the natural-language filter. This is a hard refuse before read.
|
||||
SENSITIVE_BASENAME_REGEX = re.compile(
|
||||
r"(?ix)^("
|
||||
r"\.env(\..+)?"
|
||||
r"|\.netrc"
|
||||
r"|credentials(\..+)?"
|
||||
r"|secrets?(\..+)?"
|
||||
r"|passwords?(\..+)?"
|
||||
r"|id_(rsa|dsa|ecdsa|ed25519)(\.pub)?"
|
||||
r"|authorized_keys"
|
||||
r"|known_hosts"
|
||||
r"|.*\.(pem|key|p12|pfx|crt|cer|jks|keystore|asc|gpg)"
|
||||
r")$"
|
||||
)
|
||||
|
||||
SENSITIVE_PATH_COMPONENTS = frozenset({".ssh", ".aws", ".gnupg", ".kube", ".docker"})
|
||||
|
||||
SENSITIVE_NAME_TOKENS = (
|
||||
"secret", "credential", "password", "passwd",
|
||||
"apikey", "accesskey", "token", "privatekey",
|
||||
)
|
||||
|
||||
|
||||
def is_sensitive_path(filepath: Path) -> bool:
|
||||
"""Heuristic denylist for files that must never be shipped to a third-party API."""
|
||||
name = filepath.name
|
||||
if SENSITIVE_BASENAME_REGEX.match(name):
|
||||
return True
|
||||
lowered_parts = {p.lower() for p in filepath.parts}
|
||||
if lowered_parts & SENSITIVE_PATH_COMPONENTS:
|
||||
return True
|
||||
# Normalize separators so "api-key" and "api_key" both match "apikey".
|
||||
lower = re.sub(r"[_\-\s.]", "", name.lower())
|
||||
return any(tok in lower for tok in SENSITIVE_NAME_TOKENS)
|
||||
|
||||
|
||||
def strip_llm_wrapper(text: str) -> str:
|
||||
"""Strip outer ```markdown ... ``` fence when it wraps the entire output."""
|
||||
m = OUTER_FENCE_REGEX.match(text)
|
||||
if m:
|
||||
return m.group(2)
|
||||
return text
|
||||
|
||||
from .detect import should_compress
|
||||
from .validate import validate
|
||||
|
||||
MAX_RETRIES = 2
|
||||
|
||||
|
||||
# ---------- Claude Calls ----------
|
||||
|
||||
|
||||
def call_claude(prompt: str) -> str:
|
||||
api_key = os.environ.get("ANTHROPIC_API_KEY")
|
||||
if api_key:
|
||||
try:
|
||||
import anthropic
|
||||
|
||||
client = anthropic.Anthropic(api_key=api_key)
|
||||
msg = client.messages.create(
|
||||
model=os.environ.get("CAVEMAN_MODEL", "claude-sonnet-4-5"),
|
||||
max_tokens=8192,
|
||||
messages=[{"role": "user", "content": prompt}],
|
||||
)
|
||||
return strip_llm_wrapper(msg.content[0].text.strip())
|
||||
except ImportError:
|
||||
pass # anthropic not installed, fall back to CLI
|
||||
# Fallback: use claude CLI (handles desktop auth)
|
||||
try:
|
||||
result = subprocess.run(
|
||||
["claude", "--print"],
|
||||
input=prompt,
|
||||
text=True,
|
||||
capture_output=True,
|
||||
check=True,
|
||||
)
|
||||
return strip_llm_wrapper(result.stdout.strip())
|
||||
except subprocess.CalledProcessError as e:
|
||||
raise RuntimeError(f"Claude call failed:\n{e.stderr}")
|
||||
|
||||
|
||||
def build_compress_prompt(original: str) -> str:
|
||||
return f"""
|
||||
Compress this markdown into caveman format.
|
||||
|
||||
STRICT RULES:
|
||||
- Do NOT modify anything inside ``` code blocks
|
||||
- Do NOT modify anything inside inline backticks
|
||||
- Preserve ALL URLs exactly
|
||||
- Preserve ALL headings exactly
|
||||
- Preserve file paths and commands
|
||||
- Return ONLY the compressed markdown body — do NOT wrap the entire output in a ```markdown fence or any other fence. Inner code blocks from the original stay as-is; do not add a new outer fence around the whole file.
|
||||
|
||||
Only compress natural language.
|
||||
|
||||
TEXT:
|
||||
{original}
|
||||
"""
|
||||
|
||||
|
||||
def build_fix_prompt(original: str, compressed: str, errors: List[str]) -> str:
|
||||
errors_str = "\n".join(f"- {e}" for e in errors)
|
||||
return f"""You are fixing a caveman-compressed markdown file. Specific validation errors were found.
|
||||
|
||||
CRITICAL RULES:
|
||||
- DO NOT recompress or rephrase the file
|
||||
- ONLY fix the listed errors — leave everything else exactly as-is
|
||||
- The ORIGINAL is provided as reference only (to restore missing content)
|
||||
- Preserve caveman style in all untouched sections
|
||||
|
||||
ERRORS TO FIX:
|
||||
{errors_str}
|
||||
|
||||
HOW TO FIX:
|
||||
- Missing URL: find it in ORIGINAL, restore it exactly where it belongs in COMPRESSED
|
||||
- Code block mismatch: find the exact code block in ORIGINAL, restore it in COMPRESSED
|
||||
- Heading mismatch: restore the exact heading text from ORIGINAL into COMPRESSED
|
||||
- Do not touch any section not mentioned in the errors
|
||||
|
||||
ORIGINAL (reference only):
|
||||
{original}
|
||||
|
||||
COMPRESSED (fix this):
|
||||
{compressed}
|
||||
|
||||
Return ONLY the fixed compressed file. No explanation.
|
||||
"""
|
||||
|
||||
|
||||
# ---------- Core Logic ----------
|
||||
|
||||
|
||||
def compress_file(filepath: Path) -> bool:
|
||||
# Resolve and validate path
|
||||
filepath = filepath.resolve()
|
||||
MAX_FILE_SIZE = 500_000 # 500KB
|
||||
if not filepath.exists():
|
||||
raise FileNotFoundError(f"File not found: {filepath}")
|
||||
if filepath.stat().st_size > MAX_FILE_SIZE:
|
||||
raise ValueError(f"File too large to compress safely (max 500KB): {filepath}")
|
||||
|
||||
# Refuse files that look like they contain secrets or PII. Compressing ships
|
||||
# the raw bytes to the Anthropic API — a third-party boundary — so we fail
|
||||
# loudly rather than silently exfiltrate credentials or keys. Override is
|
||||
# intentional: the user must rename the file if the heuristic is wrong.
|
||||
if is_sensitive_path(filepath):
|
||||
raise ValueError(
|
||||
f"Refusing to compress {filepath}: filename looks sensitive "
|
||||
"(credentials, keys, secrets, or known private paths). "
|
||||
"Compression sends file contents to the Anthropic API. "
|
||||
"Rename the file if this is a false positive."
|
||||
)
|
||||
|
||||
print(f"Processing: {filepath}")
|
||||
|
||||
if not should_compress(filepath):
|
||||
print("Skipping (not natural language)")
|
||||
return False
|
||||
|
||||
original_text = filepath.read_text(errors="ignore")
|
||||
backup_path = filepath.with_name(filepath.stem + ".original.md")
|
||||
|
||||
# Check if backup already exists to prevent accidental overwriting
|
||||
if backup_path.exists():
|
||||
print(f"⚠️ Backup file already exists: {backup_path}")
|
||||
print("The original backup may contain important content.")
|
||||
print("Aborting to prevent data loss. Please remove or rename the backup file if you want to proceed.")
|
||||
return False
|
||||
|
||||
# Step 1: Compress
|
||||
print("Compressing with Claude...")
|
||||
compressed = call_claude(build_compress_prompt(original_text))
|
||||
|
||||
# Save original as backup, write compressed to original path
|
||||
backup_path.write_text(original_text)
|
||||
filepath.write_text(compressed)
|
||||
|
||||
# Step 2: Validate + Retry
|
||||
for attempt in range(MAX_RETRIES):
|
||||
print(f"\nValidation attempt {attempt + 1}")
|
||||
|
||||
result = validate(backup_path, filepath)
|
||||
|
||||
if result.is_valid:
|
||||
print("Validation passed")
|
||||
break
|
||||
|
||||
print("❌ Validation failed:")
|
||||
for err in result.errors:
|
||||
print(f" - {err}")
|
||||
|
||||
if attempt == MAX_RETRIES - 1:
|
||||
# Restore original on failure
|
||||
filepath.write_text(original_text)
|
||||
backup_path.unlink(missing_ok=True)
|
||||
print("❌ Failed after retries — original restored")
|
||||
return False
|
||||
|
||||
print("Fixing with Claude...")
|
||||
compressed = call_claude(
|
||||
build_fix_prompt(original_text, compressed, result.errors)
|
||||
)
|
||||
filepath.write_text(compressed)
|
||||
|
||||
return True
|
||||
121
.agents/skills/caveman-compress/scripts/detect.py
Normal file
121
.agents/skills/caveman-compress/scripts/detect.py
Normal file
@@ -0,0 +1,121 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Detect whether a file is natural language (compressible) or code/config (skip)."""
|
||||
|
||||
import json
|
||||
import re
|
||||
from pathlib import Path
|
||||
|
||||
# Extensions that are natural language and compressible
|
||||
COMPRESSIBLE_EXTENSIONS = {".md", ".txt", ".markdown", ".rst"}
|
||||
|
||||
# Extensions that are code/config and should be skipped
|
||||
SKIP_EXTENSIONS = {
|
||||
".py", ".js", ".ts", ".tsx", ".jsx", ".json", ".yaml", ".yml",
|
||||
".toml", ".env", ".lock", ".css", ".scss", ".html", ".xml",
|
||||
".sql", ".sh", ".bash", ".zsh", ".go", ".rs", ".java", ".c",
|
||||
".cpp", ".h", ".hpp", ".rb", ".php", ".swift", ".kt", ".lua",
|
||||
".dockerfile", ".makefile", ".csv", ".ini", ".cfg",
|
||||
}
|
||||
|
||||
# Patterns that indicate a line is code
|
||||
CODE_PATTERNS = [
|
||||
re.compile(r"^\s*(import |from .+ import |require\(|const |let |var )"),
|
||||
re.compile(r"^\s*(def |class |function |async function |export )"),
|
||||
re.compile(r"^\s*(if\s*\(|for\s*\(|while\s*\(|switch\s*\(|try\s*\{)"),
|
||||
re.compile(r"^\s*[\}\]\);]+\s*$"), # closing braces/brackets
|
||||
re.compile(r"^\s*@\w+"), # decorators/annotations
|
||||
re.compile(r'^\s*"[^"]+"\s*:\s*'), # JSON-like key-value
|
||||
re.compile(r"^\s*\w+\s*=\s*[{\[\(\"']"), # assignment with literal
|
||||
]
|
||||
|
||||
|
||||
def _is_code_line(line: str) -> bool:
|
||||
"""Check if a line looks like code."""
|
||||
return any(p.match(line) for p in CODE_PATTERNS)
|
||||
|
||||
|
||||
def _is_json_content(text: str) -> bool:
|
||||
"""Check if content is valid JSON."""
|
||||
try:
|
||||
json.loads(text)
|
||||
return True
|
||||
except (json.JSONDecodeError, ValueError):
|
||||
return False
|
||||
|
||||
|
||||
def _is_yaml_content(lines: list[str]) -> bool:
|
||||
"""Heuristic: check if content looks like YAML."""
|
||||
yaml_indicators = 0
|
||||
for line in lines[:30]:
|
||||
stripped = line.strip()
|
||||
if stripped.startswith("---"):
|
||||
yaml_indicators += 1
|
||||
elif re.match(r"^\w[\w\s]*:\s", stripped):
|
||||
yaml_indicators += 1
|
||||
elif stripped.startswith("- ") and ":" in stripped:
|
||||
yaml_indicators += 1
|
||||
# If most non-empty lines look like YAML
|
||||
non_empty = sum(1 for l in lines[:30] if l.strip())
|
||||
return non_empty > 0 and yaml_indicators / non_empty > 0.6
|
||||
|
||||
|
||||
def detect_file_type(filepath: Path) -> str:
|
||||
"""Classify a file as 'natural_language', 'code', 'config', or 'unknown'.
|
||||
|
||||
Returns:
|
||||
One of: 'natural_language', 'code', 'config', 'unknown'
|
||||
"""
|
||||
ext = filepath.suffix.lower()
|
||||
|
||||
# Extension-based classification
|
||||
if ext in COMPRESSIBLE_EXTENSIONS:
|
||||
return "natural_language"
|
||||
if ext in SKIP_EXTENSIONS:
|
||||
return "code" if ext not in {".json", ".yaml", ".yml", ".toml", ".ini", ".cfg", ".env"} else "config"
|
||||
|
||||
# Extensionless files (like CLAUDE.md, TODO) — check content
|
||||
if not ext:
|
||||
try:
|
||||
text = filepath.read_text(errors="ignore")
|
||||
except (OSError, PermissionError):
|
||||
return "unknown"
|
||||
|
||||
lines = text.splitlines()[:50]
|
||||
|
||||
if _is_json_content(text[:10000]):
|
||||
return "config"
|
||||
if _is_yaml_content(lines):
|
||||
return "config"
|
||||
|
||||
code_lines = sum(1 for l in lines if l.strip() and _is_code_line(l))
|
||||
non_empty = sum(1 for l in lines if l.strip())
|
||||
if non_empty > 0 and code_lines / non_empty > 0.4:
|
||||
return "code"
|
||||
|
||||
return "natural_language"
|
||||
|
||||
return "unknown"
|
||||
|
||||
|
||||
def should_compress(filepath: Path) -> bool:
|
||||
"""Return True if the file is natural language and should be compressed."""
|
||||
if not filepath.is_file():
|
||||
return False
|
||||
# Skip backup files
|
||||
if filepath.name.endswith(".original.md"):
|
||||
return False
|
||||
return detect_file_type(filepath) == "natural_language"
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
if len(sys.argv) < 2:
|
||||
print("Usage: python detect.py <file1> [file2] ...")
|
||||
sys.exit(1)
|
||||
|
||||
for path_str in sys.argv[1:]:
|
||||
p = Path(path_str).resolve()
|
||||
file_type = detect_file_type(p)
|
||||
compress = should_compress(p)
|
||||
print(f" {p.name:30s} type={file_type:20s} compress={compress}")
|
||||
189
.agents/skills/caveman-compress/scripts/validate.py
Normal file
189
.agents/skills/caveman-compress/scripts/validate.py
Normal file
@@ -0,0 +1,189 @@
|
||||
#!/usr/bin/env python3
|
||||
import re
|
||||
from pathlib import Path
|
||||
|
||||
URL_REGEX = re.compile(r"https?://[^\s)]+")
|
||||
FENCE_OPEN_REGEX = re.compile(r"^(\s{0,3})(`{3,}|~{3,})(.*)$")
|
||||
HEADING_REGEX = re.compile(r"^(#{1,6})\s+(.*)", re.MULTILINE)
|
||||
BULLET_REGEX = re.compile(r"^\s*[-*+]\s+", re.MULTILINE)
|
||||
|
||||
# crude but effective path detection
|
||||
# Requires either a path prefix (./ ../ / or drive letter) or a slash/backslash within the match
|
||||
PATH_REGEX = re.compile(r"(?:\./|\.\./|/|[A-Za-z]:\\)[\w\-/\\\.]+|[\w\-\.]+[/\\][\w\-/\\\.]+")
|
||||
|
||||
|
||||
class ValidationResult:
|
||||
def __init__(self):
|
||||
self.is_valid = True
|
||||
self.errors = []
|
||||
self.warnings = []
|
||||
|
||||
def add_error(self, msg):
|
||||
self.is_valid = False
|
||||
self.errors.append(msg)
|
||||
|
||||
def add_warning(self, msg):
|
||||
self.warnings.append(msg)
|
||||
|
||||
|
||||
def read_file(path: Path) -> str:
|
||||
return path.read_text(errors="ignore")
|
||||
|
||||
|
||||
# ---------- Extractors ----------
|
||||
|
||||
|
||||
def extract_headings(text):
|
||||
return [(level, title.strip()) for level, title in HEADING_REGEX.findall(text)]
|
||||
|
||||
|
||||
def extract_code_blocks(text):
|
||||
"""Line-based fenced code block extractor.
|
||||
|
||||
Handles ``` and ~~~ fences with variable length (CommonMark: closing
|
||||
fence must use same char and be at least as long as opening). Supports
|
||||
nested fences (e.g. an outer 4-backtick block wrapping inner 3-backtick
|
||||
content).
|
||||
"""
|
||||
blocks = []
|
||||
lines = text.split("\n")
|
||||
i = 0
|
||||
n = len(lines)
|
||||
while i < n:
|
||||
m = FENCE_OPEN_REGEX.match(lines[i])
|
||||
if not m:
|
||||
i += 1
|
||||
continue
|
||||
fence_char = m.group(2)[0]
|
||||
fence_len = len(m.group(2))
|
||||
open_line = lines[i]
|
||||
block_lines = [open_line]
|
||||
i += 1
|
||||
closed = False
|
||||
while i < n:
|
||||
close_m = FENCE_OPEN_REGEX.match(lines[i])
|
||||
if (
|
||||
close_m
|
||||
and close_m.group(2)[0] == fence_char
|
||||
and len(close_m.group(2)) >= fence_len
|
||||
and close_m.group(3).strip() == ""
|
||||
):
|
||||
block_lines.append(lines[i])
|
||||
closed = True
|
||||
i += 1
|
||||
break
|
||||
block_lines.append(lines[i])
|
||||
i += 1
|
||||
if closed:
|
||||
blocks.append("\n".join(block_lines))
|
||||
# Unclosed fences are silently skipped — they indicate malformed markdown
|
||||
# and including them would cause false-positive validation failures.
|
||||
return blocks
|
||||
|
||||
|
||||
def extract_urls(text):
|
||||
return set(URL_REGEX.findall(text))
|
||||
|
||||
|
||||
def extract_paths(text):
|
||||
return set(PATH_REGEX.findall(text))
|
||||
|
||||
|
||||
def count_bullets(text):
|
||||
return len(BULLET_REGEX.findall(text))
|
||||
|
||||
|
||||
# ---------- Validators ----------
|
||||
|
||||
|
||||
def validate_headings(orig, comp, result):
|
||||
h1 = extract_headings(orig)
|
||||
h2 = extract_headings(comp)
|
||||
|
||||
if len(h1) != len(h2):
|
||||
result.add_error(f"Heading count mismatch: {len(h1)} vs {len(h2)}")
|
||||
|
||||
if h1 != h2:
|
||||
result.add_warning("Heading text/order changed")
|
||||
|
||||
|
||||
def validate_code_blocks(orig, comp, result):
|
||||
c1 = extract_code_blocks(orig)
|
||||
c2 = extract_code_blocks(comp)
|
||||
|
||||
if c1 != c2:
|
||||
result.add_error("Code blocks not preserved exactly")
|
||||
|
||||
|
||||
def validate_urls(orig, comp, result):
|
||||
u1 = extract_urls(orig)
|
||||
u2 = extract_urls(comp)
|
||||
|
||||
if u1 != u2:
|
||||
result.add_error(f"URL mismatch: lost={u1 - u2}, added={u2 - u1}")
|
||||
|
||||
|
||||
def validate_paths(orig, comp, result):
|
||||
p1 = extract_paths(orig)
|
||||
p2 = extract_paths(comp)
|
||||
|
||||
if p1 != p2:
|
||||
result.add_warning(f"Path mismatch: lost={p1 - p2}, added={p2 - p1}")
|
||||
|
||||
|
||||
def validate_bullets(orig, comp, result):
|
||||
b1 = count_bullets(orig)
|
||||
b2 = count_bullets(comp)
|
||||
|
||||
if b1 == 0:
|
||||
return
|
||||
|
||||
diff = abs(b1 - b2) / b1
|
||||
|
||||
if diff > 0.15:
|
||||
result.add_warning(f"Bullet count changed too much: {b1} -> {b2}")
|
||||
|
||||
|
||||
# ---------- Main ----------
|
||||
|
||||
|
||||
def validate(original_path: Path, compressed_path: Path) -> ValidationResult:
|
||||
result = ValidationResult()
|
||||
|
||||
orig = read_file(original_path)
|
||||
comp = read_file(compressed_path)
|
||||
|
||||
validate_headings(orig, comp, result)
|
||||
validate_code_blocks(orig, comp, result)
|
||||
validate_urls(orig, comp, result)
|
||||
validate_paths(orig, comp, result)
|
||||
validate_bullets(orig, comp, result)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
# ---------- CLI ----------
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
if len(sys.argv) != 3:
|
||||
print("Usage: python validate.py <original> <compressed>")
|
||||
sys.exit(1)
|
||||
|
||||
orig = Path(sys.argv[1]).resolve()
|
||||
comp = Path(sys.argv[2]).resolve()
|
||||
|
||||
res = validate(orig, comp)
|
||||
|
||||
print(f"\nValid: {res.is_valid}")
|
||||
|
||||
if res.errors:
|
||||
print("\nErrors:")
|
||||
for e in res.errors:
|
||||
print(f" - {e}")
|
||||
|
||||
if res.warnings:
|
||||
print("\nWarnings:")
|
||||
for w in res.warnings:
|
||||
print(f" - {w}")
|
||||
59
.agents/skills/caveman-help/SKILL.md
Normal file
59
.agents/skills/caveman-help/SKILL.md
Normal file
@@ -0,0 +1,59 @@
|
||||
---
|
||||
name: caveman-help
|
||||
description: >
|
||||
Quick-reference card for all caveman modes, skills, and commands.
|
||||
One-shot display, not a persistent mode. Trigger: /caveman-help,
|
||||
"caveman help", "what caveman commands", "how do I use caveman".
|
||||
---
|
||||
|
||||
# Caveman Help
|
||||
|
||||
Display this reference card when invoked. One-shot — do NOT change mode, write flag files, or persist anything. Output in caveman style.
|
||||
|
||||
## Modes
|
||||
|
||||
| Mode | Trigger | What change |
|
||||
|------|---------|-------------|
|
||||
| **Lite** | `/caveman lite` | Drop filler. Keep sentence structure. |
|
||||
| **Full** | `/caveman` | Drop articles, filler, pleasantries, hedging. Fragments OK. Default. |
|
||||
| **Ultra** | `/caveman ultra` | Extreme compression. Bare fragments. Tables over prose. |
|
||||
| **Wenyan-Lite** | `/caveman wenyan-lite` | Classical Chinese style, light compression. |
|
||||
| **Wenyan-Full** | `/caveman wenyan` | Full 文言文. Maximum classical terseness. |
|
||||
| **Wenyan-Ultra** | `/caveman wenyan-ultra` | Extreme. Ancient scholar on a budget. |
|
||||
|
||||
Mode stick until changed or session end.
|
||||
|
||||
## Skills
|
||||
|
||||
| Skill | Trigger | What it do |
|
||||
|-------|---------|-----------|
|
||||
| **caveman-commit** | `/caveman-commit` | Terse commit messages. Conventional Commits. ≤50 char subject. |
|
||||
| **caveman-review** | `/caveman-review` | One-line PR comments: `L42: bug: user null. Add guard.` |
|
||||
| **caveman-compress** | `/caveman:compress <file>` | Compress .md files to caveman prose. Saves ~46% input tokens. |
|
||||
| **caveman-help** | `/caveman-help` | This card. |
|
||||
|
||||
## Deactivate
|
||||
|
||||
Say "stop caveman" or "normal mode". Resume anytime with `/caveman`.
|
||||
|
||||
## Configure Default Mode
|
||||
|
||||
Default mode = `full`. Change it:
|
||||
|
||||
**Environment variable** (highest priority):
|
||||
```bash
|
||||
export CAVEMAN_DEFAULT_MODE=ultra
|
||||
```
|
||||
|
||||
**Config file** (`~/.config/caveman/config.json`):
|
||||
```json
|
||||
{ "defaultMode": "lite" }
|
||||
```
|
||||
|
||||
Set `"off"` to disable auto-activation on session start. User can still activate manually with `/caveman`.
|
||||
|
||||
Resolution: env var > config file > `full`.
|
||||
|
||||
## More
|
||||
|
||||
Full docs: https://github.com/JuliusBrussee/caveman
|
||||
67
.agents/skills/caveman/SKILL.md
Normal file
67
.agents/skills/caveman/SKILL.md
Normal file
@@ -0,0 +1,67 @@
|
||||
---
|
||||
name: caveman
|
||||
description: >
|
||||
Ultra-compressed communication mode. Cuts token usage ~75% by speaking like caveman
|
||||
while keeping full technical accuracy. Supports intensity levels: lite, full (default), ultra,
|
||||
wenyan-lite, wenyan-full, wenyan-ultra.
|
||||
Use when user says "caveman mode", "talk like caveman", "use caveman", "less tokens",
|
||||
"be brief", or invokes /caveman. Also auto-triggers when token efficiency is requested.
|
||||
---
|
||||
|
||||
Respond terse like smart caveman. All technical substance stay. Only fluff die.
|
||||
|
||||
## Persistence
|
||||
|
||||
ACTIVE EVERY RESPONSE. No revert after many turns. No filler drift. Still active if unsure. Off only: "stop caveman" / "normal mode".
|
||||
|
||||
Default: **full**. Switch: `/caveman lite|full|ultra`.
|
||||
|
||||
## Rules
|
||||
|
||||
Drop: articles (a/an/the), filler (just/really/basically/actually/simply), pleasantries (sure/certainly/of course/happy to), hedging. Fragments OK. Short synonyms (big not extensive, fix not "implement a solution for"). Technical terms exact. Code blocks unchanged. Errors quoted exact.
|
||||
|
||||
Pattern: `[thing] [action] [reason]. [next step].`
|
||||
|
||||
Not: "Sure! I'd be happy to help you with that. The issue you're experiencing is likely caused by..."
|
||||
Yes: "Bug in auth middleware. Token expiry check use `<` not `<=`. Fix:"
|
||||
|
||||
## Intensity
|
||||
|
||||
| Level | What change |
|
||||
|-------|------------|
|
||||
| **lite** | No filler/hedging. Keep articles + full sentences. Professional but tight |
|
||||
| **full** | Drop articles, fragments OK, short synonyms. Classic caveman |
|
||||
| **ultra** | Abbreviate (DB/auth/config/req/res/fn/impl), strip conjunctions, arrows for causality (X → Y), one word when one word enough |
|
||||
| **wenyan-lite** | Semi-classical. Drop filler/hedging but keep grammar structure, classical register |
|
||||
| **wenyan-full** | Maximum classical terseness. Fully 文言文. 80-90% character reduction. Classical sentence patterns, verbs precede objects, subjects often omitted, classical particles (之/乃/為/其) |
|
||||
| **wenyan-ultra** | Extreme abbreviation while keeping classical Chinese feel. Maximum compression, ultra terse |
|
||||
|
||||
Example — "Why React component re-render?"
|
||||
- lite: "Your component re-renders because you create a new object reference each render. Wrap it in `useMemo`."
|
||||
- full: "New object ref each render. Inline object prop = new ref = re-render. Wrap in `useMemo`."
|
||||
- ultra: "Inline obj prop → new ref → re-render. `useMemo`."
|
||||
- wenyan-lite: "組件頻重繪,以每繪新生對象參照故。以 useMemo 包之。"
|
||||
- wenyan-full: "物出新參照,致重繪。useMemo .Wrap之。"
|
||||
- wenyan-ultra: "新參照→重繪。useMemo Wrap。"
|
||||
|
||||
Example — "Explain database connection pooling."
|
||||
- lite: "Connection pooling reuses open connections instead of creating new ones per request. Avoids repeated handshake overhead."
|
||||
- full: "Pool reuse open DB connections. No new connection per request. Skip handshake overhead."
|
||||
- ultra: "Pool = reuse DB conn. Skip handshake → fast under load."
|
||||
- wenyan-full: "池reuse open connection。不每req新開。skip handshake overhead。"
|
||||
- wenyan-ultra: "池reuse conn。skip handshake → fast。"
|
||||
|
||||
## Auto-Clarity
|
||||
|
||||
Drop caveman for: security warnings, irreversible action confirmations, multi-step sequences where fragment order risks misread, user asks to clarify or repeats question. Resume caveman after clear part done.
|
||||
|
||||
Example — destructive op:
|
||||
> **Warning:** This will permanently delete all rows in the `users` table and cannot be undone.
|
||||
> ```sql
|
||||
> DROP TABLE users;
|
||||
> ```
|
||||
> Caveman resume. Verify backup exist first.
|
||||
|
||||
## Boundaries
|
||||
|
||||
Code/commits/PRs: write normal. "stop caveman" or "normal mode": revert. Level persist until changed or session end.
|
||||
111
.agents/skills/compress/SKILL.md
Normal file
111
.agents/skills/compress/SKILL.md
Normal file
@@ -0,0 +1,111 @@
|
||||
---
|
||||
name: compress
|
||||
description: >
|
||||
Compress natural language memory files (CLAUDE.md, todos, preferences) into caveman format
|
||||
to save input tokens. Preserves all technical substance, code, URLs, and structure.
|
||||
Compressed version overwrites the original file. Human-readable backup saved as FILE.original.md.
|
||||
Trigger: /caveman:compress <filepath> or "compress memory file"
|
||||
---
|
||||
|
||||
# Caveman Compress
|
||||
|
||||
## Purpose
|
||||
|
||||
Compress natural language files (CLAUDE.md, todos, preferences) into caveman-speak to reduce input tokens. Compressed version overwrites original. Human-readable backup saved as `<filename>.original.md`.
|
||||
|
||||
## Trigger
|
||||
|
||||
`/caveman:compress <filepath>` or when user asks to compress a memory file.
|
||||
|
||||
## Process
|
||||
|
||||
1. This SKILL.md lives alongside `scripts/` in the same directory. Find that directory.
|
||||
|
||||
2. Run:
|
||||
|
||||
cd <directory_containing_this_SKILL.md> && python3 -m scripts <absolute_filepath>
|
||||
|
||||
3. The CLI will:
|
||||
- detect file type (no tokens)
|
||||
- call Claude to compress
|
||||
- validate output (no tokens)
|
||||
- if errors: cherry-pick fix with Claude (targeted fixes only, no recompression)
|
||||
- retry up to 2 times
|
||||
- if still failing after 2 retries: report error to user, leave original file untouched
|
||||
|
||||
4. Return result to user
|
||||
|
||||
## Compression Rules
|
||||
|
||||
### Remove
|
||||
- Articles: a, an, the
|
||||
- Filler: just, really, basically, actually, simply, essentially, generally
|
||||
- Pleasantries: "sure", "certainly", "of course", "happy to", "I'd recommend"
|
||||
- Hedging: "it might be worth", "you could consider", "it would be good to"
|
||||
- Redundant phrasing: "in order to" → "to", "make sure to" → "ensure", "the reason is because" → "because"
|
||||
- Connective fluff: "however", "furthermore", "additionally", "in addition"
|
||||
|
||||
### Preserve EXACTLY (never modify)
|
||||
- Code blocks (fenced ``` and indented)
|
||||
- Inline code (`backtick content`)
|
||||
- URLs and links (full URLs, markdown links)
|
||||
- File paths (`/src/components/...`, `./config.yaml`)
|
||||
- Commands (`npm install`, `git commit`, `docker build`)
|
||||
- Technical terms (library names, API names, protocols, algorithms)
|
||||
- Proper nouns (project names, people, companies)
|
||||
- Dates, version numbers, numeric values
|
||||
- Environment variables (`$HOME`, `NODE_ENV`)
|
||||
|
||||
### Preserve Structure
|
||||
- All markdown headings (keep exact heading text, compress body below)
|
||||
- Bullet point hierarchy (keep nesting level)
|
||||
- Numbered lists (keep numbering)
|
||||
- Tables (compress cell text, keep structure)
|
||||
- Frontmatter/YAML headers in markdown files
|
||||
|
||||
### Compress
|
||||
- Use short synonyms: "big" not "extensive", "fix" not "implement a solution for", "use" not "utilize"
|
||||
- Fragments OK: "Run tests before commit" not "You should always run tests before committing"
|
||||
- Drop "you should", "make sure to", "remember to" — just state the action
|
||||
- Merge redundant bullets that say the same thing differently
|
||||
- Keep one example where multiple examples show the same pattern
|
||||
|
||||
CRITICAL RULE:
|
||||
Anything inside ``` ... ``` must be copied EXACTLY.
|
||||
Do not:
|
||||
- remove comments
|
||||
- remove spacing
|
||||
- reorder lines
|
||||
- shorten commands
|
||||
- simplify anything
|
||||
|
||||
Inline code (`...`) must be preserved EXACTLY.
|
||||
Do not modify anything inside backticks.
|
||||
|
||||
If file contains code blocks:
|
||||
- Treat code blocks as read-only regions
|
||||
- Only compress text outside them
|
||||
- Do not merge sections around code
|
||||
|
||||
## Pattern
|
||||
|
||||
Original:
|
||||
> You should always make sure to run the test suite before pushing any changes to the main branch. This is important because it helps catch bugs early and prevents broken builds from being deployed to production.
|
||||
|
||||
Compressed:
|
||||
> Run tests before push to main. Catch bugs early, prevent broken prod deploys.
|
||||
|
||||
Original:
|
||||
> The application uses a microservices architecture with the following components. The API gateway handles all incoming requests and routes them to the appropriate service. The authentication service is responsible for managing user sessions and JWT tokens.
|
||||
|
||||
Compressed:
|
||||
> Microservices architecture. API gateway route all requests to services. Auth service manage user sessions + JWT tokens.
|
||||
|
||||
## Boundaries
|
||||
|
||||
- ONLY compress natural language files (.md, .txt, extensionless)
|
||||
- NEVER modify: .py, .js, .ts, .json, .yaml, .yml, .toml, .env, .lock, .css, .html, .xml, .sql, .sh
|
||||
- If file has mixed content (prose + code), compress ONLY the prose sections
|
||||
- If unsure whether something is code or prose, leave it unchanged
|
||||
- Original file is backed up as FILE.original.md before overwriting
|
||||
- Never compress FILE.original.md (skip it)
|
||||
9
.agents/skills/compress/scripts/__init__.py
Normal file
9
.agents/skills/compress/scripts/__init__.py
Normal file
@@ -0,0 +1,9 @@
|
||||
"""Caveman compress scripts.
|
||||
|
||||
This package provides tools to compress natural language markdown files
|
||||
into caveman format to save input tokens.
|
||||
"""
|
||||
|
||||
__all__ = ["cli", "compress", "detect", "validate"]
|
||||
|
||||
__version__ = "1.0.0"
|
||||
3
.agents/skills/compress/scripts/__main__.py
Normal file
3
.agents/skills/compress/scripts/__main__.py
Normal file
@@ -0,0 +1,3 @@
|
||||
from .cli import main
|
||||
|
||||
main()
|
||||
78
.agents/skills/compress/scripts/benchmark.py
Normal file
78
.agents/skills/compress/scripts/benchmark.py
Normal file
@@ -0,0 +1,78 @@
|
||||
#!/usr/bin/env python3
|
||||
from pathlib import Path
|
||||
import sys
|
||||
|
||||
# Support both direct execution and module import
|
||||
try:
|
||||
from .validate import validate
|
||||
except ImportError:
|
||||
sys.path.insert(0, str(Path(__file__).parent))
|
||||
from validate import validate
|
||||
|
||||
try:
|
||||
import tiktoken
|
||||
_enc = tiktoken.get_encoding("o200k_base")
|
||||
except ImportError:
|
||||
_enc = None
|
||||
|
||||
|
||||
def count_tokens(text):
|
||||
if _enc is None:
|
||||
return len(text.split()) # fallback: word count
|
||||
return len(_enc.encode(text))
|
||||
|
||||
|
||||
def benchmark_pair(orig_path: Path, comp_path: Path):
|
||||
orig_text = orig_path.read_text()
|
||||
comp_text = comp_path.read_text()
|
||||
|
||||
orig_tokens = count_tokens(orig_text)
|
||||
comp_tokens = count_tokens(comp_text)
|
||||
saved = 100 * (orig_tokens - comp_tokens) / orig_tokens if orig_tokens > 0 else 0.0
|
||||
result = validate(orig_path, comp_path)
|
||||
|
||||
return (comp_path.name, orig_tokens, comp_tokens, saved, result.is_valid)
|
||||
|
||||
|
||||
def print_table(rows):
|
||||
print("\n| File | Original | Compressed | Saved % | Valid |")
|
||||
print("|------|----------|------------|---------|-------|")
|
||||
for r in rows:
|
||||
print(f"| {r[0]} | {r[1]} | {r[2]} | {r[3]:.1f}% | {'✅' if r[4] else '❌'} |")
|
||||
|
||||
|
||||
def main():
|
||||
# Direct file pair: python3 benchmark.py original.md compressed.md
|
||||
if len(sys.argv) == 3:
|
||||
orig = Path(sys.argv[1]).resolve()
|
||||
comp = Path(sys.argv[2]).resolve()
|
||||
if not orig.exists():
|
||||
print(f"❌ Not found: {orig}")
|
||||
sys.exit(1)
|
||||
if not comp.exists():
|
||||
print(f"❌ Not found: {comp}")
|
||||
sys.exit(1)
|
||||
print_table([benchmark_pair(orig, comp)])
|
||||
return
|
||||
|
||||
# Glob mode: repo_root/tests/caveman-compress/
|
||||
tests_dir = Path(__file__).parent.parent.parent / "tests" / "caveman-compress"
|
||||
if not tests_dir.exists():
|
||||
print(f"❌ Tests dir not found: {tests_dir}")
|
||||
sys.exit(1)
|
||||
|
||||
rows = []
|
||||
for orig in sorted(tests_dir.glob("*.original.md")):
|
||||
comp = orig.with_name(orig.stem.removesuffix(".original") + ".md")
|
||||
if comp.exists():
|
||||
rows.append(benchmark_pair(orig, comp))
|
||||
|
||||
if not rows:
|
||||
print("No compressed file pairs found.")
|
||||
return
|
||||
|
||||
print_table(rows)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
73
.agents/skills/compress/scripts/cli.py
Normal file
73
.agents/skills/compress/scripts/cli.py
Normal file
@@ -0,0 +1,73 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Caveman Compress CLI
|
||||
|
||||
Usage:
|
||||
caveman <filepath>
|
||||
"""
|
||||
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
from .compress import compress_file
|
||||
from .detect import detect_file_type, should_compress
|
||||
|
||||
|
||||
def print_usage():
|
||||
print("Usage: caveman <filepath>")
|
||||
|
||||
|
||||
def main():
|
||||
if len(sys.argv) != 2:
|
||||
print_usage()
|
||||
sys.exit(1)
|
||||
|
||||
filepath = Path(sys.argv[1])
|
||||
|
||||
# Check file exists
|
||||
if not filepath.exists():
|
||||
print(f"❌ File not found: {filepath}")
|
||||
sys.exit(1)
|
||||
|
||||
if not filepath.is_file():
|
||||
print(f"❌ Not a file: {filepath}")
|
||||
sys.exit(1)
|
||||
|
||||
filepath = filepath.resolve()
|
||||
|
||||
# Detect file type
|
||||
file_type = detect_file_type(filepath)
|
||||
|
||||
print(f"Detected: {file_type}")
|
||||
|
||||
# Check if compressible
|
||||
if not should_compress(filepath):
|
||||
print("Skipping: file is not natural language (code/config)")
|
||||
sys.exit(0)
|
||||
|
||||
print("Starting caveman compression...\n")
|
||||
|
||||
try:
|
||||
success = compress_file(filepath)
|
||||
|
||||
if success:
|
||||
print("\nCompression completed successfully")
|
||||
backup_path = filepath.with_name(filepath.stem + ".original.md")
|
||||
print(f"Compressed: {filepath}")
|
||||
print(f"Original: {backup_path}")
|
||||
sys.exit(0)
|
||||
else:
|
||||
print("\n❌ Compression failed after retries")
|
||||
sys.exit(2)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("\nInterrupted by user")
|
||||
sys.exit(130)
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n❌ Error: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
227
.agents/skills/compress/scripts/compress.py
Normal file
227
.agents/skills/compress/scripts/compress.py
Normal file
@@ -0,0 +1,227 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Caveman Memory Compression Orchestrator
|
||||
|
||||
Usage:
|
||||
python scripts/compress.py <filepath>
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from typing import List
|
||||
|
||||
OUTER_FENCE_REGEX = re.compile(
|
||||
r"\A\s*(`{3,}|~{3,})[^\n]*\n(.*)\n\1\s*\Z", re.DOTALL
|
||||
)
|
||||
|
||||
# Filenames and paths that almost certainly hold secrets or PII. Compressing
|
||||
# them ships raw bytes to the Anthropic API — a third-party data boundary that
|
||||
# developers on sensitive codebases cannot cross. detect.py already skips .env
|
||||
# by extension, but credentials.md / secrets.txt / ~/.aws/credentials would
|
||||
# slip through the natural-language filter. This is a hard refuse before read.
|
||||
SENSITIVE_BASENAME_REGEX = re.compile(
|
||||
r"(?ix)^("
|
||||
r"\.env(\..+)?"
|
||||
r"|\.netrc"
|
||||
r"|credentials(\..+)?"
|
||||
r"|secrets?(\..+)?"
|
||||
r"|passwords?(\..+)?"
|
||||
r"|id_(rsa|dsa|ecdsa|ed25519)(\.pub)?"
|
||||
r"|authorized_keys"
|
||||
r"|known_hosts"
|
||||
r"|.*\.(pem|key|p12|pfx|crt|cer|jks|keystore|asc|gpg)"
|
||||
r")$"
|
||||
)
|
||||
|
||||
SENSITIVE_PATH_COMPONENTS = frozenset({".ssh", ".aws", ".gnupg", ".kube", ".docker"})
|
||||
|
||||
SENSITIVE_NAME_TOKENS = (
|
||||
"secret", "credential", "password", "passwd",
|
||||
"apikey", "accesskey", "token", "privatekey",
|
||||
)
|
||||
|
||||
|
||||
def is_sensitive_path(filepath: Path) -> bool:
|
||||
"""Heuristic denylist for files that must never be shipped to a third-party API."""
|
||||
name = filepath.name
|
||||
if SENSITIVE_BASENAME_REGEX.match(name):
|
||||
return True
|
||||
lowered_parts = {p.lower() for p in filepath.parts}
|
||||
if lowered_parts & SENSITIVE_PATH_COMPONENTS:
|
||||
return True
|
||||
# Normalize separators so "api-key" and "api_key" both match "apikey".
|
||||
lower = re.sub(r"[_\-\s.]", "", name.lower())
|
||||
return any(tok in lower for tok in SENSITIVE_NAME_TOKENS)
|
||||
|
||||
|
||||
def strip_llm_wrapper(text: str) -> str:
|
||||
"""Strip outer ```markdown ... ``` fence when it wraps the entire output."""
|
||||
m = OUTER_FENCE_REGEX.match(text)
|
||||
if m:
|
||||
return m.group(2)
|
||||
return text
|
||||
|
||||
from .detect import should_compress
|
||||
from .validate import validate
|
||||
|
||||
MAX_RETRIES = 2
|
||||
|
||||
|
||||
# ---------- Claude Calls ----------
|
||||
|
||||
|
||||
def call_claude(prompt: str) -> str:
|
||||
api_key = os.environ.get("ANTHROPIC_API_KEY")
|
||||
if api_key:
|
||||
try:
|
||||
import anthropic
|
||||
|
||||
client = anthropic.Anthropic(api_key=api_key)
|
||||
msg = client.messages.create(
|
||||
model=os.environ.get("CAVEMAN_MODEL", "claude-sonnet-4-5"),
|
||||
max_tokens=8192,
|
||||
messages=[{"role": "user", "content": prompt}],
|
||||
)
|
||||
return strip_llm_wrapper(msg.content[0].text.strip())
|
||||
except ImportError:
|
||||
pass # anthropic not installed, fall back to CLI
|
||||
# Fallback: use claude CLI (handles desktop auth)
|
||||
try:
|
||||
result = subprocess.run(
|
||||
["claude", "--print"],
|
||||
input=prompt,
|
||||
text=True,
|
||||
capture_output=True,
|
||||
check=True,
|
||||
)
|
||||
return strip_llm_wrapper(result.stdout.strip())
|
||||
except subprocess.CalledProcessError as e:
|
||||
raise RuntimeError(f"Claude call failed:\n{e.stderr}")
|
||||
|
||||
|
||||
def build_compress_prompt(original: str) -> str:
|
||||
return f"""
|
||||
Compress this markdown into caveman format.
|
||||
|
||||
STRICT RULES:
|
||||
- Do NOT modify anything inside ``` code blocks
|
||||
- Do NOT modify anything inside inline backticks
|
||||
- Preserve ALL URLs exactly
|
||||
- Preserve ALL headings exactly
|
||||
- Preserve file paths and commands
|
||||
- Return ONLY the compressed markdown body — do NOT wrap the entire output in a ```markdown fence or any other fence. Inner code blocks from the original stay as-is; do not add a new outer fence around the whole file.
|
||||
|
||||
Only compress natural language.
|
||||
|
||||
TEXT:
|
||||
{original}
|
||||
"""
|
||||
|
||||
|
||||
def build_fix_prompt(original: str, compressed: str, errors: List[str]) -> str:
|
||||
errors_str = "\n".join(f"- {e}" for e in errors)
|
||||
return f"""You are fixing a caveman-compressed markdown file. Specific validation errors were found.
|
||||
|
||||
CRITICAL RULES:
|
||||
- DO NOT recompress or rephrase the file
|
||||
- ONLY fix the listed errors — leave everything else exactly as-is
|
||||
- The ORIGINAL is provided as reference only (to restore missing content)
|
||||
- Preserve caveman style in all untouched sections
|
||||
|
||||
ERRORS TO FIX:
|
||||
{errors_str}
|
||||
|
||||
HOW TO FIX:
|
||||
- Missing URL: find it in ORIGINAL, restore it exactly where it belongs in COMPRESSED
|
||||
- Code block mismatch: find the exact code block in ORIGINAL, restore it in COMPRESSED
|
||||
- Heading mismatch: restore the exact heading text from ORIGINAL into COMPRESSED
|
||||
- Do not touch any section not mentioned in the errors
|
||||
|
||||
ORIGINAL (reference only):
|
||||
{original}
|
||||
|
||||
COMPRESSED (fix this):
|
||||
{compressed}
|
||||
|
||||
Return ONLY the fixed compressed file. No explanation.
|
||||
"""
|
||||
|
||||
|
||||
# ---------- Core Logic ----------
|
||||
|
||||
|
||||
def compress_file(filepath: Path) -> bool:
|
||||
# Resolve and validate path
|
||||
filepath = filepath.resolve()
|
||||
MAX_FILE_SIZE = 500_000 # 500KB
|
||||
if not filepath.exists():
|
||||
raise FileNotFoundError(f"File not found: {filepath}")
|
||||
if filepath.stat().st_size > MAX_FILE_SIZE:
|
||||
raise ValueError(f"File too large to compress safely (max 500KB): {filepath}")
|
||||
|
||||
# Refuse files that look like they contain secrets or PII. Compressing ships
|
||||
# the raw bytes to the Anthropic API — a third-party boundary — so we fail
|
||||
# loudly rather than silently exfiltrate credentials or keys. Override is
|
||||
# intentional: the user must rename the file if the heuristic is wrong.
|
||||
if is_sensitive_path(filepath):
|
||||
raise ValueError(
|
||||
f"Refusing to compress {filepath}: filename looks sensitive "
|
||||
"(credentials, keys, secrets, or known private paths). "
|
||||
"Compression sends file contents to the Anthropic API. "
|
||||
"Rename the file if this is a false positive."
|
||||
)
|
||||
|
||||
print(f"Processing: {filepath}")
|
||||
|
||||
if not should_compress(filepath):
|
||||
print("Skipping (not natural language)")
|
||||
return False
|
||||
|
||||
original_text = filepath.read_text(errors="ignore")
|
||||
backup_path = filepath.with_name(filepath.stem + ".original.md")
|
||||
|
||||
# Check if backup already exists to prevent accidental overwriting
|
||||
if backup_path.exists():
|
||||
print(f"⚠️ Backup file already exists: {backup_path}")
|
||||
print("The original backup may contain important content.")
|
||||
print("Aborting to prevent data loss. Please remove or rename the backup file if you want to proceed.")
|
||||
return False
|
||||
|
||||
# Step 1: Compress
|
||||
print("Compressing with Claude...")
|
||||
compressed = call_claude(build_compress_prompt(original_text))
|
||||
|
||||
# Save original as backup, write compressed to original path
|
||||
backup_path.write_text(original_text)
|
||||
filepath.write_text(compressed)
|
||||
|
||||
# Step 2: Validate + Retry
|
||||
for attempt in range(MAX_RETRIES):
|
||||
print(f"\nValidation attempt {attempt + 1}")
|
||||
|
||||
result = validate(backup_path, filepath)
|
||||
|
||||
if result.is_valid:
|
||||
print("Validation passed")
|
||||
break
|
||||
|
||||
print("❌ Validation failed:")
|
||||
for err in result.errors:
|
||||
print(f" - {err}")
|
||||
|
||||
if attempt == MAX_RETRIES - 1:
|
||||
# Restore original on failure
|
||||
filepath.write_text(original_text)
|
||||
backup_path.unlink(missing_ok=True)
|
||||
print("❌ Failed after retries — original restored")
|
||||
return False
|
||||
|
||||
print("Fixing with Claude...")
|
||||
compressed = call_claude(
|
||||
build_fix_prompt(original_text, compressed, result.errors)
|
||||
)
|
||||
filepath.write_text(compressed)
|
||||
|
||||
return True
|
||||
121
.agents/skills/compress/scripts/detect.py
Normal file
121
.agents/skills/compress/scripts/detect.py
Normal file
@@ -0,0 +1,121 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Detect whether a file is natural language (compressible) or code/config (skip)."""
|
||||
|
||||
import json
|
||||
import re
|
||||
from pathlib import Path
|
||||
|
||||
# Extensions that are natural language and compressible
|
||||
COMPRESSIBLE_EXTENSIONS = {".md", ".txt", ".markdown", ".rst"}
|
||||
|
||||
# Extensions that are code/config and should be skipped
|
||||
SKIP_EXTENSIONS = {
|
||||
".py", ".js", ".ts", ".tsx", ".jsx", ".json", ".yaml", ".yml",
|
||||
".toml", ".env", ".lock", ".css", ".scss", ".html", ".xml",
|
||||
".sql", ".sh", ".bash", ".zsh", ".go", ".rs", ".java", ".c",
|
||||
".cpp", ".h", ".hpp", ".rb", ".php", ".swift", ".kt", ".lua",
|
||||
".dockerfile", ".makefile", ".csv", ".ini", ".cfg",
|
||||
}
|
||||
|
||||
# Patterns that indicate a line is code
|
||||
CODE_PATTERNS = [
|
||||
re.compile(r"^\s*(import |from .+ import |require\(|const |let |var )"),
|
||||
re.compile(r"^\s*(def |class |function |async function |export )"),
|
||||
re.compile(r"^\s*(if\s*\(|for\s*\(|while\s*\(|switch\s*\(|try\s*\{)"),
|
||||
re.compile(r"^\s*[\}\]\);]+\s*$"), # closing braces/brackets
|
||||
re.compile(r"^\s*@\w+"), # decorators/annotations
|
||||
re.compile(r'^\s*"[^"]+"\s*:\s*'), # JSON-like key-value
|
||||
re.compile(r"^\s*\w+\s*=\s*[{\[\(\"']"), # assignment with literal
|
||||
]
|
||||
|
||||
|
||||
def _is_code_line(line: str) -> bool:
|
||||
"""Check if a line looks like code."""
|
||||
return any(p.match(line) for p in CODE_PATTERNS)
|
||||
|
||||
|
||||
def _is_json_content(text: str) -> bool:
|
||||
"""Check if content is valid JSON."""
|
||||
try:
|
||||
json.loads(text)
|
||||
return True
|
||||
except (json.JSONDecodeError, ValueError):
|
||||
return False
|
||||
|
||||
|
||||
def _is_yaml_content(lines: list[str]) -> bool:
|
||||
"""Heuristic: check if content looks like YAML."""
|
||||
yaml_indicators = 0
|
||||
for line in lines[:30]:
|
||||
stripped = line.strip()
|
||||
if stripped.startswith("---"):
|
||||
yaml_indicators += 1
|
||||
elif re.match(r"^\w[\w\s]*:\s", stripped):
|
||||
yaml_indicators += 1
|
||||
elif stripped.startswith("- ") and ":" in stripped:
|
||||
yaml_indicators += 1
|
||||
# If most non-empty lines look like YAML
|
||||
non_empty = sum(1 for l in lines[:30] if l.strip())
|
||||
return non_empty > 0 and yaml_indicators / non_empty > 0.6
|
||||
|
||||
|
||||
def detect_file_type(filepath: Path) -> str:
|
||||
"""Classify a file as 'natural_language', 'code', 'config', or 'unknown'.
|
||||
|
||||
Returns:
|
||||
One of: 'natural_language', 'code', 'config', 'unknown'
|
||||
"""
|
||||
ext = filepath.suffix.lower()
|
||||
|
||||
# Extension-based classification
|
||||
if ext in COMPRESSIBLE_EXTENSIONS:
|
||||
return "natural_language"
|
||||
if ext in SKIP_EXTENSIONS:
|
||||
return "code" if ext not in {".json", ".yaml", ".yml", ".toml", ".ini", ".cfg", ".env"} else "config"
|
||||
|
||||
# Extensionless files (like CLAUDE.md, TODO) — check content
|
||||
if not ext:
|
||||
try:
|
||||
text = filepath.read_text(errors="ignore")
|
||||
except (OSError, PermissionError):
|
||||
return "unknown"
|
||||
|
||||
lines = text.splitlines()[:50]
|
||||
|
||||
if _is_json_content(text[:10000]):
|
||||
return "config"
|
||||
if _is_yaml_content(lines):
|
||||
return "config"
|
||||
|
||||
code_lines = sum(1 for l in lines if l.strip() and _is_code_line(l))
|
||||
non_empty = sum(1 for l in lines if l.strip())
|
||||
if non_empty > 0 and code_lines / non_empty > 0.4:
|
||||
return "code"
|
||||
|
||||
return "natural_language"
|
||||
|
||||
return "unknown"
|
||||
|
||||
|
||||
def should_compress(filepath: Path) -> bool:
|
||||
"""Return True if the file is natural language and should be compressed."""
|
||||
if not filepath.is_file():
|
||||
return False
|
||||
# Skip backup files
|
||||
if filepath.name.endswith(".original.md"):
|
||||
return False
|
||||
return detect_file_type(filepath) == "natural_language"
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
if len(sys.argv) < 2:
|
||||
print("Usage: python detect.py <file1> [file2] ...")
|
||||
sys.exit(1)
|
||||
|
||||
for path_str in sys.argv[1:]:
|
||||
p = Path(path_str).resolve()
|
||||
file_type = detect_file_type(p)
|
||||
compress = should_compress(p)
|
||||
print(f" {p.name:30s} type={file_type:20s} compress={compress}")
|
||||
189
.agents/skills/compress/scripts/validate.py
Normal file
189
.agents/skills/compress/scripts/validate.py
Normal file
@@ -0,0 +1,189 @@
|
||||
#!/usr/bin/env python3
|
||||
import re
|
||||
from pathlib import Path
|
||||
|
||||
URL_REGEX = re.compile(r"https?://[^\s)]+")
|
||||
FENCE_OPEN_REGEX = re.compile(r"^(\s{0,3})(`{3,}|~{3,})(.*)$")
|
||||
HEADING_REGEX = re.compile(r"^(#{1,6})\s+(.*)", re.MULTILINE)
|
||||
BULLET_REGEX = re.compile(r"^\s*[-*+]\s+", re.MULTILINE)
|
||||
|
||||
# crude but effective path detection
|
||||
# Requires either a path prefix (./ ../ / or drive letter) or a slash/backslash within the match
|
||||
PATH_REGEX = re.compile(r"(?:\./|\.\./|/|[A-Za-z]:\\)[\w\-/\\\.]+|[\w\-\.]+[/\\][\w\-/\\\.]+")
|
||||
|
||||
|
||||
class ValidationResult:
|
||||
def __init__(self):
|
||||
self.is_valid = True
|
||||
self.errors = []
|
||||
self.warnings = []
|
||||
|
||||
def add_error(self, msg):
|
||||
self.is_valid = False
|
||||
self.errors.append(msg)
|
||||
|
||||
def add_warning(self, msg):
|
||||
self.warnings.append(msg)
|
||||
|
||||
|
||||
def read_file(path: Path) -> str:
|
||||
return path.read_text(errors="ignore")
|
||||
|
||||
|
||||
# ---------- Extractors ----------
|
||||
|
||||
|
||||
def extract_headings(text):
|
||||
return [(level, title.strip()) for level, title in HEADING_REGEX.findall(text)]
|
||||
|
||||
|
||||
def extract_code_blocks(text):
|
||||
"""Line-based fenced code block extractor.
|
||||
|
||||
Handles ``` and ~~~ fences with variable length (CommonMark: closing
|
||||
fence must use same char and be at least as long as opening). Supports
|
||||
nested fences (e.g. an outer 4-backtick block wrapping inner 3-backtick
|
||||
content).
|
||||
"""
|
||||
blocks = []
|
||||
lines = text.split("\n")
|
||||
i = 0
|
||||
n = len(lines)
|
||||
while i < n:
|
||||
m = FENCE_OPEN_REGEX.match(lines[i])
|
||||
if not m:
|
||||
i += 1
|
||||
continue
|
||||
fence_char = m.group(2)[0]
|
||||
fence_len = len(m.group(2))
|
||||
open_line = lines[i]
|
||||
block_lines = [open_line]
|
||||
i += 1
|
||||
closed = False
|
||||
while i < n:
|
||||
close_m = FENCE_OPEN_REGEX.match(lines[i])
|
||||
if (
|
||||
close_m
|
||||
and close_m.group(2)[0] == fence_char
|
||||
and len(close_m.group(2)) >= fence_len
|
||||
and close_m.group(3).strip() == ""
|
||||
):
|
||||
block_lines.append(lines[i])
|
||||
closed = True
|
||||
i += 1
|
||||
break
|
||||
block_lines.append(lines[i])
|
||||
i += 1
|
||||
if closed:
|
||||
blocks.append("\n".join(block_lines))
|
||||
# Unclosed fences are silently skipped — they indicate malformed markdown
|
||||
# and including them would cause false-positive validation failures.
|
||||
return blocks
|
||||
|
||||
|
||||
def extract_urls(text):
|
||||
return set(URL_REGEX.findall(text))
|
||||
|
||||
|
||||
def extract_paths(text):
|
||||
return set(PATH_REGEX.findall(text))
|
||||
|
||||
|
||||
def count_bullets(text):
|
||||
return len(BULLET_REGEX.findall(text))
|
||||
|
||||
|
||||
# ---------- Validators ----------
|
||||
|
||||
|
||||
def validate_headings(orig, comp, result):
|
||||
h1 = extract_headings(orig)
|
||||
h2 = extract_headings(comp)
|
||||
|
||||
if len(h1) != len(h2):
|
||||
result.add_error(f"Heading count mismatch: {len(h1)} vs {len(h2)}")
|
||||
|
||||
if h1 != h2:
|
||||
result.add_warning("Heading text/order changed")
|
||||
|
||||
|
||||
def validate_code_blocks(orig, comp, result):
|
||||
c1 = extract_code_blocks(orig)
|
||||
c2 = extract_code_blocks(comp)
|
||||
|
||||
if c1 != c2:
|
||||
result.add_error("Code blocks not preserved exactly")
|
||||
|
||||
|
||||
def validate_urls(orig, comp, result):
|
||||
u1 = extract_urls(orig)
|
||||
u2 = extract_urls(comp)
|
||||
|
||||
if u1 != u2:
|
||||
result.add_error(f"URL mismatch: lost={u1 - u2}, added={u2 - u1}")
|
||||
|
||||
|
||||
def validate_paths(orig, comp, result):
|
||||
p1 = extract_paths(orig)
|
||||
p2 = extract_paths(comp)
|
||||
|
||||
if p1 != p2:
|
||||
result.add_warning(f"Path mismatch: lost={p1 - p2}, added={p2 - p1}")
|
||||
|
||||
|
||||
def validate_bullets(orig, comp, result):
|
||||
b1 = count_bullets(orig)
|
||||
b2 = count_bullets(comp)
|
||||
|
||||
if b1 == 0:
|
||||
return
|
||||
|
||||
diff = abs(b1 - b2) / b1
|
||||
|
||||
if diff > 0.15:
|
||||
result.add_warning(f"Bullet count changed too much: {b1} -> {b2}")
|
||||
|
||||
|
||||
# ---------- Main ----------
|
||||
|
||||
|
||||
def validate(original_path: Path, compressed_path: Path) -> ValidationResult:
|
||||
result = ValidationResult()
|
||||
|
||||
orig = read_file(original_path)
|
||||
comp = read_file(compressed_path)
|
||||
|
||||
validate_headings(orig, comp, result)
|
||||
validate_code_blocks(orig, comp, result)
|
||||
validate_urls(orig, comp, result)
|
||||
validate_paths(orig, comp, result)
|
||||
validate_bullets(orig, comp, result)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
# ---------- CLI ----------
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
if len(sys.argv) != 3:
|
||||
print("Usage: python validate.py <original> <compressed>")
|
||||
sys.exit(1)
|
||||
|
||||
orig = Path(sys.argv[1]).resolve()
|
||||
comp = Path(sys.argv[2]).resolve()
|
||||
|
||||
res = validate(orig, comp)
|
||||
|
||||
print(f"\nValid: {res.is_valid}")
|
||||
|
||||
if res.errors:
|
||||
print("\nErrors:")
|
||||
for e in res.errors:
|
||||
print(f" - {e}")
|
||||
|
||||
if res.warnings:
|
||||
print("\nWarnings:")
|
||||
for w in res.warnings:
|
||||
print(f" - {w}")
|
||||
1
.claude/skills/caveman
Symbolic link
1
.claude/skills/caveman
Symbolic link
@@ -0,0 +1 @@
|
||||
../../.agents/skills/caveman
|
||||
1
.claude/skills/caveman-compress
Symbolic link
1
.claude/skills/caveman-compress
Symbolic link
@@ -0,0 +1 @@
|
||||
../../.agents/skills/caveman-compress
|
||||
1
.claude/skills/caveman-help
Symbolic link
1
.claude/skills/caveman-help
Symbolic link
@@ -0,0 +1 @@
|
||||
../../.agents/skills/caveman-help
|
||||
1
.claude/skills/compress
Symbolic link
1
.claude/skills/compress
Symbolic link
@@ -0,0 +1 @@
|
||||
../../.agents/skills/compress
|
||||
@@ -36,24 +36,31 @@ const b = 2;
|
||||
// const a = 1, b = 2; // WRONG
|
||||
|
||||
Types and Interfaces
|
||||
Prefer Interfaces Over Type Aliases
|
||||
Prefer Type Aliases Over Interfaces
|
||||
|
||||
// Good: interface for object shapes
|
||||
interface User {
|
||||
id: string;
|
||||
name: string;
|
||||
email?: string;
|
||||
}
|
||||
|
||||
// Avoid: type alias for object shapes
|
||||
// Good: type alias for object shapes
|
||||
type User = {
|
||||
id: string;
|
||||
name: string;
|
||||
email?: string;
|
||||
};
|
||||
|
||||
// Type aliases OK for unions, intersections, mapped types
|
||||
// Avoid: interface for object shapes
|
||||
// interface User {
|
||||
// id: string;
|
||||
// name: string;
|
||||
// }
|
||||
|
||||
// Type aliases work for everything: objects, unions, intersections, mapped types
|
||||
type Status = 'active' | 'inactive';
|
||||
type Combined = TypeA & TypeB;
|
||||
type Handler = (event: Event) => void;
|
||||
|
||||
// Benefits of types over interfaces:
|
||||
// 1. Consistent syntax for all type definitions
|
||||
// 2. Cannot be merged/extended unexpectedly (no declaration merging)
|
||||
// 3. Better for union types and computed properties
|
||||
// 4. Works with utility types more naturally
|
||||
|
||||
Type Inference
|
||||
|
||||
|
||||
20050
package-lock.json
generated
Normal file
20050
package-lock.json
generated
Normal file
File diff suppressed because it is too large
Load Diff
@@ -86,6 +86,7 @@
|
||||
"zustand": "^5.0.11"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@eslint/js": "^10.0.0",
|
||||
"@graphql-codegen/cli": "^6.1.2",
|
||||
"@graphql-codegen/typescript": "^5.0.8",
|
||||
"@graphql-codegen/typescript-operations": "^5.0.8",
|
||||
@@ -110,15 +111,16 @@
|
||||
"@testing-library/react": "^16.3.2",
|
||||
"@testing-library/user-event": "^14.6.1",
|
||||
"@types/ellipsize": "^0.1.3",
|
||||
"@types/html-to-text": "^9.0.4",
|
||||
"@types/jest": "^30.0.0",
|
||||
"@types/lodash": "^4.17.24",
|
||||
"@types/node": "^25.6.0",
|
||||
"@types/prop-types": "^15.7.15",
|
||||
"@types/react": "^19.2.14",
|
||||
"@types/react-dom": "^19.2.3",
|
||||
"@types/react-redux": "^7.1.34",
|
||||
"@types/react-table": "^7.7.20",
|
||||
"autoprefixer": "^10.4.27",
|
||||
"docdash": "^2.0.2",
|
||||
"@eslint/js": "^10.0.0",
|
||||
"eslint": "^10.0.2",
|
||||
"eslint-config-prettier": "^10.1.8",
|
||||
"eslint-plugin-css-modules": "^2.12.0",
|
||||
|
||||
211
plans/import-directory-status.md
Normal file
211
plans/import-directory-status.md
Normal file
@@ -0,0 +1,211 @@
|
||||
# Implementation Plan: Directory Status Check for Import.tsx
|
||||
|
||||
## Overview
|
||||
|
||||
Add functionality to `Import.tsx` that checks if the required directories (`comics` and `userdata`) exist before allowing the import process to start. If either directory is missing, display a warning banner to the user and disable the import functionality.
|
||||
|
||||
## API Endpoint
|
||||
|
||||
- **Endpoint**: `GET /api/library/getDirectoryStatus`
|
||||
- **Response Structure**:
|
||||
```typescript
|
||||
interface DirectoryStatus {
|
||||
comics: { exists: boolean };
|
||||
userdata: { exists: boolean };
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation Details
|
||||
|
||||
### 1. Add Directory Status Type
|
||||
|
||||
In [`Import.tsx`](src/client/components/Import/Import.tsx:1), add a type definition for the directory status response:
|
||||
|
||||
```typescript
|
||||
interface DirectoryStatus {
|
||||
comics: { exists: boolean };
|
||||
userdata: { exists: boolean };
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Create useQuery Hook for Directory Status
|
||||
|
||||
Use `@tanstack/react-query` (already imported) to fetch directory status on component mount:
|
||||
|
||||
```typescript
|
||||
const { data: directoryStatus, isLoading: isCheckingDirectories, error: directoryError } = useQuery({
|
||||
queryKey: ['directoryStatus'],
|
||||
queryFn: async (): Promise<DirectoryStatus> => {
|
||||
const response = await axios.get('http://localhost:3000/api/library/getDirectoryStatus');
|
||||
return response.data;
|
||||
},
|
||||
refetchOnWindowFocus: false,
|
||||
staleTime: 30000, // Cache for 30 seconds
|
||||
});
|
||||
```
|
||||
|
||||
### 3. Derive Missing Directories State
|
||||
|
||||
Compute which directories are missing from the query result:
|
||||
|
||||
```typescript
|
||||
const missingDirectories = useMemo(() => {
|
||||
if (!directoryStatus) return [];
|
||||
const missing: string[] = [];
|
||||
if (!directoryStatus.comics?.exists) missing.push('comics');
|
||||
if (!directoryStatus.userdata?.exists) missing.push('userdata');
|
||||
return missing;
|
||||
}, [directoryStatus]);
|
||||
|
||||
const hasAllDirectories = missingDirectories.length === 0;
|
||||
```
|
||||
|
||||
### 4. Create Warning Banner Component
|
||||
|
||||
Add a warning banner that displays when directories are missing, positioned above the import button. This uses the same styling patterns as the existing error banner:
|
||||
|
||||
```tsx
|
||||
{/* Directory Status Warning */}
|
||||
{!isCheckingDirectories && missingDirectories.length > 0 && (
|
||||
<div className="my-6 max-w-screen-lg rounded-lg border-s-4 border-amber-500 bg-amber-50 dark:bg-amber-900/20 p-4">
|
||||
<div className="flex items-start gap-3">
|
||||
<span className="w-6 h-6 text-amber-600 dark:text-amber-400 mt-0.5">
|
||||
<i className="h-6 w-6 icon-[solar--folder-error-bold]"></i>
|
||||
</span>
|
||||
<div className="flex-1">
|
||||
<p className="font-semibold text-amber-800 dark:text-amber-300">
|
||||
Required Directories Missing
|
||||
</p>
|
||||
<p className="text-sm text-amber-700 dark:text-amber-400 mt-1">
|
||||
The following directories do not exist and must be created before importing:
|
||||
</p>
|
||||
<ul className="list-disc list-inside text-sm text-amber-700 dark:text-amber-400 mt-2">
|
||||
{missingDirectories.map((dir) => (
|
||||
<li key={dir}>
|
||||
<code className="bg-amber-100 dark:bg-amber-900/50 px-1 rounded">{dir}</code>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
<p className="text-sm text-amber-700 dark:text-amber-400 mt-2">
|
||||
Please ensure these directories are mounted correctly in your Docker configuration.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
```
|
||||
|
||||
### 5. Disable Import Button When Directories Missing
|
||||
|
||||
Modify the button's `disabled` prop and click handler:
|
||||
|
||||
```tsx
|
||||
<button
|
||||
className="..."
|
||||
onClick={handleForceReImport}
|
||||
disabled={isForceReImporting || hasActiveSession || !hasAllDirectories}
|
||||
title={!hasAllDirectories
|
||||
? "Cannot import: Required directories are missing"
|
||||
: "Re-import all files to fix Elasticsearch indexing issues"}
|
||||
>
|
||||
```
|
||||
|
||||
### 6. Update handleForceReImport Guard
|
||||
|
||||
Add early return in the handler for missing directories:
|
||||
|
||||
```typescript
|
||||
const handleForceReImport = async () => {
|
||||
setImportError(null);
|
||||
|
||||
// Check for missing directories
|
||||
if (!hasAllDirectories) {
|
||||
setImportError(
|
||||
`Cannot start import: Required directories are missing (${missingDirectories.join(', ')}). Please check your Docker volume configuration.`
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
// ... existing logic
|
||||
};
|
||||
```
|
||||
|
||||
## File Changes Summary
|
||||
|
||||
| File | Changes |
|
||||
|------|---------|
|
||||
| [`src/client/components/Import/Import.tsx`](src/client/components/Import/Import.tsx) | Add useQuery for directory status, warning banner UI, disable button logic |
|
||||
| [`src/client/components/Import/Import.test.tsx`](src/client/components/Import/Import.test.tsx) | Add tests for directory status scenarios |
|
||||
|
||||
## Test Cases to Add
|
||||
|
||||
### Import.test.tsx Updates
|
||||
|
||||
1. **Should show warning banner when comics directory is missing**
|
||||
2. **Should show warning banner when userdata directory is missing**
|
||||
3. **Should show warning banner when both directories are missing**
|
||||
4. **Should disable import button when directories are missing**
|
||||
5. **Should enable import button when all directories exist**
|
||||
6. **Should handle directory status API error gracefully**
|
||||
|
||||
Example test structure:
|
||||
|
||||
```typescript
|
||||
describe('Import Component - Directory Status', () => {
|
||||
beforeEach(() => {
|
||||
jest.clearAllMocks();
|
||||
// Mock successful directory status by default
|
||||
(axios.get as jest.Mock) = jest.fn().mockResolvedValue({
|
||||
data: { comics: { exists: true }, userdata: { exists: true } }
|
||||
});
|
||||
});
|
||||
|
||||
test('should show warning when comics directory is missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: false }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Required Directories Missing')).toBeInTheDocument();
|
||||
expect(screen.getByText('comics')).toBeInTheDocument();
|
||||
});
|
||||
});
|
||||
|
||||
test('should disable import button when directories are missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: false }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
const button = screen.getByRole('button', { name: /Force Re-Import/i });
|
||||
expect(button).toBeDisabled();
|
||||
});
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Architecture Diagram
|
||||
|
||||
```mermaid
|
||||
flowchart TD
|
||||
A[Import Component Mounts] --> B[Fetch Directory Status]
|
||||
B --> C{API Success?}
|
||||
C -->|Yes| D{All Directories Exist?}
|
||||
C -->|No| E[Show Error Banner]
|
||||
D -->|Yes| F[Enable Import Button]
|
||||
D -->|No| G[Show Warning Banner]
|
||||
G --> H[Disable Import Button]
|
||||
F --> I[User Clicks Import]
|
||||
I --> J[Proceed with Import]
|
||||
```
|
||||
|
||||
## Notes
|
||||
|
||||
- The directory status is fetched once on mount with a 30-second stale time
|
||||
- The warning uses amber/yellow colors to differentiate from error messages (red)
|
||||
- The existing `importError` state and UI can remain unchanged
|
||||
- No changes needed to the backend - the endpoint already exists
|
||||
25
skills-lock.json
Normal file
25
skills-lock.json
Normal file
@@ -0,0 +1,25 @@
|
||||
{
|
||||
"version": 1,
|
||||
"skills": {
|
||||
"caveman": {
|
||||
"source": "JuliusBrussee/caveman",
|
||||
"sourceType": "github",
|
||||
"computedHash": "a818cdc41dcfaa50dd891c5cb5e5705968338de02e7e37949ca56e8c30ad4176"
|
||||
},
|
||||
"caveman-compress": {
|
||||
"source": "JuliusBrussee/caveman",
|
||||
"sourceType": "github",
|
||||
"computedHash": "300fb8578258161e1752a2a4142a7e9ff178c960bcb83b84422e2987421f33bf"
|
||||
},
|
||||
"caveman-help": {
|
||||
"source": "JuliusBrussee/caveman",
|
||||
"sourceType": "github",
|
||||
"computedHash": "3cd5f7d3f88c8ef7b16a6555dc61f5a11b14151386697609ab6887ab8b5f059d"
|
||||
},
|
||||
"compress": {
|
||||
"source": "JuliusBrussee/caveman",
|
||||
"sourceType": "github",
|
||||
"computedHash": "05c97bc3120108acd0b80bdef7fb4fa7c224ba83c8d384ccbc97f92e8a065918"
|
||||
}
|
||||
}
|
||||
}
|
||||
@@ -1,6 +1,10 @@
|
||||
@import "tailwindcss";
|
||||
@config "../tailwind.config.ts";
|
||||
|
||||
html, body {
|
||||
overflow-x: hidden;
|
||||
}
|
||||
|
||||
/* Custom Project Fonts */
|
||||
@font-face {
|
||||
font-family: "PP Object Sans Regular";
|
||||
|
||||
@@ -1,177 +0,0 @@
|
||||
import {
|
||||
SearchQuery,
|
||||
SearchInstance,
|
||||
PriorityEnum,
|
||||
SearchResponse,
|
||||
} from "threetwo-ui-typings";
|
||||
import {
|
||||
LIBRARY_SERVICE_BASE_URI,
|
||||
SEARCH_SERVICE_BASE_URI,
|
||||
} from "../constants/endpoints";
|
||||
import {
|
||||
AIRDCPP_SEARCH_RESULTS_ADDED,
|
||||
AIRDCPP_SEARCH_RESULTS_UPDATED,
|
||||
AIRDCPP_HUB_SEARCHES_SENT,
|
||||
AIRDCPP_RESULT_DOWNLOAD_INITIATED,
|
||||
AIRDCPP_DOWNLOAD_PROGRESS_TICK,
|
||||
AIRDCPP_BUNDLES_FETCHED,
|
||||
AIRDCPP_SEARCH_IN_PROGRESS,
|
||||
AIRDCPP_FILE_DOWNLOAD_COMPLETED,
|
||||
LS_SINGLE_IMPORT,
|
||||
IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
AIRDCPP_TRANSFERS_FETCHED,
|
||||
LIBRARY_ISSUE_BUNDLES,
|
||||
AIRDCPP_SOCKET_CONNECTED,
|
||||
AIRDCPP_SOCKET_DISCONNECTED,
|
||||
} from "../constants/action-types";
|
||||
import { isNil } from "lodash";
|
||||
import axios from "axios";
|
||||
|
||||
interface SearchData {
|
||||
query: Pick<SearchQuery, "pattern"> & Partial<Omit<SearchQuery, "pattern">>;
|
||||
hub_urls: string[] | undefined | null;
|
||||
priority: PriorityEnum;
|
||||
}
|
||||
|
||||
export const sleep = (ms: number): Promise<NodeJS.Timeout> => {
|
||||
return new Promise((resolve) => setTimeout(resolve, ms));
|
||||
};
|
||||
|
||||
export const toggleAirDCPPSocketConnectionStatus =
|
||||
(status: String, payload?: any) => async (dispatch) => {
|
||||
switch (status) {
|
||||
case "connected":
|
||||
dispatch({
|
||||
type: AIRDCPP_SOCKET_CONNECTED,
|
||||
data: payload,
|
||||
});
|
||||
break;
|
||||
|
||||
case "disconnected":
|
||||
dispatch({
|
||||
type: AIRDCPP_SOCKET_DISCONNECTED,
|
||||
data: payload,
|
||||
});
|
||||
break;
|
||||
|
||||
default:
|
||||
break;
|
||||
}
|
||||
};
|
||||
export const downloadAirDCPPItem =
|
||||
(
|
||||
searchInstanceId: Number,
|
||||
resultId: String,
|
||||
comicObjectId: String,
|
||||
name: String,
|
||||
size: Number,
|
||||
type: any,
|
||||
ADCPPSocket: any,
|
||||
credentials: any,
|
||||
): void =>
|
||||
async (dispatch) => {
|
||||
try {
|
||||
if (!ADCPPSocket.isConnected()) {
|
||||
await ADCPPSocket.connect();
|
||||
}
|
||||
let bundleDBImportResult = {};
|
||||
const downloadResult = await ADCPPSocket.post(
|
||||
`search/${searchInstanceId}/results/${resultId}/download`,
|
||||
);
|
||||
|
||||
if (!isNil(downloadResult)) {
|
||||
bundleDBImportResult = await axios({
|
||||
method: "POST",
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/applyAirDCPPDownloadMetadata`,
|
||||
headers: {
|
||||
"Content-Type": "application/json; charset=utf-8",
|
||||
},
|
||||
data: {
|
||||
bundleId: downloadResult.bundle_info.id,
|
||||
comicObjectId,
|
||||
name,
|
||||
size,
|
||||
type,
|
||||
},
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: AIRDCPP_RESULT_DOWNLOAD_INITIATED,
|
||||
downloadResult,
|
||||
bundleDBImportResult,
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
comicBookDetail: bundleDBImportResult.data,
|
||||
IMS_inProgress: false,
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
export const getBundlesForComic =
|
||||
(comicObjectId: string, ADCPPSocket: any, credentials: any) =>
|
||||
async (dispatch) => {
|
||||
try {
|
||||
if (!ADCPPSocket.isConnected()) {
|
||||
await ADCPPSocket.connect();
|
||||
}
|
||||
const comicObject = await axios({
|
||||
method: "POST",
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBookById`,
|
||||
headers: {
|
||||
"Content-Type": "application/json; charset=utf-8",
|
||||
},
|
||||
data: {
|
||||
id: `${comicObjectId}`,
|
||||
},
|
||||
});
|
||||
// get only the bundles applicable for the comic
|
||||
if (comicObject.data.acquisition.directconnect) {
|
||||
const filteredBundles =
|
||||
comicObject.data.acquisition.directconnect.downloads.map(
|
||||
async ({ bundleId }) => {
|
||||
return await ADCPPSocket.get(`queue/bundles/${bundleId}`);
|
||||
},
|
||||
);
|
||||
dispatch({
|
||||
type: AIRDCPP_BUNDLES_FETCHED,
|
||||
bundles: await Promise.all(filteredBundles),
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
export const getTransfers =
|
||||
(ADCPPSocket: any, credentials: any) => async (dispatch) => {
|
||||
try {
|
||||
if (!ADCPPSocket.isConnected()) {
|
||||
await ADCPPSocket.connect();
|
||||
}
|
||||
const bundles = await ADCPPSocket.get("queue/bundles/1/85", {});
|
||||
if (!isNil(bundles)) {
|
||||
dispatch({
|
||||
type: AIRDCPP_TRANSFERS_FETCHED,
|
||||
bundles,
|
||||
});
|
||||
const bundleIds = bundles.map((bundle) => bundle.id);
|
||||
// get issues with matching bundleIds
|
||||
const issue_bundles = await axios({
|
||||
url: `${SEARCH_SERVICE_BASE_URI}/groupIssuesByBundles`,
|
||||
method: "POST",
|
||||
data: { bundleIds },
|
||||
});
|
||||
dispatch({
|
||||
type: LIBRARY_ISSUE_BUNDLES,
|
||||
issue_bundles,
|
||||
});
|
||||
}
|
||||
} catch (err) {
|
||||
throw err;
|
||||
}
|
||||
};
|
||||
@@ -1,207 +0,0 @@
|
||||
import axios from "axios";
|
||||
import rateLimiter from "axios-rate-limit";
|
||||
import { setupCache } from "axios-cache-interceptor";
|
||||
import {
|
||||
CV_SEARCH_SUCCESS,
|
||||
CV_API_CALL_IN_PROGRESS,
|
||||
CV_API_GENERIC_FAILURE,
|
||||
IMS_COMIC_BOOK_DB_OBJECT_CALL_IN_PROGRESS,
|
||||
IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
CV_ISSUES_METADATA_CALL_IN_PROGRESS,
|
||||
CV_CLEANUP,
|
||||
IMS_COMIC_BOOKS_DB_OBJECTS_FETCHED,
|
||||
CV_ISSUES_MATCHES_IN_LIBRARY_FETCHED,
|
||||
CV_ISSUES_FOR_VOLUME_IN_LIBRARY_SUCCESS,
|
||||
CV_WEEKLY_PULLLIST_CALL_IN_PROGRESS,
|
||||
CV_WEEKLY_PULLLIST_FETCHED,
|
||||
LIBRARY_STATISTICS_CALL_IN_PROGRESS,
|
||||
LIBRARY_STATISTICS_FETCHED,
|
||||
} from "../constants/action-types";
|
||||
import {
|
||||
COMICVINE_SERVICE_URI,
|
||||
LIBRARY_SERVICE_BASE_URI,
|
||||
} from "../constants/endpoints";
|
||||
|
||||
const http = rateLimiter(axios.create(), {
|
||||
maxRequests: 1,
|
||||
perMilliseconds: 1000,
|
||||
maxRPS: 1,
|
||||
});
|
||||
const cachedAxios = setupCache(axios);
|
||||
export const getWeeklyPullList = (options) => async (dispatch) => {
|
||||
try {
|
||||
dispatch({
|
||||
type: CV_WEEKLY_PULLLIST_CALL_IN_PROGRESS,
|
||||
});
|
||||
await cachedAxios(`${COMICVINE_SERVICE_URI}/getWeeklyPullList`, {
|
||||
method: "get",
|
||||
params: options,
|
||||
}).then((response) => {
|
||||
dispatch({
|
||||
type: CV_WEEKLY_PULLLIST_FETCHED,
|
||||
data: response.data.result,
|
||||
});
|
||||
});
|
||||
} catch (error) {
|
||||
// Error handling could be added here if needed
|
||||
}
|
||||
};
|
||||
|
||||
export const comicinfoAPICall = (options) => async (dispatch) => {
|
||||
try {
|
||||
dispatch({
|
||||
type: CV_API_CALL_IN_PROGRESS,
|
||||
inProgress: true,
|
||||
});
|
||||
const serviceURI = `${COMICVINE_SERVICE_URI}/${options.callURIAction}`;
|
||||
const response = await http(serviceURI, {
|
||||
method: options.callMethod,
|
||||
params: options.callParams,
|
||||
data: options.data ? options.data : null,
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
"Access-Control-Allow-Origin": "*",
|
||||
},
|
||||
});
|
||||
|
||||
switch (options.callURIAction) {
|
||||
case "search":
|
||||
dispatch({
|
||||
type: CV_SEARCH_SUCCESS,
|
||||
searchResults: response.data,
|
||||
});
|
||||
break;
|
||||
|
||||
default:
|
||||
break;
|
||||
}
|
||||
} catch (error) {
|
||||
dispatch({
|
||||
type: CV_API_GENERIC_FAILURE,
|
||||
error,
|
||||
});
|
||||
}
|
||||
};
|
||||
export const getIssuesForSeries =
|
||||
(comicObjectID: string) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: CV_ISSUES_METADATA_CALL_IN_PROGRESS,
|
||||
});
|
||||
dispatch({
|
||||
type: CV_CLEANUP,
|
||||
});
|
||||
|
||||
const issues = await axios({
|
||||
url: `${COMICVINE_SERVICE_URI}/getIssuesForSeries`,
|
||||
method: "POST",
|
||||
params: {
|
||||
comicObjectID,
|
||||
},
|
||||
});
|
||||
dispatch({
|
||||
type: CV_ISSUES_FOR_VOLUME_IN_LIBRARY_SUCCESS,
|
||||
issues: issues.data.results,
|
||||
});
|
||||
};
|
||||
|
||||
export const analyzeLibrary = (issues) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: CV_ISSUES_METADATA_CALL_IN_PROGRESS,
|
||||
});
|
||||
const queryObjects = issues.map((issue) => {
|
||||
const { id, name, issue_number } = issue;
|
||||
return {
|
||||
issueId: id,
|
||||
issueName: name,
|
||||
volumeName: issue.volume.name,
|
||||
issueNumber: issue_number,
|
||||
};
|
||||
});
|
||||
const foo = await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/findIssueForSeries`,
|
||||
method: "POST",
|
||||
data: {
|
||||
queryObjects,
|
||||
},
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: CV_ISSUES_MATCHES_IN_LIBRARY_FETCHED,
|
||||
matches: foo.data,
|
||||
});
|
||||
};
|
||||
|
||||
export const getLibraryStatistics = () => async (dispatch) => {
|
||||
dispatch({
|
||||
type: LIBRARY_STATISTICS_CALL_IN_PROGRESS,
|
||||
});
|
||||
const result = await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/libraryStatistics`,
|
||||
method: "GET",
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: LIBRARY_STATISTICS_FETCHED,
|
||||
data: result.data,
|
||||
});
|
||||
};
|
||||
|
||||
export const getComicBookDetailById =
|
||||
(comicBookObjectId: string) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_CALL_IN_PROGRESS,
|
||||
IMS_inProgress: true,
|
||||
});
|
||||
const result = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBookById`,
|
||||
method: "POST",
|
||||
data: {
|
||||
id: comicBookObjectId,
|
||||
},
|
||||
});
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
comicBookDetail: result.data,
|
||||
IMS_inProgress: false,
|
||||
});
|
||||
};
|
||||
|
||||
export const getComicBooksDetailsByIds =
|
||||
(comicBookObjectIds: Array<string>) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_CALL_IN_PROGRESS,
|
||||
IMS_inProgress: true,
|
||||
});
|
||||
const result = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBooksByIds`,
|
||||
method: "POST",
|
||||
data: {
|
||||
ids: comicBookObjectIds,
|
||||
},
|
||||
});
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOKS_DB_OBJECTS_FETCHED,
|
||||
comicBooks: result.data,
|
||||
});
|
||||
};
|
||||
|
||||
export const applyComicVineMatch =
|
||||
(match, comicObjectId) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_CALL_IN_PROGRESS,
|
||||
IMS_inProgress: true,
|
||||
});
|
||||
const result = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/applyComicVineMetadata`,
|
||||
method: "POST",
|
||||
data: {
|
||||
match,
|
||||
comicObjectId,
|
||||
},
|
||||
});
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
comicBookDetail: result.data,
|
||||
IMS_inProgress: false,
|
||||
});
|
||||
};
|
||||
@@ -1,383 +0,0 @@
|
||||
import axios from "axios";
|
||||
import { IFolderData } from "threetwo-ui-typings";
|
||||
import {
|
||||
COMICVINE_SERVICE_URI,
|
||||
IMAGETRANSFORMATION_SERVICE_BASE_URI,
|
||||
LIBRARY_SERVICE_BASE_URI,
|
||||
SEARCH_SERVICE_BASE_URI,
|
||||
JOB_QUEUE_SERVICE_BASE_URI,
|
||||
} from "../constants/endpoints";
|
||||
import {
|
||||
IMS_COMIC_BOOK_GROUPS_FETCHED,
|
||||
IMS_COMIC_BOOK_GROUPS_CALL_IN_PROGRESS,
|
||||
IMS_RECENT_COMICS_FETCHED,
|
||||
IMS_WANTED_COMICS_FETCHED,
|
||||
CV_API_CALL_IN_PROGRESS,
|
||||
CV_SEARCH_SUCCESS,
|
||||
CV_CLEANUP,
|
||||
IMS_CV_METADATA_IMPORT_CALL_IN_PROGRESS,
|
||||
IMS_CV_METADATA_IMPORT_SUCCESSFUL,
|
||||
IMS_CV_METADATA_IMPORT_FAILED,
|
||||
LS_IMPORT,
|
||||
IMG_ANALYSIS_CALL_IN_PROGRESS,
|
||||
IMG_ANALYSIS_DATA_FETCH_SUCCESS,
|
||||
IMS_COMIC_BOOK_ARCHIVE_EXTRACTION_CALL_IN_PROGRESS,
|
||||
SS_SEARCH_RESULTS_FETCHED,
|
||||
SS_SEARCH_IN_PROGRESS,
|
||||
FILEOPS_STATE_RESET,
|
||||
LS_IMPORT_CALL_IN_PROGRESS,
|
||||
SS_SEARCH_FAILED,
|
||||
SS_SEARCH_RESULTS_FETCHED_SPECIAL,
|
||||
WANTED_COMICS_FETCHED,
|
||||
VOLUMES_FETCHED,
|
||||
LIBRARY_SERVICE_HEALTH,
|
||||
LS_SET_QUEUE_STATUS,
|
||||
LS_IMPORT_JOB_STATISTICS_FETCHED,
|
||||
} from "../constants/action-types";
|
||||
|
||||
import { isNil } from "lodash";
|
||||
|
||||
export const getServiceStatus = (serviceName?: string) => async (dispatch) => {
|
||||
axios
|
||||
.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getHealthInformation`,
|
||||
method: "GET",
|
||||
transformResponse: (r: string) => JSON.parse(r),
|
||||
})
|
||||
.then((response) => {
|
||||
const { data } = response;
|
||||
dispatch({
|
||||
type: LIBRARY_SERVICE_HEALTH,
|
||||
status: data,
|
||||
});
|
||||
});
|
||||
};
|
||||
export async function walkFolder(path: string): Promise<Array<IFolderData>> {
|
||||
return axios
|
||||
.request<Array<IFolderData>>({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/walkFolders`,
|
||||
method: "POST",
|
||||
data: {
|
||||
basePathToWalk: path,
|
||||
},
|
||||
transformResponse: (r: string) => JSON.parse(r),
|
||||
})
|
||||
.then((response) => {
|
||||
const { data } = response;
|
||||
return data;
|
||||
})
|
||||
.catch((error) => error);
|
||||
}
|
||||
/**
|
||||
* Fetches comic book covers along with some metadata
|
||||
* @return the comic book metadata
|
||||
*/
|
||||
export const fetchComicBookMetadata = () => async (dispatch) => {
|
||||
dispatch({
|
||||
type: LS_IMPORT_CALL_IN_PROGRESS,
|
||||
});
|
||||
|
||||
// dispatch(
|
||||
// success({
|
||||
// // uid: 'once-please', // you can specify your own uid if required
|
||||
// title: "Import Started",
|
||||
// message: `<span class="icon-text has-text-success"><i class="fas fa-plug"></i></span> Socket <span class="has-text-info">${socket.id}</span> connected. <strong>${walkedFolders.length}</strong> comics scanned.`,
|
||||
// dismissible: "click",
|
||||
// position: "tr",
|
||||
// autoDismiss: 0,
|
||||
// }),
|
||||
// );
|
||||
const sessionId = localStorage.getItem("sessionId");
|
||||
dispatch({
|
||||
type: LS_IMPORT,
|
||||
});
|
||||
|
||||
await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/newImport`,
|
||||
method: "POST",
|
||||
data: { sessionId },
|
||||
});
|
||||
};
|
||||
|
||||
export const getImportJobResultStatistics = () => async (dispatch) => {
|
||||
const result = await axios.request({
|
||||
url: `${JOB_QUEUE_SERVICE_BASE_URI}/getJobResultStatistics`,
|
||||
method: "GET",
|
||||
});
|
||||
dispatch({
|
||||
type: LS_IMPORT_JOB_STATISTICS_FETCHED,
|
||||
data: result.data,
|
||||
});
|
||||
};
|
||||
|
||||
export const setQueueControl =
|
||||
(queueAction: string, queueStatus: string) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: LS_SET_QUEUE_STATUS,
|
||||
meta: { remote: true },
|
||||
data: { queueAction, queueStatus },
|
||||
});
|
||||
};
|
||||
|
||||
/**
|
||||
* Fetches comic book metadata for various types
|
||||
* @return metadata for the comic book object categories
|
||||
* @param options
|
||||
**/
|
||||
export const getComicBooks = (options) => async (dispatch) => {
|
||||
const { paginationOptions, predicate, comicStatus } = options;
|
||||
|
||||
const response = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBooks`,
|
||||
method: "POST",
|
||||
data: {
|
||||
paginationOptions,
|
||||
predicate,
|
||||
},
|
||||
});
|
||||
|
||||
switch (comicStatus) {
|
||||
case "recent":
|
||||
dispatch({
|
||||
type: IMS_RECENT_COMICS_FETCHED,
|
||||
data: response.data,
|
||||
});
|
||||
break;
|
||||
case "wanted":
|
||||
dispatch({
|
||||
type: IMS_WANTED_COMICS_FETCHED,
|
||||
data: response.data.docs,
|
||||
});
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Makes a call to library service to import the comic book metadata into the ThreeTwo data store.
|
||||
* @returns Nothing.
|
||||
* @param payload
|
||||
*/
|
||||
export const importToDB =
|
||||
(sourceName: string, metadata?: any) => (dispatch) => {
|
||||
try {
|
||||
const comicBookMetadata = {
|
||||
importType: "new",
|
||||
payload: {
|
||||
rawFileDetails: {
|
||||
name: "",
|
||||
},
|
||||
importStatus: {
|
||||
isImported: true,
|
||||
tagged: false,
|
||||
matchedResult: {
|
||||
score: "0",
|
||||
},
|
||||
},
|
||||
sourcedMetadata: metadata || null,
|
||||
acquisition: { source: { wanted: true, name: sourceName } },
|
||||
},
|
||||
};
|
||||
dispatch({
|
||||
type: IMS_CV_METADATA_IMPORT_CALL_IN_PROGRESS,
|
||||
});
|
||||
return axios
|
||||
.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/rawImportToDb`,
|
||||
method: "POST",
|
||||
data: comicBookMetadata,
|
||||
// transformResponse: (r: string) => JSON.parse(r),
|
||||
})
|
||||
.then((response) => {
|
||||
const { data } = response;
|
||||
dispatch({
|
||||
type: IMS_CV_METADATA_IMPORT_SUCCESSFUL,
|
||||
importResult: data,
|
||||
});
|
||||
});
|
||||
} catch (error) {
|
||||
dispatch({
|
||||
type: IMS_CV_METADATA_IMPORT_FAILED,
|
||||
importError: error,
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
export const fetchVolumeGroups = () => async (dispatch) => {
|
||||
try {
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_GROUPS_CALL_IN_PROGRESS,
|
||||
});
|
||||
const response = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBookGroups`,
|
||||
method: "GET",
|
||||
});
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_GROUPS_FETCHED,
|
||||
data: response.data,
|
||||
});
|
||||
} catch (error) {
|
||||
// Error handling could be added here if needed
|
||||
}
|
||||
};
|
||||
export const fetchComicVineMatches =
|
||||
(searchPayload, issueSearchQuery, seriesSearchQuery?) => async (dispatch) => {
|
||||
try {
|
||||
dispatch({
|
||||
type: CV_API_CALL_IN_PROGRESS,
|
||||
});
|
||||
axios
|
||||
.request({
|
||||
url: `${COMICVINE_SERVICE_URI}/volumeBasedSearch`,
|
||||
method: "POST",
|
||||
data: {
|
||||
format: "json",
|
||||
// hack
|
||||
query: issueSearchQuery.inferredIssueDetails.name
|
||||
.replace(/[^a-zA-Z0-9 ]/g, "")
|
||||
.trim(),
|
||||
limit: "100",
|
||||
page: 1,
|
||||
resources: "volume",
|
||||
scorerConfiguration: {
|
||||
searchParams: issueSearchQuery.inferredIssueDetails,
|
||||
},
|
||||
rawFileDetails: searchPayload.rawFileDetails,
|
||||
},
|
||||
transformResponse: (r) => {
|
||||
const matches = JSON.parse(r);
|
||||
return matches;
|
||||
// return sortBy(matches, (match) => -match.score);
|
||||
},
|
||||
})
|
||||
.then((response) => {
|
||||
let matches: any = [];
|
||||
if (
|
||||
!isNil(response.data.results) &&
|
||||
response.data.results.length === 1
|
||||
) {
|
||||
matches = response.data.results;
|
||||
} else {
|
||||
matches = response.data.map((match) => match);
|
||||
}
|
||||
dispatch({
|
||||
type: CV_SEARCH_SUCCESS,
|
||||
searchResults: matches,
|
||||
searchQueryObject: {
|
||||
issue: issueSearchQuery,
|
||||
series: seriesSearchQuery,
|
||||
},
|
||||
});
|
||||
});
|
||||
} catch (error) {
|
||||
// Error handling could be added here if needed
|
||||
}
|
||||
|
||||
dispatch({
|
||||
type: CV_CLEANUP,
|
||||
});
|
||||
};
|
||||
|
||||
/**
|
||||
* This method is a proxy to `uncompressFullArchive` which uncompresses complete `rar` or `zip` archives
|
||||
* @param {string} path The path to the compressed archive
|
||||
* @param {any} options Options object
|
||||
* @returns {any}
|
||||
*/
|
||||
export const extractComicArchive =
|
||||
(path: string, options: any): any =>
|
||||
async (dispatch) => {
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_ARCHIVE_EXTRACTION_CALL_IN_PROGRESS,
|
||||
});
|
||||
await axios({
|
||||
method: "POST",
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/uncompressFullArchive`,
|
||||
headers: {
|
||||
"Content-Type": "application/json; charset=utf-8",
|
||||
},
|
||||
data: {
|
||||
filePath: path,
|
||||
options,
|
||||
},
|
||||
});
|
||||
};
|
||||
|
||||
/**
|
||||
* Description
|
||||
* @param {any} query
|
||||
* @param {any} options
|
||||
* @returns {any}
|
||||
*/
|
||||
export const searchIssue = (query, options) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: SS_SEARCH_IN_PROGRESS,
|
||||
});
|
||||
|
||||
const response = await axios({
|
||||
url: `${SEARCH_SERVICE_BASE_URI}/searchIssue`,
|
||||
method: "POST",
|
||||
data: { ...query, ...options },
|
||||
});
|
||||
|
||||
if (response.data.code === 404) {
|
||||
dispatch({
|
||||
type: SS_SEARCH_FAILED,
|
||||
data: response.data,
|
||||
});
|
||||
}
|
||||
|
||||
switch (options.trigger) {
|
||||
case "wantedComicsPage":
|
||||
dispatch({
|
||||
type: WANTED_COMICS_FETCHED,
|
||||
data: response.data.hits,
|
||||
});
|
||||
break;
|
||||
case "globalSearchBar":
|
||||
dispatch({
|
||||
type: SS_SEARCH_RESULTS_FETCHED_SPECIAL,
|
||||
data: response.data.hits,
|
||||
});
|
||||
break;
|
||||
|
||||
case "libraryPage":
|
||||
dispatch({
|
||||
type: SS_SEARCH_RESULTS_FETCHED,
|
||||
data: response.data.hits,
|
||||
});
|
||||
break;
|
||||
case "volumesPage":
|
||||
dispatch({
|
||||
type: VOLUMES_FETCHED,
|
||||
data: response.data.hits,
|
||||
});
|
||||
break;
|
||||
|
||||
default:
|
||||
break;
|
||||
}
|
||||
};
|
||||
export const analyzeImage =
|
||||
(imageFilePath: string | Buffer) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: FILEOPS_STATE_RESET,
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: IMG_ANALYSIS_CALL_IN_PROGRESS,
|
||||
});
|
||||
|
||||
const foo = await axios({
|
||||
url: `${IMAGETRANSFORMATION_SERVICE_BASE_URI}/analyze`,
|
||||
method: "POST",
|
||||
data: {
|
||||
imageFilePath,
|
||||
},
|
||||
});
|
||||
dispatch({
|
||||
type: IMG_ANALYSIS_DATA_FETCH_SUCCESS,
|
||||
result: foo.data,
|
||||
});
|
||||
};
|
||||
@@ -1,26 +0,0 @@
|
||||
import axios from "axios";
|
||||
import { isNil } from "lodash";
|
||||
import { METRON_SERVICE_URI } from "../constants/endpoints";
|
||||
|
||||
export const fetchMetronResource = async (options) => {
|
||||
const metronResourceResults = await axios.post(
|
||||
`${METRON_SERVICE_URI}/fetchResource`,
|
||||
options,
|
||||
);
|
||||
const results = metronResourceResults.data.results.map((result) => {
|
||||
return {
|
||||
label: result.name || result.__str__,
|
||||
value: result.id,
|
||||
};
|
||||
});
|
||||
|
||||
return {
|
||||
options: results,
|
||||
hasMore: !isNil(metronResourceResults.data.next),
|
||||
additional: {
|
||||
page: !isNil(metronResourceResults.data.next)
|
||||
? options.query.page + 1
|
||||
: null,
|
||||
},
|
||||
};
|
||||
};
|
||||
@@ -1,77 +0,0 @@
|
||||
import axios from "axios";
|
||||
import {
|
||||
SETTINGS_OBJECT_FETCHED,
|
||||
SETTINGS_CALL_IN_PROGRESS,
|
||||
SETTINGS_DB_FLUSH_SUCCESS,
|
||||
SETTINGS_QBITTORRENT_TORRENTS_LIST_FETCHED,
|
||||
} from "../reducers/settings.reducer";
|
||||
import {
|
||||
LIBRARY_SERVICE_BASE_URI,
|
||||
SETTINGS_SERVICE_BASE_URI,
|
||||
QBITTORRENT_SERVICE_BASE_URI,
|
||||
} from "../constants/endpoints";
|
||||
|
||||
export const getSettings = (settingsKey?) => async (dispatch) => {
|
||||
const result = await axios({
|
||||
url: `${SETTINGS_SERVICE_BASE_URI}/getSettings`,
|
||||
method: "POST",
|
||||
data: settingsKey,
|
||||
});
|
||||
{
|
||||
dispatch({
|
||||
type: SETTINGS_OBJECT_FETCHED,
|
||||
data: result.data,
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteSettings = () => async (dispatch) => {
|
||||
const result = await axios({
|
||||
url: `${SETTINGS_SERVICE_BASE_URI}/deleteSettings`,
|
||||
method: "POST",
|
||||
});
|
||||
|
||||
if (result.data.ok === 1) {
|
||||
dispatch({
|
||||
type: SETTINGS_OBJECT_FETCHED,
|
||||
data: {},
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
export const flushDb = () => async (dispatch) => {
|
||||
dispatch({
|
||||
type: SETTINGS_CALL_IN_PROGRESS,
|
||||
});
|
||||
|
||||
const flushDbResult = await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/flushDb`,
|
||||
method: "POST",
|
||||
});
|
||||
|
||||
if (flushDbResult) {
|
||||
dispatch({
|
||||
type: SETTINGS_DB_FLUSH_SUCCESS,
|
||||
data: flushDbResult.data,
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
export const getQBitTorrentClientInfo = (hostInfo) => async (dispatch) => {
|
||||
await axios.request({
|
||||
url: `${QBITTORRENT_SERVICE_BASE_URI}/connect`,
|
||||
method: "POST",
|
||||
data: hostInfo,
|
||||
});
|
||||
const qBittorrentClientInfo = await axios.request({
|
||||
url: `${QBITTORRENT_SERVICE_BASE_URI}/getClientInfo`,
|
||||
method: "GET",
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: SETTINGS_QBITTORRENT_TORRENTS_LIST_FETCHED,
|
||||
data: qBittorrentClientInfo.data,
|
||||
});
|
||||
};
|
||||
|
||||
export const getProwlarrConnectionInfo = (hostInfo) => async (dispatch) => {};
|
||||
@@ -1,3 +1,10 @@
|
||||
/**
|
||||
* @fileoverview Root application component.
|
||||
* Provides the main layout structure with navigation, content outlet,
|
||||
* and toast notifications. Initializes socket connection on mount.
|
||||
* @module components/App
|
||||
*/
|
||||
|
||||
import React, { ReactElement, useEffect } from "react";
|
||||
import { Outlet } from "react-router-dom";
|
||||
import { Navbar2 } from "./shared/Navbar2";
|
||||
@@ -5,6 +12,26 @@ import { ToastContainer } from "react-toastify";
|
||||
import "../../app.css";
|
||||
import { useStore } from "../store";
|
||||
|
||||
/**
|
||||
* Root application component that provides the main layout structure.
|
||||
*
|
||||
* Features:
|
||||
* - Initializes WebSocket connection to the server on mount
|
||||
* - Renders the navigation bar across all routes
|
||||
* - Provides React Router outlet for child routes
|
||||
* - Includes toast notification container for app-wide notifications
|
||||
*
|
||||
* @returns {ReactElement} The root application layout
|
||||
* @example
|
||||
* // Used as the root element in React Router configuration
|
||||
* const router = createBrowserRouter([
|
||||
* {
|
||||
* path: "/",
|
||||
* element: <App />,
|
||||
* children: [...]
|
||||
* }
|
||||
* ]);
|
||||
*/
|
||||
export const App = (): ReactElement => {
|
||||
useEffect(() => {
|
||||
useStore.getState().getSocket("/"); // Connect to the base namespace
|
||||
|
||||
@@ -1,41 +1,45 @@
|
||||
import React, {
|
||||
useCallback,
|
||||
ReactElement,
|
||||
useEffect,
|
||||
useRef,
|
||||
useState,
|
||||
} from "react";
|
||||
import { SearchQuery, PriorityEnum, SearchResponse } from "threetwo-ui-typings";
|
||||
import { RootState, SearchInstance } from "threetwo-ui-typings";
|
||||
import ellipsize from "ellipsize";
|
||||
import { Form, Field } from "react-final-form";
|
||||
import { difference } from "../../shared/utils/object.utils";
|
||||
import { isEmpty, isNil, map } from "lodash";
|
||||
import { useStore } from "../../store";
|
||||
import { useShallow } from "zustand/react/shallow";
|
||||
import { useQuery, useQueryClient } from "@tanstack/react-query";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { AIRDCPP_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import type { Socket } from "socket.io-client";
|
||||
import type { AcquisitionPanelProps } from "../../types";
|
||||
|
||||
interface IAcquisitionPanelProps {
|
||||
query: any;
|
||||
comicObjectId: any;
|
||||
comicObject: any;
|
||||
settings: any;
|
||||
interface HubData {
|
||||
hub_url: string;
|
||||
identity: { name: string };
|
||||
value: string;
|
||||
}
|
||||
|
||||
interface AirDCPPSearchResult {
|
||||
id: string;
|
||||
dupe?: unknown;
|
||||
type: { id: string; str: string };
|
||||
name: string;
|
||||
slots: { total: number; free: number };
|
||||
users: { user: { nicks: string; flags: string[] } };
|
||||
size: number;
|
||||
}
|
||||
|
||||
export const AcquisitionPanel = (
|
||||
props: IAcquisitionPanelProps,
|
||||
props: AcquisitionPanelProps,
|
||||
): ReactElement => {
|
||||
const socketRef = useRef<Socket>();
|
||||
const queryClient = useQueryClient();
|
||||
const socketRef = useRef<Socket | undefined>(undefined);
|
||||
|
||||
const [dcppQuery, setDcppQuery] = useState({});
|
||||
const [airDCPPSearchResults, setAirDCPPSearchResults] = useState<any[]>([]);
|
||||
const [airDCPPSearchResults, setAirDCPPSearchResults] = useState<AirDCPPSearchResult[]>([]);
|
||||
const [airDCPPSearchStatus, setAirDCPPSearchStatus] = useState(false);
|
||||
const [airDCPPSearchInstance, setAirDCPPSearchInstance] = useState<any>({});
|
||||
const [airDCPPSearchInfo, setAirDCPPSearchInfo] = useState<any>({});
|
||||
const [airDCPPSearchInstance, setAirDCPPSearchInstance] = useState<{ id?: string; owner?: string; expires_in?: number }>({});
|
||||
const [airDCPPSearchInfo, setAirDCPPSearchInfo] = useState<{ query?: { pattern: string; extensions: string[]; file_type: string } }>({});
|
||||
|
||||
const { comicObjectId } = props;
|
||||
const issueName = props.query.issue.name || "";
|
||||
@@ -140,13 +144,13 @@ export const AcquisitionPanel = (
|
||||
};
|
||||
|
||||
const download = async (
|
||||
searchInstanceId: Number,
|
||||
resultId: String,
|
||||
comicObjectId: String,
|
||||
name: String,
|
||||
size: Number,
|
||||
type: any,
|
||||
config: any,
|
||||
searchInstanceId: string | number,
|
||||
resultId: string,
|
||||
comicObjectId: string,
|
||||
name: string,
|
||||
size: number,
|
||||
type: unknown,
|
||||
config: Record<string, unknown>,
|
||||
): Promise<void> => {
|
||||
socketRef.current?.emit(
|
||||
"call",
|
||||
@@ -166,7 +170,7 @@ export const AcquisitionPanel = (
|
||||
);
|
||||
};
|
||||
|
||||
const getDCPPSearchResults = async (searchQuery) => {
|
||||
const getDCPPSearchResults = async (searchQuery: { issueName: string }) => {
|
||||
const manualQuery = {
|
||||
query: {
|
||||
pattern: `${searchQuery.issueName}`,
|
||||
@@ -255,7 +259,7 @@ export const AcquisitionPanel = (
|
||||
<dl>
|
||||
<dt>
|
||||
<div className="mb-1">
|
||||
{hubs?.data.map((value, idx: string) => (
|
||||
{hubs?.data.map((value: HubData, idx: number) => (
|
||||
<span className="tag is-warning" key={idx}>
|
||||
{value.identity.name}
|
||||
</span>
|
||||
@@ -266,19 +270,19 @@ export const AcquisitionPanel = (
|
||||
<dt>
|
||||
Query:
|
||||
<span className="has-text-weight-semibold">
|
||||
{airDCPPSearchInfo.query.pattern}
|
||||
{airDCPPSearchInfo.query?.pattern}
|
||||
</span>
|
||||
</dt>
|
||||
<dd>
|
||||
Extensions:
|
||||
<span className="has-text-weight-semibold">
|
||||
{airDCPPSearchInfo.query.extensions.join(", ")}
|
||||
{airDCPPSearchInfo.query?.extensions.join(", ")}
|
||||
</span>
|
||||
</dd>
|
||||
<dd>
|
||||
File type:
|
||||
<span className="has-text-weight-semibold">
|
||||
{airDCPPSearchInfo.query.file_type}
|
||||
{airDCPPSearchInfo.query?.file_type}
|
||||
</span>
|
||||
</dd>
|
||||
</dl>
|
||||
@@ -384,7 +388,7 @@ export const AcquisitionPanel = (
|
||||
className="inline-flex items-center gap-1 rounded border border-green-500 bg-green-500 px-2 py-1 text-xs font-medium text-white hover:bg-transparent hover:text-green-400 dark:border-green-300 dark:bg-green-300 dark:text-slate-900 dark:hover:bg-transparent"
|
||||
onClick={() =>
|
||||
download(
|
||||
airDCPPSearchInstance.id,
|
||||
airDCPPSearchInstance.id ?? "",
|
||||
id,
|
||||
comicObjectId,
|
||||
name,
|
||||
|
||||
@@ -1,17 +1,31 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import Select from "react-select";
|
||||
import Select, { StylesConfig, SingleValue } from "react-select";
|
||||
import { ActionOption } from "../actionMenuConfig";
|
||||
|
||||
export const Menu = (props): ReactElement => {
|
||||
interface MenuConfiguration {
|
||||
filteredActionOptions: ActionOption[];
|
||||
customStyles: StylesConfig<ActionOption, false>;
|
||||
handleActionSelection: (action: SingleValue<ActionOption>) => void;
|
||||
}
|
||||
|
||||
interface MenuProps {
|
||||
data?: unknown;
|
||||
handlers?: {
|
||||
setSlidingPanelContentId: (id: string) => void;
|
||||
setVisible: (visible: boolean) => void;
|
||||
};
|
||||
configuration: MenuConfiguration;
|
||||
}
|
||||
|
||||
export const Menu = (props: MenuProps): ReactElement => {
|
||||
const {
|
||||
filteredActionOptions,
|
||||
customStyles,
|
||||
handleActionSelection,
|
||||
Placeholder,
|
||||
} = props.configuration;
|
||||
|
||||
return (
|
||||
<Select
|
||||
components={{ Placeholder }}
|
||||
<Select<ActionOption, false>
|
||||
placeholder={
|
||||
<span className="inline-flex flex-row items-center gap-2 pt-1">
|
||||
<div className="w-6 h-6">
|
||||
|
||||
@@ -4,7 +4,19 @@ import dayjs from "dayjs";
|
||||
import ellipsize from "ellipsize";
|
||||
import { map } from "lodash";
|
||||
import { DownloadProgressTick } from "./DownloadProgressTick";
|
||||
export const AirDCPPBundles = (props) => {
|
||||
|
||||
interface BundleData {
|
||||
id: string;
|
||||
name: string;
|
||||
target: string;
|
||||
size: number;
|
||||
}
|
||||
|
||||
interface AirDCPPBundlesProps {
|
||||
data: BundleData[];
|
||||
}
|
||||
|
||||
export const AirDCPPBundles = (props: AirDCPPBundlesProps) => {
|
||||
return (
|
||||
<div className="overflow-x-auto w-fit mt-6">
|
||||
<table className="min-w-full text-sm text-gray-900 dark:text-slate-100">
|
||||
|
||||
@@ -1,42 +1,70 @@
|
||||
import React, { ReactElement, useCallback, useState } from "react";
|
||||
import { fetchMetronResource } from "../../../actions/metron.actions";
|
||||
import axios from "axios";
|
||||
import { isNil } from "lodash";
|
||||
import Creatable from "react-select/creatable";
|
||||
import { withAsyncPaginate } from "react-select-async-paginate";
|
||||
import { METRON_SERVICE_URI } from "../../../constants/endpoints";
|
||||
|
||||
const CreatableAsyncPaginate = withAsyncPaginate(Creatable);
|
||||
|
||||
interface AsyncSelectPaginateProps {
|
||||
metronResource: string;
|
||||
placeholder?: string;
|
||||
export interface AsyncSelectPaginateProps {
|
||||
metronResource?: string;
|
||||
placeholder?: string | React.ReactNode;
|
||||
value?: object;
|
||||
onChange?(...args: unknown[]): unknown;
|
||||
meta?: Record<string, unknown>;
|
||||
input?: Record<string, unknown>;
|
||||
name?: string;
|
||||
type?: string;
|
||||
}
|
||||
|
||||
interface AdditionalType {
|
||||
page: number | null;
|
||||
}
|
||||
|
||||
interface MetronResultItem {
|
||||
name?: string;
|
||||
__str__?: string;
|
||||
id: number;
|
||||
}
|
||||
|
||||
export const AsyncSelectPaginate = (props: AsyncSelectPaginateProps): ReactElement => {
|
||||
const [value, setValue] = useState(null);
|
||||
const [isAddingInProgress, setIsAddingInProgress] = useState(false);
|
||||
|
||||
const loadData = useCallback((query, loadedOptions, { page }) => {
|
||||
return fetchMetronResource({
|
||||
const loadData = useCallback(async (
|
||||
query: string,
|
||||
_loadedOptions: unknown,
|
||||
additional?: AdditionalType
|
||||
) => {
|
||||
const page = additional?.page ?? 1;
|
||||
const options = {
|
||||
method: "GET",
|
||||
resource: props.metronResource,
|
||||
query: {
|
||||
name: query,
|
||||
page,
|
||||
resource: props.metronResource || "",
|
||||
query: { name: query, page },
|
||||
};
|
||||
const response = await axios.post(`${METRON_SERVICE_URI}/fetchResource`, options);
|
||||
const results = response.data.results.map((result: MetronResultItem) => ({
|
||||
label: result.name || result.__str__,
|
||||
value: result.id,
|
||||
}));
|
||||
return {
|
||||
options: results,
|
||||
hasMore: !isNil(response.data.next),
|
||||
additional: {
|
||||
page: !isNil(response.data.next) ? page + 1 : null,
|
||||
},
|
||||
});
|
||||
}, []);
|
||||
};
|
||||
}, [props.metronResource]);
|
||||
|
||||
return (
|
||||
<CreatableAsyncPaginate
|
||||
SelectComponent={Creatable}
|
||||
debounceTimeout={200}
|
||||
isDisabled={isAddingInProgress}
|
||||
value={props.value}
|
||||
loadOptions={loadData}
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
loadOptions={loadData as any}
|
||||
placeholder={props.placeholder}
|
||||
// onCreateOption={onCreateOption}
|
||||
onChange={props.onChange}
|
||||
// cacheUniqs={[cacheUniq]}
|
||||
additional={{
|
||||
page: 1,
|
||||
}}
|
||||
|
||||
@@ -10,7 +10,7 @@ import "react-sliding-pane/dist/react-sliding-pane.css";
|
||||
import SlidingPane from "react-sliding-pane";
|
||||
import { determineCoverFile } from "../../shared/utils/metadata.utils";
|
||||
import { styled } from "styled-components";
|
||||
import type { RawFileDetails as RawFileDetailsType, InferredMetadata } from "../../graphql/generated";
|
||||
import type { ComicDetailProps } from "../../types";
|
||||
|
||||
// Extracted modules
|
||||
import { useComicVineMatching } from "./useComicVineMatching";
|
||||
@@ -23,39 +23,6 @@ const StyledSlidingPanel = styled(SlidingPane)`
|
||||
background: #ccc;
|
||||
`;
|
||||
|
||||
interface ComicVineMetadata {
|
||||
name?: string;
|
||||
volumeInformation?: Record<string, unknown>;
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
interface Acquisition {
|
||||
directconnect?: {
|
||||
downloads?: unknown[];
|
||||
};
|
||||
torrent?: unknown[];
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
interface ComicDetailProps {
|
||||
data: {
|
||||
_id: string;
|
||||
rawFileDetails?: RawFileDetailsType;
|
||||
inferredMetadata: InferredMetadata;
|
||||
sourcedMetadata: {
|
||||
comicvine?: ComicVineMetadata;
|
||||
locg?: Record<string, unknown>;
|
||||
comicInfo?: Record<string, unknown>;
|
||||
};
|
||||
acquisition?: Acquisition;
|
||||
createdAt: string;
|
||||
updatedAt: string;
|
||||
};
|
||||
userSettings?: Record<string, unknown>;
|
||||
queryClient?: unknown;
|
||||
comicObjectId?: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Displays full comic detail: cover, file info, action menu, and tabbed panels
|
||||
* for metadata, archive operations, and acquisition.
|
||||
@@ -101,15 +68,15 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
|
||||
// Hide "match on Comic Vine" when there are no raw file details — matching
|
||||
// requires file metadata to seed the search query.
|
||||
const Placeholder = components.Placeholder;
|
||||
const filteredActionOptions = filter(actionOptions, (item) => {
|
||||
const filteredActionOptions: ActionOption[] = actionOptions.filter((item) => {
|
||||
if (isUndefined(rawFileDetails)) {
|
||||
return item.value !== "match-on-comic-vine";
|
||||
}
|
||||
return item;
|
||||
return true;
|
||||
});
|
||||
|
||||
const handleActionSelection = (action: ActionOption) => {
|
||||
const handleActionSelection = (action: ActionOption | null) => {
|
||||
if (!action) return;
|
||||
switch (action.value) {
|
||||
case "match-on-comic-vine":
|
||||
openDrawerWithCVMatches();
|
||||
@@ -223,7 +190,6 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
filteredActionOptions,
|
||||
customStyles,
|
||||
handleActionSelection,
|
||||
Placeholder,
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
|
||||
@@ -4,19 +4,19 @@ import dayjs from "dayjs";
|
||||
import { isEmpty, isUndefined } from "lodash";
|
||||
import Card from "../shared/Carda";
|
||||
import { convert } from "html-to-text";
|
||||
|
||||
interface ComicVineDetailsProps {
|
||||
updatedAt?: string;
|
||||
data?: {
|
||||
name?: string;
|
||||
number?: string;
|
||||
resource_type?: string;
|
||||
id?: number;
|
||||
};
|
||||
}
|
||||
import type { ComicVineDetailsProps } from "../../types";
|
||||
|
||||
export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement => {
|
||||
const { data, updatedAt } = props;
|
||||
|
||||
if (!data || !data.volumeInformation) {
|
||||
return <div className="text-slate-500 dark:text-gray-400">No ComicVine data available</div>;
|
||||
}
|
||||
|
||||
const detectedIssueType = data.volumeInformation.description
|
||||
? detectIssueTypes(data.volumeInformation.description)
|
||||
: undefined;
|
||||
|
||||
return (
|
||||
<div className="text-slate-500 dark:text-gray-400">
|
||||
<div className="">
|
||||
@@ -24,10 +24,9 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
<div className="flex flex-row gap-4">
|
||||
<div className="min-w-fit">
|
||||
<Card
|
||||
imageUrl={data.volumeInformation.image.thumb_url}
|
||||
imageUrl={data.volumeInformation.image?.thumb_url}
|
||||
orientation={"cover-only"}
|
||||
hasDetails={false}
|
||||
// cardContainerStyle={{ maxWidth: 200 }}
|
||||
/>
|
||||
</div>
|
||||
<div className="flex flex-col gap-5">
|
||||
@@ -49,7 +48,7 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
<div className="text-md">ComicVine Metadata</div>
|
||||
<div className="text-sm">
|
||||
Last scraped on{" "}
|
||||
{dayjs(updatedAt).format("MMM D YYYY [at] h:mm a")}
|
||||
{updatedAt ? dayjs(updatedAt).format("MMM D YYYY [at] h:mm a") : "Unknown"}
|
||||
</div>
|
||||
<div className="text-sm">
|
||||
ComicVine Issue ID
|
||||
@@ -61,7 +60,7 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
{/* Publisher details */}
|
||||
<div className="ml-8">
|
||||
Published by{" "}
|
||||
<span>{data.volumeInformation.publisher.name}</span>
|
||||
<span>{data.volumeInformation.publisher?.name}</span>
|
||||
<div>
|
||||
Total issues in this volume{" "}
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs font-medium px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
@@ -77,16 +76,11 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
<span>{data.issue_number}</span>
|
||||
</div>
|
||||
)}
|
||||
{!isUndefined(
|
||||
detectIssueTypes(data.volumeInformation.description),
|
||||
) ? (
|
||||
{!isUndefined(detectedIssueType) ? (
|
||||
<div>
|
||||
<span>Detected Type</span>
|
||||
<span>
|
||||
{
|
||||
detectIssueTypes(data.volumeInformation.description)
|
||||
.displayName
|
||||
}
|
||||
{detectedIssueType.displayName}
|
||||
</span>
|
||||
</div>
|
||||
) : data.resource_type ? (
|
||||
@@ -101,6 +95,7 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
{/* Description */}
|
||||
<div className="mt-3 w-3/4">
|
||||
{!isEmpty(data.description) &&
|
||||
data.description &&
|
||||
convert(data.description, {
|
||||
baseElements: {
|
||||
selectors: ["p"],
|
||||
|
||||
@@ -3,15 +3,7 @@ import MatchResult from "./MatchResult";
|
||||
import { isEmpty } from "lodash";
|
||||
import { useStore } from "../../store";
|
||||
import { useShallow } from "zustand/react/shallow";
|
||||
|
||||
interface ComicVineMatchPanelProps {
|
||||
props: {
|
||||
comicObjectId: string;
|
||||
comicVineMatches: any[];
|
||||
queryClient?: any;
|
||||
onMatchApplied?: () => void;
|
||||
};
|
||||
}
|
||||
import type { ComicVineMatchPanelProps } from "../../types";
|
||||
|
||||
/** Displays ComicVine search results or a status message while searching. */
|
||||
export const ComicVineMatchPanel = ({ props: comicVineData }: ComicVineMatchPanelProps): ReactElement => {
|
||||
|
||||
@@ -1,7 +1,16 @@
|
||||
import React, { useCallback } from "react";
|
||||
import { Form, Field } from "react-final-form";
|
||||
import Collapsible from "react-collapsible";
|
||||
import { fetchComicVineMatches } from "../../actions/fileops.actions";
|
||||
import { ValidationErrors } from "final-form";
|
||||
|
||||
interface ComicVineSearchFormProps {
|
||||
rawFileDetails?: Record<string, unknown>;
|
||||
}
|
||||
|
||||
interface SearchFormValues {
|
||||
issueName?: string;
|
||||
issueNumber?: string;
|
||||
issueYear?: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Component for performing search against ComicVine
|
||||
@@ -12,8 +21,8 @@ import { fetchComicVineMatches } from "../../actions/fileops.actions";
|
||||
* <ComicVineSearchForm data={rawFileDetails} />
|
||||
* )
|
||||
*/
|
||||
export const ComicVineSearchForm = (data) => {
|
||||
const onSubmit = useCallback((value) => {
|
||||
export const ComicVineSearchForm = (props: ComicVineSearchFormProps) => {
|
||||
const onSubmit = useCallback((value: SearchFormValues) => {
|
||||
const userInititatedQuery = {
|
||||
inferredIssueDetails: {
|
||||
name: value.issueName,
|
||||
@@ -24,8 +33,8 @@ export const ComicVineSearchForm = (data) => {
|
||||
};
|
||||
// dispatch(fetchComicVineMatches(data, userInititatedQuery));
|
||||
}, []);
|
||||
const validate = () => {
|
||||
return true;
|
||||
const validate = (_values: SearchFormValues): ValidationErrors | undefined => {
|
||||
return undefined;
|
||||
};
|
||||
|
||||
const MyForm = () => (
|
||||
@@ -34,52 +43,46 @@ export const ComicVineSearchForm = (data) => {
|
||||
validate={validate}
|
||||
render={({ handleSubmit }) => (
|
||||
<form onSubmit={handleSubmit}>
|
||||
<span className="flex items-center">
|
||||
<span className="text-md text-slate-500 dark:text-slate-500 pr-5">
|
||||
Override Search Query
|
||||
</span>
|
||||
<span className="h-px flex-1 bg-slate-200 dark:bg-slate-400"></span>
|
||||
</span>
|
||||
<label className="block py-1">Issue Name</label>
|
||||
<label className="block py-1 text-slate-700 dark:text-slate-200">Issue Name</label>
|
||||
<Field name="issueName">
|
||||
{(props) => (
|
||||
<input
|
||||
{...props.input}
|
||||
className="appearance-none dark:bg-slate-100 bg-slate-100 h-10 w-full rounded-md border-none text-gray-700 dark:text-slate-200 py-1 pr-7 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:shadow-outline-blue focus:border-blue-300"
|
||||
className="appearance-none bg-slate-100 dark:bg-slate-700 h-10 w-full rounded-md border border-slate-300 dark:border-slate-600 text-slate-900 dark:text-slate-100 py-1 pr-7 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-blue-300"
|
||||
placeholder="Type the issue name"
|
||||
/>
|
||||
)}
|
||||
</Field>
|
||||
<div className="flex flex-row gap-4">
|
||||
<div className="flex flex-row gap-4 mt-2">
|
||||
<div>
|
||||
<label className="block py-1">Number</label>
|
||||
<label className="block py-1 text-slate-700 dark:text-slate-200">Number</label>
|
||||
<Field name="issueNumber">
|
||||
{(props) => (
|
||||
<input
|
||||
{...props.input}
|
||||
className="appearance-none dark:bg-slate-100 bg-slate-100 h-10 w-14 rounded-md border-none text-gray-700 dark:text-slate-200 py-1 pr-7 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:shadow-outline-blue focus:border-blue-300"
|
||||
className="appearance-none bg-slate-100 dark:bg-slate-700 h-10 w-14 rounded-md border border-slate-300 dark:border-slate-600 text-slate-900 dark:text-slate-100 py-1 pr-2 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-blue-300"
|
||||
placeholder="#"
|
||||
/>
|
||||
)}
|
||||
</Field>
|
||||
</div>
|
||||
<div>
|
||||
<label className="block py-1">Year</label>
|
||||
<label className="block py-1 text-slate-700 dark:text-slate-200">Year</label>
|
||||
<Field name="issueYear">
|
||||
{(props) => (
|
||||
<input
|
||||
{...props.input}
|
||||
className="appearance-none dark:bg-slate-100 bg-slate-100 h-10 w-20 rounded-md border-none text-gray-700 dark:text-slate-200 py-1 pr-7 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:shadow-outline-blue focus:border-blue-300"
|
||||
className="appearance-none bg-slate-100 dark:bg-slate-700 h-10 w-20 rounded-md border border-slate-300 dark:border-slate-600 text-slate-900 dark:text-slate-100 py-1 pr-2 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-blue-300"
|
||||
placeholder="1984"
|
||||
/>
|
||||
)}
|
||||
</Field>
|
||||
</div>
|
||||
|
||||
<div className="flex justify-end mt-5">
|
||||
<div className="flex items-end">
|
||||
<button
|
||||
type="submit"
|
||||
className="flex h-10 sm:mt-3 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-4 py-2 text-gray-500 hover:bg-transparent hover:text-red-600 focus:outline-none focus:ring active:text-indigo-500"
|
||||
className="flex h-10 items-center rounded-lg border border-green-500 dark:border-green-400 bg-green-500 dark:bg-green-600 px-4 py-2 text-white font-medium hover:bg-green-600 dark:hover:bg-green-500 focus:outline-none focus:ring-2 focus:ring-green-500 focus:ring-offset-2 active:bg-green-700"
|
||||
>
|
||||
Search
|
||||
</button>
|
||||
|
||||
@@ -2,32 +2,12 @@ import prettyBytes from "pretty-bytes";
|
||||
import React, { ReactElement, useEffect, useRef, useState } from "react";
|
||||
import { useStore } from "../../store";
|
||||
import type { Socket } from "socket.io-client";
|
||||
|
||||
/**
|
||||
* @typedef {Object} DownloadProgressTickProps
|
||||
* @property {string} bundleId - The bundle ID to filter ticks by (as string)
|
||||
*/
|
||||
interface DownloadProgressTickProps {
|
||||
bundleId: string;
|
||||
}
|
||||
import type { DownloadProgressTickProps } from "../../types";
|
||||
|
||||
/**
|
||||
* Shape of the download tick data received over the socket.
|
||||
*
|
||||
* @typedef DownloadTickData
|
||||
* @property {number} id - Unique download ID
|
||||
* @property {string} name - File name (e.g. "movie.mkv")
|
||||
* @property {number} downloaded_bytes - Bytes downloaded so far
|
||||
* @property {number} size - Total size in bytes
|
||||
* @property {number} speed - Current download speed (bytes/sec)
|
||||
* @property {number} seconds_left - Estimated seconds remaining
|
||||
* @property {{ id: string; str: string; completed: boolean; downloaded: boolean; failed: boolean; hook_error: any }} status
|
||||
* - Status object (e.g. `{ id: "queued", str: "Running (15.1%)", ... }`)
|
||||
* @property {{ online: number; total: number; str: string }} sources
|
||||
* - Peer count (e.g. `{ online: 1, total: 1, str: "1/1 online" }`)
|
||||
* @property {string} target - Download destination (e.g. "/Downloads/movie.mkv")
|
||||
*/
|
||||
interface DownloadTickData {
|
||||
type DownloadTickData = {
|
||||
id: number;
|
||||
name: string;
|
||||
downloaded_bytes: number;
|
||||
@@ -48,12 +28,12 @@ interface DownloadTickData {
|
||||
str: string;
|
||||
};
|
||||
target: string;
|
||||
}
|
||||
};
|
||||
|
||||
export const DownloadProgressTick: React.FC<DownloadProgressTickProps> = ({
|
||||
bundleId,
|
||||
}): ReactElement | null => {
|
||||
const socketRef = useRef<Socket>();
|
||||
const socketRef = useRef<Socket | undefined>(undefined);
|
||||
const [tick, setTick] = useState<DownloadTickData | null>(null);
|
||||
useEffect(() => {
|
||||
const socket = useStore.getState().getSocket("manual");
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import React, { useEffect, ReactElement, useState, useMemo } from "react";
|
||||
import { isEmpty, isNil, isUndefined, map } from "lodash";
|
||||
import { AirDCPPBundles } from "./AirDCPPBundles";
|
||||
import { TorrentDownloads } from "./TorrentDownloads";
|
||||
import { TorrentDownloads, TorrentData } from "./TorrentDownloads";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import {
|
||||
@@ -32,7 +32,7 @@ export interface TorrentDetails {
|
||||
export const DownloadsPanel = (): ReactElement | null => {
|
||||
const { comicObjectId } = useParams<{ comicObjectId: string }>();
|
||||
const [infoHashes, setInfoHashes] = useState<string[]>([]);
|
||||
const [torrentDetails, setTorrentDetails] = useState<TorrentDetails[]>([]);
|
||||
const [torrentDetails, setTorrentDetails] = useState<TorrentData[]>([]);
|
||||
const [activeTab, setActiveTab] = useState<"directconnect" | "torrents">(
|
||||
"directconnect",
|
||||
);
|
||||
|
||||
@@ -5,20 +5,39 @@ import ellipsize from "ellipsize";
|
||||
import { LIBRARY_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import axios from "axios";
|
||||
import { useGetComicByIdQuery } from "../../graphql/generated";
|
||||
import type { MatchResultProps } from "../../types";
|
||||
|
||||
interface MatchResultProps {
|
||||
matchData: any;
|
||||
comicObjectId: string;
|
||||
queryClient?: any;
|
||||
onMatchApplied?: () => void;
|
||||
}
|
||||
|
||||
const handleBrokenImage = (e) => {
|
||||
e.target.src = "http://localhost:3050/dist/img/noimage.svg";
|
||||
const handleBrokenImage = (e: React.SyntheticEvent<HTMLImageElement>) => {
|
||||
e.currentTarget.src = "http://localhost:3050/dist/img/noimage.svg";
|
||||
};
|
||||
|
||||
interface ComicVineMatch {
|
||||
description?: string;
|
||||
name?: string;
|
||||
score: string | number;
|
||||
issue_number: string | number;
|
||||
cover_date: string;
|
||||
image: {
|
||||
thumb_url: string;
|
||||
};
|
||||
volume: {
|
||||
name: string;
|
||||
};
|
||||
volumeInformation: {
|
||||
results: {
|
||||
image: {
|
||||
icon_url: string;
|
||||
};
|
||||
count_of_issues: number;
|
||||
publisher: {
|
||||
name: string;
|
||||
};
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
export const MatchResult = (props: MatchResultProps) => {
|
||||
const applyCVMatch = async (match, comicObjectId) => {
|
||||
const applyCVMatch = async (match: ComicVineMatch, comicObjectId: string) => {
|
||||
try {
|
||||
const response = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/applyComicVineMetadata`,
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import React from "react";
|
||||
import React, { useState } from "react";
|
||||
import { ComicVineSearchForm } from "./ComicVineSearchForm";
|
||||
import { ComicVineMatchPanel } from "./ComicVineMatchPanel";
|
||||
import { EditMetadataPanel } from "./EditMetadataPanel";
|
||||
@@ -13,6 +13,47 @@ interface CVMatchesPanelProps {
|
||||
onMatchApplied: () => void;
|
||||
};
|
||||
|
||||
/**
|
||||
* Collapsible container for manual ComicVine search form.
|
||||
* Allows users to manually search when auto-match doesn't yield results.
|
||||
*/
|
||||
const CollapsibleSearchForm: React.FC<{ rawFileDetails?: RawFileDetails }> = ({
|
||||
rawFileDetails,
|
||||
}) => {
|
||||
const [isExpanded, setIsExpanded] = useState(false);
|
||||
|
||||
return (
|
||||
<div className="border border-slate-300 dark:border-slate-600 rounded-lg overflow-hidden">
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => setIsExpanded(!isExpanded)}
|
||||
className="w-full flex items-center justify-between px-4 py-3 bg-slate-100 dark:bg-slate-700 hover:bg-slate-200 dark:hover:bg-slate-600 transition-colors text-left"
|
||||
aria-expanded={isExpanded}
|
||||
>
|
||||
<span className="flex items-center gap-2 text-slate-700 dark:text-slate-200 font-medium">
|
||||
<svg
|
||||
className={`w-4 h-4 transition-transform ${isExpanded ? "rotate-90" : ""}`}
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
viewBox="0 0 24 24"
|
||||
>
|
||||
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M9 5l7 7-7 7" />
|
||||
</svg>
|
||||
Manual Search
|
||||
</span>
|
||||
<span className="text-sm text-slate-500 dark:text-slate-400">
|
||||
{isExpanded ? "Click to collapse" : "No results? Search manually"}
|
||||
</span>
|
||||
</button>
|
||||
{isExpanded && (
|
||||
<div className="p-4 bg-white dark:bg-slate-800">
|
||||
<ComicVineSearchForm rawFileDetails={rawFileDetails} />
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* Sliding panel content for ComicVine match search.
|
||||
*
|
||||
@@ -32,19 +73,18 @@ export const CVMatchesPanel: React.FC<CVMatchesPanelProps> = ({
|
||||
onMatchApplied,
|
||||
}) => (
|
||||
<>
|
||||
<div>
|
||||
<ComicVineSearchForm data={rawFileDetails} />
|
||||
</div>
|
||||
|
||||
<div className="border-slate-500 border rounded-lg p-2 mt-3">
|
||||
<p className="">Searching for:</p>
|
||||
<div className="border-slate-500 border rounded-lg p-2 mb-3">
|
||||
<p className="text-slate-600 dark:text-slate-300">Searching for:</p>
|
||||
{inferredMetadata.issue ? (
|
||||
<>
|
||||
<span className="">{inferredMetadata.issue?.name} </span>
|
||||
<span className=""> # {inferredMetadata.issue?.number} </span>
|
||||
<span className="text-slate-800 dark:text-slate-100 font-medium">{inferredMetadata.issue?.name} </span>
|
||||
<span className="text-slate-600 dark:text-slate-300"> # {inferredMetadata.issue?.number} </span>
|
||||
</>
|
||||
) : null}
|
||||
</div>
|
||||
|
||||
<CollapsibleSearchForm rawFileDetails={rawFileDetails} />
|
||||
|
||||
<ComicVineMatchPanel
|
||||
props={{
|
||||
comicVineMatches,
|
||||
|
||||
@@ -1,20 +1,41 @@
|
||||
import React, { ReactElement, Suspense, useState } from "react";
|
||||
import { isNil } from "lodash";
|
||||
|
||||
export const TabControls = (props): ReactElement => {
|
||||
interface TabItem {
|
||||
id: number;
|
||||
name: string;
|
||||
icon: React.ReactNode;
|
||||
content: React.ReactNode;
|
||||
shouldShow?: boolean;
|
||||
}
|
||||
|
||||
interface TabControlsProps {
|
||||
filteredTabs: TabItem[];
|
||||
downloadCount: number;
|
||||
activeTab?: number;
|
||||
setActiveTab?: (id: number) => void;
|
||||
}
|
||||
|
||||
export const TabControls = (props: TabControlsProps): ReactElement => {
|
||||
const { filteredTabs, downloadCount, activeTab, setActiveTab } = props;
|
||||
const [active, setActive] = useState(filteredTabs[0].id);
|
||||
|
||||
// Use controlled state if provided, otherwise use internal state
|
||||
const currentActive = activeTab !== undefined ? activeTab : active;
|
||||
const handleSetActive = activeTab !== undefined ? setActiveTab : setActive;
|
||||
const handleSetActive = (id: number) => {
|
||||
if (setActiveTab) {
|
||||
setActiveTab(id);
|
||||
} else {
|
||||
setActive(id);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className="hidden sm:block mt-7 mb-3 w-fit">
|
||||
<div className="border-b border-gray-200">
|
||||
<nav className="flex gap-6" aria-label="Tabs">
|
||||
{filteredTabs.map(({ id, name, icon }) => (
|
||||
{filteredTabs.map(({ id, name, icon }: TabItem) => (
|
||||
<a
|
||||
key={id}
|
||||
className={`inline-flex shrink-0 items-center gap-2 px-1 py-1 text-md font-medium text-gray-500 dark:text-gray-400 hover:border-gray-300 hover:border-b hover:dark:text-slate-200 ${
|
||||
@@ -48,7 +69,7 @@ export const TabControls = (props): ReactElement => {
|
||||
</div>
|
||||
</div>
|
||||
<Suspense fallback={null}>
|
||||
{filteredTabs.map(({ id, content }) => (
|
||||
{filteredTabs.map(({ id, content }: TabItem) => (
|
||||
<React.Fragment key={id}>
|
||||
{currentActive === id ? content : null}
|
||||
</React.Fragment>
|
||||
|
||||
@@ -139,7 +139,7 @@ export const ArchiveOperations = (props: { data: any }): ReactElement => {
|
||||
}, [isSuccess, shouldRefetchComicBookData, queryClient]);
|
||||
|
||||
// sliding panel init
|
||||
const contentForSlidingPanel: Record<string, { content: () => JSX.Element }> = {
|
||||
const contentForSlidingPanel: Record<string, { content: () => React.ReactElement }> = {
|
||||
imageAnalysis: {
|
||||
content: () => {
|
||||
return (
|
||||
|
||||
@@ -2,11 +2,48 @@ import React from "react";
|
||||
import dayjs from "dayjs";
|
||||
import prettyBytes from "pretty-bytes";
|
||||
|
||||
export const TorrentDownloads = (props) => {
|
||||
interface TorrentInfo {
|
||||
name: string;
|
||||
hash: string;
|
||||
added_on: number;
|
||||
progress: number;
|
||||
downloaded: number;
|
||||
uploaded: number;
|
||||
trackers_count: number;
|
||||
total_size: number;
|
||||
}
|
||||
|
||||
interface TorrentData {
|
||||
torrent?: TorrentInfo;
|
||||
// Support direct TorrentDetails format from socket events
|
||||
infoHash?: string;
|
||||
downloadSpeed?: number;
|
||||
uploadSpeed?: number;
|
||||
name?: string;
|
||||
}
|
||||
|
||||
export interface TorrentDownloadsProps {
|
||||
data: TorrentData[];
|
||||
}
|
||||
|
||||
export type { TorrentData };
|
||||
|
||||
export const TorrentDownloads = (props: TorrentDownloadsProps) => {
|
||||
const { data } = props;
|
||||
return (
|
||||
<>
|
||||
{data.map(({ torrent }) => {
|
||||
{data.map((item: TorrentData, index: number) => {
|
||||
// Support both wrapped format (item.torrent) and direct format
|
||||
const torrent: TorrentInfo = item.torrent || {
|
||||
name: item.name || 'Unknown',
|
||||
hash: item.infoHash || '',
|
||||
added_on: 0,
|
||||
progress: (item as any).progress || 0,
|
||||
downloaded: 0,
|
||||
uploaded: 0,
|
||||
trackers_count: 0,
|
||||
total_size: 0,
|
||||
};
|
||||
return (
|
||||
<dl className="mt-5 dark:text-slate-200 text-slate-600">
|
||||
<dt className="text-lg">{torrent.name}</dt>
|
||||
|
||||
@@ -10,7 +10,31 @@ import { isEmpty, isNil } from "lodash";
|
||||
import ellipsize from "ellipsize";
|
||||
import prettyBytes from "pretty-bytes";
|
||||
|
||||
export const TorrentSearchPanel = (props) => {
|
||||
interface TorrentSearchPanelProps {
|
||||
issueName: string;
|
||||
comicObjectId: string;
|
||||
}
|
||||
|
||||
interface SearchFormValues {
|
||||
issueName: string;
|
||||
}
|
||||
|
||||
interface TorrentResult {
|
||||
fileName: string;
|
||||
seeders: number;
|
||||
leechers: number;
|
||||
size: number;
|
||||
files: number;
|
||||
indexer: string;
|
||||
downloadUrl: string;
|
||||
}
|
||||
|
||||
interface TorrentDownloadPayload {
|
||||
comicObjectId: string;
|
||||
torrentToDownload: string;
|
||||
}
|
||||
|
||||
export const TorrentSearchPanel = (props: TorrentSearchPanelProps) => {
|
||||
const { issueName, comicObjectId } = props;
|
||||
// Initialize searchTerm with issueName from props
|
||||
const [searchTerm, setSearchTerm] = useState({ issueName });
|
||||
@@ -40,19 +64,19 @@ export const TorrentSearchPanel = (props) => {
|
||||
enabled: !isNil(searchTerm.issueName) && searchTerm.issueName.trim() !== "", // Make sure searchTerm is not empty
|
||||
});
|
||||
const mutation = useMutation({
|
||||
mutationFn: async (newTorrent) =>
|
||||
mutationFn: async (newTorrent: TorrentDownloadPayload) =>
|
||||
axios.post(`${QBITTORRENT_SERVICE_BASE_URI}/addTorrent`, newTorrent),
|
||||
onSuccess: async (data) => {
|
||||
onSuccess: async () => {
|
||||
// Torrent added successfully
|
||||
},
|
||||
});
|
||||
const searchIndexer = (values) => {
|
||||
const searchIndexer = (values: SearchFormValues) => {
|
||||
setSearchTerm({ issueName: values.issueName }); // Update searchTerm based on the form submission
|
||||
};
|
||||
const downloadTorrent = (evt) => {
|
||||
const newTorrent = {
|
||||
const downloadTorrent = (downloadUrl: string) => {
|
||||
const newTorrent: TorrentDownloadPayload = {
|
||||
comicObjectId,
|
||||
torrentToDownload: evt,
|
||||
torrentToDownload: downloadUrl,
|
||||
};
|
||||
mutation.mutate(newTorrent);
|
||||
};
|
||||
@@ -125,7 +149,7 @@ export const TorrentSearchPanel = (props) => {
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody className="divide-y divide-slate-100 dark:divide-gray-500">
|
||||
{data?.data.map((result, idx) => (
|
||||
{data?.data.map((result: TorrentResult, idx: number) => (
|
||||
<tr key={idx}>
|
||||
<td className="px-3 py-3 text-gray-700 dark:text-slate-300 text-md">
|
||||
<p>{ellipsize(result.fileName, 90)}</p>
|
||||
|
||||
@@ -1,5 +1,6 @@
|
||||
import React, { lazy } from "react";
|
||||
import { isNil, isEmpty } from "lodash";
|
||||
import type { TabConfig, TabConfigParams } from "../../types";
|
||||
|
||||
const VolumeInformation = lazy(() => import("./Tabs/VolumeInformation").then(m => ({ default: m.VolumeInformation })));
|
||||
const ArchiveOperations = lazy(() => import("./Tabs/ArchiveOperations").then(m => ({ default: m.ArchiveOperations })));
|
||||
@@ -7,26 +8,6 @@ const AcquisitionPanel = lazy(() => import("./AcquisitionPanel"));
|
||||
const TorrentSearchPanel = lazy(() => import("./TorrentSearchPanel"));
|
||||
const DownloadsPanel = lazy(() => import("./DownloadsPanel"));
|
||||
|
||||
interface TabConfig {
|
||||
id: number;
|
||||
name: string;
|
||||
icon: React.ReactElement;
|
||||
content: React.ReactElement | null;
|
||||
shouldShow: boolean;
|
||||
}
|
||||
|
||||
interface TabConfigParams {
|
||||
data: any;
|
||||
hasAnyMetadata: boolean;
|
||||
areRawFileDetailsAvailable: boolean;
|
||||
airDCPPQuery: any;
|
||||
comicObjectId: string;
|
||||
userSettings: any;
|
||||
issueName: string;
|
||||
acquisition?: any;
|
||||
onReconcileMetadata?: () => void;
|
||||
}
|
||||
|
||||
export const createTabConfig = ({
|
||||
data,
|
||||
hasAnyMetadata,
|
||||
|
||||
@@ -56,7 +56,7 @@ export const Dashboard = (): ReactElement => {
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className="mx-auto max-w-screen-xl px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
<div className="mx-auto max-w-7xl px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
<PullList />
|
||||
{recentComics.length > 0 && <RecentlyImported comics={recentComics} />}
|
||||
{/* Wanted comics */}
|
||||
|
||||
@@ -1,16 +1,7 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import Header from "../shared/Header";
|
||||
import { GetLibraryStatisticsQuery, DirectorySize } from "../../graphql/generated";
|
||||
|
||||
type Stats = Omit<GetLibraryStatisticsQuery["getLibraryStatistics"], "comicDirectorySize"> & {
|
||||
comicDirectorySize: DirectorySize;
|
||||
comicsMissingFiles: number;
|
||||
};
|
||||
|
||||
/** Props for {@link LibraryStatistics}. */
|
||||
interface LibraryStatisticsProps {
|
||||
stats: Stats | null | undefined;
|
||||
}
|
||||
import type { LibraryStatisticsProps } from "../../types";
|
||||
|
||||
/**
|
||||
* Displays a snapshot of library metrics: total comic files, tagging coverage,
|
||||
|
||||
@@ -12,10 +12,7 @@ import { Form } from "react-final-form";
|
||||
import DatePickerDialog from "../shared/DatePicker";
|
||||
import { format } from "date-fns";
|
||||
import { LocgMetadata, useGetWeeklyPullListQuery } from "../../graphql/generated";
|
||||
|
||||
interface PullListProps {
|
||||
issues?: LocgMetadata[];
|
||||
}
|
||||
import type { PullListProps } from "../../types";
|
||||
|
||||
export const PullList = (): ReactElement => {
|
||||
const queryClient = useQueryClient();
|
||||
@@ -136,7 +133,7 @@ export const PullList = (): ReactElement => {
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<div className="w-lvw -mr-4 sm:-mr-6 lg:-mr-8">
|
||||
<div className="mr-[calc(-1*(1rem+max(0px,(100vw-80rem)/2)))] sm:mr-[calc(-1*(1.5rem+max(0px,(100vw-80rem)/2)))] lg:mr-[calc(-1*(2rem+max(0px,(100vw-80rem)/2)))]">
|
||||
{isSuccess && !isLoading && (
|
||||
<div className="overflow-hidden" ref={emblaRef}>
|
||||
<div className="flex">
|
||||
|
||||
@@ -1,9 +1,6 @@
|
||||
import * as React from "react";
|
||||
import type { ZeroStateProps } from "../../types";
|
||||
|
||||
interface ZeroStateProps {
|
||||
header: string;
|
||||
message: string;
|
||||
}
|
||||
const ZeroState: React.FunctionComponent<ZeroStateProps> = (props) => {
|
||||
return (
|
||||
<article className="">
|
||||
|
||||
@@ -1,62 +1,49 @@
|
||||
import React, { ReactElement, useEffect, useState } from "react";
|
||||
import { getTransfers } from "../../actions/airdcpp.actions";
|
||||
import { isEmpty, isNil, isUndefined } from "lodash";
|
||||
import { isEmpty, isNil } from "lodash";
|
||||
import { determineCoverFile } from "../../shared/utils/metadata.utils";
|
||||
import MetadataPanel from "../shared/MetadataPanel";
|
||||
import type { DownloadsProps } from "../../types";
|
||||
import { useStore } from "../../store";
|
||||
|
||||
interface IDownloadsProps {
|
||||
data: any;
|
||||
interface BundleData {
|
||||
rawFileDetails?: Record<string, unknown>;
|
||||
inferredMetadata?: Record<string, unknown>;
|
||||
acquisition?: {
|
||||
directconnect?: {
|
||||
downloads?: Array<{
|
||||
name: string;
|
||||
size: number;
|
||||
type: { str: string };
|
||||
bundleId: string;
|
||||
}>;
|
||||
};
|
||||
};
|
||||
sourcedMetadata?: {
|
||||
locg?: unknown;
|
||||
comicvine?: unknown;
|
||||
};
|
||||
issueName?: string;
|
||||
url?: string;
|
||||
}
|
||||
|
||||
export const Downloads = (props: IDownloadsProps): ReactElement => {
|
||||
// const airDCPPConfiguration = useContext(AirDCPPSocketContext);
|
||||
const {
|
||||
airDCPPState: { settings, socket },
|
||||
} = airDCPPConfiguration;
|
||||
// const dispatch = useDispatch();
|
||||
|
||||
// const airDCPPTransfers = useSelector(
|
||||
// (state: RootState) => state.airdcpp.transfers,
|
||||
// );
|
||||
// const issueBundles = useSelector(
|
||||
// (state: RootState) => state.airdcpp.issue_bundles,
|
||||
// );
|
||||
const [bundles, setBundles] = useState([]);
|
||||
// Make the call to get all transfers from AirDC++
|
||||
export const Downloads = (_props: DownloadsProps): ReactElement => {
|
||||
// Using Zustand store for socket management
|
||||
const getSocket = useStore((state) => state.getSocket);
|
||||
|
||||
const [bundles, setBundles] = useState<BundleData[]>([]);
|
||||
const [isLoading, setIsLoading] = useState(true);
|
||||
|
||||
// Initialize socket connection and load data
|
||||
useEffect(() => {
|
||||
if (!isUndefined(socket) && !isEmpty(settings)) {
|
||||
dispatch(
|
||||
getTransfers(socket, {
|
||||
username: `${settings.directConnect.client.host.username}`,
|
||||
password: `${settings.directConnect.client.host.password}`,
|
||||
}),
|
||||
);
|
||||
const socket = getSocket();
|
||||
if (socket) {
|
||||
// Socket is connected, we could fetch transfers here
|
||||
// For now, just set loading to false since we don't have direct access to Redux state
|
||||
setIsLoading(false);
|
||||
}
|
||||
}, [socket]);
|
||||
}, [getSocket]);
|
||||
|
||||
useEffect(() => {
|
||||
if (!isUndefined(issueBundles)) {
|
||||
const foo = issueBundles.data.map((bundle) => {
|
||||
const {
|
||||
rawFileDetails,
|
||||
inferredMetadata,
|
||||
acquisition: {
|
||||
directconnect: { downloads },
|
||||
},
|
||||
sourcedMetadata: { locg, comicvine },
|
||||
} = bundle;
|
||||
const { issueName, url } = determineCoverFile({
|
||||
rawFileDetails,
|
||||
comicvine,
|
||||
locg,
|
||||
});
|
||||
return { ...bundle, issueName, url };
|
||||
});
|
||||
setBundles(foo);
|
||||
}
|
||||
}, [issueBundles]);
|
||||
|
||||
return !isNil(bundles) ? (
|
||||
return !isNil(bundles) && bundles.length > 0 ? (
|
||||
<div className="container mx-auto px-4 sm:px-6 lg:px-8">
|
||||
<section className="section">
|
||||
<h1 className="title">Downloads</h1>
|
||||
@@ -87,16 +74,16 @@ export const Downloads = (props: IDownloadsProps): ReactElement => {
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{bundle.acquisition.directconnect.downloads.map(
|
||||
(bundle, idx) => {
|
||||
{bundle.acquisition?.directconnect?.downloads?.map(
|
||||
(download, idx: number) => {
|
||||
return (
|
||||
<tr key={idx}>
|
||||
<td>{bundle.name}</td>
|
||||
<td>{bundle.size}</td>
|
||||
<td>{bundle.type.str}</td>
|
||||
<td>{download.name}</td>
|
||||
<td>{download.size}</td>
|
||||
<td>{download.type.str}</td>
|
||||
<td>
|
||||
<span className="tag is-warning">
|
||||
{bundle.bundleId}
|
||||
{download.bundleId}
|
||||
</span>
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
@@ -1,40 +1,28 @@
|
||||
import { debounce, isEmpty, map } from "lodash";
|
||||
import React, { ReactElement, useCallback, useState } from "react";
|
||||
import { useDispatch, useSelector } from "react-redux";
|
||||
import axios from "axios";
|
||||
import Card from "../shared/Carda";
|
||||
|
||||
import { searchIssue } from "../../actions/fileops.actions";
|
||||
import MetadataPanel from "../shared/MetadataPanel";
|
||||
import { SEARCH_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import type { GlobalSearchBarProps } from "../../types";
|
||||
|
||||
interface ISearchBarProps {
|
||||
data: any;
|
||||
}
|
||||
|
||||
export const SearchBar = (data: ISearchBarProps): ReactElement => {
|
||||
const dispatch = useDispatch();
|
||||
const searchResults = useSelector(
|
||||
(state: RootState) => state.fileOps.librarySearchResultsFormatted,
|
||||
);
|
||||
export const SearchBar = (data: GlobalSearchBarProps): ReactElement => {
|
||||
const [searchResults, setSearchResults] = useState<Record<string, unknown>[]>([]);
|
||||
|
||||
const performSearch = useCallback(
|
||||
debounce((e) => {
|
||||
dispatch(
|
||||
searchIssue(
|
||||
{
|
||||
query: {
|
||||
volumeName: e.target.value,
|
||||
},
|
||||
},
|
||||
{
|
||||
pagination: {
|
||||
size: 25,
|
||||
from: 0,
|
||||
},
|
||||
type: "volumeName",
|
||||
trigger: "globalSearchBar",
|
||||
},
|
||||
),
|
||||
);
|
||||
debounce(async (e) => {
|
||||
const response = await axios({
|
||||
url: `${SEARCH_SERVICE_BASE_URI}/searchIssue`,
|
||||
method: "POST",
|
||||
data: {
|
||||
query: { volumeName: e.target.value },
|
||||
pagination: { size: 25, from: 0 },
|
||||
type: "volumeName",
|
||||
trigger: "globalSearchBar",
|
||||
},
|
||||
});
|
||||
setSearchResults(response.data?.hits ?? []);
|
||||
}, 500),
|
||||
[data],
|
||||
);
|
||||
|
||||
@@ -490,4 +490,188 @@ describe('Import Component - Real-time Updates', () => {
|
||||
});
|
||||
});
|
||||
|
||||
describe('Import Component - Directory Status', () => {
|
||||
beforeEach(() => {
|
||||
jest.clearAllMocks();
|
||||
(axios as any).mockResolvedValue({ data: [] });
|
||||
(axios.request as jest.Mock) = jest.fn().mockResolvedValue({ data: {} });
|
||||
// Mock successful directory status by default
|
||||
(axios.get as jest.Mock) = jest.fn().mockResolvedValue({
|
||||
data: { comics: { exists: true }, userdata: { exists: true } }
|
||||
});
|
||||
});
|
||||
|
||||
test('should show warning banner when comics directory is missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: false }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Required Directories Missing')).toBeInTheDocument();
|
||||
});
|
||||
expect(screen.getByText('comics')).toBeInTheDocument();
|
||||
});
|
||||
|
||||
test('should show warning banner when userdata directory is missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: true }, userdata: { exists: false } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Required Directories Missing')).toBeInTheDocument();
|
||||
});
|
||||
expect(screen.getByText('userdata')).toBeInTheDocument();
|
||||
});
|
||||
|
||||
test('should show warning banner when both directories are missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: false }, userdata: { exists: false } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Required Directories Missing')).toBeInTheDocument();
|
||||
});
|
||||
expect(screen.getByText('comics')).toBeInTheDocument();
|
||||
expect(screen.getByText('userdata')).toBeInTheDocument();
|
||||
});
|
||||
|
||||
test('should disable import button when directories are missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: false }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
const button = screen.getByRole('button', { name: /Force Re-Import/i });
|
||||
expect(button).toBeDisabled();
|
||||
});
|
||||
});
|
||||
|
||||
test('should enable import button when all directories exist', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: true }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
const button = screen.getByRole('button', { name: /Force Re-Import/i });
|
||||
expect(button).not.toBeDisabled();
|
||||
});
|
||||
});
|
||||
|
||||
test('should not show warning banner when all directories exist', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: true }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
// Wait for the component to finish loading
|
||||
await waitFor(() => {
|
||||
expect(screen.getByRole('button', { name: /Force Re-Import/i })).toBeInTheDocument();
|
||||
});
|
||||
|
||||
// The warning banner should not be present
|
||||
expect(screen.queryByText('Required Directories Missing')).not.toBeInTheDocument();
|
||||
});
|
||||
});
|
||||
|
||||
export {};
|
||||
|
||||
@@ -1,14 +1,36 @@
|
||||
/**
|
||||
* @fileoverview Import page component for managing comic library imports.
|
||||
* Provides UI for starting imports, monitoring progress, viewing history,
|
||||
* and handling directory configuration issues.
|
||||
* @module components/Import/Import
|
||||
*/
|
||||
|
||||
import { ReactElement, useEffect, useRef, useState } from "react";
|
||||
import { format } from "date-fns";
|
||||
import { isEmpty } from "lodash";
|
||||
import { useMutation, useQueryClient } from "@tanstack/react-query";
|
||||
import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
|
||||
import { useStore } from "../../store";
|
||||
import { useShallow } from "zustand/react/shallow";
|
||||
import axios from "axios";
|
||||
import { useGetJobResultStatisticsQuery } from "../../graphql/generated";
|
||||
import { RealTimeImportStats } from "./RealTimeImportStats";
|
||||
import { PastImportsTable } from "./PastImportsTable";
|
||||
import { AlertBanner } from "../shared/AlertBanner";
|
||||
import { useImportSessionStatus } from "../../hooks/useImportSessionStatus";
|
||||
import { SETTINGS_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import type { DirectoryStatus, DirectoryIssue } from "./import.types";
|
||||
|
||||
/**
|
||||
* Import page component for managing comic library imports.
|
||||
*
|
||||
* Features:
|
||||
* - Real-time import progress tracking via WebSocket
|
||||
* - Directory status validation before import
|
||||
* - Force re-import functionality for fixing indexing issues
|
||||
* - Past import history table
|
||||
* - Session management for import tracking
|
||||
*
|
||||
* @returns {ReactElement} The import page UI
|
||||
*/
|
||||
export const Import = (): ReactElement => {
|
||||
const [importError, setImportError] = useState<string | null>(null);
|
||||
const queryClient = useQueryClient();
|
||||
@@ -17,10 +39,36 @@ export const Import = (): ReactElement => {
|
||||
importJobQueue: state.importJobQueue,
|
||||
getSocket: state.getSocket,
|
||||
disconnectSocket: state.disconnectSocket,
|
||||
})),
|
||||
}))
|
||||
);
|
||||
|
||||
// Force re-import mutation - re-imports all files regardless of import status
|
||||
// Check if required directories exist
|
||||
const {
|
||||
data: directoryStatus,
|
||||
isLoading: isCheckingDirectories,
|
||||
isError: isDirectoryCheckError,
|
||||
error: directoryError,
|
||||
} = useQuery({
|
||||
queryKey: ["directoryStatus"],
|
||||
queryFn: async (): Promise<DirectoryStatus> => {
|
||||
const response = await axios.get(
|
||||
`${SETTINGS_SERVICE_BASE_URI}/getDirectoryStatus`
|
||||
);
|
||||
return response.data;
|
||||
},
|
||||
refetchOnWindowFocus: false,
|
||||
staleTime: 30000,
|
||||
retry: false,
|
||||
});
|
||||
|
||||
// Use isValid for quick check, issues array for detailed display
|
||||
const directoryCheckFailed = isDirectoryCheckError;
|
||||
const hasAllDirectories = directoryCheckFailed
|
||||
? false
|
||||
: (directoryStatus?.isValid ?? true);
|
||||
const directoryIssues = directoryStatus?.issues ?? [];
|
||||
|
||||
// Force re-import mutation
|
||||
const { mutate: forceReImport, isPending: isForceReImporting } = useMutation({
|
||||
mutationFn: async () => {
|
||||
const sessionId = localStorage.getItem("sessionId") || "";
|
||||
@@ -37,7 +85,11 @@ export const Import = (): ReactElement => {
|
||||
},
|
||||
onError: (error: any) => {
|
||||
console.error("Failed to start force re-import:", error);
|
||||
setImportError(error?.response?.data?.message || error?.message || "Failed to start force re-import. Please try again.");
|
||||
setImportError(
|
||||
error?.response?.data?.message ||
|
||||
error?.message ||
|
||||
"Failed to start force re-import. Please try again."
|
||||
);
|
||||
},
|
||||
});
|
||||
|
||||
@@ -47,14 +99,11 @@ export const Import = (): ReactElement => {
|
||||
const hasActiveSession = importSession.isActive;
|
||||
const wasComplete = useRef(false);
|
||||
|
||||
// React to importSession.isComplete rather than socket events — more reliable
|
||||
// since it's derived from the actual GraphQL state, not a raw socket event.
|
||||
// React to importSession.isComplete for state updates
|
||||
useEffect(() => {
|
||||
if (importSession.isComplete && !wasComplete.current) {
|
||||
wasComplete.current = true;
|
||||
// Small delay so the backend has time to commit job result stats
|
||||
setTimeout(() => {
|
||||
// Invalidate the cache to force a fresh fetch of job result statistics
|
||||
queryClient.invalidateQueries({ queryKey: ["GetJobResultStatistics"] });
|
||||
refetch();
|
||||
}, 1500);
|
||||
@@ -64,21 +113,23 @@ export const Import = (): ReactElement => {
|
||||
}
|
||||
}, [importSession.isComplete, refetch, importJobQueue, queryClient]);
|
||||
|
||||
// Listen to socket events to update Past Imports table in real-time
|
||||
// Listen to socket events to update Past Imports table
|
||||
useEffect(() => {
|
||||
const socket = getSocket("/");
|
||||
|
||||
const handleImportCompleted = () => {
|
||||
console.log("[Import] IMPORT_SESSION_COMPLETED event - refreshing Past Imports");
|
||||
// Small delay to ensure backend has committed the job results
|
||||
console.log(
|
||||
"[Import] IMPORT_SESSION_COMPLETED event - refreshing Past Imports"
|
||||
);
|
||||
setTimeout(() => {
|
||||
queryClient.invalidateQueries({ queryKey: ["GetJobResultStatistics"] });
|
||||
}, 1500);
|
||||
};
|
||||
|
||||
const handleQueueDrained = () => {
|
||||
console.log("[Import] LS_IMPORT_QUEUE_DRAINED event - refreshing Past Imports");
|
||||
// Small delay to ensure backend has committed the job results
|
||||
console.log(
|
||||
"[Import] LS_IMPORT_QUEUE_DRAINED event - refreshing Past Imports"
|
||||
);
|
||||
setTimeout(() => {
|
||||
queryClient.invalidateQueries({ queryKey: ["GetJobResultStatistics"] });
|
||||
}, 1500);
|
||||
@@ -99,7 +150,22 @@ export const Import = (): ReactElement => {
|
||||
const handleForceReImport = async () => {
|
||||
setImportError(null);
|
||||
|
||||
// Check for active session before starting using definitive status
|
||||
if (!hasAllDirectories) {
|
||||
if (directoryCheckFailed) {
|
||||
setImportError(
|
||||
"Cannot start import: Failed to verify directory status. Please check that the backend service is running."
|
||||
);
|
||||
} else {
|
||||
const issueDetails = directoryIssues
|
||||
.map((i) => `${i.directory}: ${i.issue}`)
|
||||
.join(", ");
|
||||
setImportError(
|
||||
`Cannot start import: ${issueDetails || "Required directories are missing"}. Please check your Docker volume configuration.`
|
||||
);
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
if (hasActiveSession) {
|
||||
setImportError(
|
||||
`Cannot start import: An import session "${importSession.sessionId}" is already active. Please wait for it to complete.`
|
||||
@@ -107,10 +173,12 @@ export const Import = (): ReactElement => {
|
||||
return;
|
||||
}
|
||||
|
||||
if (window.confirm(
|
||||
"This will re-import ALL files in your library folder, even those already imported. " +
|
||||
"This can help fix Elasticsearch indexing issues. Continue?"
|
||||
)) {
|
||||
if (
|
||||
window.confirm(
|
||||
"This will re-import ALL files in your library folder, even those already imported. " +
|
||||
"This can help fix Elasticsearch indexing issues. Continue?"
|
||||
)
|
||||
) {
|
||||
if (importJobQueue.status === "drained") {
|
||||
localStorage.removeItem("sessionId");
|
||||
disconnectSocket("/");
|
||||
@@ -126,6 +194,10 @@ export const Import = (): ReactElement => {
|
||||
}
|
||||
};
|
||||
|
||||
const canStartImport =
|
||||
!hasActiveSession &&
|
||||
(importJobQueue.status === "drained" || importJobQueue.status === undefined);
|
||||
|
||||
return (
|
||||
<div>
|
||||
<section>
|
||||
@@ -136,7 +208,6 @@ export const Import = (): ReactElement => {
|
||||
<h1 className="text-2xl font-bold text-gray-900 dark:text-white sm:text-3xl">
|
||||
Import
|
||||
</h1>
|
||||
|
||||
<p className="mt-1.5 text-sm text-gray-500 dark:text-white">
|
||||
Import comics into the ThreeTwo library.
|
||||
</p>
|
||||
@@ -172,43 +243,72 @@ export const Import = (): ReactElement => {
|
||||
|
||||
{/* Error Message */}
|
||||
{importError && (
|
||||
<div className="my-6 max-w-screen-lg rounded-lg border-s-4 border-red-500 bg-red-50 dark:bg-red-900/20 p-4">
|
||||
<div className="flex items-start gap-3">
|
||||
<span className="w-6 h-6 text-red-600 dark:text-red-400 mt-0.5">
|
||||
<i className="h-6 w-6 icon-[solar--danger-circle-bold]"></i>
|
||||
</span>
|
||||
<div className="flex-1">
|
||||
<p className="font-semibold text-red-800 dark:text-red-300">
|
||||
Import Error
|
||||
</p>
|
||||
<p className="text-sm text-red-700 dark:text-red-400 mt-1">
|
||||
{importError}
|
||||
</p>
|
||||
</div>
|
||||
<button
|
||||
onClick={() => setImportError(null)}
|
||||
className="text-red-600 dark:text-red-400 hover:text-red-800 dark:hover:text-red-200"
|
||||
>
|
||||
<span className="w-5 h-5">
|
||||
<i className="h-5 w-5 icon-[solar--close-circle-bold]"></i>
|
||||
</span>
|
||||
</button>
|
||||
</div>
|
||||
<div className="my-6 max-w-screen-lg">
|
||||
<AlertBanner
|
||||
severity="error"
|
||||
title="Import Error"
|
||||
onClose={() => setImportError(null)}
|
||||
>
|
||||
{importError}
|
||||
</AlertBanner>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Force Re-Import Button - always shown when no import is running */}
|
||||
{!hasActiveSession &&
|
||||
(importJobQueue.status === "drained" || importJobQueue.status === undefined) && (
|
||||
{/* Directory Check Error */}
|
||||
{!isCheckingDirectories && directoryCheckFailed && (
|
||||
<div className="my-6 max-w-screen-lg">
|
||||
<AlertBanner severity="error" title="Failed to Check Directory Status">
|
||||
<p>
|
||||
Unable to verify if required directories exist. Import
|
||||
functionality has been disabled.
|
||||
</p>
|
||||
<p className="mt-2">
|
||||
Error: {(directoryError as Error)?.message || "Unknown error"}
|
||||
</p>
|
||||
</AlertBanner>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Directory Status Warning */}
|
||||
{!isCheckingDirectories &&
|
||||
!directoryCheckFailed &&
|
||||
directoryIssues.length > 0 && (
|
||||
<div className="my-6 max-w-screen-lg">
|
||||
<AlertBanner
|
||||
severity="warning"
|
||||
title="Directory Configuration Issues"
|
||||
iconClass="icon-[solar--folder-error-bold]"
|
||||
>
|
||||
<p>
|
||||
The following issues were detected with your directory
|
||||
configuration:
|
||||
</p>
|
||||
<DirectoryIssuesList issues={directoryIssues} />
|
||||
<p className="mt-2">
|
||||
Please ensure these directories are mounted correctly in your
|
||||
Docker configuration.
|
||||
</p>
|
||||
</AlertBanner>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Force Re-Import Button */}
|
||||
{canStartImport && (
|
||||
<div className="my-6 max-w-screen-lg">
|
||||
<button
|
||||
className="flex space-x-1 sm:mt-0 sm:flex-row sm:items-center rounded-lg border border-orange-400 dark:border-orange-200 bg-orange-200 px-5 py-3 text-gray-700 hover:bg-transparent hover:text-orange-600 focus:outline-none focus:ring active:text-orange-500 disabled:opacity-50 disabled:cursor-not-allowed"
|
||||
onClick={handleForceReImport}
|
||||
disabled={isForceReImporting || hasActiveSession}
|
||||
title="Re-import all files to fix Elasticsearch indexing issues"
|
||||
disabled={isForceReImporting || hasActiveSession || !hasAllDirectories}
|
||||
title={
|
||||
!hasAllDirectories
|
||||
? "Cannot import: Required directories are missing"
|
||||
: "Re-import all files to fix Elasticsearch indexing issues"
|
||||
}
|
||||
>
|
||||
<span className="text-md font-medium">
|
||||
{isForceReImporting ? "Starting Re-Import..." : "Force Re-Import All Files"}
|
||||
{isForceReImporting
|
||||
? "Starting Re-Import..."
|
||||
: "Force Re-Import All Files"}
|
||||
</span>
|
||||
<span className="w-6 h-6">
|
||||
<i className="h-6 w-6 icon-[solar--refresh-bold-duotone]"></i>
|
||||
@@ -217,87 +317,9 @@ export const Import = (): ReactElement => {
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Import activity is now shown in the RealTimeImportStats component above */}
|
||||
|
||||
{/* Past Imports Table */}
|
||||
{!isLoading && !isEmpty(data?.getJobResultStatistics) && (
|
||||
<div className="max-w-screen-lg">
|
||||
<span className="flex items-center mt-6">
|
||||
<span className="text-xl text-slate-500 dark:text-slate-200 pr-5">
|
||||
Past Imports
|
||||
</span>
|
||||
<span className="h-px flex-1 bg-slate-200 dark:bg-slate-400"></span>
|
||||
</span>
|
||||
|
||||
<div className="overflow-x-auto w-fit mt-4 rounded-lg border border-gray-200">
|
||||
<table className="min-w-full divide-y-2 divide-gray-200 dark:divide-gray-200 text-md">
|
||||
<thead className="ltr:text-left rtl:text-right">
|
||||
<tr>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
#
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Time Started
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Session Id
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Imported
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Failed
|
||||
</th>
|
||||
</tr>
|
||||
</thead>
|
||||
|
||||
<tbody className="divide-y divide-gray-200">
|
||||
{data?.getJobResultStatistics.map((jobResult: any, index: number) => {
|
||||
return (
|
||||
<tr key={index}>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300 font-medium">
|
||||
{index + 1}
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
{jobResult.earliestTimestamp && !isNaN(parseInt(jobResult.earliestTimestamp))
|
||||
? format(
|
||||
new Date(parseInt(jobResult.earliestTimestamp)),
|
||||
"EEEE, hh:mma, do LLLL y",
|
||||
)
|
||||
: "N/A"}
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="tag is-warning">
|
||||
{jobResult.sessionId}
|
||||
</span>
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="inline-flex items-center justify-center rounded-full bg-emerald-100 px-2 py-0.5 text-emerald-700">
|
||||
<span className="h-5 w-6">
|
||||
<i className="icon-[solar--check-circle-line-duotone] h-5 w-5"></i>
|
||||
</span>
|
||||
<p className="whitespace-nowrap text-sm">
|
||||
{jobResult.completedJobs}
|
||||
</p>
|
||||
</span>
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="inline-flex items-center justify-center rounded-full bg-red-100 px-2 py-0.5 text-red-700">
|
||||
<span className="h-5 w-6">
|
||||
<i className="icon-[solar--close-circle-line-duotone] h-5 w-5"></i>
|
||||
</span>
|
||||
|
||||
<p className="whitespace-nowrap text-sm">
|
||||
{jobResult.failedJobs}
|
||||
</p>
|
||||
</span>
|
||||
</td>
|
||||
</tr>
|
||||
);
|
||||
})}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
<PastImportsTable data={data!.getJobResultStatistics as any} />
|
||||
)}
|
||||
</div>
|
||||
</section>
|
||||
@@ -305,4 +327,20 @@ export const Import = (): ReactElement => {
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* Helper component to render directory issues list.
|
||||
*/
|
||||
const DirectoryIssuesList = ({ issues }: { issues: DirectoryIssue[] }): ReactElement => (
|
||||
<ul className="list-disc list-inside mt-2">
|
||||
{issues.map((item) => (
|
||||
<li key={item.directory}>
|
||||
<code className="bg-amber-100 dark:bg-amber-900/50 px-1 rounded">
|
||||
{item.directory}
|
||||
</code>
|
||||
<span className="ml-1">— {item.issue}</span>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
);
|
||||
|
||||
export default Import;
|
||||
|
||||
103
src/client/components/Import/PastImportsTable.tsx
Normal file
103
src/client/components/Import/PastImportsTable.tsx
Normal file
@@ -0,0 +1,103 @@
|
||||
/**
|
||||
* @fileoverview Table component displaying historical import sessions.
|
||||
* @module components/Import/PastImportsTable
|
||||
*/
|
||||
|
||||
import { ReactElement } from "react";
|
||||
import { format } from "date-fns";
|
||||
import type { JobResultStatistics } from "./import.types";
|
||||
|
||||
/**
|
||||
* Props for the PastImportsTable component.
|
||||
*/
|
||||
export type PastImportsTableProps = {
|
||||
/** Array of job result statistics from past imports */
|
||||
data: JobResultStatistics[];
|
||||
};
|
||||
|
||||
/**
|
||||
* Displays a table of past import sessions with their statistics.
|
||||
*
|
||||
* @param props - Component props
|
||||
* @returns Table element showing import history
|
||||
*/
|
||||
export const PastImportsTable = ({ data }: PastImportsTableProps): ReactElement => {
|
||||
return (
|
||||
<div className="max-w-screen-lg">
|
||||
<span className="flex items-center mt-6">
|
||||
<span className="text-xl text-slate-500 dark:text-slate-200 pr-5">
|
||||
Past Imports
|
||||
</span>
|
||||
<span className="h-px flex-1 bg-slate-200 dark:bg-slate-400"></span>
|
||||
</span>
|
||||
|
||||
<div className="overflow-x-auto w-fit mt-4 rounded-lg border border-gray-200">
|
||||
<table className="min-w-full divide-y-2 divide-gray-200 dark:divide-gray-200 text-md">
|
||||
<thead className="ltr:text-left rtl:text-right">
|
||||
<tr>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
#
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Time Started
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Session Id
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Imported
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Failed
|
||||
</th>
|
||||
</tr>
|
||||
</thead>
|
||||
|
||||
<tbody className="divide-y divide-gray-200">
|
||||
{data.map((jobResult, index) => (
|
||||
<tr key={jobResult.sessionId || index}>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300 font-medium">
|
||||
{index + 1}
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
{jobResult.earliestTimestamp &&
|
||||
!isNaN(parseInt(jobResult.earliestTimestamp))
|
||||
? format(
|
||||
new Date(parseInt(jobResult.earliestTimestamp)),
|
||||
"EEEE, hh:mma, do LLLL y"
|
||||
)
|
||||
: "N/A"}
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="tag is-warning">{jobResult.sessionId}</span>
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="inline-flex items-center justify-center rounded-full bg-emerald-100 px-2 py-0.5 text-emerald-700">
|
||||
<span className="h-5 w-6">
|
||||
<i className="icon-[solar--check-circle-line-duotone] h-5 w-5"></i>
|
||||
</span>
|
||||
<p className="whitespace-nowrap text-sm">
|
||||
{jobResult.completedJobs}
|
||||
</p>
|
||||
</span>
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="inline-flex items-center justify-center rounded-full bg-red-100 px-2 py-0.5 text-red-700">
|
||||
<span className="h-5 w-6">
|
||||
<i className="icon-[solar--close-circle-line-duotone] h-5 w-5"></i>
|
||||
</span>
|
||||
<p className="whitespace-nowrap text-sm">
|
||||
{jobResult.failedJobs}
|
||||
</p>
|
||||
</span>
|
||||
</td>
|
||||
</tr>
|
||||
))}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default PastImportsTable;
|
||||
@@ -1,6 +1,12 @@
|
||||
import { ReactElement, useEffect, useState } from "react";
|
||||
/**
|
||||
* @fileoverview Real-time import statistics component with live progress tracking.
|
||||
* Displays import statistics, progress bars, and file detection notifications
|
||||
* using WebSocket events for real-time updates.
|
||||
* @module components/Import/RealTimeImportStats
|
||||
*/
|
||||
|
||||
import { ReactElement, useState } from "react";
|
||||
import { Link } from "react-router-dom";
|
||||
import { useQueryClient } from "@tanstack/react-query";
|
||||
import {
|
||||
useGetImportStatisticsQuery,
|
||||
useGetWantedComicsQuery,
|
||||
@@ -9,22 +15,30 @@ import {
|
||||
import { useStore } from "../../store";
|
||||
import { useShallow } from "zustand/react/shallow";
|
||||
import { useImportSessionStatus } from "../../hooks/useImportSessionStatus";
|
||||
import { useImportSocketEvents } from "../../hooks/useImportSocketEvents";
|
||||
import { getComicDisplayLabel } from "../../shared/utils/formatting.utils";
|
||||
import { AlertCard } from "../shared/AlertCard";
|
||||
import { StatsCard } from "../shared/StatsCard";
|
||||
import { ProgressBar } from "../shared/ProgressBar";
|
||||
|
||||
/**
|
||||
* Import statistics with card-based layout and progress bar.
|
||||
* Three states: pre-import (idle), importing (active), and post-import (complete).
|
||||
* Also surfaces missing files detected by the file watcher.
|
||||
* Real-time import statistics component with card-based layout and progress tracking.
|
||||
*
|
||||
* This component manages three distinct states:
|
||||
* - **Pre-import (idle)**: Shows current file counts and "Start Import" button when new files exist
|
||||
* - **Importing (active)**: Displays real-time progress bar with completed/total counts
|
||||
* - **Post-import (complete)**: Shows final statistics including failed imports
|
||||
*
|
||||
* Additionally, it surfaces missing files detected by the file watcher, allowing users
|
||||
* to see which previously-imported files are no longer found on disk.
|
||||
*
|
||||
* @returns {ReactElement} The rendered import statistics component
|
||||
*/
|
||||
export const RealTimeImportStats = (): ReactElement => {
|
||||
const [importError, setImportError] = useState<string | null>(null);
|
||||
const [detectedFile, setDetectedFile] = useState<string | null>(null);
|
||||
const [socketImport, setSocketImport] = useState<{
|
||||
active: boolean;
|
||||
completed: number;
|
||||
total: number;
|
||||
failed: number;
|
||||
} | null>(null);
|
||||
const queryClient = useQueryClient();
|
||||
|
||||
const { socketImport, detectedFile } = useImportSocketEvents();
|
||||
const importSession = useImportSessionStatus();
|
||||
|
||||
const { getSocket, disconnectSocket, importJobQueue } = useStore(
|
||||
useShallow((state) => ({
|
||||
@@ -34,14 +48,14 @@ export const RealTimeImportStats = (): ReactElement => {
|
||||
})),
|
||||
);
|
||||
|
||||
const { data: importStats, isLoading } = useGetImportStatisticsQuery(
|
||||
const { data: importStats, isLoading, isError: isStatsError, error: statsError } = useGetImportStatisticsQuery(
|
||||
{},
|
||||
{ refetchOnWindowFocus: false, refetchInterval: false },
|
||||
);
|
||||
|
||||
const stats = importStats?.getImportStatistics?.stats;
|
||||
const missingCount = stats?.missingFiles ?? 0;
|
||||
|
||||
// File list for the detail panel — only fetched when there are missing files
|
||||
const { data: missingComicsData } = useGetWantedComicsQuery(
|
||||
{
|
||||
paginationOptions: { limit: 3, page: 1 },
|
||||
@@ -50,26 +64,12 @@ export const RealTimeImportStats = (): ReactElement => {
|
||||
{
|
||||
refetchOnWindowFocus: false,
|
||||
refetchInterval: false,
|
||||
enabled: (stats?.missingFiles ?? 0) > 0,
|
||||
enabled: missingCount > 0,
|
||||
},
|
||||
);
|
||||
|
||||
const missingDocs = missingComicsData?.getComicBooks?.docs ?? [];
|
||||
|
||||
const getMissingComicLabel = (comic: any): string => {
|
||||
const series =
|
||||
comic.canonicalMetadata?.series?.value ??
|
||||
comic.inferredMetadata?.issue?.name;
|
||||
const issueNum =
|
||||
comic.canonicalMetadata?.issueNumber?.value ??
|
||||
comic.inferredMetadata?.issue?.number;
|
||||
if (series && issueNum) return `${series} #${issueNum}`;
|
||||
if (series) return series;
|
||||
return comic.rawFileDetails?.name ?? comic.id;
|
||||
};
|
||||
|
||||
const importSession = useImportSessionStatus();
|
||||
|
||||
const { mutate: startIncrementalImport, isPending: isStartingImport } =
|
||||
useStartIncrementalImportMutation({
|
||||
onSuccess: (data) => {
|
||||
@@ -79,83 +79,10 @@ export const RealTimeImportStats = (): ReactElement => {
|
||||
}
|
||||
},
|
||||
onError: (error: any) => {
|
||||
setImportError(
|
||||
error?.message || "Failed to start import. Please try again.",
|
||||
);
|
||||
setImportError(error?.message || "Failed to start import. Please try again.");
|
||||
},
|
||||
});
|
||||
|
||||
const hasNewFiles = stats && stats.newFiles > 0;
|
||||
const missingCount = stats?.missingFiles ?? 0;
|
||||
|
||||
// LS_LIBRARY_STATISTICS fires after every filesystem change and every import job completion.
|
||||
// Invalidating GetImportStatistics covers: total files, imported, new files, and missing count.
|
||||
// Invalidating GetWantedComics refreshes the missing file name list in the detail panel.
|
||||
useEffect(() => {
|
||||
const socket = getSocket("/");
|
||||
|
||||
const handleStatsChange = () => {
|
||||
queryClient.invalidateQueries({ queryKey: ["GetImportStatistics"] });
|
||||
queryClient.invalidateQueries({ queryKey: ["GetWantedComics"] });
|
||||
};
|
||||
|
||||
const handleFileDetected = (payload: { filePath: string }) => {
|
||||
handleStatsChange();
|
||||
const name = payload.filePath.split("/").pop() ?? payload.filePath;
|
||||
setDetectedFile(name);
|
||||
setTimeout(() => setDetectedFile(null), 5000);
|
||||
};
|
||||
|
||||
const handleImportStarted = () => {
|
||||
setSocketImport({ active: true, completed: 0, total: 0, failed: 0 });
|
||||
};
|
||||
|
||||
const handleCoverExtracted = (payload: {
|
||||
completedJobCount: number;
|
||||
totalJobCount: number;
|
||||
importResult: unknown;
|
||||
}) => {
|
||||
setSocketImport((prev) => ({
|
||||
active: true,
|
||||
completed: payload.completedJobCount,
|
||||
total: payload.totalJobCount,
|
||||
failed: prev?.failed ?? 0,
|
||||
}));
|
||||
};
|
||||
|
||||
const handleCoverExtractionFailed = (payload: {
|
||||
failedJobCount: number;
|
||||
importResult: unknown;
|
||||
}) => {
|
||||
setSocketImport((prev) =>
|
||||
prev ? { ...prev, failed: payload.failedJobCount } : null,
|
||||
);
|
||||
};
|
||||
|
||||
const handleQueueDrained = () => {
|
||||
setSocketImport((prev) => (prev ? { ...prev, active: false } : null));
|
||||
handleStatsChange();
|
||||
};
|
||||
|
||||
socket.on("LS_LIBRARY_STATS", handleStatsChange);
|
||||
socket.on("LS_FILES_MISSING", handleStatsChange);
|
||||
socket.on("LS_FILE_DETECTED", handleFileDetected);
|
||||
socket.on("LS_INCREMENTAL_IMPORT_STARTED", handleImportStarted);
|
||||
socket.on("LS_COVER_EXTRACTED", handleCoverExtracted);
|
||||
socket.on("LS_COVER_EXTRACTION_FAILED", handleCoverExtractionFailed);
|
||||
socket.on("LS_IMPORT_QUEUE_DRAINED", handleQueueDrained);
|
||||
|
||||
return () => {
|
||||
socket.off("LS_LIBRARY_STATS", handleStatsChange);
|
||||
socket.off("LS_FILES_MISSING", handleStatsChange);
|
||||
socket.off("LS_FILE_DETECTED", handleFileDetected);
|
||||
socket.off("LS_INCREMENTAL_IMPORT_STARTED", handleImportStarted);
|
||||
socket.off("LS_COVER_EXTRACTED", handleCoverExtracted);
|
||||
socket.off("LS_COVER_EXTRACTION_FAILED", handleCoverExtractionFailed);
|
||||
socket.off("LS_IMPORT_QUEUE_DRAINED", handleQueueDrained);
|
||||
};
|
||||
}, [getSocket, queryClient]);
|
||||
|
||||
const handleStartImport = async () => {
|
||||
setImportError(null);
|
||||
|
||||
@@ -182,59 +109,43 @@ export const RealTimeImportStats = (): ReactElement => {
|
||||
}
|
||||
};
|
||||
|
||||
if (isLoading || !stats) {
|
||||
if (isLoading) {
|
||||
return <div className="text-gray-500 dark:text-gray-400">Loading...</div>;
|
||||
}
|
||||
|
||||
if (isStatsError || !stats) {
|
||||
return (
|
||||
<AlertCard variant="error" title="Failed to Load Import Statistics">
|
||||
<p>Unable to retrieve import statistics from the server. Please check that the backend service is running.</p>
|
||||
{isStatsError && (
|
||||
<p className="mt-2">Error: {statsError instanceof Error ? statsError.message : "Unknown error"}</p>
|
||||
)}
|
||||
</AlertCard>
|
||||
);
|
||||
}
|
||||
|
||||
const hasNewFiles = stats.newFiles > 0;
|
||||
const isFirstImport = stats.alreadyImported === 0;
|
||||
const buttonText = isFirstImport
|
||||
? `Start Import (${stats.newFiles} files)`
|
||||
: `Start Incremental Import (${stats.newFiles} new files)`;
|
||||
|
||||
// Determine what to show in each card based on current phase
|
||||
const sessionStats = importSession.stats;
|
||||
const hasSessionStats = importSession.isActive && sessionStats !== null;
|
||||
|
||||
const totalFiles = stats.totalLocalFiles;
|
||||
const importedCount = stats.alreadyImported;
|
||||
const failedCount = hasSessionStats ? sessionStats!.filesFailed : 0;
|
||||
|
||||
const showProgressBar = socketImport !== null;
|
||||
const socketProgressPct =
|
||||
socketImport && socketImport.total > 0
|
||||
? Math.round((socketImport.completed / socketImport.total) * 100)
|
||||
: 0;
|
||||
const showFailedCard = hasSessionStats && failedCount > 0;
|
||||
const showMissingCard = missingCount > 0;
|
||||
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
{/* Error Message */}
|
||||
{importError && (
|
||||
<div className="rounded-lg border-l-4 border-red-500 bg-red-50 dark:bg-red-900/20 p-4">
|
||||
<div className="flex items-start gap-3">
|
||||
<span className="w-6 h-6 text-red-600 dark:text-red-400 mt-0.5">
|
||||
<i className="h-6 w-6 icon-[solar--danger-circle-bold]"></i>
|
||||
</span>
|
||||
<div className="flex-1">
|
||||
<p className="font-semibold text-red-800 dark:text-red-300">
|
||||
Import Error
|
||||
</p>
|
||||
<p className="text-sm text-red-700 dark:text-red-400 mt-1">
|
||||
{importError}
|
||||
</p>
|
||||
</div>
|
||||
<button
|
||||
onClick={() => setImportError(null)}
|
||||
className="text-red-600 dark:text-red-400 hover:text-red-800 dark:hover:text-red-200"
|
||||
>
|
||||
<i className="h-5 w-5 icon-[solar--close-circle-bold]"></i>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
<AlertCard variant="error" title="Import Error" onDismiss={() => setImportError(null)}>
|
||||
{importError}
|
||||
</AlertCard>
|
||||
)}
|
||||
|
||||
{/* File detected toast */}
|
||||
{detectedFile && (
|
||||
<div className="rounded-lg border-l-4 border-blue-500 bg-blue-50 dark:bg-blue-900/20 p-3 flex items-center gap-3">
|
||||
<i className="h-5 w-5 text-blue-600 dark:text-blue-400 icon-[solar--document-add-bold-duotone] shrink-0"></i>
|
||||
@@ -244,7 +155,6 @@ export const RealTimeImportStats = (): ReactElement => {
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Start Import button — only when idle with new files */}
|
||||
{hasNewFiles && !importSession.isActive && (
|
||||
<button
|
||||
onClick={handleStartImport}
|
||||
@@ -256,121 +166,74 @@ export const RealTimeImportStats = (): ReactElement => {
|
||||
</button>
|
||||
)}
|
||||
|
||||
{/* Progress bar — shown while importing and once complete */}
|
||||
{showProgressBar && (
|
||||
<div className="space-y-1.5">
|
||||
<div className="flex items-center justify-between text-sm">
|
||||
<span className="font-medium text-gray-700 dark:text-gray-300">
|
||||
{socketImport!.active
|
||||
? `Importing ${socketImport!.completed} / ${socketImport!.total}`
|
||||
: `${socketImport!.completed} / ${socketImport!.total} imported`}
|
||||
</span>
|
||||
<span className="font-semibold text-gray-900 dark:text-white">
|
||||
{socketProgressPct}% complete
|
||||
</span>
|
||||
</div>
|
||||
<div className="w-full bg-gray-200 dark:bg-gray-700 rounded-full h-3 overflow-hidden">
|
||||
<div
|
||||
className="bg-blue-600 dark:bg-blue-500 h-3 rounded-full transition-all duration-300 relative"
|
||||
style={{ width: `${socketProgressPct}%` }}
|
||||
>
|
||||
{socketImport!.active && (
|
||||
<div className="absolute inset-0 bg-linear-to-r from-transparent via-white/20 to-transparent animate-shimmer" />
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<ProgressBar
|
||||
current={socketImport!.completed}
|
||||
total={socketImport!.total}
|
||||
isActive={socketImport!.active}
|
||||
activeLabel={`Importing ${socketImport!.completed} / ${socketImport!.total}`}
|
||||
completeLabel={`${socketImport!.completed} / ${socketImport!.total} imported`}
|
||||
/>
|
||||
)}
|
||||
|
||||
{/* Stats cards */}
|
||||
<div className="grid grid-cols-2 sm:grid-cols-4 gap-4">
|
||||
{/* Total files */}
|
||||
<div
|
||||
className="rounded-lg p-6 text-center"
|
||||
style={{ backgroundColor: "#6b7280" }}
|
||||
>
|
||||
<div className="text-4xl font-bold text-white mb-2">{totalFiles}</div>
|
||||
<div className="text-sm text-gray-200 font-medium">in import folder</div>
|
||||
</div>
|
||||
|
||||
{/* Imported */}
|
||||
<div
|
||||
className="rounded-lg p-6 text-center"
|
||||
style={{ backgroundColor: "#d8dab2" }}
|
||||
>
|
||||
<div className="text-4xl font-bold text-gray-800 mb-2">
|
||||
{importedCount}
|
||||
</div>
|
||||
<div className="text-sm text-gray-700 font-medium">
|
||||
{importSession.isActive ? "imported so far" : "imported in database"}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Failed — only shown after a session with failures */}
|
||||
<StatsCard
|
||||
value={stats.totalLocalFiles}
|
||||
label="in import folder"
|
||||
backgroundColor="#6b7280"
|
||||
/>
|
||||
<StatsCard
|
||||
value={stats.alreadyImported}
|
||||
label={importSession.isActive ? "imported so far" : "imported in database"}
|
||||
backgroundColor="#d8dab2"
|
||||
valueColor="text-gray-800"
|
||||
labelColor="text-gray-700"
|
||||
/>
|
||||
{showFailedCard && (
|
||||
<div className="rounded-lg p-6 text-center bg-red-500">
|
||||
<div className="text-4xl font-bold text-white mb-2">
|
||||
{failedCount}
|
||||
</div>
|
||||
<div className="text-sm text-red-100 font-medium">failed</div>
|
||||
</div>
|
||||
<StatsCard
|
||||
value={failedCount}
|
||||
label="failed"
|
||||
backgroundColor="bg-red-500"
|
||||
labelColor="text-red-100"
|
||||
/>
|
||||
)}
|
||||
|
||||
{/* Missing files — shown when watcher detects moved/deleted files */}
|
||||
{showMissingCard && (
|
||||
<div className="rounded-lg p-6 text-center bg-card-missing">
|
||||
<div className="text-4xl font-bold text-slate-700 mb-2">
|
||||
{missingCount}
|
||||
</div>
|
||||
<div className="text-sm text-slate-800 font-medium">missing</div>
|
||||
</div>
|
||||
<StatsCard
|
||||
value={missingCount}
|
||||
label="missing"
|
||||
backgroundColor="bg-card-missing"
|
||||
valueColor="text-slate-700"
|
||||
labelColor="text-slate-800"
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Missing files detail panel */}
|
||||
{showMissingCard && (
|
||||
<div className="rounded-lg border border-amber-300 bg-amber-50 dark:bg-amber-900/20 p-4">
|
||||
<div className="flex items-start gap-3">
|
||||
<i className="h-6 w-6 text-amber-600 dark:text-amber-400 mt-0.5 icon-[solar--danger-triangle-bold] shrink-0"></i>
|
||||
<div className="flex-1 min-w-0">
|
||||
<p className="font-semibold text-amber-800 dark:text-amber-300">
|
||||
{missingCount} {missingCount === 1 ? "file" : "files"} missing
|
||||
</p>
|
||||
<p className="text-sm text-amber-700 dark:text-amber-400 mt-1">
|
||||
These files were previously imported but can no longer be found
|
||||
on disk. Move them back to restore access.
|
||||
</p>
|
||||
{missingDocs.length > 0 && (
|
||||
<ul className="mt-2 space-y-1">
|
||||
{missingDocs.map((comic, i) => (
|
||||
<li
|
||||
key={i}
|
||||
className="text-xs text-amber-700 dark:text-amber-400 truncate"
|
||||
>
|
||||
{getMissingComicLabel(comic)} is missing
|
||||
</li>
|
||||
))}
|
||||
{missingCount > 3 && (
|
||||
<li className="text-xs text-amber-600 dark:text-amber-500">
|
||||
and {missingCount - 3} more.
|
||||
</li>
|
||||
)}
|
||||
</ul>
|
||||
<AlertCard variant="warning" title={`${missingCount} ${missingCount === 1 ? "file" : "files"} missing`}>
|
||||
<p>These files were previously imported but can no longer be found on disk. Move them back to restore access.</p>
|
||||
{missingDocs.length > 0 && (
|
||||
<ul className="mt-2 space-y-1">
|
||||
{missingDocs.map((comic, i) => (
|
||||
<li key={i} className="text-xs truncate">
|
||||
{getComicDisplayLabel(comic)} is missing
|
||||
</li>
|
||||
))}
|
||||
{missingCount > 3 && (
|
||||
<li className="text-xs text-amber-600 dark:text-amber-500">
|
||||
and {missingCount - 3} more.
|
||||
</li>
|
||||
)}
|
||||
<Link
|
||||
to="/library?filter=missingFiles"
|
||||
className="inline-flex items-center gap-1.5 mt-3 text-xs font-medium text-amber-800 dark:text-amber-300 underline underline-offset-2 hover:text-amber-600"
|
||||
>
|
||||
|
||||
<span className="underline">
|
||||
<i className="icon-[solar--file-corrupted-outline] w-4 h-4 px-3" />
|
||||
View Missing Files In Library
|
||||
<i className="icon-[solar--arrow-right-up-outline] w-3 h-3" />
|
||||
</span>
|
||||
</Link>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</ul>
|
||||
)}
|
||||
<Link
|
||||
to="/library?filter=missingFiles"
|
||||
className="inline-flex items-center gap-1.5 mt-3 text-xs font-medium underline underline-offset-2 hover:opacity-70"
|
||||
>
|
||||
<i className="icon-[solar--file-corrupted-outline] w-4 h-4" />
|
||||
View Missing Files In Library
|
||||
<i className="icon-[solar--arrow-right-up-outline] w-3 h-3" />
|
||||
</Link>
|
||||
</AlertCard>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
|
||||
43
src/client/components/Import/import.types.ts
Normal file
43
src/client/components/Import/import.types.ts
Normal file
@@ -0,0 +1,43 @@
|
||||
/**
|
||||
* @fileoverview Type definitions for the Import module.
|
||||
* @module components/Import/import.types
|
||||
*/
|
||||
|
||||
/**
|
||||
* Represents an issue with a configured directory.
|
||||
*/
|
||||
export type DirectoryIssue = {
|
||||
/** Path to the directory with issues */
|
||||
directory: string;
|
||||
/** Description of the issue */
|
||||
issue: string;
|
||||
};
|
||||
|
||||
/**
|
||||
* Result of directory status check from the backend.
|
||||
*/
|
||||
export type DirectoryStatus = {
|
||||
/** Whether all required directories are accessible */
|
||||
isValid: boolean;
|
||||
/** List of specific issues found */
|
||||
issues: DirectoryIssue[];
|
||||
};
|
||||
|
||||
/**
|
||||
* Statistics for a completed import job session.
|
||||
*/
|
||||
export type JobResultStatistics = {
|
||||
/** Unique session identifier */
|
||||
sessionId: string;
|
||||
/** Timestamp of the earliest job in the session (as string for GraphQL compatibility) */
|
||||
earliestTimestamp: string;
|
||||
/** Number of successfully completed jobs */
|
||||
completedJobs: number;
|
||||
/** Number of failed jobs */
|
||||
failedJobs: number;
|
||||
};
|
||||
|
||||
/**
|
||||
* Status of the import job queue.
|
||||
*/
|
||||
export type ImportQueueStatus = "running" | "drained" | undefined;
|
||||
@@ -14,14 +14,7 @@ import axios from "axios";
|
||||
import { format, parseISO } from "date-fns";
|
||||
import { useGetWantedComicsQuery } from "../../graphql/generated";
|
||||
|
||||
type FilterOption = "all" | "missingFiles";
|
||||
|
||||
interface SearchQuery {
|
||||
query: Record<string, any>;
|
||||
pagination: { size: number; from: number };
|
||||
type: string;
|
||||
trigger: string;
|
||||
}
|
||||
import type { LibrarySearchQuery, FilterOption } from "../../types";
|
||||
|
||||
const FILTER_OPTIONS: { value: FilterOption; label: string }[] = [
|
||||
{ value: "all", label: "All Comics" },
|
||||
@@ -37,7 +30,7 @@ export const Library = (): ReactElement => {
|
||||
const initialFilter = (searchParams.get("filter") as FilterOption) ?? "all";
|
||||
|
||||
const [activeFilter, setActiveFilter] = useState<FilterOption>(initialFilter);
|
||||
const [searchQuery, setSearchQuery] = useState<SearchQuery>({
|
||||
const [searchQuery, setSearchQuery] = useState<LibrarySearchQuery>({
|
||||
query: {},
|
||||
pagination: { size: 25, from: 0 },
|
||||
type: "all",
|
||||
@@ -47,7 +40,7 @@ export const Library = (): ReactElement => {
|
||||
const queryClient = useQueryClient();
|
||||
|
||||
/** Fetches a page of issues from the search API. */
|
||||
const fetchIssues = async (q: SearchQuery) => {
|
||||
const fetchIssues = async (q: LibrarySearchQuery) => {
|
||||
const { pagination, query, type } = q;
|
||||
return await axios({
|
||||
method: "POST",
|
||||
|
||||
@@ -8,23 +8,52 @@ import {
|
||||
import { useTable, usePagination } from "react-table";
|
||||
import prettyBytes from "pretty-bytes";
|
||||
import ellipsize from "ellipsize";
|
||||
import { useDispatch, useSelector } from "react-redux";
|
||||
import { getComicBooks } from "../../actions/fileops.actions";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { isNil, isEmpty, isUndefined } from "lodash";
|
||||
import Masonry from "react-masonry-css";
|
||||
import Card from "../shared/Carda";
|
||||
import { detectIssueTypes } from "../../shared/utils/tradepaperback.utils";
|
||||
import { Link } from "react-router-dom";
|
||||
import { LIBRARY_SERVICE_HOST } from "../../constants/endpoints";
|
||||
import { LIBRARY_SERVICE_HOST, LIBRARY_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import type { LibraryGridProps } from "../../types";
|
||||
|
||||
interface ILibraryGridProps {}
|
||||
export const LibraryGrid = (libraryGridProps: ILibraryGridProps) => {
|
||||
const data = useSelector(
|
||||
(state: RootState) => state.fileOps.recentComics.docs,
|
||||
);
|
||||
const pageTotal = useSelector(
|
||||
(state: RootState) => state.fileOps.recentComics.totalDocs,
|
||||
);
|
||||
interface ComicDoc {
|
||||
_id: string;
|
||||
rawFileDetails?: {
|
||||
cover?: {
|
||||
filePath: string;
|
||||
};
|
||||
name?: string;
|
||||
};
|
||||
sourcedMetadata?: {
|
||||
comicvine?: {
|
||||
image?: {
|
||||
small_url?: string;
|
||||
};
|
||||
name?: string;
|
||||
volumeInformation?: {
|
||||
description?: string;
|
||||
};
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
export const LibraryGrid = (libraryGridProps: LibraryGridProps) => {
|
||||
const { data: comicsData } = useQuery({
|
||||
queryKey: ["recentComics"],
|
||||
queryFn: async () =>
|
||||
axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBooks`,
|
||||
method: "POST",
|
||||
data: {
|
||||
paginationOptions: { size: 25, from: 0 },
|
||||
predicate: {},
|
||||
},
|
||||
}),
|
||||
});
|
||||
const data: ComicDoc[] = comicsData?.data?.docs ?? [];
|
||||
const pageTotal: number = comicsData?.data?.totalDocs ?? 0;
|
||||
const breakpointColumnsObj = {
|
||||
default: 5,
|
||||
1100: 4,
|
||||
@@ -42,20 +71,20 @@ export const LibraryGrid = (libraryGridProps: ILibraryGridProps) => {
|
||||
className="my-masonry-grid"
|
||||
columnClassName="my-masonry-grid_column"
|
||||
>
|
||||
{data.map(({ _id, rawFileDetails, sourcedMetadata }) => {
|
||||
{data.map(({ _id, rawFileDetails, sourcedMetadata }: ComicDoc) => {
|
||||
let imagePath = "";
|
||||
let comicName = "";
|
||||
if (!isEmpty(rawFileDetails.cover)) {
|
||||
if (rawFileDetails && !isEmpty(rawFileDetails.cover)) {
|
||||
const encodedFilePath = encodeURI(
|
||||
`${LIBRARY_SERVICE_HOST}/${removeLeadingPeriod(
|
||||
rawFileDetails.cover.filePath,
|
||||
rawFileDetails.cover?.filePath || '',
|
||||
)}`,
|
||||
);
|
||||
imagePath = escapePoundSymbol(encodedFilePath);
|
||||
comicName = rawFileDetails.name;
|
||||
} else if (!isNil(sourcedMetadata)) {
|
||||
comicName = rawFileDetails.name || '';
|
||||
} else if (!isNil(sourcedMetadata) && sourcedMetadata.comicvine?.image?.small_url) {
|
||||
imagePath = sourcedMetadata.comicvine.image.small_url;
|
||||
comicName = sourcedMetadata.comicvine.name;
|
||||
comicName = sourcedMetadata.comicvine?.name || '';
|
||||
}
|
||||
const titleElement = (
|
||||
<Link to={"/comic/details/" + _id}>
|
||||
@@ -71,7 +100,7 @@ export const LibraryGrid = (libraryGridProps: ILibraryGridProps) => {
|
||||
title={comicName ? titleElement : null}
|
||||
>
|
||||
<div className="content is-flex is-flex-direction-row">
|
||||
{!isEmpty(sourcedMetadata.comicvine) && (
|
||||
{sourcedMetadata && !isEmpty(sourcedMetadata.comicvine) && (
|
||||
<span className="icon cv-icon is-small inline-block w-6 h-6 md:w-7 md:h-7 flex-shrink-0">
|
||||
<img
|
||||
src="/src/client/assets/img/cvlogo.svg"
|
||||
@@ -85,7 +114,7 @@ export const LibraryGrid = (libraryGridProps: ILibraryGridProps) => {
|
||||
<i className="fas fa-adjust" />
|
||||
</span>
|
||||
)}
|
||||
{!isUndefined(sourcedMetadata.comicvine.volumeInformation) &&
|
||||
{sourcedMetadata?.comicvine?.volumeInformation?.description &&
|
||||
!isEmpty(
|
||||
detectIssueTypes(
|
||||
sourcedMetadata.comicvine.volumeInformation.description,
|
||||
@@ -94,8 +123,7 @@ export const LibraryGrid = (libraryGridProps: ILibraryGridProps) => {
|
||||
<span className="tag is-warning ml-1">
|
||||
{
|
||||
detectIssueTypes(
|
||||
sourcedMetadata.comicvine.volumeInformation
|
||||
.description,
|
||||
sourcedMetadata.comicvine.volumeInformation.description || '',
|
||||
).displayName
|
||||
}
|
||||
</span>
|
||||
|
||||
@@ -3,7 +3,11 @@ import PropTypes from "prop-types";
|
||||
import { Form, Field } from "react-final-form";
|
||||
import { Link } from "react-router-dom";
|
||||
|
||||
export const SearchBar = (props): ReactElement => {
|
||||
interface SearchBarProps {
|
||||
searchHandler: (values: Record<string, unknown>) => void;
|
||||
}
|
||||
|
||||
export const SearchBar = (props: SearchBarProps): ReactElement => {
|
||||
const { searchHandler } = props;
|
||||
return (
|
||||
<Form
|
||||
|
||||
@@ -3,10 +3,7 @@ import PullList from "../PullList/PullList";
|
||||
import { Volumes } from "../Volumes/Volumes";
|
||||
import WantedComics from "../WantedComics/WantedComics";
|
||||
import { Library } from "./Library";
|
||||
|
||||
interface ITabulatedContentContainerProps {
|
||||
category: string;
|
||||
}
|
||||
import type { TabulatedContentContainerProps } from "../../types";
|
||||
/**
|
||||
* Component to draw the contents of a category in a table.
|
||||
*
|
||||
@@ -18,7 +15,7 @@ interface ITabulatedContentContainerProps {
|
||||
*/
|
||||
|
||||
const TabulatedContentContainer = (
|
||||
props: ITabulatedContentContainerProps,
|
||||
props: TabulatedContentContainerProps,
|
||||
): ReactElement => {
|
||||
const { category } = props;
|
||||
const renderTabulatedContent = () => {
|
||||
|
||||
@@ -1,16 +1,27 @@
|
||||
import React, { ReactElement, useEffect, useMemo } from "react";
|
||||
import React, { ReactElement, useEffect, useMemo, useState } from "react";
|
||||
import T2Table from "../shared/T2Table";
|
||||
import { getWeeklyPullList } from "../../actions/comicinfo.actions";
|
||||
import Card from "../shared/Carda";
|
||||
import ellipsize from "ellipsize";
|
||||
import { isNil } from "lodash";
|
||||
import type { CellContext } from "@tanstack/react-table";
|
||||
|
||||
interface PullListComic {
|
||||
issue: {
|
||||
cover: string;
|
||||
name: string;
|
||||
publisher: string;
|
||||
description: string;
|
||||
price: string;
|
||||
pulls: number;
|
||||
};
|
||||
}
|
||||
|
||||
export const PullList = (): ReactElement => {
|
||||
// const pullListComics = useSelector(
|
||||
// (state: RootState) => state.comicInfo.pullList,
|
||||
// );
|
||||
// Placeholder for pull list comics - would come from API/store
|
||||
const [pullListComics, setPullListComics] = useState<PullListComic[] | null>(null);
|
||||
|
||||
useEffect(() => {
|
||||
// TODO: Implement pull list fetching
|
||||
// dispatch(
|
||||
// getWeeklyPullList({
|
||||
// startDate: "2023-7-28",
|
||||
@@ -31,7 +42,7 @@ export const PullList = (): ReactElement => {
|
||||
id: "comicDetails",
|
||||
minWidth: 450,
|
||||
accessorKey: "issue",
|
||||
cell: (row) => {
|
||||
cell: (row: CellContext<PullListComic, PullListComic["issue"]>) => {
|
||||
const item = row.getValue();
|
||||
return (
|
||||
<div className="columns">
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
import React, { ReactElement, useState } from "react";
|
||||
import { isNil, isEmpty, isUndefined } from "lodash";
|
||||
import { IExtractedComicBookCoverFile, RootState } from "threetwo-ui-typings";
|
||||
import { detectIssueTypes } from "../../shared/utils/tradepaperback.utils";
|
||||
import { Form, Field } from "react-final-form";
|
||||
import Card from "../shared/Carda";
|
||||
@@ -16,18 +15,35 @@ import {
|
||||
LIBRARY_SERVICE_BASE_URI,
|
||||
} from "../../constants/endpoints";
|
||||
import axios from "axios";
|
||||
import type { SearchPageProps, ComicVineSearchResult } from "../../types";
|
||||
|
||||
interface ISearchProps {}
|
||||
interface ComicData {
|
||||
id: number;
|
||||
api_detail_url: string;
|
||||
image: { small_url: string; thumb_url?: string };
|
||||
cover_date?: string;
|
||||
issue_number?: string;
|
||||
name?: string;
|
||||
description?: string;
|
||||
volume?: { name: string; api_detail_url: string };
|
||||
start_year?: string;
|
||||
count_of_issues?: number;
|
||||
publisher?: { name: string };
|
||||
resource_type?: string;
|
||||
}
|
||||
|
||||
export const Search = ({}: ISearchProps): ReactElement => {
|
||||
export const Search = ({}: SearchPageProps): ReactElement => {
|
||||
const queryClient = useQueryClient();
|
||||
const formData = {
|
||||
search: "",
|
||||
};
|
||||
const [comicVineMetadata, setComicVineMetadata] = useState({});
|
||||
const [comicVineMetadata, setComicVineMetadata] = useState<{
|
||||
sourceName?: string;
|
||||
comicData?: ComicData;
|
||||
}>({});
|
||||
const [selectedResource, setSelectedResource] = useState("volume");
|
||||
const { t } = useTranslation();
|
||||
const handleResourceChange = (value) => {
|
||||
const handleResourceChange = (value: string) => {
|
||||
setSelectedResource(value);
|
||||
};
|
||||
|
||||
@@ -63,6 +79,11 @@ export const Search = ({}: ISearchProps): ReactElement => {
|
||||
comicObject,
|
||||
markEntireVolumeWanted,
|
||||
resourceType,
|
||||
}: {
|
||||
source: string;
|
||||
comicObject: any;
|
||||
markEntireVolumeWanted: boolean;
|
||||
resourceType: string;
|
||||
}) => {
|
||||
let volumeInformation = {};
|
||||
let issues = [];
|
||||
@@ -143,14 +164,14 @@ export const Search = ({}: ISearchProps): ReactElement => {
|
||||
},
|
||||
});
|
||||
|
||||
const addToLibrary = (sourceName: string, comicData) =>
|
||||
const addToLibrary = (sourceName: string, comicData: ComicData) =>
|
||||
setComicVineMetadata({ sourceName, comicData });
|
||||
|
||||
const createDescriptionMarkup = (html) => {
|
||||
const createDescriptionMarkup = (html: string) => {
|
||||
return { __html: html };
|
||||
};
|
||||
|
||||
const onSubmit = async (values) => {
|
||||
const onSubmit = async (values: { search: string }) => {
|
||||
const formData = { ...values, resource: selectedResource };
|
||||
try {
|
||||
mutate(formData);
|
||||
@@ -270,7 +291,7 @@ export const Search = ({}: ISearchProps): ReactElement => {
|
||||
)}
|
||||
{!isEmpty(comicVineSearchResults?.data?.results) ? (
|
||||
<div className="mx-auto max-w-screen-xl px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
{comicVineSearchResults.data.results.map((result) => {
|
||||
{comicVineSearchResults?.data?.results?.map((result: ComicData) => {
|
||||
return result.resource_type === "issue" ? (
|
||||
<div
|
||||
key={result.id}
|
||||
@@ -287,8 +308,8 @@ export const Search = ({}: ISearchProps): ReactElement => {
|
||||
</div>
|
||||
<div className="w-3/4">
|
||||
<div className="text-xl">
|
||||
{!isEmpty(result.volume.name) ? (
|
||||
result.volume.name
|
||||
{!isEmpty(result.volume?.name) ? (
|
||||
result.volume?.name
|
||||
) : (
|
||||
<span className="is-size-3">No Name</span>
|
||||
)}
|
||||
@@ -306,18 +327,18 @@ export const Search = ({}: ISearchProps): ReactElement => {
|
||||
{result.api_detail_url}
|
||||
</a>
|
||||
<p className="text-sm">
|
||||
{ellipsize(
|
||||
{result.description ? ellipsize(
|
||||
convert(result.description, {
|
||||
baseElements: {
|
||||
selectors: ["p", "div"],
|
||||
},
|
||||
}),
|
||||
320,
|
||||
)}
|
||||
) : ''}
|
||||
</p>
|
||||
<div className="mt-2">
|
||||
<PopoverButton
|
||||
content={`This will add ${result.volume.name} to your wanted list.`}
|
||||
content={`This will add ${result.volume?.name || 'this issue'} to your wanted list.`}
|
||||
clickHandler={() =>
|
||||
addToWantedList({
|
||||
source: "comicvine",
|
||||
@@ -408,14 +429,14 @@ export const Search = ({}: ISearchProps): ReactElement => {
|
||||
|
||||
{/* description */}
|
||||
<p className="text-sm">
|
||||
{ellipsize(
|
||||
{result.description ? ellipsize(
|
||||
convert(result.description, {
|
||||
baseElements: {
|
||||
selectors: ["p", "div"],
|
||||
},
|
||||
}),
|
||||
320,
|
||||
)}
|
||||
) : ''}
|
||||
</p>
|
||||
<div className="mt-2">
|
||||
<PopoverButton
|
||||
|
||||
@@ -1,16 +1,15 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import { useDispatch, useSelector } from "react-redux";
|
||||
import { useEffect } from "react";
|
||||
import { getServiceStatus } from "../../actions/fileops.actions";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { LIBRARY_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
|
||||
export const ServiceStatuses = (): ReactElement => {
|
||||
const serviceStatus = useSelector(
|
||||
(state: RootState) => state.fileOps.libraryServiceStatus,
|
||||
);
|
||||
const dispatch = useDispatch();
|
||||
useEffect(() => {
|
||||
dispatch(getServiceStatus());
|
||||
}, []);
|
||||
const { data } = useQuery({
|
||||
queryKey: ["serviceStatus"],
|
||||
queryFn: async () =>
|
||||
axios({ url: `${LIBRARY_SERVICE_BASE_URI}/getHealthInformation`, method: "GET" }),
|
||||
});
|
||||
const serviceStatus = data?.data;
|
||||
return (
|
||||
<div className="is-clearfix">
|
||||
<div className="mt-4">
|
||||
|
||||
@@ -38,16 +38,21 @@ export const AirDCPPHubsForm = (): ReactElement => {
|
||||
enabled: !isEmpty(settings?.data.directConnect?.client?.host),
|
||||
});
|
||||
|
||||
let hubList: any[] = [];
|
||||
interface HubOption {
|
||||
value: string;
|
||||
label: string;
|
||||
}
|
||||
|
||||
let hubList: HubOption[] = [];
|
||||
if (!isNil(hubs)) {
|
||||
hubList = hubs?.data.map(({ hub_url, identity }) => ({
|
||||
hubList = hubs?.data.map(({ hub_url, identity }: { hub_url: string; identity: { name: string } }) => ({
|
||||
value: hub_url,
|
||||
label: identity.name,
|
||||
}));
|
||||
}
|
||||
|
||||
const mutation = useMutation({
|
||||
mutationFn: async (values) =>
|
||||
mutationFn: async (values: Record<string, unknown>) =>
|
||||
await axios({
|
||||
url: `http://localhost:3000/api/settings/saveSettings`,
|
||||
method: "POST",
|
||||
@@ -69,13 +74,24 @@ export const AirDCPPHubsForm = (): ReactElement => {
|
||||
},
|
||||
});
|
||||
|
||||
const validate = async (values) => {
|
||||
const errors = {};
|
||||
const validate = async (values: Record<string, unknown>) => {
|
||||
const errors: Record<string, string> = {};
|
||||
// Add any validation logic here if needed
|
||||
return errors;
|
||||
};
|
||||
|
||||
const SelectAdapter = ({ input, ...rest }) => {
|
||||
interface SelectAdapterProps {
|
||||
input: {
|
||||
value: unknown;
|
||||
onChange: (value: unknown) => void;
|
||||
onBlur: () => void;
|
||||
onFocus: () => void;
|
||||
name: string;
|
||||
};
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
const SelectAdapter = ({ input, ...rest }: SelectAdapterProps) => {
|
||||
return <Select {...input} {...rest} isClearable isMulti />;
|
||||
};
|
||||
|
||||
@@ -155,7 +171,7 @@ export const AirDCPPHubsForm = (): ReactElement => {
|
||||
</span>
|
||||
<div className="block max-w-sm p-6 bg-white border border-gray-200 rounded-lg shadow dark:bg-slate-400 dark:border-gray-700">
|
||||
{settings?.data.directConnect?.client.hubs.map(
|
||||
({ value, label }) => (
|
||||
({ value, label }: HubOption) => (
|
||||
<div key={value}>
|
||||
<div>{label}</div>
|
||||
<span className="is-size-7">{value}</span>
|
||||
|
||||
@@ -1,7 +1,24 @@
|
||||
import React, { ReactElement } from "react";
|
||||
|
||||
export const AirDCPPSettingsConfirmation = (settingsObject): ReactElement => {
|
||||
const { settings } = settingsObject;
|
||||
interface AirDCPPSessionInfo {
|
||||
_id: string;
|
||||
system_info: {
|
||||
client_version: string;
|
||||
hostname: string;
|
||||
platform: string;
|
||||
};
|
||||
user: {
|
||||
username: string;
|
||||
active_sessions: number;
|
||||
permissions: string[];
|
||||
};
|
||||
}
|
||||
|
||||
interface AirDCPPSettingsConfirmationProps {
|
||||
settings: AirDCPPSessionInfo;
|
||||
}
|
||||
|
||||
export const AirDCPPSettingsConfirmation = ({ settings }: AirDCPPSettingsConfirmationProps): ReactElement => {
|
||||
return (
|
||||
<div>
|
||||
<span className="flex items-center mt-10 mb-4">
|
||||
|
||||
@@ -17,8 +17,16 @@ export const AirDCPPSettingsForm = () => {
|
||||
queryFn: () => axios.get(`${SETTINGS_SERVICE_BASE_URI}/getAllSettings`),
|
||||
});
|
||||
|
||||
interface HostConfig {
|
||||
hostname: string;
|
||||
port: string;
|
||||
username: string;
|
||||
password: string;
|
||||
protocol: string;
|
||||
}
|
||||
|
||||
// Fetch session information
|
||||
const fetchSessionInfo = (host) => {
|
||||
const fetchSessionInfo = (host: HostConfig) => {
|
||||
return axios.post(`${AIRDCPP_SERVICE_BASE_URI}/initialize`, { host });
|
||||
};
|
||||
|
||||
@@ -34,7 +42,7 @@ export const AirDCPPSettingsForm = () => {
|
||||
|
||||
// Handle setting update and subsequent AirDC++ initialization
|
||||
const { mutate } = useMutation({
|
||||
mutationFn: (values) => {
|
||||
mutationFn: (values: Record<string, unknown>) => {
|
||||
return axios.post("http://localhost:3000/api/settings/saveSettings", {
|
||||
settingsPayload: values,
|
||||
settingsKey: "directConnect",
|
||||
@@ -50,12 +58,13 @@ export const AirDCPPSettingsForm = () => {
|
||||
},
|
||||
});
|
||||
|
||||
const deleteSettingsMutation = useMutation(() =>
|
||||
axios.post("http://localhost:3000/api/settings/saveSettings", {
|
||||
settingsPayload: {},
|
||||
settingsKey: "directConnect",
|
||||
}),
|
||||
);
|
||||
const deleteSettingsMutation = useMutation({
|
||||
mutationFn: () =>
|
||||
axios.post("http://localhost:3000/api/settings/saveSettings", {
|
||||
settingsPayload: {},
|
||||
settingsKey: "directConnect",
|
||||
}),
|
||||
});
|
||||
|
||||
const initFormData = settingsData?.data?.directConnect?.client?.host ?? {};
|
||||
|
||||
|
||||
@@ -4,9 +4,13 @@ import { Form, Field } from "react-final-form";
|
||||
import { PROWLARR_SERVICE_BASE_URI } from "../../../constants/endpoints";
|
||||
import axios from "axios";
|
||||
|
||||
export const ProwlarrSettingsForm = (props) => {
|
||||
interface ProwlarrSettingsFormProps {
|
||||
// Add props here if needed
|
||||
}
|
||||
|
||||
export const ProwlarrSettingsForm = (_props: ProwlarrSettingsFormProps) => {
|
||||
const { data } = useQuery({
|
||||
queryFn: async (): any => {
|
||||
queryFn: async () => {
|
||||
return await axios({
|
||||
url: `${PROWLARR_SERVICE_BASE_URI}/getIndexers`,
|
||||
method: "POST",
|
||||
|
||||
@@ -3,7 +3,7 @@ import { ConnectionForm } from "../../shared/ConnectionForm/ConnectionForm";
|
||||
import { useQuery, useMutation, QueryClient } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
|
||||
export const QbittorrentConnectionForm = (): ReactElement => {
|
||||
export const QbittorrentConnectionForm = (): ReactElement | null => {
|
||||
const queryClient = new QueryClient();
|
||||
// fetch settings
|
||||
const { data, isLoading, isError } = useQuery({
|
||||
@@ -28,7 +28,7 @@ export const QbittorrentConnectionForm = (): ReactElement => {
|
||||
});
|
||||
// Update action using a mutation
|
||||
const { mutate } = useMutation({
|
||||
mutationFn: async (values) =>
|
||||
mutationFn: async (values: Record<string, unknown>) =>
|
||||
await axios({
|
||||
url: `http://localhost:3000/api/settings/saveSettings`,
|
||||
method: "POST",
|
||||
@@ -77,6 +77,7 @@ export const QbittorrentConnectionForm = (): ReactElement => {
|
||||
</>
|
||||
);
|
||||
}
|
||||
return null;
|
||||
};
|
||||
|
||||
export default QbittorrentConnectionForm;
|
||||
|
||||
@@ -8,10 +8,22 @@ import DockerVars from "./DockerVars/DockerVars";
|
||||
import { ServiceStatuses } from "../ServiceStatuses/ServiceStatuses";
|
||||
import settingsObject from "../../constants/settings/settingsMenu.json";
|
||||
import { isUndefined, map } from "lodash";
|
||||
import type { SettingsProps } from "../../types";
|
||||
|
||||
interface ISettingsProps {}
|
||||
interface SettingsMenuItem {
|
||||
id: string | number;
|
||||
displayName: string;
|
||||
children?: SettingsMenuItem[];
|
||||
}
|
||||
|
||||
export const Settings = (props: ISettingsProps): ReactElement => {
|
||||
interface SettingsCategory {
|
||||
id: number;
|
||||
category: string;
|
||||
displayName: string;
|
||||
children?: SettingsMenuItem[];
|
||||
}
|
||||
|
||||
export const Settings = (props: SettingsProps): ReactElement => {
|
||||
const [active, setActive] = useState("gen-db");
|
||||
const [expanded, setExpanded] = useState<Record<string, boolean>>({});
|
||||
|
||||
@@ -63,70 +75,70 @@ export const Settings = (props: ISettingsProps): ReactElement => {
|
||||
overflow-hidden"
|
||||
>
|
||||
<div className="px-4 py-6 overflow-y-auto">
|
||||
{map(settingsObject, (settingObject, idx) => (
|
||||
<div
|
||||
key={idx}
|
||||
className="mb-6 text-slate-700 dark:text-slate-300"
|
||||
>
|
||||
<h3 className="text-xs font-semibold text-slate-500 dark:text-slate-400 tracking-wide mb-3">
|
||||
{settingObject.category.toUpperCase()}
|
||||
</h3>
|
||||
|
||||
{!isUndefined(settingObject.children) && (
|
||||
<ul>
|
||||
{map(settingObject.children, (item, idx) => {
|
||||
const isOpen = expanded[item.id];
|
||||
|
||||
return (
|
||||
<li key={idx} className="mb-1">
|
||||
<div
|
||||
onClick={() => toggleExpanded(item.id)}
|
||||
className={`cursor-pointer flex justify-between items-center px-1 py-1 rounded-md transition-colors hover:bg-white/50 dark:hover:bg-slate-700 ${
|
||||
item.id === active
|
||||
? "font-semibold text-blue-600 dark:text-blue-400"
|
||||
: ""
|
||||
}`}
|
||||
>
|
||||
<span
|
||||
onClick={() => setActive(item.id.toString())}
|
||||
className="flex-1"
|
||||
{map(settingsObject as SettingsCategory[], (settingObject, idx) => (
|
||||
<div
|
||||
key={idx}
|
||||
className="mb-6 text-slate-700 dark:text-slate-300"
|
||||
>
|
||||
<h3 className="text-xs font-semibold text-slate-500 dark:text-slate-400 tracking-wide mb-3">
|
||||
{settingObject.category.toUpperCase()}
|
||||
</h3>
|
||||
|
||||
{!isUndefined(settingObject.children) && (
|
||||
<ul>
|
||||
{map(settingObject.children, (item: SettingsMenuItem, idx) => {
|
||||
const isOpen = expanded[String(item.id)];
|
||||
|
||||
return (
|
||||
<li key={idx} className="mb-1">
|
||||
<div
|
||||
onClick={() => toggleExpanded(String(item.id))}
|
||||
className={`cursor-pointer flex justify-between items-center px-1 py-1 rounded-md transition-colors hover:bg-white/50 dark:hover:bg-slate-700 ${
|
||||
String(item.id) === active
|
||||
? "font-semibold text-blue-600 dark:text-blue-400"
|
||||
: ""
|
||||
}`}
|
||||
>
|
||||
{item.displayName}
|
||||
</span>
|
||||
{!isUndefined(item.children) && (
|
||||
<span className="text-xs opacity-60">
|
||||
{isOpen ? "−" : "+"}
|
||||
<span
|
||||
onClick={() => setActive(String(item.id))}
|
||||
className="flex-1"
|
||||
>
|
||||
{item.displayName}
|
||||
</span>
|
||||
{!isUndefined(item.children) && (
|
||||
<span className="text-xs opacity-60">
|
||||
{isOpen ? "−" : "+"}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{!isUndefined(item.children) && isOpen && (
|
||||
<ul className="pl-4 mt-1">
|
||||
{map(item.children, (subItem: SettingsMenuItem) => (
|
||||
<li key={String(subItem.id)} className="mb-1">
|
||||
<a
|
||||
onClick={() =>
|
||||
setActive(String(subItem.id))
|
||||
}
|
||||
className={`cursor-pointer flex items-center px-1 py-1 rounded-md transition-colors hover:bg-white/50 dark:hover:bg-slate-700 ${
|
||||
String(subItem.id) === active
|
||||
? "font-semibold text-blue-600 dark:text-blue-400"
|
||||
: ""
|
||||
}`}
|
||||
>
|
||||
{subItem.displayName}
|
||||
</a>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{!isUndefined(item.children) && isOpen && (
|
||||
<ul className="pl-4 mt-1">
|
||||
{map(item.children, (subItem) => (
|
||||
<li key={subItem.id} className="mb-1">
|
||||
<a
|
||||
onClick={() =>
|
||||
setActive(subItem.id.toString())
|
||||
}
|
||||
className={`cursor-pointer flex items-center px-1 py-1 rounded-md transition-colors hover:bg-white/50 dark:hover:bg-slate-700 ${
|
||||
subItem.id.toString() === active
|
||||
? "font-semibold text-blue-600 dark:text-blue-400"
|
||||
: ""
|
||||
}`}
|
||||
>
|
||||
{subItem.displayName}
|
||||
</a>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
)}
|
||||
</li>
|
||||
);
|
||||
})}
|
||||
</ul>
|
||||
)}
|
||||
</div>
|
||||
))}
|
||||
</li>
|
||||
);
|
||||
})}
|
||||
</ul>
|
||||
)}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</aside>
|
||||
</div>
|
||||
|
||||
@@ -3,7 +3,7 @@ import { useMutation } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
|
||||
export const SystemSettingsForm = (): ReactElement => {
|
||||
const { mutate: flushDb, isLoading } = useMutation({
|
||||
const { mutate: flushDb, isPending } = useMutation({
|
||||
mutationFn: async () => {
|
||||
await axios({
|
||||
url: `http://localhost:3000/api/library/flushDb`,
|
||||
|
||||
@@ -1,21 +1,41 @@
|
||||
import { isArray, map } from "lodash";
|
||||
import React, { useEffect, ReactElement } from "react";
|
||||
import { useDispatch, useSelector } from "react-redux";
|
||||
import { getComicBooksDetailsByIds } from "../../actions/comicinfo.actions";
|
||||
import React, { ReactElement } from "react";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { Card } from "../shared/Carda";
|
||||
import ellipsize from "ellipsize";
|
||||
import { LIBRARY_SERVICE_HOST } from "../../constants/endpoints";
|
||||
import { LIBRARY_SERVICE_HOST, LIBRARY_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import { escapePoundSymbol } from "../../shared/utils/formatting.utils";
|
||||
import prettyBytes from "pretty-bytes";
|
||||
|
||||
const PotentialLibraryMatches = (props): ReactElement => {
|
||||
const dispatch = useDispatch();
|
||||
const comicBooks = useSelector(
|
||||
(state: RootState) => state.comicInfo.comicBooksDetails,
|
||||
);
|
||||
useEffect(() => {
|
||||
dispatch(getComicBooksDetailsByIds(props.matches));
|
||||
}, []);
|
||||
interface PotentialLibraryMatchesProps {
|
||||
matches: string[];
|
||||
}
|
||||
|
||||
interface ComicBookMatch {
|
||||
rawFileDetails: {
|
||||
cover: {
|
||||
filePath: string;
|
||||
};
|
||||
name: string;
|
||||
containedIn: string;
|
||||
extension: string;
|
||||
fileSize: number;
|
||||
};
|
||||
}
|
||||
|
||||
const PotentialLibraryMatches = (props: PotentialLibraryMatchesProps): ReactElement => {
|
||||
const { data } = useQuery({
|
||||
queryKey: ["comicBooksDetails", props.matches],
|
||||
queryFn: async () =>
|
||||
axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBooksByIds`,
|
||||
method: "POST",
|
||||
data: { ids: props.matches },
|
||||
}),
|
||||
enabled: props.matches.length > 0,
|
||||
});
|
||||
const comicBooks: ComicBookMatch[] = data?.data ?? [];
|
||||
return (
|
||||
<div className="potential-matches-container mt-10">
|
||||
{isArray(comicBooks) ? (
|
||||
|
||||
@@ -1,8 +1,7 @@
|
||||
import { isEmpty, isNil, isUndefined, map, partialRight, pick } from "lodash";
|
||||
import React, { ReactElement, useState, useCallback } from "react";
|
||||
import { useParams } from "react-router";
|
||||
import { analyzeLibrary } from "../../actions/comicinfo.actions";
|
||||
import { useQuery, useMutation, QueryClient } from "@tanstack/react-query";
|
||||
import { useQuery, useMutation } from "@tanstack/react-query";
|
||||
import PotentialLibraryMatches from "./PotentialLibraryMatches";
|
||||
import { Card } from "../shared/Carda";
|
||||
import SlidingPane from "react-sliding-pane";
|
||||
@@ -14,38 +13,87 @@ import {
|
||||
} from "../../constants/endpoints";
|
||||
import axios from "axios";
|
||||
|
||||
const VolumeDetails = (props): ReactElement => {
|
||||
interface VolumeDetailsProps {
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
interface ComicObjectData {
|
||||
sourcedMetadata: {
|
||||
comicvine: {
|
||||
id?: string;
|
||||
volumeInformation: {
|
||||
id: string;
|
||||
name: string;
|
||||
description?: string;
|
||||
image: {
|
||||
small_url: string;
|
||||
};
|
||||
publisher: {
|
||||
name: string;
|
||||
};
|
||||
};
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
interface IssueData {
|
||||
id: string;
|
||||
name: string;
|
||||
issue_number: string;
|
||||
description?: string;
|
||||
matches?: unknown[];
|
||||
image: {
|
||||
small_url: string;
|
||||
thumb_url: string;
|
||||
};
|
||||
}
|
||||
|
||||
interface StoryArc {
|
||||
name?: string;
|
||||
}
|
||||
|
||||
interface MatchItem {
|
||||
_id?: string;
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
interface ContentForSlidingPanel {
|
||||
[key: string]: {
|
||||
content: () => React.ReactNode;
|
||||
};
|
||||
}
|
||||
|
||||
const VolumeDetails = (_props: VolumeDetailsProps): ReactElement => {
|
||||
// sliding panel config
|
||||
const [visible, setVisible] = useState(false);
|
||||
const [slidingPanelContentId, setSlidingPanelContentId] = useState("");
|
||||
const [matches, setMatches] = useState([]);
|
||||
const [storyArcsData, setStoryArcsData] = useState([]);
|
||||
const [matches, setMatches] = useState<MatchItem[]>([]);
|
||||
const [storyArcsData, setStoryArcsData] = useState<StoryArc[]>([]);
|
||||
const [active, setActive] = useState(1);
|
||||
|
||||
// sliding panel init
|
||||
const contentForSlidingPanel = {
|
||||
const contentForSlidingPanel: ContentForSlidingPanel = {
|
||||
potentialMatchesInLibrary: {
|
||||
content: () => {
|
||||
const ids = map(matches, partialRight(pick, "_id"));
|
||||
const matchIds = ids.map((id: any) => id._id);
|
||||
{
|
||||
/* return <PotentialLibraryMatches matches={matchIds} />; */
|
||||
}
|
||||
const matchIds = ids.map((id: MatchItem) => id._id).filter((id): id is string => !!id);
|
||||
return <PotentialLibraryMatches matches={matchIds} />;
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
// sliding panel handlers
|
||||
const openPotentialLibraryMatchesPanel = useCallback((potentialMatches) => {
|
||||
const openPotentialLibraryMatchesPanel = useCallback((potentialMatches: MatchItem[]) => {
|
||||
setSlidingPanelContentId("potentialMatchesInLibrary");
|
||||
setMatches(potentialMatches);
|
||||
setVisible(true);
|
||||
}, []);
|
||||
|
||||
// const analyzeIssues = useCallback((issues) => {
|
||||
// dispatch(analyzeLibrary(issues));
|
||||
// }, []);
|
||||
//
|
||||
// Function to analyze issues (commented out but typed for future use)
|
||||
const analyzeIssues = useCallback((issues: IssueData[]) => {
|
||||
// dispatch(analyzeLibrary(issues));
|
||||
console.log("Analyzing issues:", issues);
|
||||
}, []);
|
||||
|
||||
const { comicObjectId } = useParams<{ comicObjectId: string }>();
|
||||
|
||||
@@ -83,7 +131,7 @@ const VolumeDetails = (props): ReactElement => {
|
||||
// get story arcs
|
||||
const useGetStoryArcs = () => {
|
||||
return useMutation({
|
||||
mutationFn: async (comicObject) =>
|
||||
mutationFn: async (comicObject: ComicObjectData) =>
|
||||
axios({
|
||||
url: `${COMICVINE_SERVICE_URI}/getResource`,
|
||||
method: "POST",
|
||||
@@ -93,7 +141,7 @@ const VolumeDetails = (props): ReactElement => {
|
||||
filter: `id:${comicObject?.sourcedMetadata.comicvine.id}`,
|
||||
},
|
||||
}),
|
||||
onSuccess: (data) => {
|
||||
onSuccess: (data: { data: { results: StoryArc[] } }) => {
|
||||
setStoryArcsData(data?.data.results);
|
||||
},
|
||||
});
|
||||
@@ -111,13 +159,13 @@ const VolumeDetails = (props): ReactElement => {
|
||||
const IssuesInVolume = () => (
|
||||
<>
|
||||
{!isUndefined(issuesForSeries) ? (
|
||||
<div className="button" onClick={() => analyzeIssues(issuesForSeries)}>
|
||||
<div className="button" onClick={() => analyzeIssues(issuesForSeries?.data || [])}>
|
||||
Analyze Library
|
||||
</div>
|
||||
) : null}
|
||||
<>
|
||||
{isSuccess &&
|
||||
issuesForSeries.data.map((issue) => {
|
||||
issuesForSeries.data.map((issue: IssueData) => {
|
||||
return (
|
||||
<>
|
||||
<Card
|
||||
@@ -157,7 +205,7 @@ const VolumeDetails = (props): ReactElement => {
|
||||
</article>
|
||||
<div className="flex flex-wrap">
|
||||
{isSuccess &&
|
||||
issuesForSeries?.data.map((issue) => {
|
||||
issuesForSeries?.data.map((issue: IssueData) => {
|
||||
return (
|
||||
<div className="my-3 dark:bg-slate-400 bg-slate-300 p-4 rounded-lg w-3/4">
|
||||
<div className="flex flex-row gap-4 mb-2">
|
||||
@@ -170,11 +218,11 @@ const VolumeDetails = (props): ReactElement => {
|
||||
<div className="w-3/4">
|
||||
<p className="text-xl">{issue.name}</p>
|
||||
<p className="text-sm">
|
||||
{convert(issue.description, {
|
||||
{issue.description ? convert(issue.description, {
|
||||
baseElements: {
|
||||
selectors: ["p"],
|
||||
},
|
||||
})}
|
||||
}) : ''}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
@@ -216,9 +264,9 @@ const VolumeDetails = (props): ReactElement => {
|
||||
{!isEmpty(storyArcsData) && status === "success" && (
|
||||
<>
|
||||
<ul>
|
||||
{storyArcsData.map((storyArc) => {
|
||||
{storyArcsData.map((storyArc: StoryArc, idx: number) => {
|
||||
return (
|
||||
<li>
|
||||
<li key={idx}>
|
||||
<span className="text-lg">{storyArc?.name}</span>
|
||||
</li>
|
||||
);
|
||||
@@ -355,7 +403,7 @@ const VolumeDetails = (props): ReactElement => {
|
||||
width={"600px"}
|
||||
>
|
||||
{slidingPanelContentId !== "" &&
|
||||
contentForSlidingPanel[slidingPanelContentId].content()}
|
||||
(contentForSlidingPanel as ContentForSlidingPanel)[slidingPanelContentId]?.content()}
|
||||
</SlidingPane>
|
||||
</div>
|
||||
</>
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import React, { ReactElement, useEffect, useMemo } from "react";
|
||||
import { searchIssue } from "../../actions/fileops.actions";
|
||||
import React, { ReactElement, useMemo } from "react";
|
||||
import Card from "../shared/Carda";
|
||||
import T2Table from "../shared/T2Table";
|
||||
import ellipsize from "ellipsize";
|
||||
@@ -8,8 +7,45 @@ import { Link } from "react-router-dom";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { SEARCH_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import { CellContext, ColumnDef } from "@tanstack/react-table";
|
||||
|
||||
export const Volumes = (props): ReactElement => {
|
||||
interface VolumesProps {
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
interface VolumeSourceData {
|
||||
_id: string;
|
||||
_source: {
|
||||
sourcedMetadata: {
|
||||
comicvine: {
|
||||
volumeInformation: {
|
||||
name: string;
|
||||
description?: string;
|
||||
image: {
|
||||
small_url: string;
|
||||
};
|
||||
publisher: {
|
||||
name: string;
|
||||
};
|
||||
count_of_issues: number;
|
||||
};
|
||||
};
|
||||
};
|
||||
acquisition?: {
|
||||
directconnect?: unknown[];
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
interface VolumeInformation {
|
||||
name: string;
|
||||
publisher: {
|
||||
name: string;
|
||||
};
|
||||
count_of_issues?: number;
|
||||
}
|
||||
|
||||
export const Volumes = (_props: VolumesProps): ReactElement => {
|
||||
// const volumes = useSelector((state: RootState) => state.fileOps.volumes);
|
||||
const {
|
||||
data: volumes,
|
||||
@@ -34,17 +70,18 @@ export const Volumes = (props): ReactElement => {
|
||||
queryKey: ["volumes"],
|
||||
});
|
||||
const columnData = useMemo(
|
||||
(): any => [
|
||||
(): ColumnDef<VolumeSourceData, unknown>[] => [
|
||||
{
|
||||
header: "Volume Details",
|
||||
id: "volumeDetails",
|
||||
minWidth: 450,
|
||||
accessorFn: (row) => row,
|
||||
cell: (row): any => {
|
||||
const comicObject = row.getValue();
|
||||
size: 450,
|
||||
accessorFn: (row: VolumeSourceData) => row,
|
||||
cell: (info: CellContext<VolumeSourceData, VolumeSourceData>) => {
|
||||
const comicObject = info.getValue();
|
||||
const {
|
||||
_source: { sourcedMetadata },
|
||||
} = comicObject;
|
||||
const description = sourcedMetadata.comicvine.volumeInformation.description || '';
|
||||
return (
|
||||
<div className="flex flex-row gap-3 mt-5">
|
||||
<Link to={`/volume/details/${comicObject._id}`}>
|
||||
@@ -61,9 +98,9 @@ export const Volumes = (props): ReactElement => {
|
||||
{sourcedMetadata.comicvine.volumeInformation.name}
|
||||
</div>
|
||||
<p>
|
||||
{ellipsize(
|
||||
{description ? ellipsize(
|
||||
convert(
|
||||
sourcedMetadata.comicvine.volumeInformation.description,
|
||||
description,
|
||||
{
|
||||
baseElements: {
|
||||
selectors: ["p"],
|
||||
@@ -71,7 +108,7 @@ export const Volumes = (props): ReactElement => {
|
||||
},
|
||||
),
|
||||
180,
|
||||
)}
|
||||
) : ''}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
@@ -84,9 +121,8 @@ export const Volumes = (props): ReactElement => {
|
||||
{
|
||||
header: "Downloads",
|
||||
accessorKey: "_source.acquisition.directconnect",
|
||||
align: "right",
|
||||
cell: (props) => {
|
||||
const row = props.getValue();
|
||||
cell: (props: CellContext<VolumeSourceData, unknown[] | undefined>) => {
|
||||
const row = props.getValue() || [];
|
||||
return (
|
||||
<div
|
||||
style={{
|
||||
@@ -105,16 +141,16 @@ export const Volumes = (props): ReactElement => {
|
||||
{
|
||||
header: "Publisher",
|
||||
accessorKey: "_source.sourcedMetadata.comicvine.volumeInformation",
|
||||
cell: (props): any => {
|
||||
cell: (props: CellContext<VolumeSourceData, VolumeInformation>) => {
|
||||
const row = props.getValue();
|
||||
return <div className="mt-5 text-md">{row.publisher.name}</div>;
|
||||
return <div className="mt-5 text-md">{row?.publisher?.name}</div>;
|
||||
},
|
||||
},
|
||||
{
|
||||
header: "Issue Count",
|
||||
accessorKey:
|
||||
"_source.sourcedMetadata.comicvine.volumeInformation.count_of_issues",
|
||||
cell: (props): any => {
|
||||
cell: (props: CellContext<VolumeSourceData, number>) => {
|
||||
const row = props.getValue();
|
||||
return (
|
||||
<div className="mt-5">
|
||||
|
||||
@@ -1,12 +1,39 @@
|
||||
import React, { ReactElement, useCallback, useEffect, useMemo } from "react";
|
||||
import SearchBar from "../Library/SearchBar";
|
||||
import React, { ReactElement } from "react";
|
||||
import T2Table from "../shared/T2Table";
|
||||
import MetadataPanel from "../shared/MetadataPanel";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { SEARCH_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import { CellContext } from "@tanstack/react-table";
|
||||
|
||||
export const WantedComics = (props): ReactElement => {
|
||||
interface WantedComicsProps {
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
interface WantedSourceData {
|
||||
_id: string;
|
||||
_source: {
|
||||
acquisition?: {
|
||||
directconnect?: {
|
||||
downloads: DownloadItem[];
|
||||
};
|
||||
};
|
||||
[key: string]: unknown;
|
||||
};
|
||||
}
|
||||
|
||||
interface DownloadItem {
|
||||
name: string;
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
interface AcquisitionData {
|
||||
directconnect?: {
|
||||
downloads: DownloadItem[];
|
||||
};
|
||||
}
|
||||
|
||||
export const WantedComics = (_props: WantedComicsProps): ReactElement => {
|
||||
const {
|
||||
data: wantedComics,
|
||||
isSuccess,
|
||||
@@ -39,9 +66,9 @@ export const WantedComics = (props): ReactElement => {
|
||||
{
|
||||
header: "Details",
|
||||
id: "comicDetails",
|
||||
minWidth: 350,
|
||||
accessorFn: (data) => data,
|
||||
cell: (value) => {
|
||||
size: 350,
|
||||
accessorFn: (data: WantedSourceData) => data,
|
||||
cell: (value: CellContext<WantedSourceData, WantedSourceData>) => {
|
||||
const row = value.getValue()._source;
|
||||
return row && <MetadataPanel data={row} />;
|
||||
},
|
||||
@@ -53,17 +80,14 @@ export const WantedComics = (props): ReactElement => {
|
||||
columns: [
|
||||
{
|
||||
header: "Files",
|
||||
align: "right",
|
||||
accessorKey: "_source.acquisition",
|
||||
cell: (props) => {
|
||||
const {
|
||||
directconnect: { downloads },
|
||||
} = props.getValue();
|
||||
cell: (props: CellContext<WantedSourceData, AcquisitionData | undefined>) => {
|
||||
const acquisition = props.getValue();
|
||||
const downloads = acquisition?.directconnect?.downloads || [];
|
||||
return (
|
||||
<div
|
||||
style={{
|
||||
display: "flex",
|
||||
// flexDirection: "column",
|
||||
justifyContent: "center",
|
||||
}}
|
||||
>
|
||||
@@ -78,17 +102,21 @@ export const WantedComics = (props): ReactElement => {
|
||||
header: "Download Details",
|
||||
id: "downloadDetails",
|
||||
accessorKey: "_source.acquisition",
|
||||
cell: (data) => (
|
||||
<ol>
|
||||
{data.getValue().directconnect.downloads.map((download, idx) => {
|
||||
return (
|
||||
<li className="is-size-7" key={idx}>
|
||||
{download.name}
|
||||
</li>
|
||||
);
|
||||
})}
|
||||
</ol>
|
||||
),
|
||||
cell: (data: CellContext<WantedSourceData, AcquisitionData | undefined>) => {
|
||||
const acquisition = data.getValue();
|
||||
const downloads = acquisition?.directconnect?.downloads || [];
|
||||
return (
|
||||
<ol>
|
||||
{downloads.map((download: DownloadItem, idx: number) => {
|
||||
return (
|
||||
<li className="is-size-7" key={idx}>
|
||||
{download.name}
|
||||
</li>
|
||||
);
|
||||
})}
|
||||
</ol>
|
||||
);
|
||||
},
|
||||
},
|
||||
{
|
||||
header: "Type",
|
||||
|
||||
129
src/client/components/shared/AlertBanner.tsx
Normal file
129
src/client/components/shared/AlertBanner.tsx
Normal file
@@ -0,0 +1,129 @@
|
||||
/**
|
||||
* @fileoverview Reusable alert banner component for displaying status messages.
|
||||
* @module components/shared/AlertBanner
|
||||
*/
|
||||
|
||||
import { ReactElement, ReactNode } from "react";
|
||||
|
||||
/**
|
||||
* Alert severity levels that determine styling.
|
||||
*/
|
||||
export type AlertSeverity = "error" | "warning" | "info" | "success";
|
||||
|
||||
/**
|
||||
* Props for the AlertBanner component.
|
||||
*/
|
||||
export type AlertBannerProps = {
|
||||
/** Alert severity level */
|
||||
severity: AlertSeverity;
|
||||
/** Alert title/heading */
|
||||
title: string;
|
||||
/** Alert content - can be string or JSX */
|
||||
children: ReactNode;
|
||||
/** Optional close handler - shows close button when provided */
|
||||
onClose?: () => void;
|
||||
/** Optional custom icon class (defaults based on severity) */
|
||||
iconClass?: string;
|
||||
/** Optional additional CSS classes */
|
||||
className?: string;
|
||||
};
|
||||
|
||||
const severityConfig: Record<
|
||||
AlertSeverity,
|
||||
{
|
||||
border: string;
|
||||
bg: string;
|
||||
titleColor: string;
|
||||
textColor: string;
|
||||
iconColor: string;
|
||||
defaultIcon: string;
|
||||
}
|
||||
> = {
|
||||
error: {
|
||||
border: "border-red-500",
|
||||
bg: "bg-red-50 dark:bg-red-900/20",
|
||||
titleColor: "text-red-800 dark:text-red-300",
|
||||
textColor: "text-red-700 dark:text-red-400",
|
||||
iconColor: "text-red-600 dark:text-red-400",
|
||||
defaultIcon: "icon-[solar--danger-circle-bold]",
|
||||
},
|
||||
warning: {
|
||||
border: "border-amber-500",
|
||||
bg: "bg-amber-50 dark:bg-amber-900/20",
|
||||
titleColor: "text-amber-800 dark:text-amber-300",
|
||||
textColor: "text-amber-700 dark:text-amber-400",
|
||||
iconColor: "text-amber-600 dark:text-amber-400",
|
||||
defaultIcon: "icon-[solar--folder-error-bold]",
|
||||
},
|
||||
info: {
|
||||
border: "border-blue-500",
|
||||
bg: "bg-blue-50 dark:bg-blue-900/20",
|
||||
titleColor: "text-blue-800 dark:text-blue-300",
|
||||
textColor: "text-blue-700 dark:text-blue-400",
|
||||
iconColor: "text-blue-600 dark:text-blue-400",
|
||||
defaultIcon: "icon-[solar--info-circle-bold]",
|
||||
},
|
||||
success: {
|
||||
border: "border-emerald-500",
|
||||
bg: "bg-emerald-50 dark:bg-emerald-900/20",
|
||||
titleColor: "text-emerald-800 dark:text-emerald-300",
|
||||
textColor: "text-emerald-700 dark:text-emerald-400",
|
||||
iconColor: "text-emerald-600 dark:text-emerald-400",
|
||||
defaultIcon: "icon-[solar--check-circle-bold]",
|
||||
},
|
||||
};
|
||||
|
||||
/**
|
||||
* Reusable alert banner component for displaying status messages.
|
||||
*
|
||||
* @param props - Component props
|
||||
* @returns Alert banner element
|
||||
*
|
||||
* @example
|
||||
* ```tsx
|
||||
* <AlertBanner severity="error" title="Import Error" onClose={() => setError(null)}>
|
||||
* Failed to import files. Please try again.
|
||||
* </AlertBanner>
|
||||
* ```
|
||||
*/
|
||||
export const AlertBanner = ({
|
||||
severity,
|
||||
title,
|
||||
children,
|
||||
onClose,
|
||||
iconClass,
|
||||
className = "",
|
||||
}: AlertBannerProps): ReactElement => {
|
||||
const config = severityConfig[severity];
|
||||
const icon = iconClass || config.defaultIcon;
|
||||
|
||||
return (
|
||||
<div
|
||||
className={`rounded-lg border-s-4 ${config.border} ${config.bg} p-4 ${className}`}
|
||||
role="alert"
|
||||
>
|
||||
<div className="flex items-start gap-3">
|
||||
<span className={`w-6 h-6 ${config.iconColor} mt-0.5`}>
|
||||
<i className={`h-6 w-6 ${icon}`}></i>
|
||||
</span>
|
||||
<div className="flex-1">
|
||||
<p className={`font-semibold ${config.titleColor}`}>{title}</p>
|
||||
<div className={`text-sm ${config.textColor} mt-1`}>{children}</div>
|
||||
</div>
|
||||
{onClose && (
|
||||
<button
|
||||
onClick={onClose}
|
||||
className={`${config.iconColor} hover:opacity-70`}
|
||||
aria-label="Close alert"
|
||||
>
|
||||
<span className="w-5 h-5">
|
||||
<i className="h-5 w-5 icon-[solar--close-circle-bold]"></i>
|
||||
</span>
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default AlertBanner;
|
||||
101
src/client/components/shared/AlertCard.tsx
Normal file
101
src/client/components/shared/AlertCard.tsx
Normal file
@@ -0,0 +1,101 @@
|
||||
/**
|
||||
* @fileoverview Reusable alert card component for displaying status messages.
|
||||
* Supports multiple variants (error, warning, info, success) with consistent
|
||||
* styling and optional dismiss functionality.
|
||||
* @module components/shared/AlertCard
|
||||
*/
|
||||
|
||||
import { ReactElement, ReactNode } from "react";
|
||||
import type { AlertVariant, AlertCardProps } from "../../types";
|
||||
|
||||
const variantStyles: Record<AlertVariant, {
|
||||
container: string;
|
||||
border: string;
|
||||
icon: string;
|
||||
iconClass: string;
|
||||
title: string;
|
||||
text: string;
|
||||
}> = {
|
||||
error: {
|
||||
container: "bg-red-50 dark:bg-red-900/20",
|
||||
border: "border-red-500",
|
||||
icon: "text-red-600 dark:text-red-400",
|
||||
iconClass: "icon-[solar--danger-circle-bold]",
|
||||
title: "text-red-800 dark:text-red-300",
|
||||
text: "text-red-700 dark:text-red-400",
|
||||
},
|
||||
warning: {
|
||||
container: "bg-amber-50 dark:bg-amber-900/20",
|
||||
border: "border-amber-300",
|
||||
icon: "text-amber-600 dark:text-amber-400",
|
||||
iconClass: "icon-[solar--danger-triangle-bold]",
|
||||
title: "text-amber-800 dark:text-amber-300",
|
||||
text: "text-amber-700 dark:text-amber-400",
|
||||
},
|
||||
info: {
|
||||
container: "bg-blue-50 dark:bg-blue-900/20",
|
||||
border: "border-blue-500",
|
||||
icon: "text-blue-600 dark:text-blue-400",
|
||||
iconClass: "icon-[solar--info-circle-bold]",
|
||||
title: "text-blue-800 dark:text-blue-300",
|
||||
text: "text-blue-700 dark:text-blue-400",
|
||||
},
|
||||
success: {
|
||||
container: "bg-green-50 dark:bg-green-900/20",
|
||||
border: "border-green-500",
|
||||
icon: "text-green-600 dark:text-green-400",
|
||||
iconClass: "icon-[solar--check-circle-bold]",
|
||||
title: "text-green-800 dark:text-green-300",
|
||||
text: "text-green-700 dark:text-green-400",
|
||||
},
|
||||
};
|
||||
|
||||
/**
|
||||
* A reusable alert card component for displaying messages with consistent styling.
|
||||
*
|
||||
* @example
|
||||
* ```tsx
|
||||
* <AlertCard variant="error" title="Import Error" onDismiss={() => setError(null)}>
|
||||
* {errorMessage}
|
||||
* </AlertCard>
|
||||
* ```
|
||||
*/
|
||||
export function AlertCard({
|
||||
variant,
|
||||
title,
|
||||
children,
|
||||
onDismiss,
|
||||
className = "",
|
||||
}: AlertCardProps): ReactElement {
|
||||
const styles = variantStyles[variant];
|
||||
|
||||
return (
|
||||
<div
|
||||
className={`rounded-lg border-l-4 ${styles.border} ${styles.container} p-4 ${className}`}
|
||||
>
|
||||
<div className="flex items-start gap-3">
|
||||
<span className={`w-6 h-6 ${styles.icon} mt-0.5 shrink-0`}>
|
||||
<i className={`h-6 w-6 ${styles.iconClass}`}></i>
|
||||
</span>
|
||||
<div className="flex-1 min-w-0">
|
||||
{title && (
|
||||
<p className={`font-semibold ${styles.title}`}>{title}</p>
|
||||
)}
|
||||
<div className={`text-sm ${styles.text} ${title ? "mt-1" : ""}`}>
|
||||
{children}
|
||||
</div>
|
||||
</div>
|
||||
{onDismiss && (
|
||||
<button
|
||||
onClick={onDismiss}
|
||||
className={`${styles.icon} hover:opacity-70 transition-opacity`}
|
||||
>
|
||||
<i className="h-5 w-5 icon-[solar--close-circle-bold]"></i>
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export default AlertCard;
|
||||
@@ -1,12 +1,25 @@
|
||||
import React, { useEffect, useRef } from "react";
|
||||
|
||||
export const Canvas = ({ data }) => {
|
||||
interface ColorHistogramData {
|
||||
r: number[];
|
||||
g: number[];
|
||||
b: number[];
|
||||
maxBrightness: number;
|
||||
}
|
||||
|
||||
interface CanvasProps {
|
||||
data: {
|
||||
colorHistogramData: ColorHistogramData;
|
||||
};
|
||||
}
|
||||
|
||||
export const Canvas = ({ data }: CanvasProps) => {
|
||||
const { colorHistogramData } = data;
|
||||
const width = 559;
|
||||
const height = 200;
|
||||
const pixelRatio = window.devicePixelRatio;
|
||||
|
||||
const canvas = useRef(null);
|
||||
const canvas = useRef<HTMLCanvasElement>(null);
|
||||
|
||||
useEffect(() => {
|
||||
const context = canvas.current?.getContext("2d");
|
||||
|
||||
@@ -6,8 +6,8 @@ interface ICardProps {
|
||||
orientation: string;
|
||||
imageUrl?: string;
|
||||
hasDetails?: boolean;
|
||||
title?: PropTypes.ReactElementLike | null;
|
||||
children?: PropTypes.ReactNodeLike;
|
||||
title?: React.ReactNode;
|
||||
children?: React.ReactNode;
|
||||
borderColorClass?: string;
|
||||
backgroundColor?: string;
|
||||
cardState?: "wanted" | "delete" | "scraped" | "uncompressed" | "imported" | "missing";
|
||||
|
||||
@@ -3,11 +3,17 @@ import { Form, Field } from "react-final-form";
|
||||
import { hostNameValidator } from "../../../shared/utils/validator.utils";
|
||||
import { isEmpty } from "lodash";
|
||||
|
||||
interface ConnectionFormProps {
|
||||
initialData?: Record<string, unknown>;
|
||||
submitHandler: (values: Record<string, unknown>) => void;
|
||||
formHeading: string;
|
||||
}
|
||||
|
||||
export const ConnectionForm = ({
|
||||
initialData,
|
||||
submitHandler,
|
||||
formHeading,
|
||||
}): ReactElement => {
|
||||
}: ConnectionFormProps): ReactElement => {
|
||||
return (
|
||||
<>
|
||||
<Form
|
||||
|
||||
@@ -1,21 +1,25 @@
|
||||
import React, { useRef, useState } from "react";
|
||||
import React, { useRef, useState, Dispatch, SetStateAction } from "react";
|
||||
import { format } from "date-fns";
|
||||
import FocusTrap from "focus-trap-react";
|
||||
import { ClassNames, DayPicker } from "react-day-picker";
|
||||
import { DayPicker } from "react-day-picker";
|
||||
import { useFloating, offset, flip, autoUpdate } from "@floating-ui/react-dom";
|
||||
import styles from "react-day-picker/dist/style.module.css";
|
||||
|
||||
export const DatePickerDialog = (props) => {
|
||||
interface DatePickerDialogProps {
|
||||
setter: Dispatch<SetStateAction<string>>;
|
||||
apiAction?: () => void;
|
||||
inputValue?: string;
|
||||
}
|
||||
|
||||
export const DatePickerDialog = (props: DatePickerDialogProps) => {
|
||||
const { setter, apiAction } = props;
|
||||
const [selected, setSelected] = useState<Date>();
|
||||
const [isPopperOpen, setIsPopperOpen] = useState(false);
|
||||
|
||||
const classNames: ClassNames = {
|
||||
...styles,
|
||||
head: "custom-head",
|
||||
};
|
||||
// Use styles without casting - let TypeScript infer
|
||||
const classNames = styles as unknown as Record<string, string>;
|
||||
const buttonRef = useRef<HTMLButtonElement>(null);
|
||||
const { x, y, reference, floating, strategy, refs, update } = useFloating({
|
||||
const { refs, floatingStyles, strategy, update } = useFloating({
|
||||
placement: "bottom-end",
|
||||
middleware: [offset(10), flip()],
|
||||
strategy: "absolute",
|
||||
@@ -33,11 +37,11 @@ export const DatePickerDialog = (props) => {
|
||||
}
|
||||
};
|
||||
|
||||
const handleDaySelect = (date) => {
|
||||
const handleDaySelect = (date: Date | undefined) => {
|
||||
setSelected(date);
|
||||
if (date) {
|
||||
setter(format(date, "yyyy/MM/dd"));
|
||||
apiAction();
|
||||
apiAction?.();
|
||||
closePopper();
|
||||
} else {
|
||||
setter("");
|
||||
@@ -46,7 +50,7 @@ export const DatePickerDialog = (props) => {
|
||||
|
||||
return (
|
||||
<div>
|
||||
<div ref={reference}>
|
||||
<div ref={refs.setReference}>
|
||||
<button
|
||||
ref={buttonRef}
|
||||
type="button"
|
||||
@@ -69,10 +73,10 @@ export const DatePickerDialog = (props) => {
|
||||
}}
|
||||
>
|
||||
<div
|
||||
ref={floating}
|
||||
ref={refs.setFloating}
|
||||
style={{
|
||||
position: strategy,
|
||||
zIndex: "999",
|
||||
...floatingStyles,
|
||||
zIndex: 999,
|
||||
borderRadius: "10px",
|
||||
boxShadow: "0 4px 6px rgba(0,0,0,0.1)", // Example of adding a shadow
|
||||
}}
|
||||
|
||||
@@ -1,14 +1,21 @@
|
||||
import React, { forwardRef } from "react";
|
||||
import React, { forwardRef, CSSProperties } from "react";
|
||||
|
||||
export const Cover = forwardRef(
|
||||
interface CoverProps {
|
||||
url: string;
|
||||
index: number;
|
||||
faded?: boolean;
|
||||
style?: CSSProperties;
|
||||
}
|
||||
|
||||
export const Cover = forwardRef<HTMLDivElement, CoverProps & React.HTMLAttributes<HTMLDivElement>>(
|
||||
({ url, index, faded, style, ...props }, ref) => {
|
||||
const inlineStyles = {
|
||||
const inlineStyles: CSSProperties = {
|
||||
opacity: faded ? "0.2" : "1",
|
||||
transformOrigin: "0 0",
|
||||
minHeight: index === 0 ? 300 : 300,
|
||||
maxWidth: 200,
|
||||
gridRowStart: index === 0 ? "span" : null,
|
||||
gridColumnStart: index === 0 ? "span" : null,
|
||||
gridRowStart: index === 0 ? "span" : undefined,
|
||||
gridColumnStart: index === 0 ? "span" : undefined,
|
||||
backgroundImage: `url("${url}")`,
|
||||
backgroundSize: "cover",
|
||||
backgroundPosition: "center",
|
||||
|
||||
@@ -8,6 +8,8 @@ import {
|
||||
DragOverlay,
|
||||
useSensor,
|
||||
useSensors,
|
||||
DragStartEvent,
|
||||
DragEndEvent,
|
||||
} from "@dnd-kit/core";
|
||||
import {
|
||||
arrayMove,
|
||||
@@ -20,22 +22,27 @@ import { SortableCover } from "./SortableCover";
|
||||
import { Cover } from "./Cover";
|
||||
import { map } from "lodash";
|
||||
|
||||
export const DnD = (data) => {
|
||||
const [items, setItems] = useState(data.data);
|
||||
const [activeId, setActiveId] = useState(null);
|
||||
interface DnDProps {
|
||||
data: string[];
|
||||
onClickHandler: (url: string) => void;
|
||||
}
|
||||
|
||||
export const DnD = ({ data, onClickHandler }: DnDProps) => {
|
||||
const [items, setItems] = useState<string[]>(data);
|
||||
const [activeId, setActiveId] = useState<string | null>(null);
|
||||
const sensors = useSensors(useSensor(MouseSensor), useSensor(TouchSensor));
|
||||
|
||||
function handleDragStart(event) {
|
||||
setActiveId(event.active.id);
|
||||
function handleDragStart(event: DragStartEvent) {
|
||||
setActiveId(event.active.id as string);
|
||||
}
|
||||
|
||||
function handleDragEnd(event) {
|
||||
function handleDragEnd(event: DragEndEvent) {
|
||||
const { active, over } = event;
|
||||
|
||||
if (active.id !== over.id) {
|
||||
setItems((items) => {
|
||||
const oldIndex = items.indexOf(active.id);
|
||||
const newIndex = items.indexOf(over.id);
|
||||
if (over && active.id !== over.id) {
|
||||
setItems((items: string[]) => {
|
||||
const oldIndex = items.indexOf(active.id as string);
|
||||
const newIndex = items.indexOf(over.id as string);
|
||||
|
||||
return arrayMove(items, oldIndex, newIndex);
|
||||
});
|
||||
@@ -56,13 +63,13 @@ export const DnD = (data) => {
|
||||
>
|
||||
<SortableContext items={items} strategy={rectSortingStrategy}>
|
||||
<Grid columns={4}>
|
||||
{map(items, (url, index) => {
|
||||
{map(items, (url: string, index: number) => {
|
||||
return (
|
||||
<div>
|
||||
<SortableCover key={url} url={url} index={index} />
|
||||
<div key={url}>
|
||||
<SortableCover url={url} index={index} />
|
||||
<div
|
||||
className="mt-2 mb-2"
|
||||
onClick={(e) => data.onClickHandler(url)}
|
||||
onClick={() => onClickHandler(url)}
|
||||
>
|
||||
<div className="box p-2 control-palette">
|
||||
<span className="tag is-warning mr-2">{index}</span>
|
||||
|
||||
@@ -1,6 +1,11 @@
|
||||
import React from "react";
|
||||
import React, { ReactNode } from "react";
|
||||
|
||||
export function Grid({ children, columns }) {
|
||||
interface GridProps {
|
||||
children: ReactNode;
|
||||
columns: number;
|
||||
}
|
||||
|
||||
export function Grid({ children, columns }: GridProps) {
|
||||
return (
|
||||
<div
|
||||
style={{
|
||||
|
||||
@@ -4,12 +4,17 @@ import { CSS } from "@dnd-kit/utilities";
|
||||
|
||||
import { Cover } from "./Cover";
|
||||
|
||||
export const SortableCover = (props) => {
|
||||
interface SortableCoverProps {
|
||||
url: string;
|
||||
index: number;
|
||||
faded?: boolean;
|
||||
}
|
||||
|
||||
export const SortableCover = (props: SortableCoverProps) => {
|
||||
const sortable = useSortable({ id: props.url });
|
||||
const {
|
||||
attributes,
|
||||
listeners,
|
||||
isDragging,
|
||||
setNodeRef,
|
||||
transform,
|
||||
transition,
|
||||
|
||||
@@ -12,7 +12,7 @@ export const Navbar2 = (): ReactElement => {
|
||||
|
||||
return (
|
||||
<header className="bg-white dark:bg-gray-900 border-b-2 border-gray-300 dark:border-slate-200">
|
||||
<div className="container mx-auto px-4 sm:px-6 lg:px-8 py-5">
|
||||
<div className="max-w-7xl mx-auto px-4 sm:px-6 lg:px-8 py-5">
|
||||
<div className="flex items-center gap-8">
|
||||
{/* Logo */}
|
||||
<img src="/src/client/assets/img/threetwo.png" alt="ThreeTwo!" />
|
||||
|
||||
@@ -1,9 +1,14 @@
|
||||
import React, { useState } from "react";
|
||||
import React, { ReactNode, useState } from "react";
|
||||
import { useFloating, offset, flip } from "@floating-ui/react-dom";
|
||||
import { useTranslation } from "react-i18next";
|
||||
import "../../shared/utils/i18n.util"; // Ensure you import your i18n configuration
|
||||
|
||||
const PopoverButton = ({ content, clickHandler }) => {
|
||||
interface PopoverButtonProps {
|
||||
content: ReactNode;
|
||||
clickHandler: () => void;
|
||||
}
|
||||
|
||||
const PopoverButton = ({ content, clickHandler }: PopoverButtonProps) => {
|
||||
const [isVisible, setIsVisible] = useState(false);
|
||||
// Use destructuring to obtain the reference and floating setters, among other values.
|
||||
const { x, y, refs, strategy, floatingStyles } = useFloating({
|
||||
|
||||
61
src/client/components/shared/ProgressBar.tsx
Normal file
61
src/client/components/shared/ProgressBar.tsx
Normal file
@@ -0,0 +1,61 @@
|
||||
/**
|
||||
* @fileoverview Reusable progress bar component with percentage display.
|
||||
* Supports animated shimmer effect for active states and customizable labels.
|
||||
* @module components/shared/ProgressBar
|
||||
*/
|
||||
|
||||
import { ReactElement } from "react";
|
||||
import type { ProgressBarProps } from "../../types";
|
||||
|
||||
/**
|
||||
* A reusable progress bar component with percentage display.
|
||||
*
|
||||
* @example
|
||||
* ```tsx
|
||||
* <ProgressBar
|
||||
* current={45}
|
||||
* total={100}
|
||||
* isActive={true}
|
||||
* activeLabel="Importing 45 / 100"
|
||||
* completeLabel="45 / 100 imported"
|
||||
* />
|
||||
* ```
|
||||
*/
|
||||
export function ProgressBar({
|
||||
current,
|
||||
total,
|
||||
isActive = false,
|
||||
activeLabel,
|
||||
completeLabel,
|
||||
className = "",
|
||||
}: ProgressBarProps): ReactElement {
|
||||
const percentage = total > 0 ? Math.round((current / total) * 100) : 0;
|
||||
const label = isActive ? activeLabel : completeLabel;
|
||||
|
||||
return (
|
||||
<div className={`space-y-1.5 ${className}`}>
|
||||
<div className="flex items-center justify-between text-sm">
|
||||
{label && (
|
||||
<span className="font-medium text-gray-700 dark:text-gray-300">
|
||||
{label}
|
||||
</span>
|
||||
)}
|
||||
<span className="font-semibold text-gray-900 dark:text-white">
|
||||
{percentage}% complete
|
||||
</span>
|
||||
</div>
|
||||
<div className="w-full bg-gray-200 dark:bg-gray-700 rounded-full h-3 overflow-hidden">
|
||||
<div
|
||||
className="bg-blue-600 dark:bg-blue-500 h-3 rounded-full transition-all duration-300 relative"
|
||||
style={{ width: `${percentage}%` }}
|
||||
>
|
||||
{isActive && (
|
||||
<div className="absolute inset-0 bg-linear-to-r from-transparent via-white/20 to-transparent animate-shimmer" />
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export default ProgressBar;
|
||||
44
src/client/components/shared/StatsCard.tsx
Normal file
44
src/client/components/shared/StatsCard.tsx
Normal file
@@ -0,0 +1,44 @@
|
||||
/**
|
||||
* @fileoverview Reusable stats card component for displaying numeric metrics.
|
||||
* Used for dashboards and import statistics displays.
|
||||
* @module components/shared/StatsCard
|
||||
*/
|
||||
|
||||
import { ReactElement } from "react";
|
||||
import type { StatsCardProps } from "../../types";
|
||||
|
||||
/**
|
||||
* A reusable stats card component for displaying numeric metrics.
|
||||
*
|
||||
* @example
|
||||
* ```tsx
|
||||
* <StatsCard
|
||||
* value={42}
|
||||
* label="imported in database"
|
||||
* backgroundColor="#d8dab2"
|
||||
* valueColor="text-gray-800"
|
||||
* />
|
||||
* ```
|
||||
*/
|
||||
export function StatsCard({
|
||||
value,
|
||||
label,
|
||||
backgroundColor = "#6b7280",
|
||||
valueColor = "text-white",
|
||||
labelColor = "text-gray-200",
|
||||
className = "",
|
||||
}: StatsCardProps): ReactElement {
|
||||
const isHexColor = backgroundColor.startsWith("#") || backgroundColor.startsWith("rgb");
|
||||
|
||||
return (
|
||||
<div
|
||||
className={`rounded-lg p-6 text-center ${!isHexColor ? backgroundColor : ""} ${className}`}
|
||||
style={isHexColor ? { backgroundColor } : undefined}
|
||||
>
|
||||
<div className={`text-4xl font-bold ${valueColor} mb-2`}>{value}</div>
|
||||
<div className={`text-sm ${labelColor} font-medium`}>{label}</div>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
|
||||
export default StatsCard;
|
||||
@@ -7,27 +7,7 @@ import {
|
||||
useReactTable,
|
||||
PaginationState,
|
||||
} from "@tanstack/react-table";
|
||||
|
||||
/** Props for {@link T2Table}. */
|
||||
interface T2TableProps<TData> {
|
||||
/** Row data to render. */
|
||||
sourceData?: TData[];
|
||||
/** Total number of records across all pages, used for pagination display. */
|
||||
totalPages?: number;
|
||||
/** Column definitions (TanStack Table {@link ColumnDef} array). */
|
||||
columns?: ColumnDef<TData>[];
|
||||
/** Callbacks for navigating between pages. */
|
||||
paginationHandlers?: {
|
||||
nextPage?(pageIndex: number, pageSize: number): void;
|
||||
previousPage?(pageIndex: number, pageSize: number): void;
|
||||
};
|
||||
/** Called with the TanStack row object when a row is clicked. */
|
||||
rowClickHandler?(row: Row<TData>): void;
|
||||
/** Returns additional CSS classes for a given row (e.g. for highlight states). */
|
||||
getRowClassName?(row: Row<TData>): string;
|
||||
/** Optional slot rendered in the toolbar area (e.g. a search input). */
|
||||
children?: ReactNode;
|
||||
}
|
||||
import type { T2TableProps } from "../../types";
|
||||
|
||||
/**
|
||||
* A paginated data table with a two-row sticky header.
|
||||
|
||||
@@ -1,154 +0,0 @@
|
||||
export const CV_API_CALL_IN_PROGRESS = "CV_SEARCH_IN_PROGRESS";
|
||||
export const CV_SEARCH_FAILURE = "CV_SEARCH_FAILURE";
|
||||
export const CV_SEARCH_SUCCESS = "CV_SEARCH_SUCCESS";
|
||||
export const CV_CLEANUP = "CV_CLEANUP";
|
||||
|
||||
export const CV_API_GENERIC_FAILURE = "CV_API_GENERIC_FAILURE";
|
||||
|
||||
export const IMS_COMICBOOK_METADATA_FETCHED = "IMS_SOCKET_DATA_FETCHED";
|
||||
|
||||
export const IMS_RAW_IMPORT_SUCCESSFUL = "IMS_RAW_IMPORT_SUCCESSFUL";
|
||||
export const IMS_RAW_IMPORT_FAILED = "IMS_RAW_IMPORT_FAILED";
|
||||
|
||||
// Library service generic action types
|
||||
export const LS_IMPORT_CALL_IN_PROGRESS = "LS_IMPORT_CALL_IN_PROGRESS";
|
||||
// Library import bull mq queue control
|
||||
export const LS_TOGGLE_IMPORT_QUEUE = "LS_TOGGLE_IMPORT_QUEUE";
|
||||
|
||||
// ComicVine Metadata
|
||||
export const IMS_CV_METADATA_IMPORT_CALL_IN_PROGRESS =
|
||||
"IMS_CV_METADATA_IMPORT_CALL_IN_PROGRESS";
|
||||
export const IMS_CV_METADATA_IMPORT_SUCCESSFUL =
|
||||
"IMS_CV_METADATA_IMPORT_SUCCESSFUL";
|
||||
export const IMS_CV_METADATA_IMPORT_FAILED = "IMS_CV_METADATA_IMPORT_FAILED";
|
||||
|
||||
export const IMS_RECENT_COMICS_FETCHED = "IMS_RECENT_COMICS_FETCHED";
|
||||
export const IMS_DATA_FETCH_ERROR = "IMS_DATA_FETCH_ERROR";
|
||||
|
||||
// Weekly pull list
|
||||
export const CV_WEEKLY_PULLLIST_CALL_IN_PROGRESS =
|
||||
"CV_WEEKLY_PULLLIST_CALL_IN_PROGRESS";
|
||||
export const CV_WEEKLY_PULLLIST_FETCHED = "CV_WEEKLY_PULLLIST_FETCHED";
|
||||
export const CV_WEEKLY_PULLLIST_ERROR = "CV_WEEKLY_PULLLIST_ERROR";
|
||||
|
||||
// Single or multiple comic book mongo objects
|
||||
export const IMS_COMIC_BOOK_DB_OBJECT_FETCHED =
|
||||
"IMS_COMIC_BOOK_DB_OBJECT_FETCHED";
|
||||
export const IMS_COMIC_BOOKS_DB_OBJECTS_FETCHED =
|
||||
"IMS_COMIC_BOOKS_DB_OBJECTS_FETCHED";
|
||||
export const IMS_COMIC_BOOK_DB_OBJECT_CALL_IN_PROGRESS =
|
||||
"IMS_COMIC_BOOK_DB_OBJECT_CALL_IN_PROGRESS";
|
||||
export const IMS_COMIC_BOOK_DB_OBJECT_CALL_FAILED =
|
||||
"IMS_COMIC_BOOK_DB_OBJECT_CALL_FAILED";
|
||||
|
||||
// wanted comics from CV, LoCG and other sources
|
||||
export const IMS_WANTED_COMICS_FETCHED = "IMS_WANTED_COMICS_FETCHED";
|
||||
|
||||
// volume groups
|
||||
export const IMS_COMIC_BOOK_GROUPS_FETCHED = "IMS_COMIC_BOOK_GROUPS_FETCHED";
|
||||
export const IMS_COMIC_BOOK_GROUPS_CALL_IN_PROGRESS =
|
||||
"IMS_COMIC_BOOK_GROUPS_CALL_IN_PROGRESS";
|
||||
export const IMS_COMIC_BOOK_GROUPS_CALL_FAILED =
|
||||
"IMS_COMIC_BOOK_GROUPS_CALL_FAILED";
|
||||
export const VOLUMES_FETCHED = "VOLUMES_FETCHED";
|
||||
|
||||
// search results from the Search service
|
||||
export const SS_SEARCH_RESULTS_FETCHED = "SS_SEARCH_RESULTS_FETCHED";
|
||||
export const SS_SEARCH_RESULTS_FETCHED_SPECIAL =
|
||||
"SS_SEARCH_RESULTS_FETCHED_SPECIAL";
|
||||
export const SS_SEARCH_IN_PROGRESS = "SS_SEARCH_IN_PROGRESS";
|
||||
export const SS_SEARCH_FAILED = "SS_SEARCH_FAILED";
|
||||
|
||||
// issues for a given volume
|
||||
export const CV_ISSUES_METADATA_CALL_IN_PROGRESS =
|
||||
"CV_ISSUES_METADATA_CALL_IN_PROGRESS";
|
||||
export const CV_ISSUES_METADATA_FETCH_SUCCESS =
|
||||
"CV_ISSUES_METADATA_FETCH_SUCCESS";
|
||||
export const CV_ISSUES_METADATA_FETCH_FAILED =
|
||||
"CV_ISSUES_METADATA_FETCH_FAILED";
|
||||
export const CV_ISSUES_FOR_VOLUME_IN_LIBRARY_SUCCESS =
|
||||
"CV_ISSUES_FOR_VOLUME_IN_LIBRARY_SUCCESS";
|
||||
export const CV_ISSUES_FOR_VOLUME_IN_LIBRARY_UPDATED =
|
||||
"CV_ISSUES_FOR_VOLUME_IN_LIBRARY_UPDATED";
|
||||
export const CV_ISSUES_MATCHES_IN_LIBRARY_FETCHED =
|
||||
"CV_ISSUES_MATCHES_IN_LIBRARY_FETCHED";
|
||||
|
||||
// extracted comic archive
|
||||
export const IMS_COMIC_BOOK_ARCHIVE_EXTRACTION_SUCCESS =
|
||||
"IMS_COMIC_BOOK_ARCHIVE_EXTRACTION_SUCCESS";
|
||||
export const IMS_COMIC_BOOK_ARCHIVE_EXTRACTION_CALL_IN_PROGRESS =
|
||||
"IMS_COMIC_BOOK_ARCHIVE_EXTRACTION_CALL_IN_PROGRESS";
|
||||
export const IMS_COMIC_BOOK_ARCHIVE_EXTRACTION_CALL_FAILED =
|
||||
"IMS_COMIC_BOOK_ARCHIVE_EXTRACTION_CALL_FAILED";
|
||||
|
||||
export const COMICBOOK_EXTRACTION_SUCCESS = "COMICBOOK_EXTRACTION_SUCCESS";
|
||||
|
||||
// Image file stats
|
||||
export const IMG_ANALYSIS_CALL_IN_PROGRESS = "IMG_ANALYSIS_CALL_IN_PROGRESS";
|
||||
export const IMG_ANALYSIS_DATA_FETCH_SUCCESS =
|
||||
"IMG_ANALYSIS_DATA_FETCH_SUCCESS";
|
||||
export const IMG_ANALYSIS_DATA_FETCH_ERROR = "IMG_ANALYSIS_DATA_FETCH_ERROR";
|
||||
|
||||
// library statistics
|
||||
export const LIBRARY_STATISTICS_CALL_IN_PROGRESS =
|
||||
"LIBRARY_STATISTICS_CALL_IN_PROGRESS";
|
||||
export const LIBRARY_STATISTICS_FETCHED = "LIBRARY_STATISTICS_FETCHED";
|
||||
export const LIBRARY_STATISTICS_FETCH_ERROR = "LIBRARY_STATISTICS_FETCH_ERROR";
|
||||
|
||||
// fileops cleanup
|
||||
export const FILEOPS_STATE_RESET = "FILEOPS_STATE_RESET";
|
||||
|
||||
// AirDC++
|
||||
export const AIRDCPP_SEARCH_IN_PROGRESS = "AIRDCPP_SEARCH_IN_PROGRESS";
|
||||
export const AIRDCPP_SEARCH_RESULTS_ADDED = "AIRDCPP_SEARCH_RESULTS_ADDED";
|
||||
export const AIRDCPP_SEARCH_RESULTS_UPDATED = "AIRDCPP_SEARCH_RESULTS_UPDATED";
|
||||
export const AIRDCPP_SEARCH_COMPLETE = "AIRDCPP_SEARCH_COMPLETE";
|
||||
|
||||
// AirDC++ related library query for issues with bundles associated with them
|
||||
export const LIBRARY_ISSUE_BUNDLES = "LIBRARY_ISSUE_BUNDLES";
|
||||
|
||||
export const AIRDCPP_HUB_SEARCHES_SENT = "AIRDCPP_HUB_SEARCHES_SENT";
|
||||
export const AIRDCPP_RESULT_DOWNLOAD_INITIATED =
|
||||
"AIRDCPP_RESULT_DOWNLOAD_INITIATED";
|
||||
export const AIRDCPP_FILE_DOWNLOAD_COMPLETED =
|
||||
"AIRDCPP_FILE_DOWNLOAD_COMPLETED";
|
||||
export const LS_SINGLE_IMPORT = "LS_SINGLE_IMPORT";
|
||||
export const AIRDCPP_BUNDLES_FETCHED = "AIRDCPP_BUNDLES_FETCHED";
|
||||
export const AIRDCPP_DOWNLOAD_PROGRESS_TICK = "AIRDCPP_DOWNLOAD_PROGRESS_TICK";
|
||||
export const AIRDCPP_SOCKET_CONNECTED = "AIRDCPP_SOCKET_CONNECTED";
|
||||
export const AIRDCPP_SOCKET_DISCONNECTED = "AIRDCPP_SOCKET_DISCONNECTED";
|
||||
|
||||
// Transfers
|
||||
export const AIRDCPP_TRANSFERS_FETCHED = "AIRDCPP_TRANSFERS_FETCHED";
|
||||
|
||||
// Comics marked as "wanted"
|
||||
export const WANTED_COMICS_FETCHED = "WANTED_COMICS_FETCHED";
|
||||
|
||||
// LIBRARY Service import queue-related action types
|
||||
export const LS_IMPORT = "LS_IMPORT";
|
||||
export const LS_COVER_EXTRACTED = "LS_COVER_EXTRACTED";
|
||||
export const LS_COVER_EXTRACTION_FAILED = "LS_COVER_EXTRACTION_FAILED";
|
||||
export const LS_COMIC_ADDED = "LS_COMIC_ADDED";
|
||||
export const LS_IMPORT_QUEUE_DRAINED = "LS_IMPORT_QUEUE_DRAINED";
|
||||
export const LS_SET_QUEUE_STATUS = "LS_SET_QUEUE_STATUS";
|
||||
export const RESTORE_JOB_COUNTS_AFTER_SESSION_RESTORATION =
|
||||
"RESTORE_JOB_COUNTS_AFTER_SESSION_RESTORATION";
|
||||
export const LS_IMPORT_JOB_STATISTICS_FETCHED =
|
||||
"LS_IMPORT_JOB_STATISTICS_FETCHED";
|
||||
|
||||
// Settings
|
||||
export const SETTINGS_CALL_IN_PROGRESS = "SETTINGS_CALL_IN_PROGRESS";
|
||||
export const SETTINGS_OBJECT_FETCHED = "SETTINGS_OBJECT_FETCHED";
|
||||
export const SETTINGS_CALL_FAILED = "SETTINGS_CALL_FAILED";
|
||||
export const SETTINGS_OBJECT_DELETED = "SETTINGS_OBJECT_DELETED";
|
||||
export const SETTINGS_DB_FLUSH_SUCCESS = "SETTINGS_DB_FLUSH_SUCCESS";
|
||||
|
||||
export const SETTINGS_QBITTORRENT_TORRENTS_LIST_FETCHED = "SETTINGS_QBITTORRENT_TORRENTS_LIST_FETCHED";
|
||||
|
||||
// Metron Metadata
|
||||
export const METRON_DATA_FETCH_SUCCESS = "METRON_DATA_FETCH_SUCCESS";
|
||||
export const METRON_DATA_FETCH_IN_PROGRESS = "METRON_DATA_FETCH_IN_PROGRESS";
|
||||
export const METRON_DATA_FETCH_ERROR = "METRON_DATA_FETCH_ERROR";
|
||||
|
||||
// service health statuses
|
||||
export const LIBRARY_SERVICE_HEALTH = "LIBRARY_SERVICE_HEALTH";
|
||||
@@ -1,3 +1,23 @@
|
||||
/**
|
||||
* @fileoverview API endpoint configuration constants.
|
||||
* Builds URIs for all microservices used by the application.
|
||||
* Supports environment-based configuration via Vite environment variables.
|
||||
* @module constants/endpoints
|
||||
*/
|
||||
|
||||
/**
|
||||
* Constructs a full URI from protocol, host, port, and path components.
|
||||
*
|
||||
* @param {Record<string, string>} options - URI component options
|
||||
* @param {string} options.protocol - Protocol (http, https, ws, wss)
|
||||
* @param {string} options.host - Hostname or IP address
|
||||
* @param {string} options.port - Port number
|
||||
* @param {string} options.apiPath - API path prefix
|
||||
* @returns {string} Complete URI string
|
||||
* @example
|
||||
* hostURIBuilder({ protocol: "http", host: "localhost", port: "3000", apiPath: "/api" })
|
||||
* // Returns "http://localhost:3000/api"
|
||||
*/
|
||||
export const hostURIBuilder = (options: Record<string, string>): string => {
|
||||
return (
|
||||
options.protocol +
|
||||
@@ -9,6 +29,14 @@ export const hostURIBuilder = (options: Record<string, string>): string => {
|
||||
);
|
||||
};
|
||||
|
||||
// =============================================================================
|
||||
// SERVICE ENDPOINT CONSTANTS
|
||||
// =============================================================================
|
||||
|
||||
/**
|
||||
* CORS proxy server URI for bypassing cross-origin restrictions.
|
||||
* @constant {string}
|
||||
*/
|
||||
export const CORS_PROXY_SERVER_URI = hostURIBuilder({
|
||||
protocol: "http",
|
||||
host: import.meta.env.VITE_UNDERLYING_HOSTNAME || "localhost",
|
||||
|
||||
@@ -1,8 +1,41 @@
|
||||
/**
|
||||
* @fileoverview Adapter functions for transforming GraphQL responses to legacy formats.
|
||||
* Enables gradual migration from REST API to GraphQL while maintaining backward
|
||||
* compatibility with existing components and data structures.
|
||||
* @module graphql/adapters/comicAdapter
|
||||
*/
|
||||
|
||||
import { GetComicByIdQuery } from '../generated';
|
||||
|
||||
/**
|
||||
* Adapter to transform GraphQL Comic response to legacy REST API format
|
||||
* This allows gradual migration while maintaining compatibility with existing components
|
||||
* @typedef {Object} LegacyComicFormat
|
||||
* @property {string} _id - Comic document ID
|
||||
* @property {Object} rawFileDetails - Original file information
|
||||
* @property {Object} inferredMetadata - Auto-detected metadata from parsing
|
||||
* @property {Object} sourcedMetadata - Metadata from external sources (ComicVine, LOCG, etc.)
|
||||
* @property {Object} acquisition - Download/acquisition tracking data
|
||||
* @property {string} createdAt - ISO timestamp of creation
|
||||
* @property {string} updatedAt - ISO timestamp of last update
|
||||
* @property {Object} __graphql - Original GraphQL response for forward compatibility
|
||||
*/
|
||||
|
||||
/**
|
||||
* Transforms a GraphQL Comic query response to the legacy REST API format.
|
||||
* This adapter enables gradual migration by allowing components to work with
|
||||
* both new GraphQL data and legacy data structures.
|
||||
*
|
||||
* Handles:
|
||||
* - Parsing stringified JSON in sourcedMetadata fields
|
||||
* - Building inferredMetadata from canonical metadata as fallback
|
||||
* - Mapping rawFileDetails to expected structure
|
||||
* - Preserving original GraphQL data for forward compatibility
|
||||
*
|
||||
* @param {GetComicByIdQuery['comic']} graphqlComic - The GraphQL comic response object
|
||||
* @returns {LegacyComicFormat|null} Transformed comic in legacy format, or null if input is null
|
||||
* @example
|
||||
* const { data } = useGetComicByIdQuery({ id: comicId });
|
||||
* const legacyComic = adaptGraphQLComicToLegacy(data?.comic);
|
||||
* // legacyComic now has _id, rawFileDetails, sourcedMetadata, etc.
|
||||
*/
|
||||
export function adaptGraphQLComicToLegacy(graphqlComic: GetComicByIdQuery['comic']) {
|
||||
if (!graphqlComic) return null;
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user