Compare commits
30 Commits
dependabot
...
dep-hell
| Author | SHA1 | Date | |
|---|---|---|---|
|
|
4b8693fe68 | ||
| e0a383042e | |||
| eb9070966a | |||
| 00adbb2c4a | |||
|
|
3ea9b83ed9 | ||
|
|
0c363dd8ae | ||
|
|
4514f578ae | ||
|
|
2dc38b6c95 | ||
|
|
6deab0b87e | ||
|
|
81f4654b50 | ||
|
|
4e53f23e79 | ||
|
|
91e99c50d9 | ||
| 733a453352 | |||
| 3d88920f39 | |||
| 0949ebc637 | |||
| 3e045f4c10 | |||
| 17db1e64e1 | |||
| d7ab553120 | |||
| 91592019c4 | |||
| 0e8f63101c | |||
| 4e2cad790b | |||
| ba1b5bb965 | |||
| 8546641152 | |||
| 867935be39 | |||
| d506cf8ba8 | |||
| 71d7034d01 | |||
| a217d447fa | |||
| 20336e5569 | |||
| 8913e9cd99 | |||
| c392333170 |
163
.agents/skills/caveman-compress/README.md
Normal file
163
.agents/skills/caveman-compress/README.md
Normal file
@@ -0,0 +1,163 @@
|
||||
<p align="center">
|
||||
<img src="https://em-content.zobj.net/source/apple/391/rock_1faa8.png" width="80" />
|
||||
</p>
|
||||
|
||||
<h1 align="center">caveman-compress</h1>
|
||||
|
||||
<p align="center">
|
||||
<strong>shrink memory file. save token every session.</strong>
|
||||
</p>
|
||||
|
||||
---
|
||||
|
||||
A Claude Code skill that compresses your project memory files (`CLAUDE.md`, todos, preferences) into caveman format — so every session loads fewer tokens automatically.
|
||||
|
||||
Claude read `CLAUDE.md` on every session start. If file big, cost big. Caveman make file small. Cost go down forever.
|
||||
|
||||
## What It Do
|
||||
|
||||
```
|
||||
/caveman:compress CLAUDE.md
|
||||
```
|
||||
|
||||
```
|
||||
CLAUDE.md ← compressed (Claude reads this — fewer tokens every session)
|
||||
CLAUDE.original.md ← human-readable backup (you edit this)
|
||||
```
|
||||
|
||||
Original never lost. You can read and edit `.original.md`. Run skill again to re-compress after edits.
|
||||
|
||||
## Benchmarks
|
||||
|
||||
Real results on real project files:
|
||||
|
||||
| File | Original | Compressed | Saved |
|
||||
|------|----------:|----------:|------:|
|
||||
| `claude-md-preferences.md` | 706 | 285 | **59.6%** |
|
||||
| `project-notes.md` | 1145 | 535 | **53.3%** |
|
||||
| `claude-md-project.md` | 1122 | 636 | **43.3%** |
|
||||
| `todo-list.md` | 627 | 388 | **38.1%** |
|
||||
| `mixed-with-code.md` | 888 | 560 | **36.9%** |
|
||||
| **Average** | **898** | **481** | **46%** |
|
||||
|
||||
All validations passed ✅ — headings, code blocks, URLs, file paths preserved exactly.
|
||||
|
||||
## Before / After
|
||||
|
||||
<table>
|
||||
<tr>
|
||||
<td width="50%">
|
||||
|
||||
### 📄 Original (706 tokens)
|
||||
|
||||
> "I strongly prefer TypeScript with strict mode enabled for all new code. Please don't use `any` type unless there's genuinely no way around it, and if you do, leave a comment explaining the reasoning. I find that taking the time to properly type things catches a lot of bugs before they ever make it to runtime."
|
||||
|
||||
</td>
|
||||
<td width="50%">
|
||||
|
||||
### 🪨 Caveman (285 tokens)
|
||||
|
||||
> "Prefer TypeScript strict mode always. No `any` unless unavoidable — comment why if used. Proper types catch bugs early."
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
**Same instructions. 60% fewer tokens. Every. Single. Session.**
|
||||
|
||||
## Security
|
||||
|
||||
`caveman-compress` is flagged as Snyk High Risk due to subprocess and file I/O patterns detected by static analysis. This is a false positive — see [SECURITY.md](./SECURITY.md) for a full explanation of what the skill does and does not do.
|
||||
|
||||
## Install
|
||||
|
||||
Compress is built in with the `caveman` plugin. Install `caveman` once, then use `/caveman:compress`.
|
||||
|
||||
If you need local files, the compress skill lives at:
|
||||
|
||||
```bash
|
||||
caveman-compress/
|
||||
```
|
||||
|
||||
**Requires:** Python 3.10+
|
||||
|
||||
## Usage
|
||||
|
||||
```
|
||||
/caveman:compress <filepath>
|
||||
```
|
||||
|
||||
Examples:
|
||||
```
|
||||
/caveman:compress CLAUDE.md
|
||||
/caveman:compress docs/preferences.md
|
||||
/caveman:compress todos.md
|
||||
```
|
||||
|
||||
### What files work
|
||||
|
||||
| Type | Compress? |
|
||||
|------|-----------|
|
||||
| `.md`, `.txt`, `.rst` | ✅ Yes |
|
||||
| Extensionless natural language | ✅ Yes |
|
||||
| `.py`, `.js`, `.ts`, `.json`, `.yaml` | ❌ Skip (code/config) |
|
||||
| `*.original.md` | ❌ Skip (backup files) |
|
||||
|
||||
## How It Work
|
||||
|
||||
```
|
||||
/caveman:compress CLAUDE.md
|
||||
↓
|
||||
detect file type (no tokens)
|
||||
↓
|
||||
Claude compresses (tokens — one call)
|
||||
↓
|
||||
validate output (no tokens)
|
||||
checks: headings, code blocks, URLs, file paths, bullets
|
||||
↓
|
||||
if errors: Claude fixes cherry-picked issues only (tokens — targeted fix)
|
||||
does NOT recompress — only patches broken parts
|
||||
↓
|
||||
retry up to 2 times
|
||||
↓
|
||||
write compressed → CLAUDE.md
|
||||
write original → CLAUDE.original.md
|
||||
```
|
||||
|
||||
Only two things use tokens: initial compression + targeted fix if validation fails. Everything else is local Python.
|
||||
|
||||
## What Is Preserved
|
||||
|
||||
Caveman compress natural language. It never touch:
|
||||
|
||||
- Code blocks (` ``` ` fenced or indented)
|
||||
- Inline code (`` `backtick content` ``)
|
||||
- URLs and links
|
||||
- File paths (`/src/components/...`)
|
||||
- Commands (`npm install`, `git commit`)
|
||||
- Technical terms, library names, API names
|
||||
- Headings (exact text preserved)
|
||||
- Tables (structure preserved, cell text compressed)
|
||||
- Dates, version numbers, numeric values
|
||||
|
||||
## Why This Matter
|
||||
|
||||
`CLAUDE.md` loads on **every session start**. A 1000-token project memory file costs tokens every single time you open a project. Over 100 sessions that's 100,000 tokens of overhead — just for context you already wrote.
|
||||
|
||||
Caveman cut that by ~46% on average. Same instructions. Same accuracy. Less waste.
|
||||
|
||||
```
|
||||
┌────────────────────────────────────────────┐
|
||||
│ TOKEN SAVINGS PER FILE █████ 46% │
|
||||
│ SESSIONS THAT BENEFIT ██████████ 100% │
|
||||
│ INFORMATION PRESERVED ██████████ 100% │
|
||||
│ SETUP TIME █ 1x │
|
||||
└────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Part of Caveman
|
||||
|
||||
This skill is part of the [caveman](https://github.com/JuliusBrussee/caveman) toolkit — making Claude use fewer tokens without losing accuracy.
|
||||
|
||||
- **caveman** — make Claude *speak* like caveman (cuts response tokens ~65%)
|
||||
- **caveman-compress** — make Claude *read* less (cuts context tokens ~46%)
|
||||
31
.agents/skills/caveman-compress/SECURITY.md
Normal file
31
.agents/skills/caveman-compress/SECURITY.md
Normal file
@@ -0,0 +1,31 @@
|
||||
# Security
|
||||
|
||||
## Snyk High Risk Rating
|
||||
|
||||
`caveman-compress` receives a Snyk High Risk rating due to static analysis heuristics. This document explains what the skill does and does not do.
|
||||
|
||||
### What triggers the rating
|
||||
|
||||
1. **subprocess usage**: The skill calls the `claude` CLI via `subprocess.run()` as a fallback when `ANTHROPIC_API_KEY` is not set. The subprocess call uses a fixed argument list — no shell interpolation occurs. User file content is passed via stdin, not as a shell argument.
|
||||
|
||||
2. **File read/write**: The skill reads the file the user explicitly points it at, compresses it, and writes the result back to the same path. A `.original.md` backup is saved alongside it. No files outside the user-specified path are read or written.
|
||||
|
||||
### What the skill does NOT do
|
||||
|
||||
- Does not execute user file content as code
|
||||
- Does not make network requests except to Anthropic's API (via SDK or CLI)
|
||||
- Does not access files outside the path the user provides
|
||||
- Does not use shell=True or string interpolation in subprocess calls
|
||||
- Does not collect or transmit any data beyond the file being compressed
|
||||
|
||||
### Auth behavior
|
||||
|
||||
If `ANTHROPIC_API_KEY` is set, the skill uses the Anthropic Python SDK directly (no subprocess). If not set, it falls back to the `claude` CLI, which uses the user's existing Claude desktop authentication.
|
||||
|
||||
### File size limit
|
||||
|
||||
Files larger than 500KB are rejected before any API call is made.
|
||||
|
||||
### Reporting a vulnerability
|
||||
|
||||
If you believe you've found a genuine security issue, please open a GitHub issue with the label `security`.
|
||||
111
.agents/skills/caveman-compress/SKILL.md
Normal file
111
.agents/skills/caveman-compress/SKILL.md
Normal file
@@ -0,0 +1,111 @@
|
||||
---
|
||||
name: caveman-compress
|
||||
description: >
|
||||
Compress natural language memory files (CLAUDE.md, todos, preferences) into caveman format
|
||||
to save input tokens. Preserves all technical substance, code, URLs, and structure.
|
||||
Compressed version overwrites the original file. Human-readable backup saved as FILE.original.md.
|
||||
Trigger: /caveman:compress <filepath> or "compress memory file"
|
||||
---
|
||||
|
||||
# Caveman Compress
|
||||
|
||||
## Purpose
|
||||
|
||||
Compress natural language files (CLAUDE.md, todos, preferences) into caveman-speak to reduce input tokens. Compressed version overwrites original. Human-readable backup saved as `<filename>.original.md`.
|
||||
|
||||
## Trigger
|
||||
|
||||
`/caveman:compress <filepath>` or when user asks to compress a memory file.
|
||||
|
||||
## Process
|
||||
|
||||
1. The compression scripts live in `caveman-compress/scripts/` (adjacent to this SKILL.md). If the path is not immediately available, search for `caveman-compress/scripts/__main__.py`.
|
||||
|
||||
2. Run:
|
||||
|
||||
cd caveman-compress && python3 -m scripts <absolute_filepath>
|
||||
|
||||
3. The CLI will:
|
||||
- detect file type (no tokens)
|
||||
- call Claude to compress
|
||||
- validate output (no tokens)
|
||||
- if errors: cherry-pick fix with Claude (targeted fixes only, no recompression)
|
||||
- retry up to 2 times
|
||||
- if still failing after 2 retries: report error to user, leave original file untouched
|
||||
|
||||
4. Return result to user
|
||||
|
||||
## Compression Rules
|
||||
|
||||
### Remove
|
||||
- Articles: a, an, the
|
||||
- Filler: just, really, basically, actually, simply, essentially, generally
|
||||
- Pleasantries: "sure", "certainly", "of course", "happy to", "I'd recommend"
|
||||
- Hedging: "it might be worth", "you could consider", "it would be good to"
|
||||
- Redundant phrasing: "in order to" → "to", "make sure to" → "ensure", "the reason is because" → "because"
|
||||
- Connective fluff: "however", "furthermore", "additionally", "in addition"
|
||||
|
||||
### Preserve EXACTLY (never modify)
|
||||
- Code blocks (fenced ``` and indented)
|
||||
- Inline code (`backtick content`)
|
||||
- URLs and links (full URLs, markdown links)
|
||||
- File paths (`/src/components/...`, `./config.yaml`)
|
||||
- Commands (`npm install`, `git commit`, `docker build`)
|
||||
- Technical terms (library names, API names, protocols, algorithms)
|
||||
- Proper nouns (project names, people, companies)
|
||||
- Dates, version numbers, numeric values
|
||||
- Environment variables (`$HOME`, `NODE_ENV`)
|
||||
|
||||
### Preserve Structure
|
||||
- All markdown headings (keep exact heading text, compress body below)
|
||||
- Bullet point hierarchy (keep nesting level)
|
||||
- Numbered lists (keep numbering)
|
||||
- Tables (compress cell text, keep structure)
|
||||
- Frontmatter/YAML headers in markdown files
|
||||
|
||||
### Compress
|
||||
- Use short synonyms: "big" not "extensive", "fix" not "implement a solution for", "use" not "utilize"
|
||||
- Fragments OK: "Run tests before commit" not "You should always run tests before committing"
|
||||
- Drop "you should", "make sure to", "remember to" — just state the action
|
||||
- Merge redundant bullets that say the same thing differently
|
||||
- Keep one example where multiple examples show the same pattern
|
||||
|
||||
CRITICAL RULE:
|
||||
Anything inside ``` ... ``` must be copied EXACTLY.
|
||||
Do not:
|
||||
- remove comments
|
||||
- remove spacing
|
||||
- reorder lines
|
||||
- shorten commands
|
||||
- simplify anything
|
||||
|
||||
Inline code (`...`) must be preserved EXACTLY.
|
||||
Do not modify anything inside backticks.
|
||||
|
||||
If file contains code blocks:
|
||||
- Treat code blocks as read-only regions
|
||||
- Only compress text outside them
|
||||
- Do not merge sections around code
|
||||
|
||||
## Pattern
|
||||
|
||||
Original:
|
||||
> You should always make sure to run the test suite before pushing any changes to the main branch. This is important because it helps catch bugs early and prevents broken builds from being deployed to production.
|
||||
|
||||
Compressed:
|
||||
> Run tests before push to main. Catch bugs early, prevent broken prod deploys.
|
||||
|
||||
Original:
|
||||
> The application uses a microservices architecture with the following components. The API gateway handles all incoming requests and routes them to the appropriate service. The authentication service is responsible for managing user sessions and JWT tokens.
|
||||
|
||||
Compressed:
|
||||
> Microservices architecture. API gateway route all requests to services. Auth service manage user sessions + JWT tokens.
|
||||
|
||||
## Boundaries
|
||||
|
||||
- ONLY compress natural language files (.md, .txt, extensionless)
|
||||
- NEVER modify: .py, .js, .ts, .json, .yaml, .yml, .toml, .env, .lock, .css, .html, .xml, .sql, .sh
|
||||
- If file has mixed content (prose + code), compress ONLY the prose sections
|
||||
- If unsure whether something is code or prose, leave it unchanged
|
||||
- Original file is backed up as FILE.original.md before overwriting
|
||||
- Never compress FILE.original.md (skip it)
|
||||
9
.agents/skills/caveman-compress/scripts/__init__.py
Normal file
9
.agents/skills/caveman-compress/scripts/__init__.py
Normal file
@@ -0,0 +1,9 @@
|
||||
"""Caveman compress scripts.
|
||||
|
||||
This package provides tools to compress natural language markdown files
|
||||
into caveman format to save input tokens.
|
||||
"""
|
||||
|
||||
__all__ = ["cli", "compress", "detect", "validate"]
|
||||
|
||||
__version__ = "1.0.0"
|
||||
3
.agents/skills/caveman-compress/scripts/__main__.py
Normal file
3
.agents/skills/caveman-compress/scripts/__main__.py
Normal file
@@ -0,0 +1,3 @@
|
||||
from .cli import main
|
||||
|
||||
main()
|
||||
78
.agents/skills/caveman-compress/scripts/benchmark.py
Normal file
78
.agents/skills/caveman-compress/scripts/benchmark.py
Normal file
@@ -0,0 +1,78 @@
|
||||
#!/usr/bin/env python3
|
||||
from pathlib import Path
|
||||
import sys
|
||||
|
||||
# Support both direct execution and module import
|
||||
try:
|
||||
from .validate import validate
|
||||
except ImportError:
|
||||
sys.path.insert(0, str(Path(__file__).parent))
|
||||
from validate import validate
|
||||
|
||||
try:
|
||||
import tiktoken
|
||||
_enc = tiktoken.get_encoding("o200k_base")
|
||||
except ImportError:
|
||||
_enc = None
|
||||
|
||||
|
||||
def count_tokens(text):
|
||||
if _enc is None:
|
||||
return len(text.split()) # fallback: word count
|
||||
return len(_enc.encode(text))
|
||||
|
||||
|
||||
def benchmark_pair(orig_path: Path, comp_path: Path):
|
||||
orig_text = orig_path.read_text()
|
||||
comp_text = comp_path.read_text()
|
||||
|
||||
orig_tokens = count_tokens(orig_text)
|
||||
comp_tokens = count_tokens(comp_text)
|
||||
saved = 100 * (orig_tokens - comp_tokens) / orig_tokens if orig_tokens > 0 else 0.0
|
||||
result = validate(orig_path, comp_path)
|
||||
|
||||
return (comp_path.name, orig_tokens, comp_tokens, saved, result.is_valid)
|
||||
|
||||
|
||||
def print_table(rows):
|
||||
print("\n| File | Original | Compressed | Saved % | Valid |")
|
||||
print("|------|----------|------------|---------|-------|")
|
||||
for r in rows:
|
||||
print(f"| {r[0]} | {r[1]} | {r[2]} | {r[3]:.1f}% | {'✅' if r[4] else '❌'} |")
|
||||
|
||||
|
||||
def main():
|
||||
# Direct file pair: python3 benchmark.py original.md compressed.md
|
||||
if len(sys.argv) == 3:
|
||||
orig = Path(sys.argv[1]).resolve()
|
||||
comp = Path(sys.argv[2]).resolve()
|
||||
if not orig.exists():
|
||||
print(f"❌ Not found: {orig}")
|
||||
sys.exit(1)
|
||||
if not comp.exists():
|
||||
print(f"❌ Not found: {comp}")
|
||||
sys.exit(1)
|
||||
print_table([benchmark_pair(orig, comp)])
|
||||
return
|
||||
|
||||
# Glob mode: repo_root/tests/caveman-compress/
|
||||
tests_dir = Path(__file__).parent.parent.parent / "tests" / "caveman-compress"
|
||||
if not tests_dir.exists():
|
||||
print(f"❌ Tests dir not found: {tests_dir}")
|
||||
sys.exit(1)
|
||||
|
||||
rows = []
|
||||
for orig in sorted(tests_dir.glob("*.original.md")):
|
||||
comp = orig.with_name(orig.stem.removesuffix(".original") + ".md")
|
||||
if comp.exists():
|
||||
rows.append(benchmark_pair(orig, comp))
|
||||
|
||||
if not rows:
|
||||
print("No compressed file pairs found.")
|
||||
return
|
||||
|
||||
print_table(rows)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
73
.agents/skills/caveman-compress/scripts/cli.py
Normal file
73
.agents/skills/caveman-compress/scripts/cli.py
Normal file
@@ -0,0 +1,73 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Caveman Compress CLI
|
||||
|
||||
Usage:
|
||||
caveman <filepath>
|
||||
"""
|
||||
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
from .compress import compress_file
|
||||
from .detect import detect_file_type, should_compress
|
||||
|
||||
|
||||
def print_usage():
|
||||
print("Usage: caveman <filepath>")
|
||||
|
||||
|
||||
def main():
|
||||
if len(sys.argv) != 2:
|
||||
print_usage()
|
||||
sys.exit(1)
|
||||
|
||||
filepath = Path(sys.argv[1])
|
||||
|
||||
# Check file exists
|
||||
if not filepath.exists():
|
||||
print(f"❌ File not found: {filepath}")
|
||||
sys.exit(1)
|
||||
|
||||
if not filepath.is_file():
|
||||
print(f"❌ Not a file: {filepath}")
|
||||
sys.exit(1)
|
||||
|
||||
filepath = filepath.resolve()
|
||||
|
||||
# Detect file type
|
||||
file_type = detect_file_type(filepath)
|
||||
|
||||
print(f"Detected: {file_type}")
|
||||
|
||||
# Check if compressible
|
||||
if not should_compress(filepath):
|
||||
print("Skipping: file is not natural language (code/config)")
|
||||
sys.exit(0)
|
||||
|
||||
print("Starting caveman compression...\n")
|
||||
|
||||
try:
|
||||
success = compress_file(filepath)
|
||||
|
||||
if success:
|
||||
print("\nCompression completed successfully")
|
||||
backup_path = filepath.with_name(filepath.stem + ".original.md")
|
||||
print(f"Compressed: {filepath}")
|
||||
print(f"Original: {backup_path}")
|
||||
sys.exit(0)
|
||||
else:
|
||||
print("\n❌ Compression failed after retries")
|
||||
sys.exit(2)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("\nInterrupted by user")
|
||||
sys.exit(130)
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n❌ Error: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
227
.agents/skills/caveman-compress/scripts/compress.py
Normal file
227
.agents/skills/caveman-compress/scripts/compress.py
Normal file
@@ -0,0 +1,227 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Caveman Memory Compression Orchestrator
|
||||
|
||||
Usage:
|
||||
python scripts/compress.py <filepath>
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from typing import List
|
||||
|
||||
OUTER_FENCE_REGEX = re.compile(
|
||||
r"\A\s*(`{3,}|~{3,})[^\n]*\n(.*)\n\1\s*\Z", re.DOTALL
|
||||
)
|
||||
|
||||
# Filenames and paths that almost certainly hold secrets or PII. Compressing
|
||||
# them ships raw bytes to the Anthropic API — a third-party data boundary that
|
||||
# developers on sensitive codebases cannot cross. detect.py already skips .env
|
||||
# by extension, but credentials.md / secrets.txt / ~/.aws/credentials would
|
||||
# slip through the natural-language filter. This is a hard refuse before read.
|
||||
SENSITIVE_BASENAME_REGEX = re.compile(
|
||||
r"(?ix)^("
|
||||
r"\.env(\..+)?"
|
||||
r"|\.netrc"
|
||||
r"|credentials(\..+)?"
|
||||
r"|secrets?(\..+)?"
|
||||
r"|passwords?(\..+)?"
|
||||
r"|id_(rsa|dsa|ecdsa|ed25519)(\.pub)?"
|
||||
r"|authorized_keys"
|
||||
r"|known_hosts"
|
||||
r"|.*\.(pem|key|p12|pfx|crt|cer|jks|keystore|asc|gpg)"
|
||||
r")$"
|
||||
)
|
||||
|
||||
SENSITIVE_PATH_COMPONENTS = frozenset({".ssh", ".aws", ".gnupg", ".kube", ".docker"})
|
||||
|
||||
SENSITIVE_NAME_TOKENS = (
|
||||
"secret", "credential", "password", "passwd",
|
||||
"apikey", "accesskey", "token", "privatekey",
|
||||
)
|
||||
|
||||
|
||||
def is_sensitive_path(filepath: Path) -> bool:
|
||||
"""Heuristic denylist for files that must never be shipped to a third-party API."""
|
||||
name = filepath.name
|
||||
if SENSITIVE_BASENAME_REGEX.match(name):
|
||||
return True
|
||||
lowered_parts = {p.lower() for p in filepath.parts}
|
||||
if lowered_parts & SENSITIVE_PATH_COMPONENTS:
|
||||
return True
|
||||
# Normalize separators so "api-key" and "api_key" both match "apikey".
|
||||
lower = re.sub(r"[_\-\s.]", "", name.lower())
|
||||
return any(tok in lower for tok in SENSITIVE_NAME_TOKENS)
|
||||
|
||||
|
||||
def strip_llm_wrapper(text: str) -> str:
|
||||
"""Strip outer ```markdown ... ``` fence when it wraps the entire output."""
|
||||
m = OUTER_FENCE_REGEX.match(text)
|
||||
if m:
|
||||
return m.group(2)
|
||||
return text
|
||||
|
||||
from .detect import should_compress
|
||||
from .validate import validate
|
||||
|
||||
MAX_RETRIES = 2
|
||||
|
||||
|
||||
# ---------- Claude Calls ----------
|
||||
|
||||
|
||||
def call_claude(prompt: str) -> str:
|
||||
api_key = os.environ.get("ANTHROPIC_API_KEY")
|
||||
if api_key:
|
||||
try:
|
||||
import anthropic
|
||||
|
||||
client = anthropic.Anthropic(api_key=api_key)
|
||||
msg = client.messages.create(
|
||||
model=os.environ.get("CAVEMAN_MODEL", "claude-sonnet-4-5"),
|
||||
max_tokens=8192,
|
||||
messages=[{"role": "user", "content": prompt}],
|
||||
)
|
||||
return strip_llm_wrapper(msg.content[0].text.strip())
|
||||
except ImportError:
|
||||
pass # anthropic not installed, fall back to CLI
|
||||
# Fallback: use claude CLI (handles desktop auth)
|
||||
try:
|
||||
result = subprocess.run(
|
||||
["claude", "--print"],
|
||||
input=prompt,
|
||||
text=True,
|
||||
capture_output=True,
|
||||
check=True,
|
||||
)
|
||||
return strip_llm_wrapper(result.stdout.strip())
|
||||
except subprocess.CalledProcessError as e:
|
||||
raise RuntimeError(f"Claude call failed:\n{e.stderr}")
|
||||
|
||||
|
||||
def build_compress_prompt(original: str) -> str:
|
||||
return f"""
|
||||
Compress this markdown into caveman format.
|
||||
|
||||
STRICT RULES:
|
||||
- Do NOT modify anything inside ``` code blocks
|
||||
- Do NOT modify anything inside inline backticks
|
||||
- Preserve ALL URLs exactly
|
||||
- Preserve ALL headings exactly
|
||||
- Preserve file paths and commands
|
||||
- Return ONLY the compressed markdown body — do NOT wrap the entire output in a ```markdown fence or any other fence. Inner code blocks from the original stay as-is; do not add a new outer fence around the whole file.
|
||||
|
||||
Only compress natural language.
|
||||
|
||||
TEXT:
|
||||
{original}
|
||||
"""
|
||||
|
||||
|
||||
def build_fix_prompt(original: str, compressed: str, errors: List[str]) -> str:
|
||||
errors_str = "\n".join(f"- {e}" for e in errors)
|
||||
return f"""You are fixing a caveman-compressed markdown file. Specific validation errors were found.
|
||||
|
||||
CRITICAL RULES:
|
||||
- DO NOT recompress or rephrase the file
|
||||
- ONLY fix the listed errors — leave everything else exactly as-is
|
||||
- The ORIGINAL is provided as reference only (to restore missing content)
|
||||
- Preserve caveman style in all untouched sections
|
||||
|
||||
ERRORS TO FIX:
|
||||
{errors_str}
|
||||
|
||||
HOW TO FIX:
|
||||
- Missing URL: find it in ORIGINAL, restore it exactly where it belongs in COMPRESSED
|
||||
- Code block mismatch: find the exact code block in ORIGINAL, restore it in COMPRESSED
|
||||
- Heading mismatch: restore the exact heading text from ORIGINAL into COMPRESSED
|
||||
- Do not touch any section not mentioned in the errors
|
||||
|
||||
ORIGINAL (reference only):
|
||||
{original}
|
||||
|
||||
COMPRESSED (fix this):
|
||||
{compressed}
|
||||
|
||||
Return ONLY the fixed compressed file. No explanation.
|
||||
"""
|
||||
|
||||
|
||||
# ---------- Core Logic ----------
|
||||
|
||||
|
||||
def compress_file(filepath: Path) -> bool:
|
||||
# Resolve and validate path
|
||||
filepath = filepath.resolve()
|
||||
MAX_FILE_SIZE = 500_000 # 500KB
|
||||
if not filepath.exists():
|
||||
raise FileNotFoundError(f"File not found: {filepath}")
|
||||
if filepath.stat().st_size > MAX_FILE_SIZE:
|
||||
raise ValueError(f"File too large to compress safely (max 500KB): {filepath}")
|
||||
|
||||
# Refuse files that look like they contain secrets or PII. Compressing ships
|
||||
# the raw bytes to the Anthropic API — a third-party boundary — so we fail
|
||||
# loudly rather than silently exfiltrate credentials or keys. Override is
|
||||
# intentional: the user must rename the file if the heuristic is wrong.
|
||||
if is_sensitive_path(filepath):
|
||||
raise ValueError(
|
||||
f"Refusing to compress {filepath}: filename looks sensitive "
|
||||
"(credentials, keys, secrets, or known private paths). "
|
||||
"Compression sends file contents to the Anthropic API. "
|
||||
"Rename the file if this is a false positive."
|
||||
)
|
||||
|
||||
print(f"Processing: {filepath}")
|
||||
|
||||
if not should_compress(filepath):
|
||||
print("Skipping (not natural language)")
|
||||
return False
|
||||
|
||||
original_text = filepath.read_text(errors="ignore")
|
||||
backup_path = filepath.with_name(filepath.stem + ".original.md")
|
||||
|
||||
# Check if backup already exists to prevent accidental overwriting
|
||||
if backup_path.exists():
|
||||
print(f"⚠️ Backup file already exists: {backup_path}")
|
||||
print("The original backup may contain important content.")
|
||||
print("Aborting to prevent data loss. Please remove or rename the backup file if you want to proceed.")
|
||||
return False
|
||||
|
||||
# Step 1: Compress
|
||||
print("Compressing with Claude...")
|
||||
compressed = call_claude(build_compress_prompt(original_text))
|
||||
|
||||
# Save original as backup, write compressed to original path
|
||||
backup_path.write_text(original_text)
|
||||
filepath.write_text(compressed)
|
||||
|
||||
# Step 2: Validate + Retry
|
||||
for attempt in range(MAX_RETRIES):
|
||||
print(f"\nValidation attempt {attempt + 1}")
|
||||
|
||||
result = validate(backup_path, filepath)
|
||||
|
||||
if result.is_valid:
|
||||
print("Validation passed")
|
||||
break
|
||||
|
||||
print("❌ Validation failed:")
|
||||
for err in result.errors:
|
||||
print(f" - {err}")
|
||||
|
||||
if attempt == MAX_RETRIES - 1:
|
||||
# Restore original on failure
|
||||
filepath.write_text(original_text)
|
||||
backup_path.unlink(missing_ok=True)
|
||||
print("❌ Failed after retries — original restored")
|
||||
return False
|
||||
|
||||
print("Fixing with Claude...")
|
||||
compressed = call_claude(
|
||||
build_fix_prompt(original_text, compressed, result.errors)
|
||||
)
|
||||
filepath.write_text(compressed)
|
||||
|
||||
return True
|
||||
121
.agents/skills/caveman-compress/scripts/detect.py
Normal file
121
.agents/skills/caveman-compress/scripts/detect.py
Normal file
@@ -0,0 +1,121 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Detect whether a file is natural language (compressible) or code/config (skip)."""
|
||||
|
||||
import json
|
||||
import re
|
||||
from pathlib import Path
|
||||
|
||||
# Extensions that are natural language and compressible
|
||||
COMPRESSIBLE_EXTENSIONS = {".md", ".txt", ".markdown", ".rst"}
|
||||
|
||||
# Extensions that are code/config and should be skipped
|
||||
SKIP_EXTENSIONS = {
|
||||
".py", ".js", ".ts", ".tsx", ".jsx", ".json", ".yaml", ".yml",
|
||||
".toml", ".env", ".lock", ".css", ".scss", ".html", ".xml",
|
||||
".sql", ".sh", ".bash", ".zsh", ".go", ".rs", ".java", ".c",
|
||||
".cpp", ".h", ".hpp", ".rb", ".php", ".swift", ".kt", ".lua",
|
||||
".dockerfile", ".makefile", ".csv", ".ini", ".cfg",
|
||||
}
|
||||
|
||||
# Patterns that indicate a line is code
|
||||
CODE_PATTERNS = [
|
||||
re.compile(r"^\s*(import |from .+ import |require\(|const |let |var )"),
|
||||
re.compile(r"^\s*(def |class |function |async function |export )"),
|
||||
re.compile(r"^\s*(if\s*\(|for\s*\(|while\s*\(|switch\s*\(|try\s*\{)"),
|
||||
re.compile(r"^\s*[\}\]\);]+\s*$"), # closing braces/brackets
|
||||
re.compile(r"^\s*@\w+"), # decorators/annotations
|
||||
re.compile(r'^\s*"[^"]+"\s*:\s*'), # JSON-like key-value
|
||||
re.compile(r"^\s*\w+\s*=\s*[{\[\(\"']"), # assignment with literal
|
||||
]
|
||||
|
||||
|
||||
def _is_code_line(line: str) -> bool:
|
||||
"""Check if a line looks like code."""
|
||||
return any(p.match(line) for p in CODE_PATTERNS)
|
||||
|
||||
|
||||
def _is_json_content(text: str) -> bool:
|
||||
"""Check if content is valid JSON."""
|
||||
try:
|
||||
json.loads(text)
|
||||
return True
|
||||
except (json.JSONDecodeError, ValueError):
|
||||
return False
|
||||
|
||||
|
||||
def _is_yaml_content(lines: list[str]) -> bool:
|
||||
"""Heuristic: check if content looks like YAML."""
|
||||
yaml_indicators = 0
|
||||
for line in lines[:30]:
|
||||
stripped = line.strip()
|
||||
if stripped.startswith("---"):
|
||||
yaml_indicators += 1
|
||||
elif re.match(r"^\w[\w\s]*:\s", stripped):
|
||||
yaml_indicators += 1
|
||||
elif stripped.startswith("- ") and ":" in stripped:
|
||||
yaml_indicators += 1
|
||||
# If most non-empty lines look like YAML
|
||||
non_empty = sum(1 for l in lines[:30] if l.strip())
|
||||
return non_empty > 0 and yaml_indicators / non_empty > 0.6
|
||||
|
||||
|
||||
def detect_file_type(filepath: Path) -> str:
|
||||
"""Classify a file as 'natural_language', 'code', 'config', or 'unknown'.
|
||||
|
||||
Returns:
|
||||
One of: 'natural_language', 'code', 'config', 'unknown'
|
||||
"""
|
||||
ext = filepath.suffix.lower()
|
||||
|
||||
# Extension-based classification
|
||||
if ext in COMPRESSIBLE_EXTENSIONS:
|
||||
return "natural_language"
|
||||
if ext in SKIP_EXTENSIONS:
|
||||
return "code" if ext not in {".json", ".yaml", ".yml", ".toml", ".ini", ".cfg", ".env"} else "config"
|
||||
|
||||
# Extensionless files (like CLAUDE.md, TODO) — check content
|
||||
if not ext:
|
||||
try:
|
||||
text = filepath.read_text(errors="ignore")
|
||||
except (OSError, PermissionError):
|
||||
return "unknown"
|
||||
|
||||
lines = text.splitlines()[:50]
|
||||
|
||||
if _is_json_content(text[:10000]):
|
||||
return "config"
|
||||
if _is_yaml_content(lines):
|
||||
return "config"
|
||||
|
||||
code_lines = sum(1 for l in lines if l.strip() and _is_code_line(l))
|
||||
non_empty = sum(1 for l in lines if l.strip())
|
||||
if non_empty > 0 and code_lines / non_empty > 0.4:
|
||||
return "code"
|
||||
|
||||
return "natural_language"
|
||||
|
||||
return "unknown"
|
||||
|
||||
|
||||
def should_compress(filepath: Path) -> bool:
|
||||
"""Return True if the file is natural language and should be compressed."""
|
||||
if not filepath.is_file():
|
||||
return False
|
||||
# Skip backup files
|
||||
if filepath.name.endswith(".original.md"):
|
||||
return False
|
||||
return detect_file_type(filepath) == "natural_language"
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
if len(sys.argv) < 2:
|
||||
print("Usage: python detect.py <file1> [file2] ...")
|
||||
sys.exit(1)
|
||||
|
||||
for path_str in sys.argv[1:]:
|
||||
p = Path(path_str).resolve()
|
||||
file_type = detect_file_type(p)
|
||||
compress = should_compress(p)
|
||||
print(f" {p.name:30s} type={file_type:20s} compress={compress}")
|
||||
189
.agents/skills/caveman-compress/scripts/validate.py
Normal file
189
.agents/skills/caveman-compress/scripts/validate.py
Normal file
@@ -0,0 +1,189 @@
|
||||
#!/usr/bin/env python3
|
||||
import re
|
||||
from pathlib import Path
|
||||
|
||||
URL_REGEX = re.compile(r"https?://[^\s)]+")
|
||||
FENCE_OPEN_REGEX = re.compile(r"^(\s{0,3})(`{3,}|~{3,})(.*)$")
|
||||
HEADING_REGEX = re.compile(r"^(#{1,6})\s+(.*)", re.MULTILINE)
|
||||
BULLET_REGEX = re.compile(r"^\s*[-*+]\s+", re.MULTILINE)
|
||||
|
||||
# crude but effective path detection
|
||||
# Requires either a path prefix (./ ../ / or drive letter) or a slash/backslash within the match
|
||||
PATH_REGEX = re.compile(r"(?:\./|\.\./|/|[A-Za-z]:\\)[\w\-/\\\.]+|[\w\-\.]+[/\\][\w\-/\\\.]+")
|
||||
|
||||
|
||||
class ValidationResult:
|
||||
def __init__(self):
|
||||
self.is_valid = True
|
||||
self.errors = []
|
||||
self.warnings = []
|
||||
|
||||
def add_error(self, msg):
|
||||
self.is_valid = False
|
||||
self.errors.append(msg)
|
||||
|
||||
def add_warning(self, msg):
|
||||
self.warnings.append(msg)
|
||||
|
||||
|
||||
def read_file(path: Path) -> str:
|
||||
return path.read_text(errors="ignore")
|
||||
|
||||
|
||||
# ---------- Extractors ----------
|
||||
|
||||
|
||||
def extract_headings(text):
|
||||
return [(level, title.strip()) for level, title in HEADING_REGEX.findall(text)]
|
||||
|
||||
|
||||
def extract_code_blocks(text):
|
||||
"""Line-based fenced code block extractor.
|
||||
|
||||
Handles ``` and ~~~ fences with variable length (CommonMark: closing
|
||||
fence must use same char and be at least as long as opening). Supports
|
||||
nested fences (e.g. an outer 4-backtick block wrapping inner 3-backtick
|
||||
content).
|
||||
"""
|
||||
blocks = []
|
||||
lines = text.split("\n")
|
||||
i = 0
|
||||
n = len(lines)
|
||||
while i < n:
|
||||
m = FENCE_OPEN_REGEX.match(lines[i])
|
||||
if not m:
|
||||
i += 1
|
||||
continue
|
||||
fence_char = m.group(2)[0]
|
||||
fence_len = len(m.group(2))
|
||||
open_line = lines[i]
|
||||
block_lines = [open_line]
|
||||
i += 1
|
||||
closed = False
|
||||
while i < n:
|
||||
close_m = FENCE_OPEN_REGEX.match(lines[i])
|
||||
if (
|
||||
close_m
|
||||
and close_m.group(2)[0] == fence_char
|
||||
and len(close_m.group(2)) >= fence_len
|
||||
and close_m.group(3).strip() == ""
|
||||
):
|
||||
block_lines.append(lines[i])
|
||||
closed = True
|
||||
i += 1
|
||||
break
|
||||
block_lines.append(lines[i])
|
||||
i += 1
|
||||
if closed:
|
||||
blocks.append("\n".join(block_lines))
|
||||
# Unclosed fences are silently skipped — they indicate malformed markdown
|
||||
# and including them would cause false-positive validation failures.
|
||||
return blocks
|
||||
|
||||
|
||||
def extract_urls(text):
|
||||
return set(URL_REGEX.findall(text))
|
||||
|
||||
|
||||
def extract_paths(text):
|
||||
return set(PATH_REGEX.findall(text))
|
||||
|
||||
|
||||
def count_bullets(text):
|
||||
return len(BULLET_REGEX.findall(text))
|
||||
|
||||
|
||||
# ---------- Validators ----------
|
||||
|
||||
|
||||
def validate_headings(orig, comp, result):
|
||||
h1 = extract_headings(orig)
|
||||
h2 = extract_headings(comp)
|
||||
|
||||
if len(h1) != len(h2):
|
||||
result.add_error(f"Heading count mismatch: {len(h1)} vs {len(h2)}")
|
||||
|
||||
if h1 != h2:
|
||||
result.add_warning("Heading text/order changed")
|
||||
|
||||
|
||||
def validate_code_blocks(orig, comp, result):
|
||||
c1 = extract_code_blocks(orig)
|
||||
c2 = extract_code_blocks(comp)
|
||||
|
||||
if c1 != c2:
|
||||
result.add_error("Code blocks not preserved exactly")
|
||||
|
||||
|
||||
def validate_urls(orig, comp, result):
|
||||
u1 = extract_urls(orig)
|
||||
u2 = extract_urls(comp)
|
||||
|
||||
if u1 != u2:
|
||||
result.add_error(f"URL mismatch: lost={u1 - u2}, added={u2 - u1}")
|
||||
|
||||
|
||||
def validate_paths(orig, comp, result):
|
||||
p1 = extract_paths(orig)
|
||||
p2 = extract_paths(comp)
|
||||
|
||||
if p1 != p2:
|
||||
result.add_warning(f"Path mismatch: lost={p1 - p2}, added={p2 - p1}")
|
||||
|
||||
|
||||
def validate_bullets(orig, comp, result):
|
||||
b1 = count_bullets(orig)
|
||||
b2 = count_bullets(comp)
|
||||
|
||||
if b1 == 0:
|
||||
return
|
||||
|
||||
diff = abs(b1 - b2) / b1
|
||||
|
||||
if diff > 0.15:
|
||||
result.add_warning(f"Bullet count changed too much: {b1} -> {b2}")
|
||||
|
||||
|
||||
# ---------- Main ----------
|
||||
|
||||
|
||||
def validate(original_path: Path, compressed_path: Path) -> ValidationResult:
|
||||
result = ValidationResult()
|
||||
|
||||
orig = read_file(original_path)
|
||||
comp = read_file(compressed_path)
|
||||
|
||||
validate_headings(orig, comp, result)
|
||||
validate_code_blocks(orig, comp, result)
|
||||
validate_urls(orig, comp, result)
|
||||
validate_paths(orig, comp, result)
|
||||
validate_bullets(orig, comp, result)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
# ---------- CLI ----------
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
if len(sys.argv) != 3:
|
||||
print("Usage: python validate.py <original> <compressed>")
|
||||
sys.exit(1)
|
||||
|
||||
orig = Path(sys.argv[1]).resolve()
|
||||
comp = Path(sys.argv[2]).resolve()
|
||||
|
||||
res = validate(orig, comp)
|
||||
|
||||
print(f"\nValid: {res.is_valid}")
|
||||
|
||||
if res.errors:
|
||||
print("\nErrors:")
|
||||
for e in res.errors:
|
||||
print(f" - {e}")
|
||||
|
||||
if res.warnings:
|
||||
print("\nWarnings:")
|
||||
for w in res.warnings:
|
||||
print(f" - {w}")
|
||||
59
.agents/skills/caveman-help/SKILL.md
Normal file
59
.agents/skills/caveman-help/SKILL.md
Normal file
@@ -0,0 +1,59 @@
|
||||
---
|
||||
name: caveman-help
|
||||
description: >
|
||||
Quick-reference card for all caveman modes, skills, and commands.
|
||||
One-shot display, not a persistent mode. Trigger: /caveman-help,
|
||||
"caveman help", "what caveman commands", "how do I use caveman".
|
||||
---
|
||||
|
||||
# Caveman Help
|
||||
|
||||
Display this reference card when invoked. One-shot — do NOT change mode, write flag files, or persist anything. Output in caveman style.
|
||||
|
||||
## Modes
|
||||
|
||||
| Mode | Trigger | What change |
|
||||
|------|---------|-------------|
|
||||
| **Lite** | `/caveman lite` | Drop filler. Keep sentence structure. |
|
||||
| **Full** | `/caveman` | Drop articles, filler, pleasantries, hedging. Fragments OK. Default. |
|
||||
| **Ultra** | `/caveman ultra` | Extreme compression. Bare fragments. Tables over prose. |
|
||||
| **Wenyan-Lite** | `/caveman wenyan-lite` | Classical Chinese style, light compression. |
|
||||
| **Wenyan-Full** | `/caveman wenyan` | Full 文言文. Maximum classical terseness. |
|
||||
| **Wenyan-Ultra** | `/caveman wenyan-ultra` | Extreme. Ancient scholar on a budget. |
|
||||
|
||||
Mode stick until changed or session end.
|
||||
|
||||
## Skills
|
||||
|
||||
| Skill | Trigger | What it do |
|
||||
|-------|---------|-----------|
|
||||
| **caveman-commit** | `/caveman-commit` | Terse commit messages. Conventional Commits. ≤50 char subject. |
|
||||
| **caveman-review** | `/caveman-review` | One-line PR comments: `L42: bug: user null. Add guard.` |
|
||||
| **caveman-compress** | `/caveman:compress <file>` | Compress .md files to caveman prose. Saves ~46% input tokens. |
|
||||
| **caveman-help** | `/caveman-help` | This card. |
|
||||
|
||||
## Deactivate
|
||||
|
||||
Say "stop caveman" or "normal mode". Resume anytime with `/caveman`.
|
||||
|
||||
## Configure Default Mode
|
||||
|
||||
Default mode = `full`. Change it:
|
||||
|
||||
**Environment variable** (highest priority):
|
||||
```bash
|
||||
export CAVEMAN_DEFAULT_MODE=ultra
|
||||
```
|
||||
|
||||
**Config file** (`~/.config/caveman/config.json`):
|
||||
```json
|
||||
{ "defaultMode": "lite" }
|
||||
```
|
||||
|
||||
Set `"off"` to disable auto-activation on session start. User can still activate manually with `/caveman`.
|
||||
|
||||
Resolution: env var > config file > `full`.
|
||||
|
||||
## More
|
||||
|
||||
Full docs: https://github.com/JuliusBrussee/caveman
|
||||
67
.agents/skills/caveman/SKILL.md
Normal file
67
.agents/skills/caveman/SKILL.md
Normal file
@@ -0,0 +1,67 @@
|
||||
---
|
||||
name: caveman
|
||||
description: >
|
||||
Ultra-compressed communication mode. Cuts token usage ~75% by speaking like caveman
|
||||
while keeping full technical accuracy. Supports intensity levels: lite, full (default), ultra,
|
||||
wenyan-lite, wenyan-full, wenyan-ultra.
|
||||
Use when user says "caveman mode", "talk like caveman", "use caveman", "less tokens",
|
||||
"be brief", or invokes /caveman. Also auto-triggers when token efficiency is requested.
|
||||
---
|
||||
|
||||
Respond terse like smart caveman. All technical substance stay. Only fluff die.
|
||||
|
||||
## Persistence
|
||||
|
||||
ACTIVE EVERY RESPONSE. No revert after many turns. No filler drift. Still active if unsure. Off only: "stop caveman" / "normal mode".
|
||||
|
||||
Default: **full**. Switch: `/caveman lite|full|ultra`.
|
||||
|
||||
## Rules
|
||||
|
||||
Drop: articles (a/an/the), filler (just/really/basically/actually/simply), pleasantries (sure/certainly/of course/happy to), hedging. Fragments OK. Short synonyms (big not extensive, fix not "implement a solution for"). Technical terms exact. Code blocks unchanged. Errors quoted exact.
|
||||
|
||||
Pattern: `[thing] [action] [reason]. [next step].`
|
||||
|
||||
Not: "Sure! I'd be happy to help you with that. The issue you're experiencing is likely caused by..."
|
||||
Yes: "Bug in auth middleware. Token expiry check use `<` not `<=`. Fix:"
|
||||
|
||||
## Intensity
|
||||
|
||||
| Level | What change |
|
||||
|-------|------------|
|
||||
| **lite** | No filler/hedging. Keep articles + full sentences. Professional but tight |
|
||||
| **full** | Drop articles, fragments OK, short synonyms. Classic caveman |
|
||||
| **ultra** | Abbreviate (DB/auth/config/req/res/fn/impl), strip conjunctions, arrows for causality (X → Y), one word when one word enough |
|
||||
| **wenyan-lite** | Semi-classical. Drop filler/hedging but keep grammar structure, classical register |
|
||||
| **wenyan-full** | Maximum classical terseness. Fully 文言文. 80-90% character reduction. Classical sentence patterns, verbs precede objects, subjects often omitted, classical particles (之/乃/為/其) |
|
||||
| **wenyan-ultra** | Extreme abbreviation while keeping classical Chinese feel. Maximum compression, ultra terse |
|
||||
|
||||
Example — "Why React component re-render?"
|
||||
- lite: "Your component re-renders because you create a new object reference each render. Wrap it in `useMemo`."
|
||||
- full: "New object ref each render. Inline object prop = new ref = re-render. Wrap in `useMemo`."
|
||||
- ultra: "Inline obj prop → new ref → re-render. `useMemo`."
|
||||
- wenyan-lite: "組件頻重繪,以每繪新生對象參照故。以 useMemo 包之。"
|
||||
- wenyan-full: "物出新參照,致重繪。useMemo .Wrap之。"
|
||||
- wenyan-ultra: "新參照→重繪。useMemo Wrap。"
|
||||
|
||||
Example — "Explain database connection pooling."
|
||||
- lite: "Connection pooling reuses open connections instead of creating new ones per request. Avoids repeated handshake overhead."
|
||||
- full: "Pool reuse open DB connections. No new connection per request. Skip handshake overhead."
|
||||
- ultra: "Pool = reuse DB conn. Skip handshake → fast under load."
|
||||
- wenyan-full: "池reuse open connection。不每req新開。skip handshake overhead。"
|
||||
- wenyan-ultra: "池reuse conn。skip handshake → fast。"
|
||||
|
||||
## Auto-Clarity
|
||||
|
||||
Drop caveman for: security warnings, irreversible action confirmations, multi-step sequences where fragment order risks misread, user asks to clarify or repeats question. Resume caveman after clear part done.
|
||||
|
||||
Example — destructive op:
|
||||
> **Warning:** This will permanently delete all rows in the `users` table and cannot be undone.
|
||||
> ```sql
|
||||
> DROP TABLE users;
|
||||
> ```
|
||||
> Caveman resume. Verify backup exist first.
|
||||
|
||||
## Boundaries
|
||||
|
||||
Code/commits/PRs: write normal. "stop caveman" or "normal mode": revert. Level persist until changed or session end.
|
||||
111
.agents/skills/compress/SKILL.md
Normal file
111
.agents/skills/compress/SKILL.md
Normal file
@@ -0,0 +1,111 @@
|
||||
---
|
||||
name: compress
|
||||
description: >
|
||||
Compress natural language memory files (CLAUDE.md, todos, preferences) into caveman format
|
||||
to save input tokens. Preserves all technical substance, code, URLs, and structure.
|
||||
Compressed version overwrites the original file. Human-readable backup saved as FILE.original.md.
|
||||
Trigger: /caveman:compress <filepath> or "compress memory file"
|
||||
---
|
||||
|
||||
# Caveman Compress
|
||||
|
||||
## Purpose
|
||||
|
||||
Compress natural language files (CLAUDE.md, todos, preferences) into caveman-speak to reduce input tokens. Compressed version overwrites original. Human-readable backup saved as `<filename>.original.md`.
|
||||
|
||||
## Trigger
|
||||
|
||||
`/caveman:compress <filepath>` or when user asks to compress a memory file.
|
||||
|
||||
## Process
|
||||
|
||||
1. This SKILL.md lives alongside `scripts/` in the same directory. Find that directory.
|
||||
|
||||
2. Run:
|
||||
|
||||
cd <directory_containing_this_SKILL.md> && python3 -m scripts <absolute_filepath>
|
||||
|
||||
3. The CLI will:
|
||||
- detect file type (no tokens)
|
||||
- call Claude to compress
|
||||
- validate output (no tokens)
|
||||
- if errors: cherry-pick fix with Claude (targeted fixes only, no recompression)
|
||||
- retry up to 2 times
|
||||
- if still failing after 2 retries: report error to user, leave original file untouched
|
||||
|
||||
4. Return result to user
|
||||
|
||||
## Compression Rules
|
||||
|
||||
### Remove
|
||||
- Articles: a, an, the
|
||||
- Filler: just, really, basically, actually, simply, essentially, generally
|
||||
- Pleasantries: "sure", "certainly", "of course", "happy to", "I'd recommend"
|
||||
- Hedging: "it might be worth", "you could consider", "it would be good to"
|
||||
- Redundant phrasing: "in order to" → "to", "make sure to" → "ensure", "the reason is because" → "because"
|
||||
- Connective fluff: "however", "furthermore", "additionally", "in addition"
|
||||
|
||||
### Preserve EXACTLY (never modify)
|
||||
- Code blocks (fenced ``` and indented)
|
||||
- Inline code (`backtick content`)
|
||||
- URLs and links (full URLs, markdown links)
|
||||
- File paths (`/src/components/...`, `./config.yaml`)
|
||||
- Commands (`npm install`, `git commit`, `docker build`)
|
||||
- Technical terms (library names, API names, protocols, algorithms)
|
||||
- Proper nouns (project names, people, companies)
|
||||
- Dates, version numbers, numeric values
|
||||
- Environment variables (`$HOME`, `NODE_ENV`)
|
||||
|
||||
### Preserve Structure
|
||||
- All markdown headings (keep exact heading text, compress body below)
|
||||
- Bullet point hierarchy (keep nesting level)
|
||||
- Numbered lists (keep numbering)
|
||||
- Tables (compress cell text, keep structure)
|
||||
- Frontmatter/YAML headers in markdown files
|
||||
|
||||
### Compress
|
||||
- Use short synonyms: "big" not "extensive", "fix" not "implement a solution for", "use" not "utilize"
|
||||
- Fragments OK: "Run tests before commit" not "You should always run tests before committing"
|
||||
- Drop "you should", "make sure to", "remember to" — just state the action
|
||||
- Merge redundant bullets that say the same thing differently
|
||||
- Keep one example where multiple examples show the same pattern
|
||||
|
||||
CRITICAL RULE:
|
||||
Anything inside ``` ... ``` must be copied EXACTLY.
|
||||
Do not:
|
||||
- remove comments
|
||||
- remove spacing
|
||||
- reorder lines
|
||||
- shorten commands
|
||||
- simplify anything
|
||||
|
||||
Inline code (`...`) must be preserved EXACTLY.
|
||||
Do not modify anything inside backticks.
|
||||
|
||||
If file contains code blocks:
|
||||
- Treat code blocks as read-only regions
|
||||
- Only compress text outside them
|
||||
- Do not merge sections around code
|
||||
|
||||
## Pattern
|
||||
|
||||
Original:
|
||||
> You should always make sure to run the test suite before pushing any changes to the main branch. This is important because it helps catch bugs early and prevents broken builds from being deployed to production.
|
||||
|
||||
Compressed:
|
||||
> Run tests before push to main. Catch bugs early, prevent broken prod deploys.
|
||||
|
||||
Original:
|
||||
> The application uses a microservices architecture with the following components. The API gateway handles all incoming requests and routes them to the appropriate service. The authentication service is responsible for managing user sessions and JWT tokens.
|
||||
|
||||
Compressed:
|
||||
> Microservices architecture. API gateway route all requests to services. Auth service manage user sessions + JWT tokens.
|
||||
|
||||
## Boundaries
|
||||
|
||||
- ONLY compress natural language files (.md, .txt, extensionless)
|
||||
- NEVER modify: .py, .js, .ts, .json, .yaml, .yml, .toml, .env, .lock, .css, .html, .xml, .sql, .sh
|
||||
- If file has mixed content (prose + code), compress ONLY the prose sections
|
||||
- If unsure whether something is code or prose, leave it unchanged
|
||||
- Original file is backed up as FILE.original.md before overwriting
|
||||
- Never compress FILE.original.md (skip it)
|
||||
9
.agents/skills/compress/scripts/__init__.py
Normal file
9
.agents/skills/compress/scripts/__init__.py
Normal file
@@ -0,0 +1,9 @@
|
||||
"""Caveman compress scripts.
|
||||
|
||||
This package provides tools to compress natural language markdown files
|
||||
into caveman format to save input tokens.
|
||||
"""
|
||||
|
||||
__all__ = ["cli", "compress", "detect", "validate"]
|
||||
|
||||
__version__ = "1.0.0"
|
||||
3
.agents/skills/compress/scripts/__main__.py
Normal file
3
.agents/skills/compress/scripts/__main__.py
Normal file
@@ -0,0 +1,3 @@
|
||||
from .cli import main
|
||||
|
||||
main()
|
||||
78
.agents/skills/compress/scripts/benchmark.py
Normal file
78
.agents/skills/compress/scripts/benchmark.py
Normal file
@@ -0,0 +1,78 @@
|
||||
#!/usr/bin/env python3
|
||||
from pathlib import Path
|
||||
import sys
|
||||
|
||||
# Support both direct execution and module import
|
||||
try:
|
||||
from .validate import validate
|
||||
except ImportError:
|
||||
sys.path.insert(0, str(Path(__file__).parent))
|
||||
from validate import validate
|
||||
|
||||
try:
|
||||
import tiktoken
|
||||
_enc = tiktoken.get_encoding("o200k_base")
|
||||
except ImportError:
|
||||
_enc = None
|
||||
|
||||
|
||||
def count_tokens(text):
|
||||
if _enc is None:
|
||||
return len(text.split()) # fallback: word count
|
||||
return len(_enc.encode(text))
|
||||
|
||||
|
||||
def benchmark_pair(orig_path: Path, comp_path: Path):
|
||||
orig_text = orig_path.read_text()
|
||||
comp_text = comp_path.read_text()
|
||||
|
||||
orig_tokens = count_tokens(orig_text)
|
||||
comp_tokens = count_tokens(comp_text)
|
||||
saved = 100 * (orig_tokens - comp_tokens) / orig_tokens if orig_tokens > 0 else 0.0
|
||||
result = validate(orig_path, comp_path)
|
||||
|
||||
return (comp_path.name, orig_tokens, comp_tokens, saved, result.is_valid)
|
||||
|
||||
|
||||
def print_table(rows):
|
||||
print("\n| File | Original | Compressed | Saved % | Valid |")
|
||||
print("|------|----------|------------|---------|-------|")
|
||||
for r in rows:
|
||||
print(f"| {r[0]} | {r[1]} | {r[2]} | {r[3]:.1f}% | {'✅' if r[4] else '❌'} |")
|
||||
|
||||
|
||||
def main():
|
||||
# Direct file pair: python3 benchmark.py original.md compressed.md
|
||||
if len(sys.argv) == 3:
|
||||
orig = Path(sys.argv[1]).resolve()
|
||||
comp = Path(sys.argv[2]).resolve()
|
||||
if not orig.exists():
|
||||
print(f"❌ Not found: {orig}")
|
||||
sys.exit(1)
|
||||
if not comp.exists():
|
||||
print(f"❌ Not found: {comp}")
|
||||
sys.exit(1)
|
||||
print_table([benchmark_pair(orig, comp)])
|
||||
return
|
||||
|
||||
# Glob mode: repo_root/tests/caveman-compress/
|
||||
tests_dir = Path(__file__).parent.parent.parent / "tests" / "caveman-compress"
|
||||
if not tests_dir.exists():
|
||||
print(f"❌ Tests dir not found: {tests_dir}")
|
||||
sys.exit(1)
|
||||
|
||||
rows = []
|
||||
for orig in sorted(tests_dir.glob("*.original.md")):
|
||||
comp = orig.with_name(orig.stem.removesuffix(".original") + ".md")
|
||||
if comp.exists():
|
||||
rows.append(benchmark_pair(orig, comp))
|
||||
|
||||
if not rows:
|
||||
print("No compressed file pairs found.")
|
||||
return
|
||||
|
||||
print_table(rows)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
73
.agents/skills/compress/scripts/cli.py
Normal file
73
.agents/skills/compress/scripts/cli.py
Normal file
@@ -0,0 +1,73 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Caveman Compress CLI
|
||||
|
||||
Usage:
|
||||
caveman <filepath>
|
||||
"""
|
||||
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
from .compress import compress_file
|
||||
from .detect import detect_file_type, should_compress
|
||||
|
||||
|
||||
def print_usage():
|
||||
print("Usage: caveman <filepath>")
|
||||
|
||||
|
||||
def main():
|
||||
if len(sys.argv) != 2:
|
||||
print_usage()
|
||||
sys.exit(1)
|
||||
|
||||
filepath = Path(sys.argv[1])
|
||||
|
||||
# Check file exists
|
||||
if not filepath.exists():
|
||||
print(f"❌ File not found: {filepath}")
|
||||
sys.exit(1)
|
||||
|
||||
if not filepath.is_file():
|
||||
print(f"❌ Not a file: {filepath}")
|
||||
sys.exit(1)
|
||||
|
||||
filepath = filepath.resolve()
|
||||
|
||||
# Detect file type
|
||||
file_type = detect_file_type(filepath)
|
||||
|
||||
print(f"Detected: {file_type}")
|
||||
|
||||
# Check if compressible
|
||||
if not should_compress(filepath):
|
||||
print("Skipping: file is not natural language (code/config)")
|
||||
sys.exit(0)
|
||||
|
||||
print("Starting caveman compression...\n")
|
||||
|
||||
try:
|
||||
success = compress_file(filepath)
|
||||
|
||||
if success:
|
||||
print("\nCompression completed successfully")
|
||||
backup_path = filepath.with_name(filepath.stem + ".original.md")
|
||||
print(f"Compressed: {filepath}")
|
||||
print(f"Original: {backup_path}")
|
||||
sys.exit(0)
|
||||
else:
|
||||
print("\n❌ Compression failed after retries")
|
||||
sys.exit(2)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("\nInterrupted by user")
|
||||
sys.exit(130)
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n❌ Error: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
227
.agents/skills/compress/scripts/compress.py
Normal file
227
.agents/skills/compress/scripts/compress.py
Normal file
@@ -0,0 +1,227 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Caveman Memory Compression Orchestrator
|
||||
|
||||
Usage:
|
||||
python scripts/compress.py <filepath>
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from typing import List
|
||||
|
||||
OUTER_FENCE_REGEX = re.compile(
|
||||
r"\A\s*(`{3,}|~{3,})[^\n]*\n(.*)\n\1\s*\Z", re.DOTALL
|
||||
)
|
||||
|
||||
# Filenames and paths that almost certainly hold secrets or PII. Compressing
|
||||
# them ships raw bytes to the Anthropic API — a third-party data boundary that
|
||||
# developers on sensitive codebases cannot cross. detect.py already skips .env
|
||||
# by extension, but credentials.md / secrets.txt / ~/.aws/credentials would
|
||||
# slip through the natural-language filter. This is a hard refuse before read.
|
||||
SENSITIVE_BASENAME_REGEX = re.compile(
|
||||
r"(?ix)^("
|
||||
r"\.env(\..+)?"
|
||||
r"|\.netrc"
|
||||
r"|credentials(\..+)?"
|
||||
r"|secrets?(\..+)?"
|
||||
r"|passwords?(\..+)?"
|
||||
r"|id_(rsa|dsa|ecdsa|ed25519)(\.pub)?"
|
||||
r"|authorized_keys"
|
||||
r"|known_hosts"
|
||||
r"|.*\.(pem|key|p12|pfx|crt|cer|jks|keystore|asc|gpg)"
|
||||
r")$"
|
||||
)
|
||||
|
||||
SENSITIVE_PATH_COMPONENTS = frozenset({".ssh", ".aws", ".gnupg", ".kube", ".docker"})
|
||||
|
||||
SENSITIVE_NAME_TOKENS = (
|
||||
"secret", "credential", "password", "passwd",
|
||||
"apikey", "accesskey", "token", "privatekey",
|
||||
)
|
||||
|
||||
|
||||
def is_sensitive_path(filepath: Path) -> bool:
|
||||
"""Heuristic denylist for files that must never be shipped to a third-party API."""
|
||||
name = filepath.name
|
||||
if SENSITIVE_BASENAME_REGEX.match(name):
|
||||
return True
|
||||
lowered_parts = {p.lower() for p in filepath.parts}
|
||||
if lowered_parts & SENSITIVE_PATH_COMPONENTS:
|
||||
return True
|
||||
# Normalize separators so "api-key" and "api_key" both match "apikey".
|
||||
lower = re.sub(r"[_\-\s.]", "", name.lower())
|
||||
return any(tok in lower for tok in SENSITIVE_NAME_TOKENS)
|
||||
|
||||
|
||||
def strip_llm_wrapper(text: str) -> str:
|
||||
"""Strip outer ```markdown ... ``` fence when it wraps the entire output."""
|
||||
m = OUTER_FENCE_REGEX.match(text)
|
||||
if m:
|
||||
return m.group(2)
|
||||
return text
|
||||
|
||||
from .detect import should_compress
|
||||
from .validate import validate
|
||||
|
||||
MAX_RETRIES = 2
|
||||
|
||||
|
||||
# ---------- Claude Calls ----------
|
||||
|
||||
|
||||
def call_claude(prompt: str) -> str:
|
||||
api_key = os.environ.get("ANTHROPIC_API_KEY")
|
||||
if api_key:
|
||||
try:
|
||||
import anthropic
|
||||
|
||||
client = anthropic.Anthropic(api_key=api_key)
|
||||
msg = client.messages.create(
|
||||
model=os.environ.get("CAVEMAN_MODEL", "claude-sonnet-4-5"),
|
||||
max_tokens=8192,
|
||||
messages=[{"role": "user", "content": prompt}],
|
||||
)
|
||||
return strip_llm_wrapper(msg.content[0].text.strip())
|
||||
except ImportError:
|
||||
pass # anthropic not installed, fall back to CLI
|
||||
# Fallback: use claude CLI (handles desktop auth)
|
||||
try:
|
||||
result = subprocess.run(
|
||||
["claude", "--print"],
|
||||
input=prompt,
|
||||
text=True,
|
||||
capture_output=True,
|
||||
check=True,
|
||||
)
|
||||
return strip_llm_wrapper(result.stdout.strip())
|
||||
except subprocess.CalledProcessError as e:
|
||||
raise RuntimeError(f"Claude call failed:\n{e.stderr}")
|
||||
|
||||
|
||||
def build_compress_prompt(original: str) -> str:
|
||||
return f"""
|
||||
Compress this markdown into caveman format.
|
||||
|
||||
STRICT RULES:
|
||||
- Do NOT modify anything inside ``` code blocks
|
||||
- Do NOT modify anything inside inline backticks
|
||||
- Preserve ALL URLs exactly
|
||||
- Preserve ALL headings exactly
|
||||
- Preserve file paths and commands
|
||||
- Return ONLY the compressed markdown body — do NOT wrap the entire output in a ```markdown fence or any other fence. Inner code blocks from the original stay as-is; do not add a new outer fence around the whole file.
|
||||
|
||||
Only compress natural language.
|
||||
|
||||
TEXT:
|
||||
{original}
|
||||
"""
|
||||
|
||||
|
||||
def build_fix_prompt(original: str, compressed: str, errors: List[str]) -> str:
|
||||
errors_str = "\n".join(f"- {e}" for e in errors)
|
||||
return f"""You are fixing a caveman-compressed markdown file. Specific validation errors were found.
|
||||
|
||||
CRITICAL RULES:
|
||||
- DO NOT recompress or rephrase the file
|
||||
- ONLY fix the listed errors — leave everything else exactly as-is
|
||||
- The ORIGINAL is provided as reference only (to restore missing content)
|
||||
- Preserve caveman style in all untouched sections
|
||||
|
||||
ERRORS TO FIX:
|
||||
{errors_str}
|
||||
|
||||
HOW TO FIX:
|
||||
- Missing URL: find it in ORIGINAL, restore it exactly where it belongs in COMPRESSED
|
||||
- Code block mismatch: find the exact code block in ORIGINAL, restore it in COMPRESSED
|
||||
- Heading mismatch: restore the exact heading text from ORIGINAL into COMPRESSED
|
||||
- Do not touch any section not mentioned in the errors
|
||||
|
||||
ORIGINAL (reference only):
|
||||
{original}
|
||||
|
||||
COMPRESSED (fix this):
|
||||
{compressed}
|
||||
|
||||
Return ONLY the fixed compressed file. No explanation.
|
||||
"""
|
||||
|
||||
|
||||
# ---------- Core Logic ----------
|
||||
|
||||
|
||||
def compress_file(filepath: Path) -> bool:
|
||||
# Resolve and validate path
|
||||
filepath = filepath.resolve()
|
||||
MAX_FILE_SIZE = 500_000 # 500KB
|
||||
if not filepath.exists():
|
||||
raise FileNotFoundError(f"File not found: {filepath}")
|
||||
if filepath.stat().st_size > MAX_FILE_SIZE:
|
||||
raise ValueError(f"File too large to compress safely (max 500KB): {filepath}")
|
||||
|
||||
# Refuse files that look like they contain secrets or PII. Compressing ships
|
||||
# the raw bytes to the Anthropic API — a third-party boundary — so we fail
|
||||
# loudly rather than silently exfiltrate credentials or keys. Override is
|
||||
# intentional: the user must rename the file if the heuristic is wrong.
|
||||
if is_sensitive_path(filepath):
|
||||
raise ValueError(
|
||||
f"Refusing to compress {filepath}: filename looks sensitive "
|
||||
"(credentials, keys, secrets, or known private paths). "
|
||||
"Compression sends file contents to the Anthropic API. "
|
||||
"Rename the file if this is a false positive."
|
||||
)
|
||||
|
||||
print(f"Processing: {filepath}")
|
||||
|
||||
if not should_compress(filepath):
|
||||
print("Skipping (not natural language)")
|
||||
return False
|
||||
|
||||
original_text = filepath.read_text(errors="ignore")
|
||||
backup_path = filepath.with_name(filepath.stem + ".original.md")
|
||||
|
||||
# Check if backup already exists to prevent accidental overwriting
|
||||
if backup_path.exists():
|
||||
print(f"⚠️ Backup file already exists: {backup_path}")
|
||||
print("The original backup may contain important content.")
|
||||
print("Aborting to prevent data loss. Please remove or rename the backup file if you want to proceed.")
|
||||
return False
|
||||
|
||||
# Step 1: Compress
|
||||
print("Compressing with Claude...")
|
||||
compressed = call_claude(build_compress_prompt(original_text))
|
||||
|
||||
# Save original as backup, write compressed to original path
|
||||
backup_path.write_text(original_text)
|
||||
filepath.write_text(compressed)
|
||||
|
||||
# Step 2: Validate + Retry
|
||||
for attempt in range(MAX_RETRIES):
|
||||
print(f"\nValidation attempt {attempt + 1}")
|
||||
|
||||
result = validate(backup_path, filepath)
|
||||
|
||||
if result.is_valid:
|
||||
print("Validation passed")
|
||||
break
|
||||
|
||||
print("❌ Validation failed:")
|
||||
for err in result.errors:
|
||||
print(f" - {err}")
|
||||
|
||||
if attempt == MAX_RETRIES - 1:
|
||||
# Restore original on failure
|
||||
filepath.write_text(original_text)
|
||||
backup_path.unlink(missing_ok=True)
|
||||
print("❌ Failed after retries — original restored")
|
||||
return False
|
||||
|
||||
print("Fixing with Claude...")
|
||||
compressed = call_claude(
|
||||
build_fix_prompt(original_text, compressed, result.errors)
|
||||
)
|
||||
filepath.write_text(compressed)
|
||||
|
||||
return True
|
||||
121
.agents/skills/compress/scripts/detect.py
Normal file
121
.agents/skills/compress/scripts/detect.py
Normal file
@@ -0,0 +1,121 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Detect whether a file is natural language (compressible) or code/config (skip)."""
|
||||
|
||||
import json
|
||||
import re
|
||||
from pathlib import Path
|
||||
|
||||
# Extensions that are natural language and compressible
|
||||
COMPRESSIBLE_EXTENSIONS = {".md", ".txt", ".markdown", ".rst"}
|
||||
|
||||
# Extensions that are code/config and should be skipped
|
||||
SKIP_EXTENSIONS = {
|
||||
".py", ".js", ".ts", ".tsx", ".jsx", ".json", ".yaml", ".yml",
|
||||
".toml", ".env", ".lock", ".css", ".scss", ".html", ".xml",
|
||||
".sql", ".sh", ".bash", ".zsh", ".go", ".rs", ".java", ".c",
|
||||
".cpp", ".h", ".hpp", ".rb", ".php", ".swift", ".kt", ".lua",
|
||||
".dockerfile", ".makefile", ".csv", ".ini", ".cfg",
|
||||
}
|
||||
|
||||
# Patterns that indicate a line is code
|
||||
CODE_PATTERNS = [
|
||||
re.compile(r"^\s*(import |from .+ import |require\(|const |let |var )"),
|
||||
re.compile(r"^\s*(def |class |function |async function |export )"),
|
||||
re.compile(r"^\s*(if\s*\(|for\s*\(|while\s*\(|switch\s*\(|try\s*\{)"),
|
||||
re.compile(r"^\s*[\}\]\);]+\s*$"), # closing braces/brackets
|
||||
re.compile(r"^\s*@\w+"), # decorators/annotations
|
||||
re.compile(r'^\s*"[^"]+"\s*:\s*'), # JSON-like key-value
|
||||
re.compile(r"^\s*\w+\s*=\s*[{\[\(\"']"), # assignment with literal
|
||||
]
|
||||
|
||||
|
||||
def _is_code_line(line: str) -> bool:
|
||||
"""Check if a line looks like code."""
|
||||
return any(p.match(line) for p in CODE_PATTERNS)
|
||||
|
||||
|
||||
def _is_json_content(text: str) -> bool:
|
||||
"""Check if content is valid JSON."""
|
||||
try:
|
||||
json.loads(text)
|
||||
return True
|
||||
except (json.JSONDecodeError, ValueError):
|
||||
return False
|
||||
|
||||
|
||||
def _is_yaml_content(lines: list[str]) -> bool:
|
||||
"""Heuristic: check if content looks like YAML."""
|
||||
yaml_indicators = 0
|
||||
for line in lines[:30]:
|
||||
stripped = line.strip()
|
||||
if stripped.startswith("---"):
|
||||
yaml_indicators += 1
|
||||
elif re.match(r"^\w[\w\s]*:\s", stripped):
|
||||
yaml_indicators += 1
|
||||
elif stripped.startswith("- ") and ":" in stripped:
|
||||
yaml_indicators += 1
|
||||
# If most non-empty lines look like YAML
|
||||
non_empty = sum(1 for l in lines[:30] if l.strip())
|
||||
return non_empty > 0 and yaml_indicators / non_empty > 0.6
|
||||
|
||||
|
||||
def detect_file_type(filepath: Path) -> str:
|
||||
"""Classify a file as 'natural_language', 'code', 'config', or 'unknown'.
|
||||
|
||||
Returns:
|
||||
One of: 'natural_language', 'code', 'config', 'unknown'
|
||||
"""
|
||||
ext = filepath.suffix.lower()
|
||||
|
||||
# Extension-based classification
|
||||
if ext in COMPRESSIBLE_EXTENSIONS:
|
||||
return "natural_language"
|
||||
if ext in SKIP_EXTENSIONS:
|
||||
return "code" if ext not in {".json", ".yaml", ".yml", ".toml", ".ini", ".cfg", ".env"} else "config"
|
||||
|
||||
# Extensionless files (like CLAUDE.md, TODO) — check content
|
||||
if not ext:
|
||||
try:
|
||||
text = filepath.read_text(errors="ignore")
|
||||
except (OSError, PermissionError):
|
||||
return "unknown"
|
||||
|
||||
lines = text.splitlines()[:50]
|
||||
|
||||
if _is_json_content(text[:10000]):
|
||||
return "config"
|
||||
if _is_yaml_content(lines):
|
||||
return "config"
|
||||
|
||||
code_lines = sum(1 for l in lines if l.strip() and _is_code_line(l))
|
||||
non_empty = sum(1 for l in lines if l.strip())
|
||||
if non_empty > 0 and code_lines / non_empty > 0.4:
|
||||
return "code"
|
||||
|
||||
return "natural_language"
|
||||
|
||||
return "unknown"
|
||||
|
||||
|
||||
def should_compress(filepath: Path) -> bool:
|
||||
"""Return True if the file is natural language and should be compressed."""
|
||||
if not filepath.is_file():
|
||||
return False
|
||||
# Skip backup files
|
||||
if filepath.name.endswith(".original.md"):
|
||||
return False
|
||||
return detect_file_type(filepath) == "natural_language"
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
if len(sys.argv) < 2:
|
||||
print("Usage: python detect.py <file1> [file2] ...")
|
||||
sys.exit(1)
|
||||
|
||||
for path_str in sys.argv[1:]:
|
||||
p = Path(path_str).resolve()
|
||||
file_type = detect_file_type(p)
|
||||
compress = should_compress(p)
|
||||
print(f" {p.name:30s} type={file_type:20s} compress={compress}")
|
||||
189
.agents/skills/compress/scripts/validate.py
Normal file
189
.agents/skills/compress/scripts/validate.py
Normal file
@@ -0,0 +1,189 @@
|
||||
#!/usr/bin/env python3
|
||||
import re
|
||||
from pathlib import Path
|
||||
|
||||
URL_REGEX = re.compile(r"https?://[^\s)]+")
|
||||
FENCE_OPEN_REGEX = re.compile(r"^(\s{0,3})(`{3,}|~{3,})(.*)$")
|
||||
HEADING_REGEX = re.compile(r"^(#{1,6})\s+(.*)", re.MULTILINE)
|
||||
BULLET_REGEX = re.compile(r"^\s*[-*+]\s+", re.MULTILINE)
|
||||
|
||||
# crude but effective path detection
|
||||
# Requires either a path prefix (./ ../ / or drive letter) or a slash/backslash within the match
|
||||
PATH_REGEX = re.compile(r"(?:\./|\.\./|/|[A-Za-z]:\\)[\w\-/\\\.]+|[\w\-\.]+[/\\][\w\-/\\\.]+")
|
||||
|
||||
|
||||
class ValidationResult:
|
||||
def __init__(self):
|
||||
self.is_valid = True
|
||||
self.errors = []
|
||||
self.warnings = []
|
||||
|
||||
def add_error(self, msg):
|
||||
self.is_valid = False
|
||||
self.errors.append(msg)
|
||||
|
||||
def add_warning(self, msg):
|
||||
self.warnings.append(msg)
|
||||
|
||||
|
||||
def read_file(path: Path) -> str:
|
||||
return path.read_text(errors="ignore")
|
||||
|
||||
|
||||
# ---------- Extractors ----------
|
||||
|
||||
|
||||
def extract_headings(text):
|
||||
return [(level, title.strip()) for level, title in HEADING_REGEX.findall(text)]
|
||||
|
||||
|
||||
def extract_code_blocks(text):
|
||||
"""Line-based fenced code block extractor.
|
||||
|
||||
Handles ``` and ~~~ fences with variable length (CommonMark: closing
|
||||
fence must use same char and be at least as long as opening). Supports
|
||||
nested fences (e.g. an outer 4-backtick block wrapping inner 3-backtick
|
||||
content).
|
||||
"""
|
||||
blocks = []
|
||||
lines = text.split("\n")
|
||||
i = 0
|
||||
n = len(lines)
|
||||
while i < n:
|
||||
m = FENCE_OPEN_REGEX.match(lines[i])
|
||||
if not m:
|
||||
i += 1
|
||||
continue
|
||||
fence_char = m.group(2)[0]
|
||||
fence_len = len(m.group(2))
|
||||
open_line = lines[i]
|
||||
block_lines = [open_line]
|
||||
i += 1
|
||||
closed = False
|
||||
while i < n:
|
||||
close_m = FENCE_OPEN_REGEX.match(lines[i])
|
||||
if (
|
||||
close_m
|
||||
and close_m.group(2)[0] == fence_char
|
||||
and len(close_m.group(2)) >= fence_len
|
||||
and close_m.group(3).strip() == ""
|
||||
):
|
||||
block_lines.append(lines[i])
|
||||
closed = True
|
||||
i += 1
|
||||
break
|
||||
block_lines.append(lines[i])
|
||||
i += 1
|
||||
if closed:
|
||||
blocks.append("\n".join(block_lines))
|
||||
# Unclosed fences are silently skipped — they indicate malformed markdown
|
||||
# and including them would cause false-positive validation failures.
|
||||
return blocks
|
||||
|
||||
|
||||
def extract_urls(text):
|
||||
return set(URL_REGEX.findall(text))
|
||||
|
||||
|
||||
def extract_paths(text):
|
||||
return set(PATH_REGEX.findall(text))
|
||||
|
||||
|
||||
def count_bullets(text):
|
||||
return len(BULLET_REGEX.findall(text))
|
||||
|
||||
|
||||
# ---------- Validators ----------
|
||||
|
||||
|
||||
def validate_headings(orig, comp, result):
|
||||
h1 = extract_headings(orig)
|
||||
h2 = extract_headings(comp)
|
||||
|
||||
if len(h1) != len(h2):
|
||||
result.add_error(f"Heading count mismatch: {len(h1)} vs {len(h2)}")
|
||||
|
||||
if h1 != h2:
|
||||
result.add_warning("Heading text/order changed")
|
||||
|
||||
|
||||
def validate_code_blocks(orig, comp, result):
|
||||
c1 = extract_code_blocks(orig)
|
||||
c2 = extract_code_blocks(comp)
|
||||
|
||||
if c1 != c2:
|
||||
result.add_error("Code blocks not preserved exactly")
|
||||
|
||||
|
||||
def validate_urls(orig, comp, result):
|
||||
u1 = extract_urls(orig)
|
||||
u2 = extract_urls(comp)
|
||||
|
||||
if u1 != u2:
|
||||
result.add_error(f"URL mismatch: lost={u1 - u2}, added={u2 - u1}")
|
||||
|
||||
|
||||
def validate_paths(orig, comp, result):
|
||||
p1 = extract_paths(orig)
|
||||
p2 = extract_paths(comp)
|
||||
|
||||
if p1 != p2:
|
||||
result.add_warning(f"Path mismatch: lost={p1 - p2}, added={p2 - p1}")
|
||||
|
||||
|
||||
def validate_bullets(orig, comp, result):
|
||||
b1 = count_bullets(orig)
|
||||
b2 = count_bullets(comp)
|
||||
|
||||
if b1 == 0:
|
||||
return
|
||||
|
||||
diff = abs(b1 - b2) / b1
|
||||
|
||||
if diff > 0.15:
|
||||
result.add_warning(f"Bullet count changed too much: {b1} -> {b2}")
|
||||
|
||||
|
||||
# ---------- Main ----------
|
||||
|
||||
|
||||
def validate(original_path: Path, compressed_path: Path) -> ValidationResult:
|
||||
result = ValidationResult()
|
||||
|
||||
orig = read_file(original_path)
|
||||
comp = read_file(compressed_path)
|
||||
|
||||
validate_headings(orig, comp, result)
|
||||
validate_code_blocks(orig, comp, result)
|
||||
validate_urls(orig, comp, result)
|
||||
validate_paths(orig, comp, result)
|
||||
validate_bullets(orig, comp, result)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
# ---------- CLI ----------
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
if len(sys.argv) != 3:
|
||||
print("Usage: python validate.py <original> <compressed>")
|
||||
sys.exit(1)
|
||||
|
||||
orig = Path(sys.argv[1]).resolve()
|
||||
comp = Path(sys.argv[2]).resolve()
|
||||
|
||||
res = validate(orig, comp)
|
||||
|
||||
print(f"\nValid: {res.is_valid}")
|
||||
|
||||
if res.errors:
|
||||
print("\nErrors:")
|
||||
for e in res.errors:
|
||||
print(f" - {e}")
|
||||
|
||||
if res.warnings:
|
||||
print("\nWarnings:")
|
||||
for w in res.warnings:
|
||||
print(f" - {w}")
|
||||
1
.claude/skills/caveman
Symbolic link
1
.claude/skills/caveman
Symbolic link
@@ -0,0 +1 @@
|
||||
../../.agents/skills/caveman
|
||||
1
.claude/skills/caveman-compress
Symbolic link
1
.claude/skills/caveman-compress
Symbolic link
@@ -0,0 +1 @@
|
||||
../../.agents/skills/caveman-compress
|
||||
1
.claude/skills/caveman-help
Symbolic link
1
.claude/skills/caveman-help
Symbolic link
@@ -0,0 +1 @@
|
||||
../../.agents/skills/caveman-help
|
||||
1
.claude/skills/compress
Symbolic link
1
.claude/skills/compress
Symbolic link
@@ -0,0 +1 @@
|
||||
../../.agents/skills/compress
|
||||
379
.claude/skills/jsdoc/SKILL.md
Normal file
379
.claude/skills/jsdoc/SKILL.md
Normal file
@@ -0,0 +1,379 @@
|
||||
---
|
||||
name: jsdoc
|
||||
description: Commenting and documentation guidelines. Auto-activate when the user discusses comments, documentation, docstrings, code clarity, API docs, JSDoc, or asks about commenting strategies.
|
||||
---
|
||||
|
||||
Auto-activate when: User discusses comments, documentation, docstrings, code clarity, code quality, API docs, JSDoc, Python docstrings, or asks about commenting strategies.
|
||||
Core Principle
|
||||
|
||||
Write code that speaks for itself. Comment only when necessary to explain WHY, not WHAT.
|
||||
|
||||
Most code does not need comments. Well-written code with clear naming and structure is self-documenting.
|
||||
|
||||
The best comment is the one you don't need to write because the code is already obvious.
|
||||
The Commenting Philosophy
|
||||
When to Comment
|
||||
|
||||
✅ DO comment when explaining:
|
||||
|
||||
WHY something is done (business logic, design decisions)
|
||||
Complex algorithms and their reasoning
|
||||
Non-obvious trade-offs or constraints
|
||||
Workarounds for bugs or limitations
|
||||
API contracts and public interfaces
|
||||
Regex patterns and what they match
|
||||
Performance considerations or optimizations
|
||||
Constants and magic numbers
|
||||
Gotchas or surprising behaviors
|
||||
|
||||
❌ DON'T comment when:
|
||||
|
||||
The code is obvious and self-explanatory
|
||||
The comment repeats the code (redundant)
|
||||
Better naming would eliminate the need
|
||||
The comment would become outdated quickly
|
||||
It's decorative or organizational noise
|
||||
It states what a standard language construct does
|
||||
|
||||
Comment Anti-Patterns
|
||||
❌ 1. Obvious Comments
|
||||
|
||||
BAD:
|
||||
|
||||
counter = 0 # Initialize counter to zero
|
||||
counter += 1 # Increment counter by one
|
||||
user_name = input("Enter name: ") # Get user name from input
|
||||
|
||||
Better: No comment needed - the code is self-explanatory.
|
||||
❌ 2. Redundant Comments
|
||||
|
||||
BAD:
|
||||
|
||||
def get_user_name(user):
|
||||
return user.name # Return the user's name
|
||||
|
||||
def calculate_total(items):
|
||||
# Loop through items and sum the prices
|
||||
total = 0
|
||||
for item in items:
|
||||
total += item.price
|
||||
return total
|
||||
|
||||
Better:
|
||||
|
||||
def get_user_name(user):
|
||||
return user.name
|
||||
|
||||
def calculate_total(items):
|
||||
return sum(item.price for item in items)
|
||||
|
||||
❌ 3. Outdated Comments
|
||||
|
||||
BAD:
|
||||
|
||||
# Calculate tax at 5% rate
|
||||
tax = price * 0.08 # Actually 8%, comment is wrong
|
||||
|
||||
# DEPRECATED: Use new_api_function() instead
|
||||
def old_function(): # Still being used, comment is misleading
|
||||
pass
|
||||
|
||||
Better: Keep comments in sync with code, or remove them entirely.
|
||||
❌ 4. Noise Comments
|
||||
|
||||
BAD:
|
||||
|
||||
# Start of function
|
||||
def calculate():
|
||||
# Declare variable
|
||||
result = 0
|
||||
# Return result
|
||||
return result
|
||||
# End of function
|
||||
|
||||
Better: Remove all of these comments.
|
||||
❌ 5. Dead Code & Changelog Comments
|
||||
|
||||
BAD:
|
||||
|
||||
# Don't comment out code - use version control
|
||||
# def old_function():
|
||||
# return "deprecated"
|
||||
|
||||
# Don't maintain history in comments
|
||||
# Modified by John on 2023-01-15
|
||||
# Fixed bug reported by Sarah on 2023-02-03
|
||||
|
||||
Better: Delete the code. Git has the history.
|
||||
Good Comment Examples
|
||||
✅ Complex Business Logic
|
||||
|
||||
# Apply progressive tax brackets: 10% up to $10k, 20% above
|
||||
# This matches IRS publication 501 for 2024
|
||||
def calculate_progressive_tax(income):
|
||||
if income <= 10000:
|
||||
return income * 0.10
|
||||
else:
|
||||
return 1000 + (income - 10000) * 0.20
|
||||
|
||||
✅ Non-obvious Algorithms
|
||||
|
||||
# Using Floyd-Warshall for all-pairs shortest paths
|
||||
# because we need distances between all nodes.
|
||||
# Time: O(n³), Space: O(n²)
|
||||
for k in range(vertices):
|
||||
for i in range(vertices):
|
||||
for j in range(vertices):
|
||||
dist[i][j] = min(dist[i][j], dist[i][k] + dist[k][j])
|
||||
|
||||
✅ Regex Patterns
|
||||
|
||||
# Match email format: username@domain.extension
|
||||
# Allows letters, numbers, dots, hyphens in username
|
||||
# Requires valid domain and 2+ char extension
|
||||
email_pattern = r'^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$'
|
||||
|
||||
✅ API Constraints or Gotchas
|
||||
|
||||
# GitHub API rate limit: 5000 requests/hour for authenticated users
|
||||
# We implement exponential backoff to handle rate limiting
|
||||
await rate_limiter.wait()
|
||||
response = await fetch(github_api_url)
|
||||
|
||||
✅ Workarounds for Bugs
|
||||
|
||||
# HACK: Workaround for bug in library v2.1.0
|
||||
# Remove after upgrading to v2.2.0
|
||||
# See: https://github.com/library/issues/123
|
||||
if library_version == "2.1.0":
|
||||
apply_workaround()
|
||||
|
||||
Decision Framework
|
||||
|
||||
Before writing a comment, ask yourself:
|
||||
Step 1: Is the code self-explanatory?
|
||||
|
||||
If YES → No comment needed
|
||||
If NO → Continue to step 2
|
||||
|
||||
Step 2: Would a better variable/function name eliminate the need?
|
||||
|
||||
If YES → Refactor the code instead
|
||||
If NO → Continue to step 3
|
||||
|
||||
Step 3: Does this explain WHY, not WHAT?
|
||||
|
||||
If explaining WHAT → Refactor code to be clearer
|
||||
If explaining WHY → Good comment candidate
|
||||
|
||||
Step 4: Will this help future maintainers?
|
||||
|
||||
If YES → Write the comment
|
||||
If NO → Skip it
|
||||
|
||||
Special Cases for Comments
|
||||
Public APIs and Docstrings
|
||||
Python Docstrings
|
||||
|
||||
def calculate_compound_interest(
|
||||
principal: float,
|
||||
rate: float,
|
||||
time: int,
|
||||
compound_frequency: int = 1
|
||||
) -> float:
|
||||
"""
|
||||
Calculate compound interest using the standard formula.
|
||||
|
||||
Args:
|
||||
principal: Initial amount invested
|
||||
rate: Annual interest rate as decimal (e.g., 0.05 for 5%)
|
||||
time: Time period in years
|
||||
compound_frequency: Times per year interest compounds (default: 1)
|
||||
|
||||
Returns:
|
||||
Final amount after compound interest
|
||||
|
||||
Raises:
|
||||
ValueError: If any parameter is negative
|
||||
|
||||
Example:
|
||||
>>> calculate_compound_interest(1000, 0.05, 10)
|
||||
1628.89
|
||||
"""
|
||||
if principal < 0 or rate < 0 or time < 0:
|
||||
raise ValueError("Parameters must be non-negative")
|
||||
|
||||
# Compound interest formula: A = P(1 + r/n)^(nt)
|
||||
return principal * (1 + rate / compound_frequency) ** (compound_frequency * time)
|
||||
|
||||
JavaScript/TypeScript JSDoc
|
||||
|
||||
/**
|
||||
* Fetch user data from the API.
|
||||
*
|
||||
* @param {string} userId - The unique user identifier
|
||||
* @param {Object} options - Configuration options
|
||||
* @param {boolean} options.includeProfile - Include profile data (default: true)
|
||||
* @param {number} options.timeout - Request timeout in ms (default: 5000)
|
||||
*
|
||||
* @returns {Promise<User>} User object with requested fields
|
||||
*
|
||||
* @throws {Error} If userId is invalid or request fails
|
||||
*
|
||||
* @example
|
||||
* const user = await fetchUser('123', { includeProfile: true });
|
||||
*/
|
||||
async function fetchUser(userId, options = {}) {
|
||||
// Implementation
|
||||
}
|
||||
|
||||
Constants and Configuration
|
||||
|
||||
# Based on network reliability studies (95th percentile)
|
||||
MAX_RETRIES = 3
|
||||
|
||||
# AWS Lambda timeout is 15s, leaving 5s buffer for cleanup
|
||||
API_TIMEOUT = 10000 # milliseconds
|
||||
|
||||
# Cache duration optimized for balance between freshness and load
|
||||
# See: docs/performance-tuning.md
|
||||
CACHE_TTL = 300 # 5 minutes
|
||||
|
||||
Annotations for TODOs and Warnings
|
||||
|
||||
# TODO: Replace with proper authentication after security review
|
||||
# Issue: #456
|
||||
def temporary_auth(user):
|
||||
return True
|
||||
|
||||
# WARNING: This function modifies the original array instead of creating a copy
|
||||
def sort_in_place(arr):
|
||||
arr.sort()
|
||||
return arr
|
||||
|
||||
# FIXME: Memory leak in production - investigate connection pooling
|
||||
# Ticket: JIRA-789
|
||||
def get_connection():
|
||||
return create_connection()
|
||||
|
||||
# PERF: Consider caching this result if called frequently in hot path
|
||||
def expensive_calculation(data):
|
||||
return complex_algorithm(data)
|
||||
|
||||
# SECURITY: Validate input to prevent SQL injection before using in query
|
||||
def build_query(user_input):
|
||||
sanitized = escape_sql(user_input)
|
||||
return f"SELECT * FROM users WHERE name = '{sanitized}'"
|
||||
|
||||
Common Annotation Keywords
|
||||
|
||||
TODO: - Work that needs to be done
|
||||
FIXME: - Known bugs that need fixing
|
||||
HACK: - Temporary workarounds
|
||||
NOTE: - Important information or context
|
||||
WARNING: - Critical information about usage
|
||||
PERF: - Performance considerations
|
||||
SECURITY: - Security-related notes
|
||||
BUG: - Known bug documentation
|
||||
REFACTOR: - Code that needs refactoring
|
||||
DEPRECATED: - Soon-to-be-removed code
|
||||
|
||||
Refactoring Over Commenting
|
||||
Instead of Commenting Complex Code...
|
||||
|
||||
BAD: Complex code with comment
|
||||
|
||||
# Check if user is admin or has special permissions
|
||||
if user.role == "admin" or (user.permissions and "special" in user.permissions):
|
||||
grant_access()
|
||||
|
||||
...Extract to Named Function
|
||||
|
||||
GOOD: Self-explanatory through naming
|
||||
|
||||
def user_has_admin_access(user):
|
||||
return user.role == "admin" or has_special_permission(user)
|
||||
|
||||
def has_special_permission(user):
|
||||
return user.permissions and "special" in user.permissions
|
||||
|
||||
if user_has_admin_access(user):
|
||||
grant_access()
|
||||
|
||||
Language-Specific Examples
|
||||
JavaScript
|
||||
|
||||
// Good: Explains WHY we debounce
|
||||
// Debounce search to reduce API calls (500ms wait after last keystroke)
|
||||
const debouncedSearch = debounce(searchAPI, 500);
|
||||
|
||||
// Bad: Obvious
|
||||
let count = 0; // Initialize count to zero
|
||||
count++; // Increment count
|
||||
|
||||
// Good: Explains algorithm choice
|
||||
// Using Set for O(1) lookup instead of Array.includes() which is O(n)
|
||||
const seen = new Set(ids);
|
||||
|
||||
Python
|
||||
|
||||
# Good: Explains the algorithm choice
|
||||
# Using binary search because data is sorted and we need O(log n) performance
|
||||
index = bisect.bisect_left(sorted_list, target)
|
||||
|
||||
# Bad: Redundant
|
||||
def get_total(items):
|
||||
return sum(items) # Return the sum of items
|
||||
|
||||
# Good: Explains why we're doing this
|
||||
# Extract to separate function for type checking in mypy
|
||||
def validate_user(user):
|
||||
if not user or not user.id:
|
||||
raise ValueError("Invalid user")
|
||||
return user
|
||||
|
||||
TypeScript
|
||||
|
||||
// Good: Explains the type assertion
|
||||
// TypeScript can't infer this is never null after the check
|
||||
const element = document.getElementById('app') as HTMLElement;
|
||||
|
||||
// Bad: Obvious
|
||||
const sum = a + b; // Add a and b
|
||||
|
||||
// Good: Explains non-obvious behavior
|
||||
// spread operator creates shallow copy; use JSON for deep copy
|
||||
const newConfig = { ...config };
|
||||
|
||||
Comment Quality Checklist
|
||||
|
||||
Before committing, ensure your comments:
|
||||
|
||||
Explain WHY, not WHAT
|
||||
Are grammatically correct and clear
|
||||
Will remain accurate as code evolves
|
||||
Add genuine value to code understanding
|
||||
Are placed appropriately (above the code they describe)
|
||||
Use proper spelling and professional language
|
||||
Follow team conventions for annotation keywords
|
||||
Could not be replaced by better naming or structure
|
||||
Are not obvious statements about language features
|
||||
Reference tickets/issues when applicable
|
||||
|
||||
Summary
|
||||
|
||||
Priority order:
|
||||
|
||||
Clear code - Self-explanatory through naming and structure
|
||||
Good comments - Explain WHY when necessary
|
||||
Documentation - API docs, docstrings for public interfaces
|
||||
No comments - Better than bad comments that lie or clutter
|
||||
|
||||
Remember: Comments are a failure to make the code self-explanatory. Use them sparingly and wisely.
|
||||
Key Takeaways
|
||||
Goal Approach
|
||||
Reduce comments Improve naming, extract functions, simplify logic
|
||||
Improve clarity Use self-explanatory code structure, clear variable names
|
||||
Document APIs Use docstrings/JSDoc for public interfaces
|
||||
Explain WHY Comment only business logic, algorithms, workarounds
|
||||
Maintain accuracy Update comments when code changes, or remove them
|
||||
360
.claude/skills/typescript/SKILL.md
Normal file
360
.claude/skills/typescript/SKILL.md
Normal file
@@ -0,0 +1,360 @@
|
||||
---
|
||||
name: typescript
|
||||
description: TypeScript engineering guidelines based on Google's style guide. Use when writing, reviewing, or refactoring TypeScript code in this project.
|
||||
---
|
||||
|
||||
Comprehensive guidelines for writing production-quality TypeScript based on Google's TypeScript Style Guide.
|
||||
Naming Conventions
|
||||
Type Convention Example
|
||||
Classes, Interfaces, Types, Enums UpperCamelCase UserService, HttpClient
|
||||
Variables, Parameters, Functions lowerCamelCase userName, processData
|
||||
Global Constants, Enum Values CONSTANT_CASE MAX_RETRIES, Status.ACTIVE
|
||||
Type Parameters Single letter or UpperCamelCase T, ResponseType
|
||||
Naming Principles
|
||||
|
||||
Descriptive names, avoid ambiguous abbreviations
|
||||
Treat acronyms as words: loadHttpUrl not loadHTTPURL
|
||||
No prefixes like opt_ for optional parameters
|
||||
No trailing underscores for private properties
|
||||
Single-letter variables only when scope is <10 lines
|
||||
|
||||
Variable Declarations
|
||||
|
||||
// Always use const by default
|
||||
const users = getUsers();
|
||||
|
||||
// Use let only when reassignment is needed
|
||||
let count = 0;
|
||||
count++;
|
||||
|
||||
// Never use var
|
||||
// var x = 1; // WRONG
|
||||
|
||||
// One variable per declaration
|
||||
const a = 1;
|
||||
const b = 2;
|
||||
// const a = 1, b = 2; // WRONG
|
||||
|
||||
Types and Interfaces
|
||||
Prefer Type Aliases Over Interfaces
|
||||
|
||||
// Good: type alias for object shapes
|
||||
type User = {
|
||||
id: string;
|
||||
name: string;
|
||||
email?: string;
|
||||
};
|
||||
|
||||
// Avoid: interface for object shapes
|
||||
// interface User {
|
||||
// id: string;
|
||||
// name: string;
|
||||
// }
|
||||
|
||||
// Type aliases work for everything: objects, unions, intersections, mapped types
|
||||
type Status = 'active' | 'inactive';
|
||||
type Combined = TypeA & TypeB;
|
||||
type Handler = (event: Event) => void;
|
||||
|
||||
// Benefits of types over interfaces:
|
||||
// 1. Consistent syntax for all type definitions
|
||||
// 2. Cannot be merged/extended unexpectedly (no declaration merging)
|
||||
// 3. Better for union types and computed properties
|
||||
// 4. Works with utility types more naturally
|
||||
|
||||
Type Inference
|
||||
|
||||
Leverage inference for trivially inferred types:
|
||||
|
||||
// Good: inference is clear
|
||||
const name = 'Alice';
|
||||
const items = [1, 2, 3];
|
||||
|
||||
// Good: explicit for complex expressions
|
||||
const result: ProcessedData = complexTransformation(input);
|
||||
|
||||
Array Types
|
||||
|
||||
// Simple types: use T[]
|
||||
const numbers: number[];
|
||||
const names: readonly string[];
|
||||
|
||||
// Multi-dimensional: use T[][]
|
||||
const matrix: number[][];
|
||||
|
||||
// Complex types: use Array<T>
|
||||
const handlers: Array<(event: Event) => void>;
|
||||
|
||||
Null and Undefined
|
||||
|
||||
// Prefer optional fields over union with undefined
|
||||
interface Config {
|
||||
timeout?: number; // Good
|
||||
// timeout: number | undefined; // Avoid
|
||||
}
|
||||
|
||||
// Type aliases must NOT include |null or |undefined
|
||||
type UserId = string; // Good
|
||||
// type UserId = string | null; // WRONG
|
||||
|
||||
// May use == for null comparison (catches both null and undefined)
|
||||
if (value == null) {
|
||||
// handles both null and undefined
|
||||
}
|
||||
|
||||
Types to Avoid
|
||||
|
||||
// Avoid any - use unknown instead
|
||||
function parse(input: unknown): Data { }
|
||||
|
||||
// Avoid {} - use unknown, Record<string, T>, or object
|
||||
function process(obj: Record<string, unknown>): void { }
|
||||
|
||||
// Use lowercase primitives
|
||||
let name: string; // Good
|
||||
// let name: String; // WRONG
|
||||
|
||||
// Never use wrapper objects
|
||||
// new String('hello') // WRONG
|
||||
|
||||
Classes
|
||||
Structure
|
||||
|
||||
class UserService {
|
||||
// Fields first, initialized where declared
|
||||
private readonly cache = new Map<string, User>();
|
||||
private lastAccess: Date | null = null;
|
||||
|
||||
// Constructor with parameter properties
|
||||
constructor(
|
||||
private readonly api: ApiClient,
|
||||
private readonly logger: Logger,
|
||||
) {}
|
||||
|
||||
// Methods separated by blank lines
|
||||
async getUser(id: string): Promise<User> {
|
||||
// ...
|
||||
}
|
||||
|
||||
private validateId(id: string): boolean {
|
||||
// ...
|
||||
}
|
||||
}
|
||||
|
||||
Visibility
|
||||
|
||||
class Example {
|
||||
// private by default, only use public when needed externally
|
||||
private internalState = 0;
|
||||
|
||||
// readonly for properties never reassigned after construction
|
||||
readonly id: string;
|
||||
|
||||
// Never use #private syntax - use TypeScript visibility
|
||||
// #field = 1; // WRONG
|
||||
private field = 1; // Good
|
||||
}
|
||||
|
||||
Avoid Arrow Functions as Properties
|
||||
|
||||
class Handler {
|
||||
// Avoid: arrow function as property
|
||||
// handleClick = () => { ... };
|
||||
|
||||
// Good: instance method
|
||||
handleClick(): void {
|
||||
// ...
|
||||
}
|
||||
}
|
||||
|
||||
// Bind at call site if needed
|
||||
element.addEventListener('click', () => handler.handleClick());
|
||||
|
||||
Static Methods
|
||||
|
||||
Never use this in static methods
|
||||
Call on defining class, not subclasses
|
||||
|
||||
Functions
|
||||
Prefer Function Declarations
|
||||
|
||||
// Good: function declaration for named functions
|
||||
function processData(input: Data): Result {
|
||||
return transform(input);
|
||||
}
|
||||
|
||||
// Arrow functions when type annotation needed
|
||||
const handler: EventHandler = (event) => {
|
||||
// ...
|
||||
};
|
||||
|
||||
Arrow Function Bodies
|
||||
|
||||
// Concise body only when return value is used
|
||||
const double = (x: number) => x * 2;
|
||||
|
||||
// Block body when return should be void
|
||||
const log = (msg: string) => {
|
||||
console.log(msg);
|
||||
};
|
||||
|
||||
Parameters
|
||||
|
||||
// Use rest parameters, not arguments
|
||||
function sum(...numbers: number[]): number {
|
||||
return numbers.reduce((a, b) => a + b, 0);
|
||||
}
|
||||
|
||||
// Destructuring for multiple optional params
|
||||
interface Options {
|
||||
timeout?: number;
|
||||
retries?: number;
|
||||
}
|
||||
function fetch(url: string, { timeout = 5000, retries = 3 }: Options = {}) {
|
||||
// ...
|
||||
}
|
||||
|
||||
// Never name a parameter 'arguments'
|
||||
|
||||
Imports and Exports
|
||||
Always Use Named Exports
|
||||
|
||||
// Good: named exports
|
||||
export function processData() { }
|
||||
export class UserService { }
|
||||
export interface Config { }
|
||||
|
||||
// Never use default exports
|
||||
// export default class UserService { } // WRONG
|
||||
|
||||
Import Styles
|
||||
|
||||
// Module import for large APIs
|
||||
import * as fs from 'fs';
|
||||
|
||||
// Named imports for frequently used symbols
|
||||
import { readFile, writeFile } from 'fs/promises';
|
||||
|
||||
// Type-only imports when only used as types
|
||||
import type { User, Config } from './types';
|
||||
|
||||
Module Organization
|
||||
|
||||
Use modules, never namespace Foo { }
|
||||
Never use require() - use ES6 imports
|
||||
Use relative imports within same project
|
||||
Avoid excessive ../../../
|
||||
|
||||
Control Structures
|
||||
Always Use Braces
|
||||
|
||||
// Good
|
||||
if (condition) {
|
||||
doSomething();
|
||||
}
|
||||
|
||||
// Exception: single-line if
|
||||
if (condition) return early;
|
||||
|
||||
Loops
|
||||
|
||||
// Prefer for...of for arrays
|
||||
for (const item of items) {
|
||||
process(item);
|
||||
}
|
||||
|
||||
// Use Object methods with for...of for objects
|
||||
for (const [key, value] of Object.entries(obj)) {
|
||||
// ...
|
||||
}
|
||||
|
||||
// Never use unfiltered for...in on arrays
|
||||
|
||||
Equality
|
||||
|
||||
// Always use === and !==
|
||||
if (a === b) { }
|
||||
|
||||
// Exception: == null catches both null and undefined
|
||||
if (value == null) { }
|
||||
|
||||
Switch Statements
|
||||
|
||||
switch (status) {
|
||||
case Status.Active:
|
||||
handleActive();
|
||||
break;
|
||||
case Status.Inactive:
|
||||
handleInactive();
|
||||
break;
|
||||
default:
|
||||
// Always include default, even if empty
|
||||
break;
|
||||
}
|
||||
|
||||
Exception Handling
|
||||
|
||||
// Always throw Error instances
|
||||
throw new Error('Something went wrong');
|
||||
// throw 'error'; // WRONG
|
||||
|
||||
// Catch with unknown type
|
||||
try {
|
||||
riskyOperation();
|
||||
} catch (e: unknown) {
|
||||
if (e instanceof Error) {
|
||||
logger.error(e.message);
|
||||
}
|
||||
throw e;
|
||||
}
|
||||
|
||||
// Empty catch needs justification comment
|
||||
try {
|
||||
optional();
|
||||
} catch {
|
||||
// Intentionally ignored: fallback behavior handles this
|
||||
}
|
||||
|
||||
Type Assertions
|
||||
|
||||
// Use 'as' syntax, not angle brackets
|
||||
const input = value as string;
|
||||
// const input = <string>value; // WRONG in TSX, avoid everywhere
|
||||
|
||||
// Double assertion through unknown when needed
|
||||
const config = (rawData as unknown) as Config;
|
||||
|
||||
// Add comment explaining why assertion is safe
|
||||
const element = document.getElementById('app') as HTMLElement;
|
||||
// Safe: element exists in index.html
|
||||
|
||||
Strings
|
||||
|
||||
// Use single quotes for string literals
|
||||
const name = 'Alice';
|
||||
|
||||
// Template literals for interpolation or multiline
|
||||
const message = `Hello, ${name}!`;
|
||||
const query = `
|
||||
SELECT *
|
||||
FROM users
|
||||
WHERE id = ?
|
||||
`;
|
||||
|
||||
// Never use backslash line continuations
|
||||
|
||||
Disallowed Features
|
||||
Feature Alternative
|
||||
var const or let
|
||||
Array() constructor [] literal
|
||||
Object() constructor {} literal
|
||||
any type unknown
|
||||
namespace modules
|
||||
require() import
|
||||
Default exports Named exports
|
||||
#private fields private modifier
|
||||
eval() Never use
|
||||
const enum Regular enum
|
||||
debugger Remove before commit
|
||||
with Never use
|
||||
Prototype modification Never modify
|
||||
28
.eslintrc.js
28
.eslintrc.js
@@ -1,28 +0,0 @@
|
||||
module.exports = {
|
||||
extends: ["plugin:react/recommended", "plugin:@typescript-eslint/recommended", "plugin:prettier/recommended", "plugin:css-modules/recommended", "plugin:storybook/recommended", "plugin:storybook/recommended", "plugin:storybook/recommended", "plugin:storybook/recommended"],
|
||||
parser: "@typescript-eslint/parser",
|
||||
parserOptions: {
|
||||
sourceType: "module",
|
||||
ecmaVersion: 2020,
|
||||
ecmaFeatures: {
|
||||
jsx: true // Allows for the parsing of JSX
|
||||
}
|
||||
},
|
||||
|
||||
plugins: ["@typescript-eslint", "css-modules"],
|
||||
settings: {
|
||||
"import/resolver": {
|
||||
node: {
|
||||
extensions: [".js", ".jsx", ".ts", ".tsx"]
|
||||
}
|
||||
},
|
||||
react: {
|
||||
version: "detect" // Tells eslint-plugin-react to automatically detect the version of React to use
|
||||
}
|
||||
},
|
||||
|
||||
// Fine tune rules
|
||||
rules: {
|
||||
"@typescript-eslint/no-var-requires": 0
|
||||
}
|
||||
};
|
||||
@@ -1,4 +1,4 @@
|
||||
module.exports = {
|
||||
semi: true,
|
||||
trailingComma: "all",
|
||||
export default {
|
||||
semi: true,
|
||||
trailingComma: "all",
|
||||
};
|
||||
|
||||
59
eslint.config.js
Normal file
59
eslint.config.js
Normal file
@@ -0,0 +1,59 @@
|
||||
import js from "@eslint/js";
|
||||
import typescript from "@typescript-eslint/eslint-plugin";
|
||||
import typescriptParser from "@typescript-eslint/parser";
|
||||
import react from "eslint-plugin-react";
|
||||
import prettier from "eslint-plugin-prettier";
|
||||
import cssModules from "eslint-plugin-css-modules";
|
||||
import storybook from "eslint-plugin-storybook";
|
||||
|
||||
export default [
|
||||
js.configs.recommended,
|
||||
{
|
||||
files: ["**/*.{js,jsx,ts,tsx}"],
|
||||
languageOptions: {
|
||||
parser: typescriptParser,
|
||||
parserOptions: {
|
||||
sourceType: "module",
|
||||
ecmaVersion: 2020,
|
||||
ecmaFeatures: {
|
||||
jsx: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
plugins: {
|
||||
"@typescript-eslint": typescript,
|
||||
react,
|
||||
prettier,
|
||||
"css-modules": cssModules,
|
||||
storybook,
|
||||
},
|
||||
settings: {
|
||||
"import/resolver": {
|
||||
node: {
|
||||
extensions: [".js", ".jsx", ".ts", ".tsx"],
|
||||
},
|
||||
},
|
||||
react: {
|
||||
version: "detect",
|
||||
},
|
||||
},
|
||||
rules: {
|
||||
...typescript.configs.recommended.rules,
|
||||
...react.configs.recommended.rules,
|
||||
...prettier.configs.recommended.rules,
|
||||
"@typescript-eslint/no-var-requires": "off",
|
||||
"@typescript-eslint/no-explicit-any": "off",
|
||||
"react/react-in-jsx-scope": "off",
|
||||
"no-undef": "off",
|
||||
},
|
||||
},
|
||||
{
|
||||
files: ["**/*.stories.{js,jsx,ts,tsx}"],
|
||||
rules: {
|
||||
...storybook.configs.recommended.rules,
|
||||
},
|
||||
},
|
||||
{
|
||||
ignores: ["dist/**", "node_modules/**", "build/**"],
|
||||
},
|
||||
];
|
||||
@@ -1,10 +1,10 @@
|
||||
module.exports = {
|
||||
preset: 'ts-jest',
|
||||
testEnvironment: 'jsdom',
|
||||
setupFilesAfterEnv: ['<rootDir>/jest.setup.js'],
|
||||
setupFilesAfterEnv: ['<rootDir>/jest.setup.cjs'],
|
||||
moduleNameMapper: {
|
||||
'\\.(css|less|scss|sass)$': 'identity-obj-proxy',
|
||||
'\\.(jpg|jpeg|png|gif|svg)$': '<rootDir>/__mocks__/fileMock.js',
|
||||
'\\.(jpg|jpeg|png|gif|svg)$': '<rootDir>/__mocks__/fileMock.cjs',
|
||||
},
|
||||
testMatch: [
|
||||
'**/__tests__/**/*.+(ts|tsx|js)',
|
||||
1143
package-lock.json
generated
1143
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
22
package.json
22
package.json
@@ -1,6 +1,7 @@
|
||||
{
|
||||
"name": "threetwo",
|
||||
"version": "0.1.0",
|
||||
"type": "module",
|
||||
"description": "ThreeTwo! A good comic book curator.",
|
||||
"scripts": {
|
||||
"build": "vite build",
|
||||
@@ -13,7 +14,8 @@
|
||||
"storybook": "storybook dev -p 6006",
|
||||
"build-storybook": "storybook build",
|
||||
"codegen": "wait-on http-get://localhost:3000/graphql/health && graphql-codegen",
|
||||
"codegen:watch": "graphql-codegen --config codegen.yml --watch"
|
||||
"codegen:watch": "graphql-codegen --config codegen.yml --watch",
|
||||
"knip": "knip"
|
||||
},
|
||||
"author": "Rishi Ghan",
|
||||
"license": "MIT",
|
||||
@@ -23,9 +25,9 @@
|
||||
"@dnd-kit/utilities": "^3.2.2",
|
||||
"@floating-ui/react": "^0.27.18",
|
||||
"@floating-ui/react-dom": "^2.1.7",
|
||||
"@fortawesome/fontawesome-free": "^7.2.0",
|
||||
"@popperjs/core": "^2.11.8",
|
||||
"@tanstack/react-query": "^5.90.21",
|
||||
"@tailwindcss/vite": "^4.2.2",
|
||||
"@tanstack/react-query": "^5.90.21",
|
||||
"@tanstack/react-table": "^8.21.3",
|
||||
"@types/mime-types": "^3.0.1",
|
||||
"@types/react-router-dom": "^5.3.3",
|
||||
@@ -52,6 +54,7 @@
|
||||
"immer": "^11.1.4",
|
||||
"jsdoc": "^4.0.5",
|
||||
"lodash": "^4.17.23",
|
||||
"motion": "^12.38.0",
|
||||
"pretty-bytes": "^7.1.0",
|
||||
"prop-types": "^15.8.1",
|
||||
"qs": "^6.15.0",
|
||||
@@ -73,16 +76,17 @@
|
||||
"react-sliding-pane": "^7.3.0",
|
||||
"react-textarea-autosize": "^8.5.9",
|
||||
"react-toastify": "^11.0.5",
|
||||
"rxjs": "^7.8.2",
|
||||
"socket.io-client": "^4.8.3",
|
||||
"styled-components": "^6.3.11",
|
||||
"threetwo-ui-typings": "^1.0.14",
|
||||
"vaul": "^1.1.2",
|
||||
"vite": "^7.3.1",
|
||||
"vite-plugin-html": "^3.2.2",
|
||||
"websocket": "^1.0.35",
|
||||
"zustand": "^5.0.11"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@eslint/js": "^10.0.0",
|
||||
"@graphql-codegen/cli": "^6.1.2",
|
||||
"@graphql-codegen/typescript": "^5.0.8",
|
||||
"@graphql-codegen/typescript-operations": "^5.0.8",
|
||||
@@ -107,12 +111,14 @@
|
||||
"@testing-library/react": "^16.3.2",
|
||||
"@testing-library/user-event": "^14.6.1",
|
||||
"@types/ellipsize": "^0.1.3",
|
||||
"@types/html-to-text": "^9.0.4",
|
||||
"@types/jest": "^30.0.0",
|
||||
"@types/lodash": "^4.17.24",
|
||||
"@types/node": "^25.3.0",
|
||||
"@types/node": "^25.6.0",
|
||||
"@types/prop-types": "^15.7.15",
|
||||
"@types/react": "^19.2.14",
|
||||
"@types/react-dom": "^19.2.3",
|
||||
"@types/react-redux": "^7.1.34",
|
||||
"@types/react-table": "^7.7.20",
|
||||
"autoprefixer": "^10.4.27",
|
||||
"docdash": "^2.0.2",
|
||||
"eslint": "^10.0.2",
|
||||
@@ -135,10 +141,10 @@
|
||||
"rimraf": "^6.1.3",
|
||||
"sass": "^1.97.3",
|
||||
"storybook": "^8.6.17",
|
||||
"tailwindcss": "^4.2.1",
|
||||
"tailwindcss": "^4.2.2",
|
||||
"ts-jest": "^29.4.6",
|
||||
"tui-jsdoc-template": "^1.2.2",
|
||||
"typescript": "^5.9.3",
|
||||
"typescript": "^6.0.2",
|
||||
"wait-on": "^9.0.4"
|
||||
},
|
||||
"resolutions": {
|
||||
|
||||
211
plans/import-directory-status.md
Normal file
211
plans/import-directory-status.md
Normal file
@@ -0,0 +1,211 @@
|
||||
# Implementation Plan: Directory Status Check for Import.tsx
|
||||
|
||||
## Overview
|
||||
|
||||
Add functionality to `Import.tsx` that checks if the required directories (`comics` and `userdata`) exist before allowing the import process to start. If either directory is missing, display a warning banner to the user and disable the import functionality.
|
||||
|
||||
## API Endpoint
|
||||
|
||||
- **Endpoint**: `GET /api/library/getDirectoryStatus`
|
||||
- **Response Structure**:
|
||||
```typescript
|
||||
interface DirectoryStatus {
|
||||
comics: { exists: boolean };
|
||||
userdata: { exists: boolean };
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation Details
|
||||
|
||||
### 1. Add Directory Status Type
|
||||
|
||||
In [`Import.tsx`](src/client/components/Import/Import.tsx:1), add a type definition for the directory status response:
|
||||
|
||||
```typescript
|
||||
interface DirectoryStatus {
|
||||
comics: { exists: boolean };
|
||||
userdata: { exists: boolean };
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Create useQuery Hook for Directory Status
|
||||
|
||||
Use `@tanstack/react-query` (already imported) to fetch directory status on component mount:
|
||||
|
||||
```typescript
|
||||
const { data: directoryStatus, isLoading: isCheckingDirectories, error: directoryError } = useQuery({
|
||||
queryKey: ['directoryStatus'],
|
||||
queryFn: async (): Promise<DirectoryStatus> => {
|
||||
const response = await axios.get('http://localhost:3000/api/library/getDirectoryStatus');
|
||||
return response.data;
|
||||
},
|
||||
refetchOnWindowFocus: false,
|
||||
staleTime: 30000, // Cache for 30 seconds
|
||||
});
|
||||
```
|
||||
|
||||
### 3. Derive Missing Directories State
|
||||
|
||||
Compute which directories are missing from the query result:
|
||||
|
||||
```typescript
|
||||
const missingDirectories = useMemo(() => {
|
||||
if (!directoryStatus) return [];
|
||||
const missing: string[] = [];
|
||||
if (!directoryStatus.comics?.exists) missing.push('comics');
|
||||
if (!directoryStatus.userdata?.exists) missing.push('userdata');
|
||||
return missing;
|
||||
}, [directoryStatus]);
|
||||
|
||||
const hasAllDirectories = missingDirectories.length === 0;
|
||||
```
|
||||
|
||||
### 4. Create Warning Banner Component
|
||||
|
||||
Add a warning banner that displays when directories are missing, positioned above the import button. This uses the same styling patterns as the existing error banner:
|
||||
|
||||
```tsx
|
||||
{/* Directory Status Warning */}
|
||||
{!isCheckingDirectories && missingDirectories.length > 0 && (
|
||||
<div className="my-6 max-w-screen-lg rounded-lg border-s-4 border-amber-500 bg-amber-50 dark:bg-amber-900/20 p-4">
|
||||
<div className="flex items-start gap-3">
|
||||
<span className="w-6 h-6 text-amber-600 dark:text-amber-400 mt-0.5">
|
||||
<i className="h-6 w-6 icon-[solar--folder-error-bold]"></i>
|
||||
</span>
|
||||
<div className="flex-1">
|
||||
<p className="font-semibold text-amber-800 dark:text-amber-300">
|
||||
Required Directories Missing
|
||||
</p>
|
||||
<p className="text-sm text-amber-700 dark:text-amber-400 mt-1">
|
||||
The following directories do not exist and must be created before importing:
|
||||
</p>
|
||||
<ul className="list-disc list-inside text-sm text-amber-700 dark:text-amber-400 mt-2">
|
||||
{missingDirectories.map((dir) => (
|
||||
<li key={dir}>
|
||||
<code className="bg-amber-100 dark:bg-amber-900/50 px-1 rounded">{dir}</code>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
<p className="text-sm text-amber-700 dark:text-amber-400 mt-2">
|
||||
Please ensure these directories are mounted correctly in your Docker configuration.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
```
|
||||
|
||||
### 5. Disable Import Button When Directories Missing
|
||||
|
||||
Modify the button's `disabled` prop and click handler:
|
||||
|
||||
```tsx
|
||||
<button
|
||||
className="..."
|
||||
onClick={handleForceReImport}
|
||||
disabled={isForceReImporting || hasActiveSession || !hasAllDirectories}
|
||||
title={!hasAllDirectories
|
||||
? "Cannot import: Required directories are missing"
|
||||
: "Re-import all files to fix Elasticsearch indexing issues"}
|
||||
>
|
||||
```
|
||||
|
||||
### 6. Update handleForceReImport Guard
|
||||
|
||||
Add early return in the handler for missing directories:
|
||||
|
||||
```typescript
|
||||
const handleForceReImport = async () => {
|
||||
setImportError(null);
|
||||
|
||||
// Check for missing directories
|
||||
if (!hasAllDirectories) {
|
||||
setImportError(
|
||||
`Cannot start import: Required directories are missing (${missingDirectories.join(', ')}). Please check your Docker volume configuration.`
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
// ... existing logic
|
||||
};
|
||||
```
|
||||
|
||||
## File Changes Summary
|
||||
|
||||
| File | Changes |
|
||||
|------|---------|
|
||||
| [`src/client/components/Import/Import.tsx`](src/client/components/Import/Import.tsx) | Add useQuery for directory status, warning banner UI, disable button logic |
|
||||
| [`src/client/components/Import/Import.test.tsx`](src/client/components/Import/Import.test.tsx) | Add tests for directory status scenarios |
|
||||
|
||||
## Test Cases to Add
|
||||
|
||||
### Import.test.tsx Updates
|
||||
|
||||
1. **Should show warning banner when comics directory is missing**
|
||||
2. **Should show warning banner when userdata directory is missing**
|
||||
3. **Should show warning banner when both directories are missing**
|
||||
4. **Should disable import button when directories are missing**
|
||||
5. **Should enable import button when all directories exist**
|
||||
6. **Should handle directory status API error gracefully**
|
||||
|
||||
Example test structure:
|
||||
|
||||
```typescript
|
||||
describe('Import Component - Directory Status', () => {
|
||||
beforeEach(() => {
|
||||
jest.clearAllMocks();
|
||||
// Mock successful directory status by default
|
||||
(axios.get as jest.Mock) = jest.fn().mockResolvedValue({
|
||||
data: { comics: { exists: true }, userdata: { exists: true } }
|
||||
});
|
||||
});
|
||||
|
||||
test('should show warning when comics directory is missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: false }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Required Directories Missing')).toBeInTheDocument();
|
||||
expect(screen.getByText('comics')).toBeInTheDocument();
|
||||
});
|
||||
});
|
||||
|
||||
test('should disable import button when directories are missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: false }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
const button = screen.getByRole('button', { name: /Force Re-Import/i });
|
||||
expect(button).toBeDisabled();
|
||||
});
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Architecture Diagram
|
||||
|
||||
```mermaid
|
||||
flowchart TD
|
||||
A[Import Component Mounts] --> B[Fetch Directory Status]
|
||||
B --> C{API Success?}
|
||||
C -->|Yes| D{All Directories Exist?}
|
||||
C -->|No| E[Show Error Banner]
|
||||
D -->|Yes| F[Enable Import Button]
|
||||
D -->|No| G[Show Warning Banner]
|
||||
G --> H[Disable Import Button]
|
||||
F --> I[User Clicks Import]
|
||||
I --> J[Proceed with Import]
|
||||
```
|
||||
|
||||
## Notes
|
||||
|
||||
- The directory status is fetched once on mount with a 30-second stale time
|
||||
- The warning uses amber/yellow colors to differentiate from error messages (red)
|
||||
- The existing `importError` state and UI can remain unchanged
|
||||
- No changes needed to the backend - the endpoint already exists
|
||||
@@ -1,4 +1,4 @@
|
||||
module.exports = {
|
||||
export default {
|
||||
plugins: {
|
||||
"postcss-import": {},
|
||||
"@tailwindcss/postcss": {},
|
||||
|
||||
25
skills-lock.json
Normal file
25
skills-lock.json
Normal file
@@ -0,0 +1,25 @@
|
||||
{
|
||||
"version": 1,
|
||||
"skills": {
|
||||
"caveman": {
|
||||
"source": "JuliusBrussee/caveman",
|
||||
"sourceType": "github",
|
||||
"computedHash": "a818cdc41dcfaa50dd891c5cb5e5705968338de02e7e37949ca56e8c30ad4176"
|
||||
},
|
||||
"caveman-compress": {
|
||||
"source": "JuliusBrussee/caveman",
|
||||
"sourceType": "github",
|
||||
"computedHash": "300fb8578258161e1752a2a4142a7e9ff178c960bcb83b84422e2987421f33bf"
|
||||
},
|
||||
"caveman-help": {
|
||||
"source": "JuliusBrussee/caveman",
|
||||
"sourceType": "github",
|
||||
"computedHash": "3cd5f7d3f88c8ef7b16a6555dc61f5a11b14151386697609ab6887ab8b5f059d"
|
||||
},
|
||||
"compress": {
|
||||
"source": "JuliusBrussee/caveman",
|
||||
"sourceType": "github",
|
||||
"computedHash": "05c97bc3120108acd0b80bdef7fb4fa7c224ba83c8d384ccbc97f92e8a065918"
|
||||
}
|
||||
}
|
||||
}
|
||||
47
src/app.css
Normal file
47
src/app.css
Normal file
@@ -0,0 +1,47 @@
|
||||
@import "tailwindcss";
|
||||
@config "../tailwind.config.ts";
|
||||
|
||||
html, body {
|
||||
overflow-x: hidden;
|
||||
}
|
||||
|
||||
/* Custom Project Fonts */
|
||||
@font-face {
|
||||
font-family: "PP Object Sans Regular";
|
||||
src: url("/fonts/PPObjectSans-Regular.otf") format("opentype");
|
||||
font-weight: 400;
|
||||
font-style: normal;
|
||||
font-display: swap;
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-family: "PP Object Sans Heavy";
|
||||
src: url("/fonts/PPObjectSans-Heavy.otf") format("opentype");
|
||||
font-weight: 700;
|
||||
font-style: normal;
|
||||
font-display: swap;
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-family: "PP Object Sans Slanted";
|
||||
src: url("/fonts/PPObjectSans-Slanted.otf") format("opentype");
|
||||
font-weight: 400;
|
||||
font-style: italic;
|
||||
font-display: swap;
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-family: "PP Object Sans HeavySlanted";
|
||||
src: url("/fonts/PPObjectSans-HeavySlanted.otf") format("opentype");
|
||||
font-weight: 700;
|
||||
font-style: italic;
|
||||
font-display: swap;
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-family: "Hasklig Regular";
|
||||
src: url("/fonts/Hasklig-Regular.otf") format("opentype");
|
||||
font-weight: 400;
|
||||
font-style: normal;
|
||||
font-display: swap;
|
||||
}
|
||||
@@ -7,11 +7,12 @@ This folder houses all the components, utils and libraries that make up ThreeTwo
|
||||
|
||||
It is based on React 18, and uses:
|
||||
|
||||
1. _Redux_ for state management
|
||||
1. _zustand_ for state management
|
||||
2. _socket.io_ for transferring data in real-time
|
||||
3. _React Router_ for routing
|
||||
4. React DnD for drag-and-drop
|
||||
5. @tanstack/react-table for all tables
|
||||
6. @tanstack/react-query for API calls
|
||||
|
||||
|
||||
|
||||
|
||||
@@ -1,177 +0,0 @@
|
||||
import {
|
||||
SearchQuery,
|
||||
SearchInstance,
|
||||
PriorityEnum,
|
||||
SearchResponse,
|
||||
} from "threetwo-ui-typings";
|
||||
import {
|
||||
LIBRARY_SERVICE_BASE_URI,
|
||||
SEARCH_SERVICE_BASE_URI,
|
||||
} from "../constants/endpoints";
|
||||
import {
|
||||
AIRDCPP_SEARCH_RESULTS_ADDED,
|
||||
AIRDCPP_SEARCH_RESULTS_UPDATED,
|
||||
AIRDCPP_HUB_SEARCHES_SENT,
|
||||
AIRDCPP_RESULT_DOWNLOAD_INITIATED,
|
||||
AIRDCPP_DOWNLOAD_PROGRESS_TICK,
|
||||
AIRDCPP_BUNDLES_FETCHED,
|
||||
AIRDCPP_SEARCH_IN_PROGRESS,
|
||||
AIRDCPP_FILE_DOWNLOAD_COMPLETED,
|
||||
LS_SINGLE_IMPORT,
|
||||
IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
AIRDCPP_TRANSFERS_FETCHED,
|
||||
LIBRARY_ISSUE_BUNDLES,
|
||||
AIRDCPP_SOCKET_CONNECTED,
|
||||
AIRDCPP_SOCKET_DISCONNECTED,
|
||||
} from "../constants/action-types";
|
||||
import { isNil } from "lodash";
|
||||
import axios from "axios";
|
||||
|
||||
interface SearchData {
|
||||
query: Pick<SearchQuery, "pattern"> & Partial<Omit<SearchQuery, "pattern">>;
|
||||
hub_urls: string[] | undefined | null;
|
||||
priority: PriorityEnum;
|
||||
}
|
||||
|
||||
export const sleep = (ms: number): Promise<NodeJS.Timeout> => {
|
||||
return new Promise((resolve) => setTimeout(resolve, ms));
|
||||
};
|
||||
|
||||
export const toggleAirDCPPSocketConnectionStatus =
|
||||
(status: String, payload?: any) => async (dispatch) => {
|
||||
switch (status) {
|
||||
case "connected":
|
||||
dispatch({
|
||||
type: AIRDCPP_SOCKET_CONNECTED,
|
||||
data: payload,
|
||||
});
|
||||
break;
|
||||
|
||||
case "disconnected":
|
||||
dispatch({
|
||||
type: AIRDCPP_SOCKET_DISCONNECTED,
|
||||
data: payload,
|
||||
});
|
||||
break;
|
||||
|
||||
default:
|
||||
break;
|
||||
}
|
||||
};
|
||||
export const downloadAirDCPPItem =
|
||||
(
|
||||
searchInstanceId: Number,
|
||||
resultId: String,
|
||||
comicObjectId: String,
|
||||
name: String,
|
||||
size: Number,
|
||||
type: any,
|
||||
ADCPPSocket: any,
|
||||
credentials: any,
|
||||
): void =>
|
||||
async (dispatch) => {
|
||||
try {
|
||||
if (!ADCPPSocket.isConnected()) {
|
||||
await ADCPPSocket.connect();
|
||||
}
|
||||
let bundleDBImportResult = {};
|
||||
const downloadResult = await ADCPPSocket.post(
|
||||
`search/${searchInstanceId}/results/${resultId}/download`,
|
||||
);
|
||||
|
||||
if (!isNil(downloadResult)) {
|
||||
bundleDBImportResult = await axios({
|
||||
method: "POST",
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/applyAirDCPPDownloadMetadata`,
|
||||
headers: {
|
||||
"Content-Type": "application/json; charset=utf-8",
|
||||
},
|
||||
data: {
|
||||
bundleId: downloadResult.bundle_info.id,
|
||||
comicObjectId,
|
||||
name,
|
||||
size,
|
||||
type,
|
||||
},
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: AIRDCPP_RESULT_DOWNLOAD_INITIATED,
|
||||
downloadResult,
|
||||
bundleDBImportResult,
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
comicBookDetail: bundleDBImportResult.data,
|
||||
IMS_inProgress: false,
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
export const getBundlesForComic =
|
||||
(comicObjectId: string, ADCPPSocket: any, credentials: any) =>
|
||||
async (dispatch) => {
|
||||
try {
|
||||
if (!ADCPPSocket.isConnected()) {
|
||||
await ADCPPSocket.connect();
|
||||
}
|
||||
const comicObject = await axios({
|
||||
method: "POST",
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBookById`,
|
||||
headers: {
|
||||
"Content-Type": "application/json; charset=utf-8",
|
||||
},
|
||||
data: {
|
||||
id: `${comicObjectId}`,
|
||||
},
|
||||
});
|
||||
// get only the bundles applicable for the comic
|
||||
if (comicObject.data.acquisition.directconnect) {
|
||||
const filteredBundles =
|
||||
comicObject.data.acquisition.directconnect.downloads.map(
|
||||
async ({ bundleId }) => {
|
||||
return await ADCPPSocket.get(`queue/bundles/${bundleId}`);
|
||||
},
|
||||
);
|
||||
dispatch({
|
||||
type: AIRDCPP_BUNDLES_FETCHED,
|
||||
bundles: await Promise.all(filteredBundles),
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
export const getTransfers =
|
||||
(ADCPPSocket: any, credentials: any) => async (dispatch) => {
|
||||
try {
|
||||
if (!ADCPPSocket.isConnected()) {
|
||||
await ADCPPSocket.connect();
|
||||
}
|
||||
const bundles = await ADCPPSocket.get("queue/bundles/1/85", {});
|
||||
if (!isNil(bundles)) {
|
||||
dispatch({
|
||||
type: AIRDCPP_TRANSFERS_FETCHED,
|
||||
bundles,
|
||||
});
|
||||
const bundleIds = bundles.map((bundle) => bundle.id);
|
||||
// get issues with matching bundleIds
|
||||
const issue_bundles = await axios({
|
||||
url: `${SEARCH_SERVICE_BASE_URI}/groupIssuesByBundles`,
|
||||
method: "POST",
|
||||
data: { bundleIds },
|
||||
});
|
||||
dispatch({
|
||||
type: LIBRARY_ISSUE_BUNDLES,
|
||||
issue_bundles,
|
||||
});
|
||||
}
|
||||
} catch (err) {
|
||||
throw err;
|
||||
}
|
||||
};
|
||||
@@ -1,207 +0,0 @@
|
||||
import axios from "axios";
|
||||
import rateLimiter from "axios-rate-limit";
|
||||
import { setupCache } from "axios-cache-interceptor";
|
||||
import {
|
||||
CV_SEARCH_SUCCESS,
|
||||
CV_API_CALL_IN_PROGRESS,
|
||||
CV_API_GENERIC_FAILURE,
|
||||
IMS_COMIC_BOOK_DB_OBJECT_CALL_IN_PROGRESS,
|
||||
IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
CV_ISSUES_METADATA_CALL_IN_PROGRESS,
|
||||
CV_CLEANUP,
|
||||
IMS_COMIC_BOOKS_DB_OBJECTS_FETCHED,
|
||||
CV_ISSUES_MATCHES_IN_LIBRARY_FETCHED,
|
||||
CV_ISSUES_FOR_VOLUME_IN_LIBRARY_SUCCESS,
|
||||
CV_WEEKLY_PULLLIST_CALL_IN_PROGRESS,
|
||||
CV_WEEKLY_PULLLIST_FETCHED,
|
||||
LIBRARY_STATISTICS_CALL_IN_PROGRESS,
|
||||
LIBRARY_STATISTICS_FETCHED,
|
||||
} from "../constants/action-types";
|
||||
import {
|
||||
COMICVINE_SERVICE_URI,
|
||||
LIBRARY_SERVICE_BASE_URI,
|
||||
} from "../constants/endpoints";
|
||||
|
||||
const http = rateLimiter(axios.create(), {
|
||||
maxRequests: 1,
|
||||
perMilliseconds: 1000,
|
||||
maxRPS: 1,
|
||||
});
|
||||
const cachedAxios = setupCache(axios);
|
||||
export const getWeeklyPullList = (options) => async (dispatch) => {
|
||||
try {
|
||||
dispatch({
|
||||
type: CV_WEEKLY_PULLLIST_CALL_IN_PROGRESS,
|
||||
});
|
||||
await cachedAxios(`${COMICVINE_SERVICE_URI}/getWeeklyPullList`, {
|
||||
method: "get",
|
||||
params: options,
|
||||
}).then((response) => {
|
||||
dispatch({
|
||||
type: CV_WEEKLY_PULLLIST_FETCHED,
|
||||
data: response.data.result,
|
||||
});
|
||||
});
|
||||
} catch (error) {
|
||||
// Error handling could be added here if needed
|
||||
}
|
||||
};
|
||||
|
||||
export const comicinfoAPICall = (options) => async (dispatch) => {
|
||||
try {
|
||||
dispatch({
|
||||
type: CV_API_CALL_IN_PROGRESS,
|
||||
inProgress: true,
|
||||
});
|
||||
const serviceURI = `${COMICVINE_SERVICE_URI}/${options.callURIAction}`;
|
||||
const response = await http(serviceURI, {
|
||||
method: options.callMethod,
|
||||
params: options.callParams,
|
||||
data: options.data ? options.data : null,
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
"Access-Control-Allow-Origin": "*",
|
||||
},
|
||||
});
|
||||
|
||||
switch (options.callURIAction) {
|
||||
case "search":
|
||||
dispatch({
|
||||
type: CV_SEARCH_SUCCESS,
|
||||
searchResults: response.data,
|
||||
});
|
||||
break;
|
||||
|
||||
default:
|
||||
break;
|
||||
}
|
||||
} catch (error) {
|
||||
dispatch({
|
||||
type: CV_API_GENERIC_FAILURE,
|
||||
error,
|
||||
});
|
||||
}
|
||||
};
|
||||
export const getIssuesForSeries =
|
||||
(comicObjectID: string) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: CV_ISSUES_METADATA_CALL_IN_PROGRESS,
|
||||
});
|
||||
dispatch({
|
||||
type: CV_CLEANUP,
|
||||
});
|
||||
|
||||
const issues = await axios({
|
||||
url: `${COMICVINE_SERVICE_URI}/getIssuesForSeries`,
|
||||
method: "POST",
|
||||
params: {
|
||||
comicObjectID,
|
||||
},
|
||||
});
|
||||
dispatch({
|
||||
type: CV_ISSUES_FOR_VOLUME_IN_LIBRARY_SUCCESS,
|
||||
issues: issues.data.results,
|
||||
});
|
||||
};
|
||||
|
||||
export const analyzeLibrary = (issues) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: CV_ISSUES_METADATA_CALL_IN_PROGRESS,
|
||||
});
|
||||
const queryObjects = issues.map((issue) => {
|
||||
const { id, name, issue_number } = issue;
|
||||
return {
|
||||
issueId: id,
|
||||
issueName: name,
|
||||
volumeName: issue.volume.name,
|
||||
issueNumber: issue_number,
|
||||
};
|
||||
});
|
||||
const foo = await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/findIssueForSeries`,
|
||||
method: "POST",
|
||||
data: {
|
||||
queryObjects,
|
||||
},
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: CV_ISSUES_MATCHES_IN_LIBRARY_FETCHED,
|
||||
matches: foo.data,
|
||||
});
|
||||
};
|
||||
|
||||
export const getLibraryStatistics = () => async (dispatch) => {
|
||||
dispatch({
|
||||
type: LIBRARY_STATISTICS_CALL_IN_PROGRESS,
|
||||
});
|
||||
const result = await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/libraryStatistics`,
|
||||
method: "GET",
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: LIBRARY_STATISTICS_FETCHED,
|
||||
data: result.data,
|
||||
});
|
||||
};
|
||||
|
||||
export const getComicBookDetailById =
|
||||
(comicBookObjectId: string) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_CALL_IN_PROGRESS,
|
||||
IMS_inProgress: true,
|
||||
});
|
||||
const result = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBookById`,
|
||||
method: "POST",
|
||||
data: {
|
||||
id: comicBookObjectId,
|
||||
},
|
||||
});
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
comicBookDetail: result.data,
|
||||
IMS_inProgress: false,
|
||||
});
|
||||
};
|
||||
|
||||
export const getComicBooksDetailsByIds =
|
||||
(comicBookObjectIds: Array<string>) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_CALL_IN_PROGRESS,
|
||||
IMS_inProgress: true,
|
||||
});
|
||||
const result = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBooksByIds`,
|
||||
method: "POST",
|
||||
data: {
|
||||
ids: comicBookObjectIds,
|
||||
},
|
||||
});
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOKS_DB_OBJECTS_FETCHED,
|
||||
comicBooks: result.data,
|
||||
});
|
||||
};
|
||||
|
||||
export const applyComicVineMatch =
|
||||
(match, comicObjectId) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_CALL_IN_PROGRESS,
|
||||
IMS_inProgress: true,
|
||||
});
|
||||
const result = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/applyComicVineMetadata`,
|
||||
method: "POST",
|
||||
data: {
|
||||
match,
|
||||
comicObjectId,
|
||||
},
|
||||
});
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
comicBookDetail: result.data,
|
||||
IMS_inProgress: false,
|
||||
});
|
||||
};
|
||||
@@ -1,383 +0,0 @@
|
||||
import axios from "axios";
|
||||
import { IFolderData } from "threetwo-ui-typings";
|
||||
import {
|
||||
COMICVINE_SERVICE_URI,
|
||||
IMAGETRANSFORMATION_SERVICE_BASE_URI,
|
||||
LIBRARY_SERVICE_BASE_URI,
|
||||
SEARCH_SERVICE_BASE_URI,
|
||||
JOB_QUEUE_SERVICE_BASE_URI,
|
||||
} from "../constants/endpoints";
|
||||
import {
|
||||
IMS_COMIC_BOOK_GROUPS_FETCHED,
|
||||
IMS_COMIC_BOOK_GROUPS_CALL_IN_PROGRESS,
|
||||
IMS_RECENT_COMICS_FETCHED,
|
||||
IMS_WANTED_COMICS_FETCHED,
|
||||
CV_API_CALL_IN_PROGRESS,
|
||||
CV_SEARCH_SUCCESS,
|
||||
CV_CLEANUP,
|
||||
IMS_CV_METADATA_IMPORT_CALL_IN_PROGRESS,
|
||||
IMS_CV_METADATA_IMPORT_SUCCESSFUL,
|
||||
IMS_CV_METADATA_IMPORT_FAILED,
|
||||
LS_IMPORT,
|
||||
IMG_ANALYSIS_CALL_IN_PROGRESS,
|
||||
IMG_ANALYSIS_DATA_FETCH_SUCCESS,
|
||||
IMS_COMIC_BOOK_ARCHIVE_EXTRACTION_CALL_IN_PROGRESS,
|
||||
SS_SEARCH_RESULTS_FETCHED,
|
||||
SS_SEARCH_IN_PROGRESS,
|
||||
FILEOPS_STATE_RESET,
|
||||
LS_IMPORT_CALL_IN_PROGRESS,
|
||||
SS_SEARCH_FAILED,
|
||||
SS_SEARCH_RESULTS_FETCHED_SPECIAL,
|
||||
WANTED_COMICS_FETCHED,
|
||||
VOLUMES_FETCHED,
|
||||
LIBRARY_SERVICE_HEALTH,
|
||||
LS_SET_QUEUE_STATUS,
|
||||
LS_IMPORT_JOB_STATISTICS_FETCHED,
|
||||
} from "../constants/action-types";
|
||||
|
||||
import { isNil } from "lodash";
|
||||
|
||||
export const getServiceStatus = (serviceName?: string) => async (dispatch) => {
|
||||
axios
|
||||
.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getHealthInformation`,
|
||||
method: "GET",
|
||||
transformResponse: (r: string) => JSON.parse(r),
|
||||
})
|
||||
.then((response) => {
|
||||
const { data } = response;
|
||||
dispatch({
|
||||
type: LIBRARY_SERVICE_HEALTH,
|
||||
status: data,
|
||||
});
|
||||
});
|
||||
};
|
||||
export async function walkFolder(path: string): Promise<Array<IFolderData>> {
|
||||
return axios
|
||||
.request<Array<IFolderData>>({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/walkFolders`,
|
||||
method: "POST",
|
||||
data: {
|
||||
basePathToWalk: path,
|
||||
},
|
||||
transformResponse: (r: string) => JSON.parse(r),
|
||||
})
|
||||
.then((response) => {
|
||||
const { data } = response;
|
||||
return data;
|
||||
})
|
||||
.catch((error) => error);
|
||||
}
|
||||
/**
|
||||
* Fetches comic book covers along with some metadata
|
||||
* @return the comic book metadata
|
||||
*/
|
||||
export const fetchComicBookMetadata = () => async (dispatch) => {
|
||||
dispatch({
|
||||
type: LS_IMPORT_CALL_IN_PROGRESS,
|
||||
});
|
||||
|
||||
// dispatch(
|
||||
// success({
|
||||
// // uid: 'once-please', // you can specify your own uid if required
|
||||
// title: "Import Started",
|
||||
// message: `<span class="icon-text has-text-success"><i class="fas fa-plug"></i></span> Socket <span class="has-text-info">${socket.id}</span> connected. <strong>${walkedFolders.length}</strong> comics scanned.`,
|
||||
// dismissible: "click",
|
||||
// position: "tr",
|
||||
// autoDismiss: 0,
|
||||
// }),
|
||||
// );
|
||||
const sessionId = localStorage.getItem("sessionId");
|
||||
dispatch({
|
||||
type: LS_IMPORT,
|
||||
});
|
||||
|
||||
await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/newImport`,
|
||||
method: "POST",
|
||||
data: { sessionId },
|
||||
});
|
||||
};
|
||||
|
||||
export const getImportJobResultStatistics = () => async (dispatch) => {
|
||||
const result = await axios.request({
|
||||
url: `${JOB_QUEUE_SERVICE_BASE_URI}/getJobResultStatistics`,
|
||||
method: "GET",
|
||||
});
|
||||
dispatch({
|
||||
type: LS_IMPORT_JOB_STATISTICS_FETCHED,
|
||||
data: result.data,
|
||||
});
|
||||
};
|
||||
|
||||
export const setQueueControl =
|
||||
(queueAction: string, queueStatus: string) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: LS_SET_QUEUE_STATUS,
|
||||
meta: { remote: true },
|
||||
data: { queueAction, queueStatus },
|
||||
});
|
||||
};
|
||||
|
||||
/**
|
||||
* Fetches comic book metadata for various types
|
||||
* @return metadata for the comic book object categories
|
||||
* @param options
|
||||
**/
|
||||
export const getComicBooks = (options) => async (dispatch) => {
|
||||
const { paginationOptions, predicate, comicStatus } = options;
|
||||
|
||||
const response = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBooks`,
|
||||
method: "POST",
|
||||
data: {
|
||||
paginationOptions,
|
||||
predicate,
|
||||
},
|
||||
});
|
||||
|
||||
switch (comicStatus) {
|
||||
case "recent":
|
||||
dispatch({
|
||||
type: IMS_RECENT_COMICS_FETCHED,
|
||||
data: response.data,
|
||||
});
|
||||
break;
|
||||
case "wanted":
|
||||
dispatch({
|
||||
type: IMS_WANTED_COMICS_FETCHED,
|
||||
data: response.data.docs,
|
||||
});
|
||||
break;
|
||||
default:
|
||||
break;
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Makes a call to library service to import the comic book metadata into the ThreeTwo data store.
|
||||
* @returns Nothing.
|
||||
* @param payload
|
||||
*/
|
||||
export const importToDB =
|
||||
(sourceName: string, metadata?: any) => (dispatch) => {
|
||||
try {
|
||||
const comicBookMetadata = {
|
||||
importType: "new",
|
||||
payload: {
|
||||
rawFileDetails: {
|
||||
name: "",
|
||||
},
|
||||
importStatus: {
|
||||
isImported: true,
|
||||
tagged: false,
|
||||
matchedResult: {
|
||||
score: "0",
|
||||
},
|
||||
},
|
||||
sourcedMetadata: metadata || null,
|
||||
acquisition: { source: { wanted: true, name: sourceName } },
|
||||
},
|
||||
};
|
||||
dispatch({
|
||||
type: IMS_CV_METADATA_IMPORT_CALL_IN_PROGRESS,
|
||||
});
|
||||
return axios
|
||||
.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/rawImportToDb`,
|
||||
method: "POST",
|
||||
data: comicBookMetadata,
|
||||
// transformResponse: (r: string) => JSON.parse(r),
|
||||
})
|
||||
.then((response) => {
|
||||
const { data } = response;
|
||||
dispatch({
|
||||
type: IMS_CV_METADATA_IMPORT_SUCCESSFUL,
|
||||
importResult: data,
|
||||
});
|
||||
});
|
||||
} catch (error) {
|
||||
dispatch({
|
||||
type: IMS_CV_METADATA_IMPORT_FAILED,
|
||||
importError: error,
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
export const fetchVolumeGroups = () => async (dispatch) => {
|
||||
try {
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_GROUPS_CALL_IN_PROGRESS,
|
||||
});
|
||||
const response = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBookGroups`,
|
||||
method: "GET",
|
||||
});
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_GROUPS_FETCHED,
|
||||
data: response.data,
|
||||
});
|
||||
} catch (error) {
|
||||
// Error handling could be added here if needed
|
||||
}
|
||||
};
|
||||
export const fetchComicVineMatches =
|
||||
(searchPayload, issueSearchQuery, seriesSearchQuery?) => async (dispatch) => {
|
||||
try {
|
||||
dispatch({
|
||||
type: CV_API_CALL_IN_PROGRESS,
|
||||
});
|
||||
axios
|
||||
.request({
|
||||
url: `${COMICVINE_SERVICE_URI}/volumeBasedSearch`,
|
||||
method: "POST",
|
||||
data: {
|
||||
format: "json",
|
||||
// hack
|
||||
query: issueSearchQuery.inferredIssueDetails.name
|
||||
.replace(/[^a-zA-Z0-9 ]/g, "")
|
||||
.trim(),
|
||||
limit: "100",
|
||||
page: 1,
|
||||
resources: "volume",
|
||||
scorerConfiguration: {
|
||||
searchParams: issueSearchQuery.inferredIssueDetails,
|
||||
},
|
||||
rawFileDetails: searchPayload.rawFileDetails,
|
||||
},
|
||||
transformResponse: (r) => {
|
||||
const matches = JSON.parse(r);
|
||||
return matches;
|
||||
// return sortBy(matches, (match) => -match.score);
|
||||
},
|
||||
})
|
||||
.then((response) => {
|
||||
let matches: any = [];
|
||||
if (
|
||||
!isNil(response.data.results) &&
|
||||
response.data.results.length === 1
|
||||
) {
|
||||
matches = response.data.results;
|
||||
} else {
|
||||
matches = response.data.map((match) => match);
|
||||
}
|
||||
dispatch({
|
||||
type: CV_SEARCH_SUCCESS,
|
||||
searchResults: matches,
|
||||
searchQueryObject: {
|
||||
issue: issueSearchQuery,
|
||||
series: seriesSearchQuery,
|
||||
},
|
||||
});
|
||||
});
|
||||
} catch (error) {
|
||||
// Error handling could be added here if needed
|
||||
}
|
||||
|
||||
dispatch({
|
||||
type: CV_CLEANUP,
|
||||
});
|
||||
};
|
||||
|
||||
/**
|
||||
* This method is a proxy to `uncompressFullArchive` which uncompresses complete `rar` or `zip` archives
|
||||
* @param {string} path The path to the compressed archive
|
||||
* @param {any} options Options object
|
||||
* @returns {any}
|
||||
*/
|
||||
export const extractComicArchive =
|
||||
(path: string, options: any): any =>
|
||||
async (dispatch) => {
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_ARCHIVE_EXTRACTION_CALL_IN_PROGRESS,
|
||||
});
|
||||
await axios({
|
||||
method: "POST",
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/uncompressFullArchive`,
|
||||
headers: {
|
||||
"Content-Type": "application/json; charset=utf-8",
|
||||
},
|
||||
data: {
|
||||
filePath: path,
|
||||
options,
|
||||
},
|
||||
});
|
||||
};
|
||||
|
||||
/**
|
||||
* Description
|
||||
* @param {any} query
|
||||
* @param {any} options
|
||||
* @returns {any}
|
||||
*/
|
||||
export const searchIssue = (query, options) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: SS_SEARCH_IN_PROGRESS,
|
||||
});
|
||||
|
||||
const response = await axios({
|
||||
url: `${SEARCH_SERVICE_BASE_URI}/searchIssue`,
|
||||
method: "POST",
|
||||
data: { ...query, ...options },
|
||||
});
|
||||
|
||||
if (response.data.code === 404) {
|
||||
dispatch({
|
||||
type: SS_SEARCH_FAILED,
|
||||
data: response.data,
|
||||
});
|
||||
}
|
||||
|
||||
switch (options.trigger) {
|
||||
case "wantedComicsPage":
|
||||
dispatch({
|
||||
type: WANTED_COMICS_FETCHED,
|
||||
data: response.data.hits,
|
||||
});
|
||||
break;
|
||||
case "globalSearchBar":
|
||||
dispatch({
|
||||
type: SS_SEARCH_RESULTS_FETCHED_SPECIAL,
|
||||
data: response.data.hits,
|
||||
});
|
||||
break;
|
||||
|
||||
case "libraryPage":
|
||||
dispatch({
|
||||
type: SS_SEARCH_RESULTS_FETCHED,
|
||||
data: response.data.hits,
|
||||
});
|
||||
break;
|
||||
case "volumesPage":
|
||||
dispatch({
|
||||
type: VOLUMES_FETCHED,
|
||||
data: response.data.hits,
|
||||
});
|
||||
break;
|
||||
|
||||
default:
|
||||
break;
|
||||
}
|
||||
};
|
||||
export const analyzeImage =
|
||||
(imageFilePath: string | Buffer) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: FILEOPS_STATE_RESET,
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: IMG_ANALYSIS_CALL_IN_PROGRESS,
|
||||
});
|
||||
|
||||
const foo = await axios({
|
||||
url: `${IMAGETRANSFORMATION_SERVICE_BASE_URI}/analyze`,
|
||||
method: "POST",
|
||||
data: {
|
||||
imageFilePath,
|
||||
},
|
||||
});
|
||||
dispatch({
|
||||
type: IMG_ANALYSIS_DATA_FETCH_SUCCESS,
|
||||
result: foo.data,
|
||||
});
|
||||
};
|
||||
@@ -1,26 +0,0 @@
|
||||
import axios from "axios";
|
||||
import { isNil } from "lodash";
|
||||
import { METRON_SERVICE_URI } from "../constants/endpoints";
|
||||
|
||||
export const fetchMetronResource = async (options) => {
|
||||
const metronResourceResults = await axios.post(
|
||||
`${METRON_SERVICE_URI}/fetchResource`,
|
||||
options,
|
||||
);
|
||||
const results = metronResourceResults.data.results.map((result) => {
|
||||
return {
|
||||
label: result.name || result.__str__,
|
||||
value: result.id,
|
||||
};
|
||||
});
|
||||
|
||||
return {
|
||||
options: results,
|
||||
hasMore: !isNil(metronResourceResults.data.next),
|
||||
additional: {
|
||||
page: !isNil(metronResourceResults.data.next)
|
||||
? options.query.page + 1
|
||||
: null,
|
||||
},
|
||||
};
|
||||
};
|
||||
@@ -1,77 +0,0 @@
|
||||
import axios from "axios";
|
||||
import {
|
||||
SETTINGS_OBJECT_FETCHED,
|
||||
SETTINGS_CALL_IN_PROGRESS,
|
||||
SETTINGS_DB_FLUSH_SUCCESS,
|
||||
SETTINGS_QBITTORRENT_TORRENTS_LIST_FETCHED,
|
||||
} from "../reducers/settings.reducer";
|
||||
import {
|
||||
LIBRARY_SERVICE_BASE_URI,
|
||||
SETTINGS_SERVICE_BASE_URI,
|
||||
QBITTORRENT_SERVICE_BASE_URI,
|
||||
} from "../constants/endpoints";
|
||||
|
||||
export const getSettings = (settingsKey?) => async (dispatch) => {
|
||||
const result = await axios({
|
||||
url: `${SETTINGS_SERVICE_BASE_URI}/getSettings`,
|
||||
method: "POST",
|
||||
data: settingsKey,
|
||||
});
|
||||
{
|
||||
dispatch({
|
||||
type: SETTINGS_OBJECT_FETCHED,
|
||||
data: result.data,
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteSettings = () => async (dispatch) => {
|
||||
const result = await axios({
|
||||
url: `${SETTINGS_SERVICE_BASE_URI}/deleteSettings`,
|
||||
method: "POST",
|
||||
});
|
||||
|
||||
if (result.data.ok === 1) {
|
||||
dispatch({
|
||||
type: SETTINGS_OBJECT_FETCHED,
|
||||
data: {},
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
export const flushDb = () => async (dispatch) => {
|
||||
dispatch({
|
||||
type: SETTINGS_CALL_IN_PROGRESS,
|
||||
});
|
||||
|
||||
const flushDbResult = await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/flushDb`,
|
||||
method: "POST",
|
||||
});
|
||||
|
||||
if (flushDbResult) {
|
||||
dispatch({
|
||||
type: SETTINGS_DB_FLUSH_SUCCESS,
|
||||
data: flushDbResult.data,
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
export const getQBitTorrentClientInfo = (hostInfo) => async (dispatch) => {
|
||||
await axios.request({
|
||||
url: `${QBITTORRENT_SERVICE_BASE_URI}/connect`,
|
||||
method: "POST",
|
||||
data: hostInfo,
|
||||
});
|
||||
const qBittorrentClientInfo = await axios.request({
|
||||
url: `${QBITTORRENT_SERVICE_BASE_URI}/getClientInfo`,
|
||||
method: "GET",
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: SETTINGS_QBITTORRENT_TORRENTS_LIST_FETCHED,
|
||||
data: qBittorrentClientInfo.data,
|
||||
});
|
||||
};
|
||||
|
||||
export const getProwlarrConnectionInfo = (hostInfo) => async (dispatch) => {};
|
||||
@@ -1,10 +1,37 @@
|
||||
/**
|
||||
* @fileoverview Root application component.
|
||||
* Provides the main layout structure with navigation, content outlet,
|
||||
* and toast notifications. Initializes socket connection on mount.
|
||||
* @module components/App
|
||||
*/
|
||||
|
||||
import React, { ReactElement, useEffect } from "react";
|
||||
import { Outlet } from "react-router-dom";
|
||||
import { Navbar2 } from "./shared/Navbar2";
|
||||
import { ToastContainer } from "react-toastify";
|
||||
import "../assets/scss/App.css";
|
||||
import "../../app.css";
|
||||
import { useStore } from "../store";
|
||||
|
||||
/**
|
||||
* Root application component that provides the main layout structure.
|
||||
*
|
||||
* Features:
|
||||
* - Initializes WebSocket connection to the server on mount
|
||||
* - Renders the navigation bar across all routes
|
||||
* - Provides React Router outlet for child routes
|
||||
* - Includes toast notification container for app-wide notifications
|
||||
*
|
||||
* @returns {ReactElement} The root application layout
|
||||
* @example
|
||||
* // Used as the root element in React Router configuration
|
||||
* const router = createBrowserRouter([
|
||||
* {
|
||||
* path: "/",
|
||||
* element: <App />,
|
||||
* children: [...]
|
||||
* }
|
||||
* ]);
|
||||
*/
|
||||
export const App = (): ReactElement => {
|
||||
useEffect(() => {
|
||||
useStore.getState().getSocket("/"); // Connect to the base namespace
|
||||
|
||||
@@ -1,41 +1,45 @@
|
||||
import React, {
|
||||
useCallback,
|
||||
ReactElement,
|
||||
useEffect,
|
||||
useRef,
|
||||
useState,
|
||||
} from "react";
|
||||
import { SearchQuery, PriorityEnum, SearchResponse } from "threetwo-ui-typings";
|
||||
import { RootState, SearchInstance } from "threetwo-ui-typings";
|
||||
import ellipsize from "ellipsize";
|
||||
import { Form, Field } from "react-final-form";
|
||||
import { difference } from "../../shared/utils/object.utils";
|
||||
import { isEmpty, isNil, map } from "lodash";
|
||||
import { useStore } from "../../store";
|
||||
import { useShallow } from "zustand/react/shallow";
|
||||
import { useQuery, useQueryClient } from "@tanstack/react-query";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { AIRDCPP_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import type { Socket } from "socket.io-client";
|
||||
import type { AcquisitionPanelProps } from "../../types";
|
||||
|
||||
interface IAcquisitionPanelProps {
|
||||
query: any;
|
||||
comicObjectId: any;
|
||||
comicObject: any;
|
||||
settings: any;
|
||||
interface HubData {
|
||||
hub_url: string;
|
||||
identity: { name: string };
|
||||
value: string;
|
||||
}
|
||||
|
||||
interface AirDCPPSearchResult {
|
||||
id: string;
|
||||
dupe?: unknown;
|
||||
type: { id: string; str: string };
|
||||
name: string;
|
||||
slots: { total: number; free: number };
|
||||
users: { user: { nicks: string; flags: string[] } };
|
||||
size: number;
|
||||
}
|
||||
|
||||
export const AcquisitionPanel = (
|
||||
props: IAcquisitionPanelProps,
|
||||
props: AcquisitionPanelProps,
|
||||
): ReactElement => {
|
||||
const socketRef = useRef<Socket>();
|
||||
const queryClient = useQueryClient();
|
||||
const socketRef = useRef<Socket | undefined>(undefined);
|
||||
|
||||
const [dcppQuery, setDcppQuery] = useState({});
|
||||
const [airDCPPSearchResults, setAirDCPPSearchResults] = useState<any[]>([]);
|
||||
const [airDCPPSearchResults, setAirDCPPSearchResults] = useState<AirDCPPSearchResult[]>([]);
|
||||
const [airDCPPSearchStatus, setAirDCPPSearchStatus] = useState(false);
|
||||
const [airDCPPSearchInstance, setAirDCPPSearchInstance] = useState<any>({});
|
||||
const [airDCPPSearchInfo, setAirDCPPSearchInfo] = useState<any>({});
|
||||
const [airDCPPSearchInstance, setAirDCPPSearchInstance] = useState<{ id?: string; owner?: string; expires_in?: number }>({});
|
||||
const [airDCPPSearchInfo, setAirDCPPSearchInfo] = useState<{ query?: { pattern: string; extensions: string[]; file_type: string } }>({});
|
||||
|
||||
const { comicObjectId } = props;
|
||||
const issueName = props.query.issue.name || "";
|
||||
@@ -140,13 +144,13 @@ export const AcquisitionPanel = (
|
||||
};
|
||||
|
||||
const download = async (
|
||||
searchInstanceId: Number,
|
||||
resultId: String,
|
||||
comicObjectId: String,
|
||||
name: String,
|
||||
size: Number,
|
||||
type: any,
|
||||
config: any,
|
||||
searchInstanceId: string | number,
|
||||
resultId: string,
|
||||
comicObjectId: string,
|
||||
name: string,
|
||||
size: number,
|
||||
type: unknown,
|
||||
config: Record<string, unknown>,
|
||||
): Promise<void> => {
|
||||
socketRef.current?.emit(
|
||||
"call",
|
||||
@@ -166,7 +170,7 @@ export const AcquisitionPanel = (
|
||||
);
|
||||
};
|
||||
|
||||
const getDCPPSearchResults = async (searchQuery) => {
|
||||
const getDCPPSearchResults = async (searchQuery: { issueName: string }) => {
|
||||
const manualQuery = {
|
||||
query: {
|
||||
pattern: `${searchQuery.issueName}`,
|
||||
@@ -255,7 +259,7 @@ export const AcquisitionPanel = (
|
||||
<dl>
|
||||
<dt>
|
||||
<div className="mb-1">
|
||||
{hubs?.data.map((value, idx: string) => (
|
||||
{hubs?.data.map((value: HubData, idx: number) => (
|
||||
<span className="tag is-warning" key={idx}>
|
||||
{value.identity.name}
|
||||
</span>
|
||||
@@ -266,19 +270,19 @@ export const AcquisitionPanel = (
|
||||
<dt>
|
||||
Query:
|
||||
<span className="has-text-weight-semibold">
|
||||
{airDCPPSearchInfo.query.pattern}
|
||||
{airDCPPSearchInfo.query?.pattern}
|
||||
</span>
|
||||
</dt>
|
||||
<dd>
|
||||
Extensions:
|
||||
<span className="has-text-weight-semibold">
|
||||
{airDCPPSearchInfo.query.extensions.join(", ")}
|
||||
{airDCPPSearchInfo.query?.extensions.join(", ")}
|
||||
</span>
|
||||
</dd>
|
||||
<dd>
|
||||
File type:
|
||||
<span className="has-text-weight-semibold">
|
||||
{airDCPPSearchInfo.query.file_type}
|
||||
{airDCPPSearchInfo.query?.file_type}
|
||||
</span>
|
||||
</dd>
|
||||
</dl>
|
||||
@@ -329,6 +333,7 @@ export const AcquisitionPanel = (
|
||||
{/* NAME */}
|
||||
<td className="whitespace-nowrap px-3 py-3 text-gray-700 dark:text-slate-300 max-w-xs">
|
||||
<p className="mb-2">
|
||||
{/* TODO: Switch to Solar icon */}
|
||||
{type.id === "directory" && (
|
||||
<i className="fas fa-folder mr-1"></i>
|
||||
)}
|
||||
@@ -383,7 +388,7 @@ export const AcquisitionPanel = (
|
||||
className="inline-flex items-center gap-1 rounded border border-green-500 bg-green-500 px-2 py-1 text-xs font-medium text-white hover:bg-transparent hover:text-green-400 dark:border-green-300 dark:bg-green-300 dark:text-slate-900 dark:hover:bg-transparent"
|
||||
onClick={() =>
|
||||
download(
|
||||
airDCPPSearchInstance.id,
|
||||
airDCPPSearchInstance.id ?? "",
|
||||
id,
|
||||
comicObjectId,
|
||||
name,
|
||||
|
||||
@@ -1,17 +1,31 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import Select from "react-select";
|
||||
import Select, { StylesConfig, SingleValue } from "react-select";
|
||||
import { ActionOption } from "../actionMenuConfig";
|
||||
|
||||
export const Menu = (props): ReactElement => {
|
||||
interface MenuConfiguration {
|
||||
filteredActionOptions: ActionOption[];
|
||||
customStyles: StylesConfig<ActionOption, false>;
|
||||
handleActionSelection: (action: SingleValue<ActionOption>) => void;
|
||||
}
|
||||
|
||||
interface MenuProps {
|
||||
data?: unknown;
|
||||
handlers?: {
|
||||
setSlidingPanelContentId: (id: string) => void;
|
||||
setVisible: (visible: boolean) => void;
|
||||
};
|
||||
configuration: MenuConfiguration;
|
||||
}
|
||||
|
||||
export const Menu = (props: MenuProps): ReactElement => {
|
||||
const {
|
||||
filteredActionOptions,
|
||||
customStyles,
|
||||
handleActionSelection,
|
||||
Placeholder,
|
||||
} = props.configuration;
|
||||
|
||||
return (
|
||||
<Select
|
||||
components={{ Placeholder }}
|
||||
<Select<ActionOption, false>
|
||||
placeholder={
|
||||
<span className="inline-flex flex-row items-center gap-2 pt-1">
|
||||
<div className="w-6 h-6">
|
||||
|
||||
@@ -4,7 +4,19 @@ import dayjs from "dayjs";
|
||||
import ellipsize from "ellipsize";
|
||||
import { map } from "lodash";
|
||||
import { DownloadProgressTick } from "./DownloadProgressTick";
|
||||
export const AirDCPPBundles = (props) => {
|
||||
|
||||
interface BundleData {
|
||||
id: string;
|
||||
name: string;
|
||||
target: string;
|
||||
size: number;
|
||||
}
|
||||
|
||||
interface AirDCPPBundlesProps {
|
||||
data: BundleData[];
|
||||
}
|
||||
|
||||
export const AirDCPPBundles = (props: AirDCPPBundlesProps) => {
|
||||
return (
|
||||
<div className="overflow-x-auto w-fit mt-6">
|
||||
<table className="min-w-full text-sm text-gray-900 dark:text-slate-100">
|
||||
|
||||
@@ -1,42 +1,70 @@
|
||||
import React, { ReactElement, useCallback, useState } from "react";
|
||||
import { fetchMetronResource } from "../../../actions/metron.actions";
|
||||
import axios from "axios";
|
||||
import { isNil } from "lodash";
|
||||
import Creatable from "react-select/creatable";
|
||||
import { withAsyncPaginate } from "react-select-async-paginate";
|
||||
import { METRON_SERVICE_URI } from "../../../constants/endpoints";
|
||||
|
||||
const CreatableAsyncPaginate = withAsyncPaginate(Creatable);
|
||||
|
||||
interface AsyncSelectPaginateProps {
|
||||
metronResource: string;
|
||||
placeholder?: string;
|
||||
export interface AsyncSelectPaginateProps {
|
||||
metronResource?: string;
|
||||
placeholder?: string | React.ReactNode;
|
||||
value?: object;
|
||||
onChange?(...args: unknown[]): unknown;
|
||||
meta?: Record<string, unknown>;
|
||||
input?: Record<string, unknown>;
|
||||
name?: string;
|
||||
type?: string;
|
||||
}
|
||||
|
||||
interface AdditionalType {
|
||||
page: number | null;
|
||||
}
|
||||
|
||||
interface MetronResultItem {
|
||||
name?: string;
|
||||
__str__?: string;
|
||||
id: number;
|
||||
}
|
||||
|
||||
export const AsyncSelectPaginate = (props: AsyncSelectPaginateProps): ReactElement => {
|
||||
const [value, setValue] = useState(null);
|
||||
const [isAddingInProgress, setIsAddingInProgress] = useState(false);
|
||||
|
||||
const loadData = useCallback((query, loadedOptions, { page }) => {
|
||||
return fetchMetronResource({
|
||||
const loadData = useCallback(async (
|
||||
query: string,
|
||||
_loadedOptions: unknown,
|
||||
additional?: AdditionalType
|
||||
) => {
|
||||
const page = additional?.page ?? 1;
|
||||
const options = {
|
||||
method: "GET",
|
||||
resource: props.metronResource,
|
||||
query: {
|
||||
name: query,
|
||||
page,
|
||||
resource: props.metronResource || "",
|
||||
query: { name: query, page },
|
||||
};
|
||||
const response = await axios.post(`${METRON_SERVICE_URI}/fetchResource`, options);
|
||||
const results = response.data.results.map((result: MetronResultItem) => ({
|
||||
label: result.name || result.__str__,
|
||||
value: result.id,
|
||||
}));
|
||||
return {
|
||||
options: results,
|
||||
hasMore: !isNil(response.data.next),
|
||||
additional: {
|
||||
page: !isNil(response.data.next) ? page + 1 : null,
|
||||
},
|
||||
});
|
||||
}, []);
|
||||
};
|
||||
}, [props.metronResource]);
|
||||
|
||||
return (
|
||||
<CreatableAsyncPaginate
|
||||
SelectComponent={Creatable}
|
||||
debounceTimeout={200}
|
||||
isDisabled={isAddingInProgress}
|
||||
value={props.value}
|
||||
loadOptions={loadData}
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
loadOptions={loadData as any}
|
||||
placeholder={props.placeholder}
|
||||
// onCreateOption={onCreateOption}
|
||||
onChange={props.onChange}
|
||||
// cacheUniqs={[cacheUniq]}
|
||||
additional={{
|
||||
page: 1,
|
||||
}}
|
||||
|
||||
@@ -1,4 +1,4 @@
|
||||
import React, { useState, ReactElement, useCallback } from "react";
|
||||
import React, { useState, ReactElement, useCallback, useMemo } from "react";
|
||||
import { useParams } from "react-router-dom";
|
||||
import Card from "../shared/Carda";
|
||||
import { RawFileDetails } from "./RawFileDetails";
|
||||
@@ -10,7 +10,7 @@ import "react-sliding-pane/dist/react-sliding-pane.css";
|
||||
import SlidingPane from "react-sliding-pane";
|
||||
import { determineCoverFile } from "../../shared/utils/metadata.utils";
|
||||
import { styled } from "styled-components";
|
||||
import { RawFileDetails as RawFileDetailsType } from "../../graphql/generated";
|
||||
import type { ComicDetailProps } from "../../types";
|
||||
|
||||
// Extracted modules
|
||||
import { useComicVineMatching } from "./useComicVineMatching";
|
||||
@@ -23,57 +23,14 @@ const StyledSlidingPanel = styled(SlidingPane)`
|
||||
background: #ccc;
|
||||
`;
|
||||
|
||||
type InferredIssue = {
|
||||
name?: string;
|
||||
number?: number;
|
||||
year?: string;
|
||||
subtitle?: string;
|
||||
[key: string]: any;
|
||||
};
|
||||
|
||||
type ComicVineMetadata = {
|
||||
name?: string;
|
||||
volumeInformation?: any;
|
||||
[key: string]: any;
|
||||
};
|
||||
|
||||
type Acquisition = {
|
||||
directconnect?: {
|
||||
downloads?: any[];
|
||||
};
|
||||
torrent?: any[];
|
||||
[key: string]: any;
|
||||
};
|
||||
|
||||
type ComicDetailProps = {
|
||||
data: {
|
||||
_id: string;
|
||||
rawFileDetails?: RawFileDetailsType;
|
||||
inferredMetadata: {
|
||||
issue?: InferredIssue;
|
||||
};
|
||||
sourcedMetadata: {
|
||||
comicvine?: ComicVineMetadata;
|
||||
locg?: any;
|
||||
comicInfo?: any;
|
||||
};
|
||||
acquisition?: Acquisition;
|
||||
createdAt: string;
|
||||
updatedAt: string;
|
||||
};
|
||||
userSettings?: any;
|
||||
queryClient?: any;
|
||||
comicObjectId?: string;
|
||||
};
|
||||
|
||||
/**
|
||||
* Component for displaying the metadata for a comic in greater detail.
|
||||
* Displays full comic detail: cover, file info, action menu, and tabbed panels
|
||||
* for metadata, archive operations, and acquisition.
|
||||
*
|
||||
* @component
|
||||
* @example
|
||||
* return (
|
||||
* <ComicDetail/>
|
||||
* )
|
||||
* @param data.queryClient - react-query client passed through to the CV match
|
||||
* panel so it can invalidate queries after a match is applied.
|
||||
* @param data.comicObjectId - optional override for the comic ID; used when the
|
||||
* component is rendered outside a route that provides the ID via `useParams`.
|
||||
*/
|
||||
export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
const {
|
||||
@@ -84,7 +41,6 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
sourcedMetadata: { comicvine, locg, comicInfo },
|
||||
acquisition,
|
||||
createdAt,
|
||||
updatedAt,
|
||||
},
|
||||
userSettings,
|
||||
queryClient,
|
||||
@@ -94,24 +50,10 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
const [activeTab, setActiveTab] = useState<number | undefined>(undefined);
|
||||
const [visible, setVisible] = useState(false);
|
||||
const [slidingPanelContentId, setSlidingPanelContentId] = useState("");
|
||||
const [modalIsOpen, setIsOpen] = useState(false);
|
||||
|
||||
const { comicObjectId } = useParams<{ comicObjectId: string }>();
|
||||
const { comicVineMatches, prepareAndFetchMatches } = useComicVineMatching();
|
||||
|
||||
// Modal handlers (currently unused but kept for future use)
|
||||
const openModal = useCallback((filePath: string) => {
|
||||
setIsOpen(true);
|
||||
}, []);
|
||||
|
||||
const afterOpenModal = useCallback((things: any) => {
|
||||
// Modal opened callback
|
||||
}, []);
|
||||
|
||||
const closeModal = useCallback(() => {
|
||||
setIsOpen(false);
|
||||
}, []);
|
||||
|
||||
// Action event handlers
|
||||
const openDrawerWithCVMatches = () => {
|
||||
prepareAndFetchMatches(rawFileDetails, comicvine);
|
||||
@@ -124,16 +66,17 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
setVisible(true);
|
||||
}, []);
|
||||
|
||||
// Action menu handler
|
||||
const Placeholder = components.Placeholder;
|
||||
const filteredActionOptions = filter(actionOptions, (item) => {
|
||||
// Hide "match on Comic Vine" when there are no raw file details — matching
|
||||
// requires file metadata to seed the search query.
|
||||
const filteredActionOptions: ActionOption[] = actionOptions.filter((item) => {
|
||||
if (isUndefined(rawFileDetails)) {
|
||||
return item.value !== "match-on-comic-vine";
|
||||
}
|
||||
return item;
|
||||
return true;
|
||||
});
|
||||
|
||||
const handleActionSelection = (action: ActionOption) => {
|
||||
const handleActionSelection = (action: ActionOption | null) => {
|
||||
if (!action) return;
|
||||
switch (action.value) {
|
||||
case "match-on-comic-vine":
|
||||
openDrawerWithCVMatches();
|
||||
@@ -150,6 +93,11 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
const isComicBookMetadataAvailable =
|
||||
!isUndefined(comicvine) && !isUndefined(comicvine?.volumeInformation);
|
||||
|
||||
const hasAnyMetadata =
|
||||
isComicBookMetadataAvailable ||
|
||||
!isEmpty(comicInfo) ||
|
||||
!isNil(locg);
|
||||
|
||||
const areRawFileDetailsAvailable =
|
||||
!isUndefined(rawFileDetails) && !isEmpty(rawFileDetails);
|
||||
|
||||
@@ -160,26 +108,29 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
});
|
||||
|
||||
// Query for airdc++
|
||||
const airDCPPQuery = {
|
||||
issue: {
|
||||
name: issueName,
|
||||
},
|
||||
};
|
||||
const airDCPPQuery = useMemo(() => ({
|
||||
issue: { name: issueName },
|
||||
}), [issueName]);
|
||||
|
||||
// Create tab configuration
|
||||
const tabGroup = createTabConfig({
|
||||
const openReconcilePanel = useCallback(() => {
|
||||
setSlidingPanelContentId("metadataReconciliation");
|
||||
setVisible(true);
|
||||
}, []);
|
||||
|
||||
const tabGroup = useMemo(() => createTabConfig({
|
||||
data: data.data,
|
||||
comicInfo,
|
||||
isComicBookMetadataAvailable,
|
||||
hasAnyMetadata,
|
||||
areRawFileDetailsAvailable,
|
||||
airDCPPQuery,
|
||||
comicObjectId: _id,
|
||||
userSettings,
|
||||
issueName,
|
||||
acquisition,
|
||||
});
|
||||
onReconcileMetadata: openReconcilePanel,
|
||||
}), [data.data, hasAnyMetadata, areRawFileDetailsAvailable, airDCPPQuery, _id, userSettings, issueName, acquisition, openReconcilePanel]);
|
||||
|
||||
const filteredTabs = tabGroup.filter((tab) => tab.shouldShow);
|
||||
const filteredTabs = useMemo(() => tabGroup.filter((tab) => tab.shouldShow), [tabGroup]);
|
||||
|
||||
// Sliding panel content mapping
|
||||
const renderSlidingPanelContent = () => {
|
||||
@@ -190,6 +141,7 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
rawFileDetails={rawFileDetails}
|
||||
inferredMetadata={inferredMetadata}
|
||||
comicVineMatches={comicVineMatches}
|
||||
// Prefer the route param; fall back to the data ID when rendered outside a route.
|
||||
comicObjectId={comicObjectId || _id}
|
||||
queryClient={queryClient}
|
||||
onMatchApplied={() => {
|
||||
@@ -224,10 +176,9 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
<div className="grid">
|
||||
<RawFileDetails
|
||||
data={{
|
||||
rawFileDetails: rawFileDetails,
|
||||
inferredMetadata: inferredMetadata,
|
||||
created_at: createdAt,
|
||||
updated_at: updatedAt,
|
||||
rawFileDetails,
|
||||
inferredMetadata,
|
||||
createdAt,
|
||||
}}
|
||||
>
|
||||
{/* action dropdown */}
|
||||
@@ -239,7 +190,6 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
filteredActionOptions,
|
||||
customStyles,
|
||||
handleActionSelection,
|
||||
Placeholder,
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
|
||||
@@ -4,19 +4,19 @@ import dayjs from "dayjs";
|
||||
import { isEmpty, isUndefined } from "lodash";
|
||||
import Card from "../shared/Carda";
|
||||
import { convert } from "html-to-text";
|
||||
|
||||
interface ComicVineDetailsProps {
|
||||
updatedAt?: string;
|
||||
data?: {
|
||||
name?: string;
|
||||
number?: string;
|
||||
resource_type?: string;
|
||||
id?: number;
|
||||
};
|
||||
}
|
||||
import type { ComicVineDetailsProps } from "../../types";
|
||||
|
||||
export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement => {
|
||||
const { data, updatedAt } = props;
|
||||
|
||||
if (!data || !data.volumeInformation) {
|
||||
return <div className="text-slate-500 dark:text-gray-400">No ComicVine data available</div>;
|
||||
}
|
||||
|
||||
const detectedIssueType = data.volumeInformation.description
|
||||
? detectIssueTypes(data.volumeInformation.description)
|
||||
: undefined;
|
||||
|
||||
return (
|
||||
<div className="text-slate-500 dark:text-gray-400">
|
||||
<div className="">
|
||||
@@ -24,10 +24,9 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
<div className="flex flex-row gap-4">
|
||||
<div className="min-w-fit">
|
||||
<Card
|
||||
imageUrl={data.volumeInformation.image.thumb_url}
|
||||
imageUrl={data.volumeInformation.image?.thumb_url}
|
||||
orientation={"cover-only"}
|
||||
hasDetails={false}
|
||||
// cardContainerStyle={{ maxWidth: 200 }}
|
||||
/>
|
||||
</div>
|
||||
<div className="flex flex-col gap-5">
|
||||
@@ -49,7 +48,7 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
<div className="text-md">ComicVine Metadata</div>
|
||||
<div className="text-sm">
|
||||
Last scraped on{" "}
|
||||
{dayjs(updatedAt).format("MMM D YYYY [at] h:mm a")}
|
||||
{updatedAt ? dayjs(updatedAt).format("MMM D YYYY [at] h:mm a") : "Unknown"}
|
||||
</div>
|
||||
<div className="text-sm">
|
||||
ComicVine Issue ID
|
||||
@@ -61,7 +60,7 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
{/* Publisher details */}
|
||||
<div className="ml-8">
|
||||
Published by{" "}
|
||||
<span>{data.volumeInformation.publisher.name}</span>
|
||||
<span>{data.volumeInformation.publisher?.name}</span>
|
||||
<div>
|
||||
Total issues in this volume{" "}
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs font-medium px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
@@ -77,16 +76,11 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
<span>{data.issue_number}</span>
|
||||
</div>
|
||||
)}
|
||||
{!isUndefined(
|
||||
detectIssueTypes(data.volumeInformation.description),
|
||||
) ? (
|
||||
{!isUndefined(detectedIssueType) ? (
|
||||
<div>
|
||||
<span>Detected Type</span>
|
||||
<span>
|
||||
{
|
||||
detectIssueTypes(data.volumeInformation.description)
|
||||
.displayName
|
||||
}
|
||||
{detectedIssueType.displayName}
|
||||
</span>
|
||||
</div>
|
||||
) : data.resource_type ? (
|
||||
@@ -101,6 +95,7 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
{/* Description */}
|
||||
<div className="mt-3 w-3/4">
|
||||
{!isEmpty(data.description) &&
|
||||
data.description &&
|
||||
convert(data.description, {
|
||||
baseElements: {
|
||||
selectors: ["p"],
|
||||
|
||||
@@ -1,12 +1,13 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import { ComicVineSearchForm } from "../ComicVineSearchForm";
|
||||
import MatchResult from "./MatchResult";
|
||||
import { isEmpty } from "lodash";
|
||||
import { useStore } from "../../store";
|
||||
import { useShallow } from "zustand/react/shallow";
|
||||
import type { ComicVineMatchPanelProps } from "../../types";
|
||||
|
||||
export const ComicVineMatchPanel = (comicVineData): ReactElement => {
|
||||
const { comicObjectId, comicVineMatches, queryClient, onMatchApplied } = comicVineData.props;
|
||||
/** Displays ComicVine search results or a status message while searching. */
|
||||
export const ComicVineMatchPanel = ({ props: comicVineData }: ComicVineMatchPanelProps): ReactElement => {
|
||||
const { comicObjectId, comicVineMatches, queryClient, onMatchApplied } = comicVineData;
|
||||
const { comicvine } = useStore(
|
||||
useShallow((state) => ({
|
||||
comicvine: state.comicvine,
|
||||
|
||||
@@ -1,7 +1,16 @@
|
||||
import React, { useCallback } from "react";
|
||||
import { Form, Field } from "react-final-form";
|
||||
import Collapsible from "react-collapsible";
|
||||
import { fetchComicVineMatches } from "../../actions/fileops.actions";
|
||||
import { ValidationErrors } from "final-form";
|
||||
|
||||
interface ComicVineSearchFormProps {
|
||||
rawFileDetails?: Record<string, unknown>;
|
||||
}
|
||||
|
||||
interface SearchFormValues {
|
||||
issueName?: string;
|
||||
issueNumber?: string;
|
||||
issueYear?: string;
|
||||
}
|
||||
|
||||
/**
|
||||
* Component for performing search against ComicVine
|
||||
@@ -12,8 +21,8 @@ import { fetchComicVineMatches } from "../../actions/fileops.actions";
|
||||
* <ComicVineSearchForm data={rawFileDetails} />
|
||||
* )
|
||||
*/
|
||||
export const ComicVineSearchForm = (data) => {
|
||||
const onSubmit = useCallback((value) => {
|
||||
export const ComicVineSearchForm = (props: ComicVineSearchFormProps) => {
|
||||
const onSubmit = useCallback((value: SearchFormValues) => {
|
||||
const userInititatedQuery = {
|
||||
inferredIssueDetails: {
|
||||
name: value.issueName,
|
||||
@@ -24,8 +33,8 @@ export const ComicVineSearchForm = (data) => {
|
||||
};
|
||||
// dispatch(fetchComicVineMatches(data, userInititatedQuery));
|
||||
}, []);
|
||||
const validate = () => {
|
||||
return true;
|
||||
const validate = (_values: SearchFormValues): ValidationErrors | undefined => {
|
||||
return undefined;
|
||||
};
|
||||
|
||||
const MyForm = () => (
|
||||
@@ -34,52 +43,46 @@ export const ComicVineSearchForm = (data) => {
|
||||
validate={validate}
|
||||
render={({ handleSubmit }) => (
|
||||
<form onSubmit={handleSubmit}>
|
||||
<span className="flex items-center">
|
||||
<span className="text-md text-slate-500 dark:text-slate-500 pr-5">
|
||||
Override Search Query
|
||||
</span>
|
||||
<span className="h-px flex-1 bg-slate-200 dark:bg-slate-400"></span>
|
||||
</span>
|
||||
<label className="block py-1">Issue Name</label>
|
||||
<label className="block py-1 text-slate-700 dark:text-slate-200">Issue Name</label>
|
||||
<Field name="issueName">
|
||||
{(props) => (
|
||||
<input
|
||||
{...props.input}
|
||||
className="appearance-none dark:bg-slate-100 bg-slate-100 h-10 w-full rounded-md border-none text-gray-700 dark:text-slate-200 py-1 pr-7 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:shadow-outline-blue focus:border-blue-300"
|
||||
className="appearance-none bg-slate-100 dark:bg-slate-700 h-10 w-full rounded-md border border-slate-300 dark:border-slate-600 text-slate-900 dark:text-slate-100 py-1 pr-7 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-blue-300"
|
||||
placeholder="Type the issue name"
|
||||
/>
|
||||
)}
|
||||
</Field>
|
||||
<div className="flex flex-row gap-4">
|
||||
<div className="flex flex-row gap-4 mt-2">
|
||||
<div>
|
||||
<label className="block py-1">Number</label>
|
||||
<label className="block py-1 text-slate-700 dark:text-slate-200">Number</label>
|
||||
<Field name="issueNumber">
|
||||
{(props) => (
|
||||
<input
|
||||
{...props.input}
|
||||
className="appearance-none dark:bg-slate-100 bg-slate-100 h-10 w-14 rounded-md border-none text-gray-700 dark:text-slate-200 py-1 pr-7 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:shadow-outline-blue focus:border-blue-300"
|
||||
className="appearance-none bg-slate-100 dark:bg-slate-700 h-10 w-14 rounded-md border border-slate-300 dark:border-slate-600 text-slate-900 dark:text-slate-100 py-1 pr-2 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-blue-300"
|
||||
placeholder="#"
|
||||
/>
|
||||
)}
|
||||
</Field>
|
||||
</div>
|
||||
<div>
|
||||
<label className="block py-1">Year</label>
|
||||
<label className="block py-1 text-slate-700 dark:text-slate-200">Year</label>
|
||||
<Field name="issueYear">
|
||||
{(props) => (
|
||||
<input
|
||||
{...props.input}
|
||||
className="appearance-none dark:bg-slate-100 bg-slate-100 h-10 w-20 rounded-md border-none text-gray-700 dark:text-slate-200 py-1 pr-7 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:shadow-outline-blue focus:border-blue-300"
|
||||
className="appearance-none bg-slate-100 dark:bg-slate-700 h-10 w-20 rounded-md border border-slate-300 dark:border-slate-600 text-slate-900 dark:text-slate-100 py-1 pr-2 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-blue-300"
|
||||
placeholder="1984"
|
||||
/>
|
||||
)}
|
||||
</Field>
|
||||
</div>
|
||||
|
||||
<div className="flex justify-end mt-5">
|
||||
<div className="flex items-end">
|
||||
<button
|
||||
type="submit"
|
||||
className="flex h-10 sm:mt-3 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-4 py-2 text-gray-500 hover:bg-transparent hover:text-red-600 focus:outline-none focus:ring active:text-indigo-500"
|
||||
className="flex h-10 items-center rounded-lg border border-green-500 dark:border-green-400 bg-green-500 dark:bg-green-600 px-4 py-2 text-white font-medium hover:bg-green-600 dark:hover:bg-green-500 focus:outline-none focus:ring-2 focus:ring-green-500 focus:ring-offset-2 active:bg-green-700"
|
||||
>
|
||||
Search
|
||||
</button>
|
||||
|
||||
@@ -2,32 +2,12 @@ import prettyBytes from "pretty-bytes";
|
||||
import React, { ReactElement, useEffect, useRef, useState } from "react";
|
||||
import { useStore } from "../../store";
|
||||
import type { Socket } from "socket.io-client";
|
||||
|
||||
/**
|
||||
* @typedef {Object} DownloadProgressTickProps
|
||||
* @property {string} bundleId - The bundle ID to filter ticks by (as string)
|
||||
*/
|
||||
interface DownloadProgressTickProps {
|
||||
bundleId: string;
|
||||
}
|
||||
import type { DownloadProgressTickProps } from "../../types";
|
||||
|
||||
/**
|
||||
* Shape of the download tick data received over the socket.
|
||||
*
|
||||
* @typedef DownloadTickData
|
||||
* @property {number} id - Unique download ID
|
||||
* @property {string} name - File name (e.g. "movie.mkv")
|
||||
* @property {number} downloaded_bytes - Bytes downloaded so far
|
||||
* @property {number} size - Total size in bytes
|
||||
* @property {number} speed - Current download speed (bytes/sec)
|
||||
* @property {number} seconds_left - Estimated seconds remaining
|
||||
* @property {{ id: string; str: string; completed: boolean; downloaded: boolean; failed: boolean; hook_error: any }} status
|
||||
* - Status object (e.g. `{ id: "queued", str: "Running (15.1%)", ... }`)
|
||||
* @property {{ online: number; total: number; str: string }} sources
|
||||
* - Peer count (e.g. `{ online: 1, total: 1, str: "1/1 online" }`)
|
||||
* @property {string} target - Download destination (e.g. "/Downloads/movie.mkv")
|
||||
*/
|
||||
interface DownloadTickData {
|
||||
type DownloadTickData = {
|
||||
id: number;
|
||||
name: string;
|
||||
downloaded_bytes: number;
|
||||
@@ -48,12 +28,12 @@ interface DownloadTickData {
|
||||
str: string;
|
||||
};
|
||||
target: string;
|
||||
}
|
||||
};
|
||||
|
||||
export const DownloadProgressTick: React.FC<DownloadProgressTickProps> = ({
|
||||
bundleId,
|
||||
}): ReactElement | null => {
|
||||
const socketRef = useRef<Socket>();
|
||||
const socketRef = useRef<Socket | undefined>(undefined);
|
||||
const [tick, setTick] = useState<DownloadTickData | null>(null);
|
||||
useEffect(() => {
|
||||
const socket = useStore.getState().getSocket("manual");
|
||||
|
||||
@@ -1,7 +1,7 @@
|
||||
import React, { useEffect, ReactElement, useState, useMemo } from "react";
|
||||
import { isEmpty, isNil, isUndefined, map } from "lodash";
|
||||
import { AirDCPPBundles } from "./AirDCPPBundles";
|
||||
import { TorrentDownloads } from "./TorrentDownloads";
|
||||
import { TorrentDownloads, TorrentData } from "./TorrentDownloads";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import {
|
||||
@@ -32,7 +32,7 @@ export interface TorrentDetails {
|
||||
export const DownloadsPanel = (): ReactElement | null => {
|
||||
const { comicObjectId } = useParams<{ comicObjectId: string }>();
|
||||
const [infoHashes, setInfoHashes] = useState<string[]>([]);
|
||||
const [torrentDetails, setTorrentDetails] = useState<TorrentDetails[]>([]);
|
||||
const [torrentDetails, setTorrentDetails] = useState<TorrentData[]>([]);
|
||||
const [activeTab, setActiveTab] = useState<"directconnect" | "torrents">(
|
||||
"directconnect",
|
||||
);
|
||||
|
||||
@@ -1,55 +1,41 @@
|
||||
import React, { ReactElement, useCallback, useEffect, useState } from "react";
|
||||
import { Form, Field } from "react-final-form";
|
||||
import React, { ReactElement } from "react";
|
||||
import { Form, Field, FieldRenderProps } from "react-final-form";
|
||||
import arrayMutators from "final-form-arrays";
|
||||
import { FieldArray } from "react-final-form-arrays";
|
||||
import AsyncSelectPaginate from "./AsyncSelectPaginate/AsyncSelectPaginate";
|
||||
import TextareaAutosize from "react-textarea-autosize";
|
||||
|
||||
export const EditMetadataPanel = (props): ReactElement => {
|
||||
const validate = async () => {};
|
||||
interface EditMetadataPanelProps {
|
||||
data: {
|
||||
name?: string | null;
|
||||
[key: string]: any;
|
||||
};
|
||||
}
|
||||
|
||||
/** Adapts react-final-form's Field render prop to AsyncSelectPaginate. */
|
||||
const AsyncSelectPaginateAdapter = ({ input, ...rest }: FieldRenderProps<any>) => (
|
||||
<AsyncSelectPaginate {...input} {...rest} onChange={(value) => input.onChange(value)} />
|
||||
);
|
||||
|
||||
/** Adapts react-final-form's Field render prop to TextareaAutosize. */
|
||||
const TextareaAutosizeAdapter = ({ input, ...rest }: FieldRenderProps<any>) => (
|
||||
<TextareaAutosize {...input} {...rest} onChange={(value) => input.onChange(value)} />
|
||||
);
|
||||
|
||||
/** Sliding panel form for manually editing comic metadata fields. */
|
||||
export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElement => {
|
||||
const onSubmit = async () => {};
|
||||
|
||||
const { data } = props;
|
||||
|
||||
const AsyncSelectPaginateAdapter = ({ input, ...rest }) => {
|
||||
return (
|
||||
<AsyncSelectPaginate
|
||||
{...input}
|
||||
{...rest}
|
||||
onChange={(value) => input.onChange(value)}
|
||||
/>
|
||||
);
|
||||
};
|
||||
const TextareaAutosizeAdapter = ({ input, ...rest }) => {
|
||||
return (
|
||||
<TextareaAutosize
|
||||
{...input}
|
||||
{...rest}
|
||||
onChange={(value) => input.onChange(value)}
|
||||
/>
|
||||
);
|
||||
};
|
||||
// const rawFileDetails = useSelector(
|
||||
// (state: RootState) => state.comicInfo.comicBookDetail.rawFileDetails.name,
|
||||
// );
|
||||
|
||||
return (
|
||||
<>
|
||||
<Form
|
||||
onSubmit={onSubmit}
|
||||
validate={validate}
|
||||
mutators={{
|
||||
...arrayMutators,
|
||||
}}
|
||||
mutators={{ ...arrayMutators }}
|
||||
render={({
|
||||
handleSubmit,
|
||||
form: {
|
||||
mutators: { push, pop },
|
||||
}, // injected from final-form-arrays above
|
||||
pristine,
|
||||
form,
|
||||
submitting,
|
||||
values,
|
||||
},
|
||||
}) => (
|
||||
<form onSubmit={handleSubmit}>
|
||||
{/* Issue Name */}
|
||||
@@ -80,7 +66,6 @@ export const EditMetadataPanel = (props): ReactElement => {
|
||||
<p className="text-xs">Do not enter the first zero</p>
|
||||
</div>
|
||||
<div>
|
||||
{/* year */}
|
||||
<div className="text-sm">Issue Year</div>
|
||||
<Field
|
||||
name="issue_year"
|
||||
@@ -100,8 +85,6 @@ export const EditMetadataPanel = (props): ReactElement => {
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* page count */}
|
||||
|
||||
{/* Description */}
|
||||
<div className="mt-2">
|
||||
<label className="text-sm">Description</label>
|
||||
@@ -113,7 +96,7 @@ export const EditMetadataPanel = (props): ReactElement => {
|
||||
/>
|
||||
</div>
|
||||
|
||||
<hr size="1" />
|
||||
<hr />
|
||||
|
||||
<div className="field is-horizontal">
|
||||
<div className="field-label">
|
||||
@@ -129,6 +112,7 @@ export const EditMetadataPanel = (props): ReactElement => {
|
||||
className="input"
|
||||
placeholder="SKU"
|
||||
/>
|
||||
{/* TODO: Switch to Solar icon */}
|
||||
<span className="icon is-small is-left">
|
||||
<i className="fa-solid fa-barcode"></i>
|
||||
</span>
|
||||
@@ -145,6 +129,7 @@ export const EditMetadataPanel = (props): ReactElement => {
|
||||
className="input"
|
||||
placeholder="UPC Code"
|
||||
/>
|
||||
{/* TODO: Switch to Solar icon */}
|
||||
<span className="icon is-small is-left">
|
||||
<i className="fa-solid fa-box"></i>
|
||||
</span>
|
||||
@@ -153,7 +138,7 @@ export const EditMetadataPanel = (props): ReactElement => {
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<hr size="1" />
|
||||
<hr />
|
||||
|
||||
{/* Publisher */}
|
||||
<div className="field is-horizontal">
|
||||
@@ -167,6 +152,7 @@ export const EditMetadataPanel = (props): ReactElement => {
|
||||
name={"publisher"}
|
||||
component={AsyncSelectPaginateAdapter}
|
||||
placeholder={
|
||||
/* TODO: Switch to Solar icon */
|
||||
<div>
|
||||
<i className="fas fa-print mr-2"></i> Publisher
|
||||
</div>
|
||||
@@ -190,6 +176,7 @@ export const EditMetadataPanel = (props): ReactElement => {
|
||||
name={"story_arc"}
|
||||
component={AsyncSelectPaginateAdapter}
|
||||
placeholder={
|
||||
/* TODO: Switch to Solar icon */
|
||||
<div>
|
||||
<i className="fas fa-book-open mr-2"></i> Story Arc
|
||||
</div>
|
||||
@@ -213,6 +200,7 @@ export const EditMetadataPanel = (props): ReactElement => {
|
||||
name={"series"}
|
||||
component={AsyncSelectPaginateAdapter}
|
||||
placeholder={
|
||||
/* TODO: Switch to Solar icon */
|
||||
<div>
|
||||
<i className="fas fa-layer-group mr-2"></i> Series
|
||||
</div>
|
||||
@@ -224,7 +212,7 @@ export const EditMetadataPanel = (props): ReactElement => {
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<hr size="1" />
|
||||
<hr />
|
||||
|
||||
{/* team credits */}
|
||||
<div className="field is-horizontal">
|
||||
@@ -267,6 +255,7 @@ export const EditMetadataPanel = (props): ReactElement => {
|
||||
name={`${name}.creator`}
|
||||
component={AsyncSelectPaginateAdapter}
|
||||
placeholder={
|
||||
/* TODO: Switch to Solar icon */
|
||||
<div>
|
||||
<i className="fa-solid fa-ghost"></i> Creator
|
||||
</div>
|
||||
@@ -282,6 +271,7 @@ export const EditMetadataPanel = (props): ReactElement => {
|
||||
name={`${name}.role`}
|
||||
metronResource={"role"}
|
||||
placeholder={
|
||||
/* TODO: Switch to Solar icon */
|
||||
<div>
|
||||
<i className="fa-solid fa-key"></i> Role
|
||||
</div>
|
||||
@@ -290,6 +280,7 @@ export const EditMetadataPanel = (props): ReactElement => {
|
||||
/>
|
||||
</p>
|
||||
</div>
|
||||
{/* TODO: Switch to Solar icon */}
|
||||
<span
|
||||
className="icon is-danger mt-2"
|
||||
onClick={() => fields.remove(index)}
|
||||
@@ -302,7 +293,6 @@ export const EditMetadataPanel = (props): ReactElement => {
|
||||
))
|
||||
}
|
||||
</FieldArray>
|
||||
<pre>{JSON.stringify(values, undefined, 2)}</pre>
|
||||
</form>
|
||||
)}
|
||||
/>
|
||||
|
||||
@@ -4,20 +4,40 @@ import { convert } from "html-to-text";
|
||||
import ellipsize from "ellipsize";
|
||||
import { LIBRARY_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import axios from "axios";
|
||||
import { useGetComicByIdQuery } from "../../graphql/generated";
|
||||
import type { MatchResultProps } from "../../types";
|
||||
|
||||
interface MatchResultProps {
|
||||
matchData: any;
|
||||
comicObjectId: string;
|
||||
queryClient?: any;
|
||||
onMatchApplied?: () => void;
|
||||
}
|
||||
|
||||
const handleBrokenImage = (e) => {
|
||||
e.target.src = "http://localhost:3050/dist/img/noimage.svg";
|
||||
const handleBrokenImage = (e: React.SyntheticEvent<HTMLImageElement>) => {
|
||||
e.currentTarget.src = "http://localhost:3050/dist/img/noimage.svg";
|
||||
};
|
||||
|
||||
interface ComicVineMatch {
|
||||
description?: string;
|
||||
name?: string;
|
||||
score: string | number;
|
||||
issue_number: string | number;
|
||||
cover_date: string;
|
||||
image: {
|
||||
thumb_url: string;
|
||||
};
|
||||
volume: {
|
||||
name: string;
|
||||
};
|
||||
volumeInformation: {
|
||||
results: {
|
||||
image: {
|
||||
icon_url: string;
|
||||
};
|
||||
count_of_issues: number;
|
||||
publisher: {
|
||||
name: string;
|
||||
};
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
export const MatchResult = (props: MatchResultProps) => {
|
||||
const applyCVMatch = async (match, comicObjectId) => {
|
||||
const applyCVMatch = async (match: ComicVineMatch, comicObjectId: string) => {
|
||||
try {
|
||||
const response = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/applyComicVineMetadata`,
|
||||
@@ -31,7 +51,7 @@ export const MatchResult = (props: MatchResultProps) => {
|
||||
// Invalidate and refetch the comic book metadata
|
||||
if (props.queryClient) {
|
||||
await props.queryClient.invalidateQueries({
|
||||
queryKey: ["comicBookMetadata", comicObjectId],
|
||||
queryKey: useGetComicByIdQuery.getKey({ id: comicObjectId }),
|
||||
});
|
||||
}
|
||||
|
||||
|
||||
@@ -1,29 +1,24 @@
|
||||
import React, { ReactElement, ReactNode } from "react";
|
||||
import prettyBytes from "pretty-bytes";
|
||||
import { isEmpty } from "lodash";
|
||||
import { format, parseISO } from "date-fns";
|
||||
import { RawFileDetails as RawFileDetailsType } from "../../graphql/generated";
|
||||
import { format, parseISO, isValid } from "date-fns";
|
||||
import {
|
||||
RawFileDetails as RawFileDetailsType,
|
||||
InferredMetadata,
|
||||
} from "../../graphql/generated";
|
||||
|
||||
type RawFileDetailsProps = {
|
||||
data?: {
|
||||
rawFileDetails?: RawFileDetailsType;
|
||||
inferredMetadata?: {
|
||||
issue?: {
|
||||
year?: string;
|
||||
name?: string;
|
||||
number?: number;
|
||||
subtitle?: string;
|
||||
};
|
||||
};
|
||||
created_at?: string;
|
||||
updated_at?: string;
|
||||
inferredMetadata?: InferredMetadata;
|
||||
createdAt?: string;
|
||||
};
|
||||
children?: ReactNode;
|
||||
};
|
||||
|
||||
/** Renders raw file info, inferred metadata, and import timestamp for a comic. */
|
||||
export const RawFileDetails = (props: RawFileDetailsProps): ReactElement => {
|
||||
const { rawFileDetails, inferredMetadata, created_at, updated_at } =
|
||||
props.data || {};
|
||||
const { rawFileDetails, inferredMetadata, createdAt } = props.data || {};
|
||||
return (
|
||||
<>
|
||||
<div className="max-w-2xl ml-5">
|
||||
@@ -97,10 +92,10 @@ export const RawFileDetails = (props: RawFileDetailsProps): ReactElement => {
|
||||
Import Details
|
||||
</dt>
|
||||
<dd className="mt-1 text-sm text-gray-900 dark:text-gray-400">
|
||||
{created_at ? (
|
||||
{createdAt && isValid(parseISO(createdAt)) ? (
|
||||
<>
|
||||
{format(parseISO(created_at), "dd MMMM, yyyy")},{" "}
|
||||
{format(parseISO(created_at), "h aaaa")}
|
||||
{format(parseISO(createdAt), "dd MMMM, yyyy")},{" "}
|
||||
{format(parseISO(createdAt), "h aaaa")}
|
||||
</>
|
||||
) : "N/A"}
|
||||
</dd>
|
||||
|
||||
@@ -1,28 +1,69 @@
|
||||
import React from "react";
|
||||
import React, { useState } from "react";
|
||||
import { ComicVineSearchForm } from "./ComicVineSearchForm";
|
||||
import { ComicVineMatchPanel } from "./ComicVineMatchPanel";
|
||||
import { EditMetadataPanel } from "./EditMetadataPanel";
|
||||
import { RawFileDetails } from "../../graphql/generated";
|
||||
import type { RawFileDetails, InferredMetadata } from "../../graphql/generated";
|
||||
|
||||
type InferredIssue = {
|
||||
name?: string;
|
||||
number?: number;
|
||||
year?: string;
|
||||
subtitle?: string;
|
||||
[key: string]: any;
|
||||
};
|
||||
|
||||
type CVMatchesPanelProps = {
|
||||
interface CVMatchesPanelProps {
|
||||
rawFileDetails?: RawFileDetails;
|
||||
inferredMetadata: {
|
||||
issue?: InferredIssue;
|
||||
};
|
||||
inferredMetadata: InferredMetadata;
|
||||
comicVineMatches: any[];
|
||||
comicObjectId: string;
|
||||
queryClient: any;
|
||||
onMatchApplied: () => void;
|
||||
};
|
||||
|
||||
/**
|
||||
* Collapsible container for manual ComicVine search form.
|
||||
* Allows users to manually search when auto-match doesn't yield results.
|
||||
*/
|
||||
const CollapsibleSearchForm: React.FC<{ rawFileDetails?: RawFileDetails }> = ({
|
||||
rawFileDetails,
|
||||
}) => {
|
||||
const [isExpanded, setIsExpanded] = useState(false);
|
||||
|
||||
return (
|
||||
<div className="border border-slate-300 dark:border-slate-600 rounded-lg overflow-hidden">
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => setIsExpanded(!isExpanded)}
|
||||
className="w-full flex items-center justify-between px-4 py-3 bg-slate-100 dark:bg-slate-700 hover:bg-slate-200 dark:hover:bg-slate-600 transition-colors text-left"
|
||||
aria-expanded={isExpanded}
|
||||
>
|
||||
<span className="flex items-center gap-2 text-slate-700 dark:text-slate-200 font-medium">
|
||||
<svg
|
||||
className={`w-4 h-4 transition-transform ${isExpanded ? "rotate-90" : ""}`}
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
viewBox="0 0 24 24"
|
||||
>
|
||||
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M9 5l7 7-7 7" />
|
||||
</svg>
|
||||
Manual Search
|
||||
</span>
|
||||
<span className="text-sm text-slate-500 dark:text-slate-400">
|
||||
{isExpanded ? "Click to collapse" : "No results? Search manually"}
|
||||
</span>
|
||||
</button>
|
||||
{isExpanded && (
|
||||
<div className="p-4 bg-white dark:bg-slate-800">
|
||||
<ComicVineSearchForm rawFileDetails={rawFileDetails} />
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* Sliding panel content for ComicVine match search.
|
||||
*
|
||||
* Renders a search form pre-populated from `rawFileDetails`, a preview of the
|
||||
* inferred issue being searched for, and a list of ComicVine match candidates
|
||||
* the user can apply to the comic.
|
||||
*
|
||||
* @param props.onMatchApplied - Called after the user selects and applies a match,
|
||||
* allowing the parent to close the panel and refresh state.
|
||||
*/
|
||||
export const CVMatchesPanel: React.FC<CVMatchesPanelProps> = ({
|
||||
rawFileDetails,
|
||||
inferredMetadata,
|
||||
@@ -32,19 +73,18 @@ export const CVMatchesPanel: React.FC<CVMatchesPanelProps> = ({
|
||||
onMatchApplied,
|
||||
}) => (
|
||||
<>
|
||||
<div>
|
||||
<ComicVineSearchForm data={rawFileDetails} />
|
||||
</div>
|
||||
|
||||
<div className="border-slate-500 border rounded-lg p-2 mt-3">
|
||||
<p className="">Searching for:</p>
|
||||
<div className="border-slate-500 border rounded-lg p-2 mb-3">
|
||||
<p className="text-slate-600 dark:text-slate-300">Searching for:</p>
|
||||
{inferredMetadata.issue ? (
|
||||
<>
|
||||
<span className="">{inferredMetadata.issue?.name} </span>
|
||||
<span className=""> # {inferredMetadata.issue?.number} </span>
|
||||
<span className="text-slate-800 dark:text-slate-100 font-medium">{inferredMetadata.issue?.name} </span>
|
||||
<span className="text-slate-600 dark:text-slate-300"> # {inferredMetadata.issue?.number} </span>
|
||||
</>
|
||||
) : null}
|
||||
</div>
|
||||
|
||||
<CollapsibleSearchForm rawFileDetails={rawFileDetails} />
|
||||
|
||||
<ComicVineMatchPanel
|
||||
props={{
|
||||
comicVineMatches,
|
||||
@@ -62,4 +102,4 @@ type EditMetadataPanelWrapperProps = {
|
||||
|
||||
export const EditMetadataPanelWrapper: React.FC<EditMetadataPanelWrapperProps> = ({
|
||||
rawFileDetails,
|
||||
}) => <EditMetadataPanel data={rawFileDetails} />;
|
||||
}) => <EditMetadataPanel data={rawFileDetails ?? {}} />;
|
||||
|
||||
@@ -1,20 +1,41 @@
|
||||
import React, { ReactElement, Suspense, useState } from "react";
|
||||
import { isNil } from "lodash";
|
||||
|
||||
export const TabControls = (props): ReactElement => {
|
||||
interface TabItem {
|
||||
id: number;
|
||||
name: string;
|
||||
icon: React.ReactNode;
|
||||
content: React.ReactNode;
|
||||
shouldShow?: boolean;
|
||||
}
|
||||
|
||||
interface TabControlsProps {
|
||||
filteredTabs: TabItem[];
|
||||
downloadCount: number;
|
||||
activeTab?: number;
|
||||
setActiveTab?: (id: number) => void;
|
||||
}
|
||||
|
||||
export const TabControls = (props: TabControlsProps): ReactElement => {
|
||||
const { filteredTabs, downloadCount, activeTab, setActiveTab } = props;
|
||||
const [active, setActive] = useState(filteredTabs[0].id);
|
||||
|
||||
// Use controlled state if provided, otherwise use internal state
|
||||
const currentActive = activeTab !== undefined ? activeTab : active;
|
||||
const handleSetActive = activeTab !== undefined ? setActiveTab : setActive;
|
||||
const handleSetActive = (id: number) => {
|
||||
if (setActiveTab) {
|
||||
setActiveTab(id);
|
||||
} else {
|
||||
setActive(id);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className="hidden sm:block mt-7 mb-3 w-fit">
|
||||
<div className="border-b border-gray-200">
|
||||
<nav className="flex gap-6" aria-label="Tabs">
|
||||
{filteredTabs.map(({ id, name, icon }) => (
|
||||
{filteredTabs.map(({ id, name, icon }: TabItem) => (
|
||||
<a
|
||||
key={id}
|
||||
className={`inline-flex shrink-0 items-center gap-2 px-1 py-1 text-md font-medium text-gray-500 dark:text-gray-400 hover:border-gray-300 hover:border-b hover:dark:text-slate-200 ${
|
||||
@@ -47,10 +68,12 @@ export const TabControls = (props): ReactElement => {
|
||||
</nav>
|
||||
</div>
|
||||
</div>
|
||||
<Suspense>
|
||||
{filteredTabs.map(({ id, content }) => {
|
||||
return currentActive === id ? content : null;
|
||||
})}
|
||||
<Suspense fallback={null}>
|
||||
{filteredTabs.map(({ id, content }: TabItem) => (
|
||||
<React.Fragment key={id}>
|
||||
{currentActive === id ? content : null}
|
||||
</React.Fragment>
|
||||
))}
|
||||
</Suspense>
|
||||
</>
|
||||
);
|
||||
|
||||
@@ -131,13 +131,15 @@ export const ArchiveOperations = (props: { data: any }): ReactElement => {
|
||||
enabled: false,
|
||||
});
|
||||
|
||||
if (isSuccess && shouldRefetchComicBookData) {
|
||||
queryClient.invalidateQueries({ queryKey: ["comicBookMetadata"] });
|
||||
setShouldRefetchComicBookData(false);
|
||||
}
|
||||
useEffect(() => {
|
||||
if (isSuccess && shouldRefetchComicBookData) {
|
||||
queryClient.invalidateQueries({ queryKey: ["comicBookMetadata"] });
|
||||
setShouldRefetchComicBookData(false);
|
||||
}
|
||||
}, [isSuccess, shouldRefetchComicBookData, queryClient]);
|
||||
|
||||
// sliding panel init
|
||||
const contentForSlidingPanel: Record<string, { content: () => JSX.Element }> = {
|
||||
const contentForSlidingPanel: Record<string, { content: () => React.ReactElement }> = {
|
||||
imageAnalysis: {
|
||||
content: () => {
|
||||
return (
|
||||
|
||||
522
src/client/components/ComicDetail/Tabs/ReconcilerDrawer.tsx
Normal file
522
src/client/components/ComicDetail/Tabs/ReconcilerDrawer.tsx
Normal file
@@ -0,0 +1,522 @@
|
||||
import React, { ReactElement, useMemo, useState } from "react"
|
||||
import { Drawer } from "vaul"
|
||||
import { FIELD_CONFIG, FIELD_GROUPS } from "./reconciler.fieldConfig"
|
||||
import {
|
||||
useReconciler,
|
||||
SourceKey,
|
||||
SOURCE_LABELS,
|
||||
RawSourcedMetadata,
|
||||
RawInferredMetadata,
|
||||
CanonicalRecord,
|
||||
} from "./useReconciler"
|
||||
|
||||
// ── Source styling ─────────────────────────────────────────────────────────────
|
||||
|
||||
const SOURCE_BADGE: Record<SourceKey, string> = {
|
||||
comicvine: "bg-blue-100 text-blue-800 dark:bg-blue-900/40 dark:text-blue-300",
|
||||
metron: "bg-purple-100 text-purple-800 dark:bg-purple-900/40 dark:text-purple-300",
|
||||
gcd: "bg-orange-100 text-orange-800 dark:bg-orange-900/40 dark:text-orange-300",
|
||||
locg: "bg-teal-100 text-teal-800 dark:bg-teal-900/40 dark:text-teal-300",
|
||||
comicInfo: "bg-slate-100 text-slate-700 dark:bg-slate-700/60 dark:text-slate-300",
|
||||
inferredMetadata: "bg-gray-100 text-gray-700 dark:bg-gray-700/60 dark:text-gray-300",
|
||||
}
|
||||
|
||||
const SOURCE_SELECTED: Record<SourceKey, string> = {
|
||||
comicvine: "ring-2 ring-blue-400 bg-blue-50 dark:bg-blue-900/20",
|
||||
metron: "ring-2 ring-purple-400 bg-purple-50 dark:bg-purple-900/20",
|
||||
gcd: "ring-2 ring-orange-400 bg-orange-50 dark:bg-orange-900/20",
|
||||
locg: "ring-2 ring-teal-400 bg-teal-50 dark:bg-teal-900/20",
|
||||
comicInfo: "ring-2 ring-slate-400 bg-slate-50 dark:bg-slate-700/40",
|
||||
inferredMetadata: "ring-2 ring-gray-400 bg-gray-50 dark:bg-gray-700/40",
|
||||
}
|
||||
|
||||
/** Abbreviated source names for compact badge display. */
|
||||
const SOURCE_SHORT: Record<SourceKey, string> = {
|
||||
comicvine: "CV",
|
||||
metron: "Metron",
|
||||
gcd: "GCD",
|
||||
locg: "LoCG",
|
||||
comicInfo: "XML",
|
||||
inferredMetadata: "Local",
|
||||
}
|
||||
|
||||
const SOURCE_ORDER: SourceKey[] = [
|
||||
"comicvine", "metron", "gcd", "locg", "comicInfo", "inferredMetadata",
|
||||
]
|
||||
|
||||
type FilterMode = "all" | "conflicts" | "unresolved"
|
||||
|
||||
// ── Props ──────────────────────────────────────────────────────────────────────
|
||||
|
||||
export interface ReconcilerDrawerProps {
|
||||
open: boolean
|
||||
onOpenChange: (open: boolean) => void
|
||||
sourcedMetadata: RawSourcedMetadata
|
||||
inferredMetadata?: RawInferredMetadata
|
||||
onSave: (record: CanonicalRecord) => void
|
||||
}
|
||||
|
||||
// ── Scalar cell ────────────────────────────────────────────────────────────────
|
||||
|
||||
interface ScalarCellProps {
|
||||
value: string | null
|
||||
isSelected: boolean
|
||||
isImage: boolean
|
||||
isLongtext: boolean
|
||||
onClick: () => void
|
||||
}
|
||||
|
||||
function ScalarCell({ value, isSelected, isImage, isLongtext, onClick }: ScalarCellProps): ReactElement {
|
||||
if (!value) {
|
||||
return <span className="text-slate-300 dark:text-slate-600 text-sm px-2 pt-1.5 block">—</span>
|
||||
}
|
||||
|
||||
return (
|
||||
<button
|
||||
onClick={onClick}
|
||||
className={`w-full text-left text-sm px-2 py-1.5 rounded-md border transition-all ${
|
||||
isSelected
|
||||
? `border-transparent ${SOURCE_SELECTED[/* filled by parent */ "comicvine"]}`
|
||||
: "border-slate-200 dark:border-slate-700 hover:border-slate-300 dark:hover:border-slate-600 bg-white dark:bg-slate-800 hover:bg-slate-50 dark:hover:bg-slate-750"
|
||||
}`}
|
||||
>
|
||||
{isImage ? (
|
||||
<img
|
||||
src={value}
|
||||
alt="cover"
|
||||
className="w-full h-24 object-cover rounded"
|
||||
onError={(e) => { (e.target as HTMLImageElement).style.display = "none" }}
|
||||
/>
|
||||
) : (
|
||||
<span className={`block text-slate-700 dark:text-slate-300 ${isLongtext ? "line-clamp-3 whitespace-normal" : "truncate"}`}>
|
||||
{value}
|
||||
</span>
|
||||
)}
|
||||
{isSelected && (
|
||||
<i className="icon-[solar--check-circle-bold] w-3.5 h-3.5 text-green-500 mt-0.5 block" />
|
||||
)}
|
||||
</button>
|
||||
)
|
||||
}
|
||||
|
||||
// ── Main component ─────────────────────────────────────────────────────────────
|
||||
|
||||
export function ReconcilerDrawer({
|
||||
open,
|
||||
onOpenChange,
|
||||
sourcedMetadata,
|
||||
inferredMetadata,
|
||||
onSave,
|
||||
}: ReconcilerDrawerProps): ReactElement {
|
||||
const [filter, setFilter] = useState<FilterMode>("all")
|
||||
|
||||
const {
|
||||
state,
|
||||
unresolvedCount,
|
||||
canonicalRecord,
|
||||
selectScalar,
|
||||
toggleItem,
|
||||
setBaseSource,
|
||||
reset,
|
||||
} = useReconciler(sourcedMetadata, inferredMetadata)
|
||||
|
||||
// Derive which sources actually contributed data
|
||||
const activeSources = useMemo<SourceKey[]>(() => {
|
||||
const seen = new Set<SourceKey>()
|
||||
for (const fieldState of Object.values(state)) {
|
||||
if (fieldState.kind === "scalar") {
|
||||
for (const c of fieldState.candidates) seen.add(c.source)
|
||||
} else if (fieldState.kind === "array" || fieldState.kind === "credits") {
|
||||
for (const item of fieldState.items) seen.add((item as { source: SourceKey }).source)
|
||||
}
|
||||
}
|
||||
return SOURCE_ORDER.filter((s) => seen.has(s))
|
||||
}, [state])
|
||||
|
||||
// Grid: 180px label + one equal column per active source
|
||||
const gridCols = `180px repeat(${Math.max(activeSources.length, 1)}, minmax(0, 1fr))`
|
||||
|
||||
function shouldShow(fieldKey: string): boolean {
|
||||
const fs = state[fieldKey]
|
||||
if (!fs) return false
|
||||
if (filter === "all") return true
|
||||
if (filter === "conflicts") {
|
||||
if (fs.kind === "scalar") return fs.candidates.length > 1
|
||||
if (fs.kind === "array" || fs.kind === "credits") {
|
||||
const srcs = new Set((fs.items as Array<{ source: SourceKey }>).map((i) => i.source))
|
||||
return srcs.size > 1
|
||||
}
|
||||
return false
|
||||
}
|
||||
// unresolved
|
||||
return (
|
||||
fs.kind === "scalar" &&
|
||||
fs.candidates.length > 1 &&
|
||||
fs.selectedSource === null &&
|
||||
fs.userValue === undefined
|
||||
)
|
||||
}
|
||||
|
||||
const allResolved = unresolvedCount === 0
|
||||
|
||||
return (
|
||||
<Drawer.Root open={open} onOpenChange={onOpenChange}>
|
||||
<Drawer.Portal>
|
||||
<Drawer.Overlay className="fixed inset-0 bg-black/50 z-40" />
|
||||
<Drawer.Content
|
||||
aria-describedby={undefined}
|
||||
className="fixed inset-0 z-50 flex flex-col bg-white dark:bg-slate-900 outline-none"
|
||||
>
|
||||
<Drawer.Title className="sr-only">Reconcile metadata sources</Drawer.Title>
|
||||
|
||||
{/* ── Header ── */}
|
||||
<div className="flex-none border-b border-slate-200 dark:border-slate-700 shadow-sm">
|
||||
{/* Title + controls */}
|
||||
<div className="flex items-center justify-between px-4 py-3">
|
||||
<div className="flex items-center gap-3">
|
||||
<i className="icon-[solar--refresh-circle-outline] w-5 h-5 text-slate-500 dark:text-slate-400" />
|
||||
<span className="font-semibold text-slate-800 dark:text-slate-100 text-base">
|
||||
Reconcile Metadata
|
||||
</span>
|
||||
{unresolvedCount > 0 && (
|
||||
<span className="inline-flex items-center px-2 py-0.5 rounded-full text-xs font-medium bg-amber-100 text-amber-700 dark:bg-amber-900/40 dark:text-amber-300">
|
||||
{unresolvedCount} unresolved
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-2">
|
||||
{/* Filter pill */}
|
||||
<div className="flex items-center bg-slate-100 dark:bg-slate-800 rounded-lg p-0.5 gap-0.5">
|
||||
{(["all", "conflicts", "unresolved"] as FilterMode[]).map((mode) => (
|
||||
<button
|
||||
key={mode}
|
||||
onClick={() => setFilter(mode)}
|
||||
className={`px-3 py-1 rounded-md text-xs font-medium transition-colors capitalize ${
|
||||
filter === mode
|
||||
? "bg-white dark:bg-slate-700 text-slate-800 dark:text-slate-100 shadow-sm"
|
||||
: "text-slate-500 hover:text-slate-700 dark:hover:text-slate-300"
|
||||
}`}
|
||||
>
|
||||
{mode}
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
|
||||
<button
|
||||
onClick={reset}
|
||||
title="Reset all selections"
|
||||
className="px-3 py-1.5 text-xs rounded-md border border-slate-200 dark:border-slate-600 text-slate-600 dark:text-slate-400 hover:bg-slate-50 dark:hover:bg-slate-800 transition-colors"
|
||||
>
|
||||
Reset
|
||||
</button>
|
||||
|
||||
<button
|
||||
onClick={() => onOpenChange(false)}
|
||||
title="Close"
|
||||
className="p-1.5 rounded-md text-slate-400 hover:text-slate-600 dark:hover:text-slate-300 hover:bg-slate-100 dark:hover:bg-slate-800 transition-colors"
|
||||
>
|
||||
<i className="icon-[solar--close-square-outline] w-5 h-5 block" />
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Source column headers */}
|
||||
<div
|
||||
className="px-4 pb-3"
|
||||
style={{ display: "grid", gridTemplateColumns: gridCols, gap: "8px" }}
|
||||
>
|
||||
<div className="text-xs font-medium text-slate-400 dark:text-slate-500 uppercase tracking-wider flex items-end pb-0.5">
|
||||
Field
|
||||
</div>
|
||||
{activeSources.map((src) => (
|
||||
<div key={src} className="flex flex-col gap-1.5">
|
||||
<span className={`text-xs font-semibold px-2 py-0.5 rounded w-fit ${SOURCE_BADGE[src]}`}>
|
||||
{SOURCE_LABELS[src]}
|
||||
</span>
|
||||
<button
|
||||
onClick={() => setBaseSource(src)}
|
||||
className="text-xs text-slate-400 hover:text-slate-600 dark:hover:text-slate-300 text-left transition-colors"
|
||||
>
|
||||
Use all ↓
|
||||
</button>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* ── Scrollable body ── */}
|
||||
<div className="flex-1 overflow-y-auto">
|
||||
{FIELD_GROUPS.map((group) => {
|
||||
const fieldsInGroup = Object.entries(FIELD_CONFIG)
|
||||
.filter(([, cfg]) => cfg.group === group)
|
||||
.filter(([key]) => shouldShow(key))
|
||||
|
||||
if (fieldsInGroup.length === 0) return null
|
||||
|
||||
return (
|
||||
<div key={group}>
|
||||
{/* Group sticky header */}
|
||||
<div className="sticky top-0 z-10 px-4 py-2 bg-slate-50 dark:bg-slate-800/90 backdrop-blur-sm border-b border-slate-200 dark:border-slate-700">
|
||||
<span className="text-xs font-bold text-slate-400 dark:text-slate-500 uppercase tracking-widest">
|
||||
{group}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
{/* Field rows */}
|
||||
{fieldsInGroup.map(([fieldKey, fieldCfg]) => {
|
||||
const fs = state[fieldKey]
|
||||
if (!fs) return null
|
||||
|
||||
const isUnresolved =
|
||||
fs.kind === "scalar" &&
|
||||
fs.candidates.length > 1 &&
|
||||
fs.selectedSource === null &&
|
||||
fs.userValue === undefined
|
||||
|
||||
return (
|
||||
<div
|
||||
key={fieldKey}
|
||||
className={`border-b border-slate-100 dark:border-slate-800/60 transition-colors ${
|
||||
isUnresolved ? "bg-amber-50/50 dark:bg-amber-950/20" : ""
|
||||
}`}
|
||||
style={{
|
||||
display: "grid",
|
||||
gridTemplateColumns: gridCols,
|
||||
gap: "8px",
|
||||
padding: "10px 16px",
|
||||
alignItems: "start",
|
||||
}}
|
||||
>
|
||||
{/* Label column */}
|
||||
<div className="flex flex-col gap-0.5 pt-1.5 pr-2">
|
||||
<span className="text-sm font-medium text-slate-700 dark:text-slate-300 leading-tight">
|
||||
{fieldCfg.label}
|
||||
</span>
|
||||
{fieldCfg.comicInfoKey && (
|
||||
<span className="text-xs text-slate-400 font-mono leading-none">
|
||||
{fieldCfg.comicInfoKey}
|
||||
</span>
|
||||
)}
|
||||
{isUnresolved && (
|
||||
<span className="inline-flex items-center gap-0.5 text-xs text-amber-600 dark:text-amber-400 mt-0.5">
|
||||
<i className="icon-[solar--danger-triangle-outline] w-3 h-3" />
|
||||
conflict
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Content — varies by kind */}
|
||||
{fs.kind === "scalar" ? (
|
||||
// One cell per active source
|
||||
activeSources.map((src) => {
|
||||
const candidate = fs.candidates.find((c) => c.source === src)
|
||||
const isSelected = fs.selectedSource === src
|
||||
|
||||
// For selected state we need the source-specific color
|
||||
const selectedClass = isSelected ? SOURCE_SELECTED[src] : ""
|
||||
|
||||
if (!candidate) {
|
||||
return (
|
||||
<span
|
||||
key={src}
|
||||
className="text-slate-300 dark:text-slate-600 text-sm px-2 pt-1.5 block"
|
||||
>
|
||||
—
|
||||
</span>
|
||||
)
|
||||
}
|
||||
|
||||
return (
|
||||
<button
|
||||
key={src}
|
||||
onClick={() => selectScalar(fieldKey, src)}
|
||||
className={`w-full text-left text-sm px-2 py-1.5 rounded-md border transition-all ${
|
||||
isSelected
|
||||
? `border-transparent ${selectedClass}`
|
||||
: "border-slate-200 dark:border-slate-700 hover:border-slate-300 dark:hover:border-slate-600 bg-white dark:bg-slate-800 hover:bg-slate-50 dark:hover:bg-slate-750"
|
||||
}`}
|
||||
>
|
||||
{fieldCfg.renderAs === "image" ? (
|
||||
<img
|
||||
src={candidate.value}
|
||||
alt="cover"
|
||||
className="w-full h-24 object-cover rounded"
|
||||
onError={(e) => {
|
||||
;(e.target as HTMLImageElement).style.display = "none"
|
||||
}}
|
||||
/>
|
||||
) : (
|
||||
<span
|
||||
className={`block text-slate-700 dark:text-slate-300 ${
|
||||
fieldCfg.renderAs === "longtext"
|
||||
? "line-clamp-3 whitespace-normal text-xs leading-relaxed"
|
||||
: "truncate"
|
||||
}`}
|
||||
>
|
||||
{candidate.value}
|
||||
</span>
|
||||
)}
|
||||
{isSelected && (
|
||||
<i className="icon-[solar--check-circle-bold] w-3.5 h-3.5 text-green-500 mt-0.5 block" />
|
||||
)}
|
||||
</button>
|
||||
)
|
||||
})
|
||||
) : fs.kind === "array" ? (
|
||||
// Merged list spanning all source columns
|
||||
<div
|
||||
className="flex flex-wrap gap-1.5"
|
||||
style={{ gridColumn: "2 / -1" }}
|
||||
>
|
||||
{fs.items.length === 0 ? (
|
||||
<span className="text-slate-400 dark:text-slate-500 text-sm">No data</span>
|
||||
) : (
|
||||
fs.items.map((item) => (
|
||||
<label
|
||||
key={item.itemKey}
|
||||
className={`inline-flex items-center gap-1.5 px-2 py-1 rounded-md border cursor-pointer transition-all text-sm select-none ${
|
||||
item.selected
|
||||
? "border-slate-200 dark:border-slate-600 bg-white dark:bg-slate-800"
|
||||
: "border-dashed border-slate-200 dark:border-slate-700 opacity-40"
|
||||
}`}
|
||||
>
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={item.selected}
|
||||
onChange={(e) =>
|
||||
toggleItem(fieldKey, item.itemKey, e.target.checked)
|
||||
}
|
||||
className="w-3 h-3 rounded accent-slate-600 flex-none"
|
||||
/>
|
||||
<span className="text-slate-700 dark:text-slate-300">
|
||||
{item.displayValue}
|
||||
</span>
|
||||
<span
|
||||
className={`text-xs px-1.5 py-0.5 rounded font-medium ${SOURCE_BADGE[item.source]}`}
|
||||
>
|
||||
{SOURCE_SHORT[item.source]}
|
||||
</span>
|
||||
</label>
|
||||
))
|
||||
)}
|
||||
</div>
|
||||
) : fs.kind === "credits" ? (
|
||||
// Credits spanning all source columns
|
||||
<div
|
||||
className="flex flex-col gap-1"
|
||||
style={{ gridColumn: "2 / -1" }}
|
||||
>
|
||||
{fs.items.length === 0 ? (
|
||||
<span className="text-slate-400 dark:text-slate-500 text-sm">No data</span>
|
||||
) : (
|
||||
fs.items.map((item) => (
|
||||
<label
|
||||
key={item.itemKey}
|
||||
className={`inline-flex items-center gap-2 px-2 py-1.5 rounded-md border cursor-pointer transition-all text-sm select-none ${
|
||||
item.selected
|
||||
? "border-slate-200 dark:border-slate-600 bg-white dark:bg-slate-800"
|
||||
: "border-dashed border-slate-200 dark:border-slate-700 opacity-40"
|
||||
}`}
|
||||
>
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={item.selected}
|
||||
onChange={(e) =>
|
||||
toggleItem(fieldKey, item.itemKey, e.target.checked)
|
||||
}
|
||||
className="w-3 h-3 rounded accent-slate-600 flex-none"
|
||||
/>
|
||||
<span className="font-medium text-slate-700 dark:text-slate-300">
|
||||
{item.name}
|
||||
</span>
|
||||
<span className="text-slate-400 dark:text-slate-500">·</span>
|
||||
<span className="text-slate-500 dark:text-slate-400 text-xs">
|
||||
{item.role}
|
||||
</span>
|
||||
<span
|
||||
className={`ml-auto text-xs px-1.5 py-0.5 rounded font-medium flex-none ${SOURCE_BADGE[item.source]}`}
|
||||
>
|
||||
{SOURCE_SHORT[item.source]}
|
||||
</span>
|
||||
</label>
|
||||
))
|
||||
)}
|
||||
</div>
|
||||
) : (
|
||||
// GTIN and other complex types
|
||||
<div
|
||||
className="pt-1.5"
|
||||
style={{ gridColumn: "2 / -1" }}
|
||||
>
|
||||
<span className="text-slate-400 dark:text-slate-500 text-sm italic">
|
||||
Structured field — editor coming soon
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
})}
|
||||
</div>
|
||||
)
|
||||
})}
|
||||
|
||||
{/* Empty state when filter hides everything */}
|
||||
{FIELD_GROUPS.every((group) =>
|
||||
Object.entries(FIELD_CONFIG)
|
||||
.filter(([, cfg]) => cfg.group === group)
|
||||
.every(([key]) => !shouldShow(key)),
|
||||
) && (
|
||||
<div className="flex flex-col items-center justify-center py-24 gap-3 text-slate-400 dark:text-slate-500">
|
||||
<i className="icon-[solar--check-circle-bold] w-10 h-10 text-green-400" />
|
||||
<span className="text-sm">
|
||||
{filter === "unresolved" ? "No unresolved conflicts" : "No fields match the current filter"}
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* ── Footer ── */}
|
||||
<div className="flex-none border-t border-slate-200 dark:border-slate-700 px-4 py-3 flex items-center justify-between bg-white dark:bg-slate-900">
|
||||
<div className="text-sm">
|
||||
{allResolved ? (
|
||||
<span className="flex items-center gap-1.5 text-green-600 dark:text-green-400">
|
||||
<i className="icon-[solar--check-circle-bold] w-4 h-4" />
|
||||
All conflicts resolved
|
||||
</span>
|
||||
) : (
|
||||
<span className="flex items-center gap-1.5 text-amber-600 dark:text-amber-400">
|
||||
<i className="icon-[solar--danger-triangle-outline] w-4 h-4" />
|
||||
{unresolvedCount} field{unresolvedCount !== 1 ? "s" : ""} still need a value
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-2">
|
||||
<button
|
||||
onClick={() => onOpenChange(false)}
|
||||
className="px-4 py-2 text-sm text-slate-600 dark:text-slate-400 hover:bg-slate-100 dark:hover:bg-slate-800 rounded-lg transition-colors"
|
||||
>
|
||||
Cancel
|
||||
</button>
|
||||
<button
|
||||
onClick={() => {
|
||||
onSave(canonicalRecord)
|
||||
onOpenChange(false)
|
||||
}}
|
||||
disabled={!allResolved}
|
||||
className={`px-4 py-2 text-sm rounded-lg font-medium transition-colors ${
|
||||
allResolved
|
||||
? "bg-green-600 text-white hover:bg-green-700 dark:bg-green-700 dark:hover:bg-green-600"
|
||||
: "bg-slate-100 text-slate-400 dark:bg-slate-800 dark:text-slate-600 cursor-not-allowed"
|
||||
}`}
|
||||
>
|
||||
Save Canonical Record
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</Drawer.Content>
|
||||
</Drawer.Portal>
|
||||
</Drawer.Root>
|
||||
)
|
||||
}
|
||||
@@ -1,14 +1,201 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import React, { ReactElement, useMemo, useState } from "react";
|
||||
import { isEmpty, isNil } from "lodash";
|
||||
import { useMutation, useQueryClient } from "@tanstack/react-query";
|
||||
import ComicVineDetails from "../ComicVineDetails";
|
||||
import { ReconcilerDrawer } from "./ReconcilerDrawer";
|
||||
import { fetcher } from "../../../graphql/fetcher";
|
||||
import { useGetComicByIdQuery } from "../../../graphql/generated";
|
||||
import type { CanonicalRecord } from "./useReconciler";
|
||||
|
||||
export const VolumeInformation = (props): ReactElement => {
|
||||
interface ComicVineMetadata {
|
||||
volumeInformation?: Record<string, unknown>;
|
||||
name?: string;
|
||||
number?: string;
|
||||
resource_type?: string;
|
||||
id?: number;
|
||||
}
|
||||
|
||||
interface SourcedMetadata {
|
||||
comicvine?: ComicVineMetadata;
|
||||
locg?: Record<string, unknown>;
|
||||
comicInfo?: unknown;
|
||||
metron?: unknown;
|
||||
gcd?: unknown;
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
interface VolumeInformationData {
|
||||
id?: string;
|
||||
sourcedMetadata?: SourcedMetadata;
|
||||
inferredMetadata?: { issue?: unknown };
|
||||
updatedAt?: string;
|
||||
}
|
||||
|
||||
interface VolumeInformationProps {
|
||||
data: VolumeInformationData;
|
||||
onReconcile?: () => void;
|
||||
}
|
||||
|
||||
const SET_METADATA_FIELD = `
|
||||
mutation SetMetadataField($comicId: ID!, $field: String!, $value: String!) {
|
||||
setMetadataField(comicId: $comicId, field: $field, value: $value) {
|
||||
id
|
||||
}
|
||||
}
|
||||
`;
|
||||
|
||||
/** Sources stored under `sourcedMetadata` — excludes `inferredMetadata`, which is checked separately. */
|
||||
const SOURCED_METADATA_KEYS = [
|
||||
"comicvine",
|
||||
"locg",
|
||||
"comicInfo",
|
||||
"metron",
|
||||
"gcd",
|
||||
];
|
||||
|
||||
const SOURCE_LABELS: Record<string, string> = {
|
||||
comicvine: "ComicVine",
|
||||
locg: "League of Comic Geeks",
|
||||
comicInfo: "ComicInfo.xml",
|
||||
metron: "Metron",
|
||||
gcd: "Grand Comics Database",
|
||||
inferredMetadata: "Local File",
|
||||
};
|
||||
|
||||
const SOURCE_ICONS: Record<string, string> = {
|
||||
comicvine: "icon-[solar--database-bold]",
|
||||
locg: "icon-[solar--users-group-rounded-outline]",
|
||||
comicInfo: "icon-[solar--file-text-outline]",
|
||||
metron: "icon-[solar--planet-outline]",
|
||||
gcd: "icon-[solar--book-outline]",
|
||||
inferredMetadata: "icon-[solar--folder-outline]",
|
||||
};
|
||||
|
||||
const MetadataSourceChips = ({
|
||||
sources,
|
||||
onOpenReconciler,
|
||||
}: {
|
||||
sources: string[];
|
||||
onOpenReconciler: () => void;
|
||||
}): ReactElement => {
|
||||
return (
|
||||
<div className="flex flex-col gap-2 mb-5 p-3 w-fit">
|
||||
<div className="flex flex-row items-center justify-between">
|
||||
<span className="text-md text-slate-500 dark:text-slate-400">
|
||||
<i className="icon-[solar--database-outline] w-4 h-4 inline-block align-middle mr-1" />
|
||||
{sources.length} metadata sources detected
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex flex-row flex-wrap gap-2">
|
||||
{sources.map((source) => (
|
||||
<span
|
||||
key={source}
|
||||
className="inline-flex items-center gap-1 bg-white dark:bg-slate-700 text-slate-700 dark:text-slate-300 text-xs font-medium px-2 py-1 rounded-md border border-slate-200 dark:border-slate-600"
|
||||
>
|
||||
<i
|
||||
className={`${SOURCE_ICONS[source] ?? "icon-[solar--check-circle-outline]"} w-3 h-3`}
|
||||
/>
|
||||
{SOURCE_LABELS[source] ?? source}
|
||||
</span>
|
||||
))}
|
||||
</div>
|
||||
<button
|
||||
className="flex space-x-1 mb-2 sm:mt-0 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-2 py-1 text-gray-500 hover:bg-transparent hover:text-green-600 focus:outline-none focus:ring active:text-indigo-500"
|
||||
onClick={onOpenReconciler}
|
||||
>
|
||||
<i className="icon-[solar--refresh-outline] w-4 h-4 px-3" />
|
||||
Reconcile sources
|
||||
</button>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* Displays volume metadata for a comic.
|
||||
*
|
||||
* - When multiple sources are present, renders a chip bar listing each source
|
||||
* with a "Reconcile sources" action to merge them.
|
||||
* - When exactly one source is present and it is ComicVine, renders the full
|
||||
* ComicVine detail panel directly.
|
||||
*
|
||||
* @param props.data - Comic data containing sourced and inferred metadata.
|
||||
* @param props.onReconcile - Called when the user triggers source reconciliation.
|
||||
*/
|
||||
export const VolumeInformation = (
|
||||
props: VolumeInformationProps,
|
||||
): ReactElement => {
|
||||
const { data } = props;
|
||||
const [isReconcilerOpen, setReconcilerOpen] = useState(false);
|
||||
const queryClient = useQueryClient();
|
||||
|
||||
const { mutate: saveCanonical } = useMutation({
|
||||
mutationFn: async (record: CanonicalRecord) => {
|
||||
const saves = Object.entries(record)
|
||||
.filter(([, fv]) => fv != null)
|
||||
.map(([field, fv]) => ({
|
||||
field,
|
||||
value:
|
||||
typeof fv!.value === "string"
|
||||
? fv!.value
|
||||
: JSON.stringify(fv!.value),
|
||||
}));
|
||||
await Promise.all(
|
||||
saves.map(({ field, value }) =>
|
||||
fetcher<unknown, { comicId: string; field: string; value: string }>(
|
||||
SET_METADATA_FIELD,
|
||||
{ comicId: data.id ?? "", field, value },
|
||||
)(),
|
||||
),
|
||||
);
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({
|
||||
queryKey: useGetComicByIdQuery.getKey({ id: data.id ?? "" }),
|
||||
});
|
||||
},
|
||||
});
|
||||
|
||||
const presentSources = useMemo(() => {
|
||||
const sources = SOURCED_METADATA_KEYS.filter((key) => {
|
||||
const val = (data?.sourcedMetadata ?? {})[key];
|
||||
if (isNil(val) || isEmpty(val)) return false;
|
||||
// locg returns an object even when empty; require at least one non-null value
|
||||
if (key === "locg")
|
||||
return Object.values(val as Record<string, unknown>).some(
|
||||
(v) => !isNil(v) && v !== "",
|
||||
);
|
||||
return true;
|
||||
});
|
||||
if (
|
||||
!isNil(data?.inferredMetadata?.issue) &&
|
||||
!isEmpty(data?.inferredMetadata?.issue)
|
||||
) {
|
||||
sources.push("inferredMetadata");
|
||||
}
|
||||
return sources;
|
||||
}, [data?.sourcedMetadata, data?.inferredMetadata]);
|
||||
|
||||
return (
|
||||
<div key={1}>
|
||||
<ComicVineDetails
|
||||
data={data.sourcedMetadata.comicvine}
|
||||
updatedAt={data.updatedAt}
|
||||
{presentSources.length > 1 && (
|
||||
<MetadataSourceChips
|
||||
sources={presentSources}
|
||||
onOpenReconciler={() => setReconcilerOpen(true)}
|
||||
/>
|
||||
)}
|
||||
{presentSources.length === 1 &&
|
||||
data.sourcedMetadata?.comicvine?.volumeInformation && (
|
||||
<ComicVineDetails
|
||||
data={data.sourcedMetadata.comicvine}
|
||||
updatedAt={data.updatedAt}
|
||||
/>
|
||||
)}
|
||||
<ReconcilerDrawer
|
||||
open={isReconcilerOpen}
|
||||
onOpenChange={setReconcilerOpen}
|
||||
sourcedMetadata={(data.sourcedMetadata ?? {}) as import("./useReconciler").RawSourcedMetadata}
|
||||
inferredMetadata={data.inferredMetadata as import("./useReconciler").RawInferredMetadata | undefined}
|
||||
onSave={saveCanonical}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
|
||||
285
src/client/components/ComicDetail/Tabs/reconciler.fieldConfig.ts
Normal file
285
src/client/components/ComicDetail/Tabs/reconciler.fieldConfig.ts
Normal file
@@ -0,0 +1,285 @@
|
||||
/**
|
||||
* UI field configuration for the metadata reconciler.
|
||||
*
|
||||
* Each entry maps a CanonicalMetadata field key to:
|
||||
* - label Display name shown in the reconciler table
|
||||
* - group Which section the field belongs to
|
||||
* - renderAs How the field's cell is rendered (drives component selection)
|
||||
* - comicInfoKey The ComicInfo.xml v1 key this field exports to, or null if
|
||||
* the field has no v1 equivalent (shown with a badge in the UI)
|
||||
*
|
||||
* The order of entries within each group controls row order in the table.
|
||||
*/
|
||||
|
||||
export type RenderType =
|
||||
| "scalar" // Single string/number — click to select
|
||||
| "date" // ISO date string — click to select
|
||||
| "longtext" // Multi-line text — click to select, expandable preview
|
||||
| "image" // Cover image — thumbnail grid picker
|
||||
| "array" // Flat list of strings with source badges
|
||||
| "arcs" // [{name, number}] — arc name + position number
|
||||
| "universes" // [{name, designation}] — universe name + designation
|
||||
| "credits" // [{name, role}] — role-grouped, toggleable list
|
||||
| "seriesInfo" // Structured series object — rendered as sub-fields
|
||||
| "prices" // [{country, amount, currency}]
|
||||
| "gtin" // {isbn, upc}
|
||||
| "reprints" // [{description}]
|
||||
| "urls" // [{url, primary}]
|
||||
| "externalIDs" // [{source, externalId, primary}]
|
||||
|
||||
export type FieldGroup =
|
||||
| "Identity"
|
||||
| "Series"
|
||||
| "Publication"
|
||||
| "Content"
|
||||
| "Credits"
|
||||
| "Classification"
|
||||
| "Physical"
|
||||
| "Commercial"
|
||||
| "External"
|
||||
|
||||
/** Ordered list of groups — controls section order in the reconciler table. */
|
||||
export const FIELD_GROUPS: FieldGroup[] = [
|
||||
"Identity",
|
||||
"Series",
|
||||
"Publication",
|
||||
"Content",
|
||||
"Credits",
|
||||
"Classification",
|
||||
"Physical",
|
||||
"Commercial",
|
||||
"External",
|
||||
]
|
||||
|
||||
export interface FieldConfig {
|
||||
label: string
|
||||
group: FieldGroup
|
||||
renderAs: RenderType
|
||||
/**
|
||||
* ComicInfo.xml v1 key this field maps to on export.
|
||||
* null means the field is not exported to ComicInfo v1.
|
||||
*/
|
||||
comicInfoKey: string | null
|
||||
}
|
||||
|
||||
/**
|
||||
* Master field registry for the reconciler.
|
||||
* Keys match CanonicalMetadata field names from the core-service GraphQL schema.
|
||||
*/
|
||||
export const FIELD_CONFIG: Record<string, FieldConfig> = {
|
||||
// ── Identity ──────────────────────────────────────────────────────────────
|
||||
title: {
|
||||
label: "Title",
|
||||
group: "Identity",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
series: {
|
||||
label: "Series",
|
||||
group: "Identity",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: "series",
|
||||
},
|
||||
issueNumber: {
|
||||
label: "Issue Number",
|
||||
group: "Identity",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: "number",
|
||||
},
|
||||
volume: {
|
||||
label: "Volume",
|
||||
group: "Identity",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
collectionTitle: {
|
||||
label: "Collection Title",
|
||||
group: "Identity",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
|
||||
// ── Series ────────────────────────────────────────────────────────────────
|
||||
seriesInfo: {
|
||||
label: "Series Info",
|
||||
group: "Series",
|
||||
renderAs: "seriesInfo",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
|
||||
// ── Publication ───────────────────────────────────────────────────────────
|
||||
publisher: {
|
||||
label: "Publisher",
|
||||
group: "Publication",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: "publisher",
|
||||
},
|
||||
imprint: {
|
||||
label: "Imprint",
|
||||
group: "Publication",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
coverDate: {
|
||||
label: "Cover Date",
|
||||
group: "Publication",
|
||||
renderAs: "date",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
storeDate: {
|
||||
label: "Store Date",
|
||||
group: "Publication",
|
||||
renderAs: "date",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
publicationDate: {
|
||||
label: "Publication Date",
|
||||
group: "Publication",
|
||||
renderAs: "date",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
language: {
|
||||
label: "Language",
|
||||
group: "Publication",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: "languageiso",
|
||||
},
|
||||
|
||||
// ── Content ───────────────────────────────────────────────────────────────
|
||||
description: {
|
||||
label: "Description",
|
||||
group: "Content",
|
||||
renderAs: "longtext",
|
||||
comicInfoKey: "summary",
|
||||
},
|
||||
notes: {
|
||||
label: "Notes",
|
||||
group: "Content",
|
||||
renderAs: "longtext",
|
||||
comicInfoKey: "notes",
|
||||
},
|
||||
stories: {
|
||||
label: "Stories",
|
||||
group: "Content",
|
||||
renderAs: "array",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
storyArcs: {
|
||||
label: "Story Arcs",
|
||||
group: "Content",
|
||||
renderAs: "arcs",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
characters: {
|
||||
label: "Characters",
|
||||
group: "Content",
|
||||
renderAs: "array",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
teams: {
|
||||
label: "Teams",
|
||||
group: "Content",
|
||||
renderAs: "array",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
locations: {
|
||||
label: "Locations",
|
||||
group: "Content",
|
||||
renderAs: "array",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
universes: {
|
||||
label: "Universes",
|
||||
group: "Content",
|
||||
renderAs: "universes",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
coverImage: {
|
||||
label: "Cover Image",
|
||||
group: "Content",
|
||||
renderAs: "image",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
|
||||
// ── Credits ───────────────────────────────────────────────────────────────
|
||||
creators: {
|
||||
label: "Credits",
|
||||
group: "Credits",
|
||||
renderAs: "credits",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
|
||||
// ── Classification ────────────────────────────────────────────────────────
|
||||
genres: {
|
||||
label: "Genres",
|
||||
group: "Classification",
|
||||
renderAs: "array",
|
||||
comicInfoKey: "genre",
|
||||
},
|
||||
tags: {
|
||||
label: "Tags",
|
||||
group: "Classification",
|
||||
renderAs: "array",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
ageRating: {
|
||||
label: "Age Rating",
|
||||
group: "Classification",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
|
||||
// ── Physical ──────────────────────────────────────────────────────────────
|
||||
pageCount: {
|
||||
label: "Page Count",
|
||||
group: "Physical",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: "pagecount",
|
||||
},
|
||||
format: {
|
||||
label: "Format",
|
||||
group: "Physical",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
|
||||
// ── Commercial ────────────────────────────────────────────────────────────
|
||||
prices: {
|
||||
label: "Prices",
|
||||
group: "Commercial",
|
||||
renderAs: "prices",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
gtin: {
|
||||
label: "ISBN / UPC",
|
||||
group: "Commercial",
|
||||
renderAs: "gtin",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
reprints: {
|
||||
label: "Reprints",
|
||||
group: "Commercial",
|
||||
renderAs: "reprints",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
communityRating: {
|
||||
label: "Community Rating",
|
||||
group: "Commercial",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
|
||||
// ── External ──────────────────────────────────────────────────────────────
|
||||
externalIDs: {
|
||||
label: "Source IDs",
|
||||
group: "External",
|
||||
renderAs: "externalIDs",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
urls: {
|
||||
label: "URLs",
|
||||
group: "External",
|
||||
renderAs: "urls",
|
||||
comicInfoKey: "web",
|
||||
},
|
||||
} as const
|
||||
745
src/client/components/ComicDetail/Tabs/useReconciler.ts
Normal file
745
src/client/components/ComicDetail/Tabs/useReconciler.ts
Normal file
@@ -0,0 +1,745 @@
|
||||
import { useReducer, useMemo } from "react";
|
||||
import { isNil, isEmpty } from "lodash";
|
||||
|
||||
// ── Source keys ────────────────────────────────────────────────────────────────
|
||||
|
||||
export type SourceKey =
|
||||
| "comicvine"
|
||||
| "metron"
|
||||
| "gcd"
|
||||
| "locg"
|
||||
| "comicInfo"
|
||||
| "inferredMetadata";
|
||||
|
||||
export const SOURCE_LABELS: Record<SourceKey, string> = {
|
||||
comicvine: "ComicVine",
|
||||
metron: "Metron",
|
||||
gcd: "Grand Comics Database",
|
||||
locg: "League of Comic Geeks",
|
||||
comicInfo: "ComicInfo.xml",
|
||||
inferredMetadata: "Local File",
|
||||
};
|
||||
|
||||
// ── Candidate types ────────────────────────────────────────────────────────────
|
||||
|
||||
/** One source's value for a scalar field. Multiple candidates for the same field = conflict. */
|
||||
export interface ScalarCandidate {
|
||||
source: SourceKey;
|
||||
value: string;
|
||||
}
|
||||
|
||||
/** One item in an array field (characters, genres, arcs…). Pre-selected; user may deselect. */
|
||||
export interface ArrayItem {
|
||||
/** Lowercase dedup key. */
|
||||
itemKey: string;
|
||||
displayValue: string;
|
||||
/** Raw value passed through to the canonical record. */
|
||||
rawValue: unknown;
|
||||
source: SourceKey;
|
||||
selected: boolean;
|
||||
}
|
||||
|
||||
/** One person credit. Dedup key is `"${name}:${role}"` (lowercased). */
|
||||
export interface CreditItem {
|
||||
itemKey: string;
|
||||
id?: string;
|
||||
name: string;
|
||||
role: string;
|
||||
source: SourceKey;
|
||||
selected: boolean;
|
||||
}
|
||||
|
||||
// ── Per-field state ────────────────────────────────────────────────────────────
|
||||
|
||||
/** Unresolved when `selectedSource === null` and `userValue` is absent. */
|
||||
interface ScalarFieldState {
|
||||
kind: "scalar";
|
||||
candidates: ScalarCandidate[];
|
||||
selectedSource: SourceKey | null;
|
||||
/** User-typed override; takes precedence over any source value. */
|
||||
userValue?: string;
|
||||
}
|
||||
|
||||
interface ArrayFieldState {
|
||||
kind: "array";
|
||||
items: ArrayItem[];
|
||||
}
|
||||
|
||||
interface CreditsFieldState {
|
||||
kind: "credits";
|
||||
items: CreditItem[];
|
||||
}
|
||||
|
||||
interface GTINFieldState {
|
||||
kind: "gtin";
|
||||
candidates: Array<{ source: SourceKey; isbn?: string; upc?: string }>;
|
||||
selectedIsbnSource: SourceKey | null;
|
||||
selectedUpcSource: SourceKey | null;
|
||||
}
|
||||
|
||||
type FieldState = ScalarFieldState | ArrayFieldState | CreditsFieldState | GTINFieldState;
|
||||
|
||||
/** Full reconciler state — one entry per field that has data from at least one source. */
|
||||
export type ReconcilerState = Record<string, FieldState>;
|
||||
|
||||
// ── Raw source data ────────────────────────────────────────────────────────────
|
||||
|
||||
/** Raw metadata payloads keyed by source, as stored on the comic document. */
|
||||
export interface RawSourcedMetadata {
|
||||
comicvine?: Record<string, unknown>;
|
||||
/** May arrive as a JSON string; normalised by `ensureParsed`. */
|
||||
metron?: unknown;
|
||||
/** May arrive as a JSON string; normalised by `ensureParsed`. */
|
||||
gcd?: unknown;
|
||||
locg?: Record<string, unknown>;
|
||||
/** May arrive as a JSON string; normalised by `ensureParsed`. */
|
||||
comicInfo?: Record<string, unknown>;
|
||||
}
|
||||
|
||||
/** Metadata inferred from the local file name / path. */
|
||||
export interface RawInferredMetadata {
|
||||
issue?: {
|
||||
name?: string;
|
||||
number?: number;
|
||||
year?: string;
|
||||
subtitle?: string;
|
||||
};
|
||||
}
|
||||
|
||||
// ── Helpers ────────────────────────────────────────────────────────────────────
|
||||
|
||||
function safeString(v: unknown): string | null {
|
||||
if (isNil(v) || v === "") return null;
|
||||
return String(v);
|
||||
}
|
||||
|
||||
/** xml2js with `normalizeTags` wraps every value in a single-element array. */
|
||||
function xmlVal(obj: Record<string, unknown>, key: string): string | null {
|
||||
const arr = obj[key];
|
||||
if (!Array.isArray(arr) || arr.length === 0) return null;
|
||||
return safeString(arr[0]);
|
||||
}
|
||||
|
||||
/** Parse a JSON string if it hasn't been parsed yet. */
|
||||
function ensureParsed(v: unknown): Record<string, unknown> | null {
|
||||
if (isNil(v)) return null;
|
||||
if (typeof v === "string") {
|
||||
try {
|
||||
return JSON.parse(v);
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
if (typeof v === "object") return v as Record<string, unknown>;
|
||||
return null;
|
||||
}
|
||||
|
||||
function makeScalarCandidate(
|
||||
source: SourceKey,
|
||||
value: unknown,
|
||||
): ScalarCandidate | undefined {
|
||||
const val = safeString(value);
|
||||
return val ? { source, value: val } : undefined;
|
||||
}
|
||||
|
||||
function makeArrayItem(
|
||||
source: SourceKey,
|
||||
rawValue: unknown,
|
||||
displayValue: string,
|
||||
): ArrayItem {
|
||||
return {
|
||||
itemKey: displayValue.toLowerCase().trim(),
|
||||
displayValue,
|
||||
rawValue,
|
||||
source,
|
||||
selected: true,
|
||||
};
|
||||
}
|
||||
|
||||
function makeCreditItem(
|
||||
source: SourceKey,
|
||||
name: string,
|
||||
role: string,
|
||||
id?: string,
|
||||
): CreditItem {
|
||||
return {
|
||||
itemKey: `${name.toLowerCase().trim()}:${role.toLowerCase().trim()}`,
|
||||
id,
|
||||
name,
|
||||
role,
|
||||
source,
|
||||
selected: true,
|
||||
};
|
||||
}
|
||||
|
||||
// ── Source adapters ────────────────────────────────────────────────────────────
|
||||
|
||||
type AdapterResult = Partial<Record<string, ScalarCandidate | ArrayItem[] | CreditItem[]>>;
|
||||
|
||||
/**
|
||||
* Extract canonical fields from a ComicVine issue payload.
|
||||
* Volume info lives under `volumeInformation`; credits under `person_credits` etc.
|
||||
*/
|
||||
function fromComicVine(cv: Record<string, unknown>): AdapterResult {
|
||||
const s: SourceKey = "comicvine";
|
||||
const vi = cv.volumeInformation as Record<string, unknown> | undefined;
|
||||
const img = cv.image as Record<string, unknown> | undefined;
|
||||
const publisher = vi?.publisher as Record<string, unknown> | undefined;
|
||||
|
||||
return {
|
||||
title: makeScalarCandidate(s, cv.name),
|
||||
series: makeScalarCandidate(s, vi?.name),
|
||||
issueNumber: makeScalarCandidate(s, cv.issue_number),
|
||||
volume: makeScalarCandidate(s, vi?.id),
|
||||
description: makeScalarCandidate(s, cv.description),
|
||||
publisher: makeScalarCandidate(s, publisher?.name),
|
||||
coverDate: makeScalarCandidate(s, cv.cover_date),
|
||||
storeDate: makeScalarCandidate(s, cv.store_date),
|
||||
coverImage: makeScalarCandidate(s, img?.super_url ?? img?.small_url),
|
||||
characters: ((cv.character_credits as unknown[]) ?? [])
|
||||
.filter((c): c is Record<string, unknown> => !isNil(c))
|
||||
.map((c) => makeArrayItem(s, c, safeString(c.name) ?? "")),
|
||||
teams: ((cv.team_credits as unknown[]) ?? [])
|
||||
.filter((t): t is Record<string, unknown> => !isNil(t))
|
||||
.map((t) => makeArrayItem(s, t, safeString(t.name) ?? "")),
|
||||
locations: ((cv.location_credits as unknown[]) ?? [])
|
||||
.filter((l): l is Record<string, unknown> => !isNil(l))
|
||||
.map((l) => makeArrayItem(s, l, safeString(l.name) ?? "")),
|
||||
storyArcs: ((cv.story_arc_credits as unknown[]) ?? [])
|
||||
.filter((a): a is Record<string, unknown> => !isNil(a))
|
||||
.map((a) => makeArrayItem(s, a, safeString(a.name) ?? "")),
|
||||
creators: ((cv.person_credits as unknown[]) ?? [])
|
||||
.filter((p): p is Record<string, unknown> => !isNil(p))
|
||||
.map((p) =>
|
||||
makeCreditItem(s, safeString(p.name) ?? "", safeString(p.role) ?? ""),
|
||||
),
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract canonical fields from a Metron / MetronInfo payload.
|
||||
* Keys are PascalCase mirroring the MetronInfo XSD schema.
|
||||
*/
|
||||
function fromMetron(raw: Record<string, unknown>): AdapterResult {
|
||||
const s: SourceKey = "metron";
|
||||
const series = raw.Series as Record<string, unknown> | undefined;
|
||||
const pub = raw.Publisher as Record<string, unknown> | undefined;
|
||||
|
||||
const nameList = (arr: unknown[]): ArrayItem[] =>
|
||||
arr
|
||||
.filter((x): x is Record<string, unknown> => !isNil(x))
|
||||
.map((x) => makeArrayItem(s, x, safeString(x.name) ?? ""));
|
||||
|
||||
return {
|
||||
title: makeScalarCandidate(s, (raw.Stories as unknown[])?.[0]),
|
||||
series: makeScalarCandidate(s, series?.Name),
|
||||
issueNumber: makeScalarCandidate(s, raw.Number),
|
||||
collectionTitle: makeScalarCandidate(s, raw.CollectionTitle),
|
||||
publisher: makeScalarCandidate(s, pub?.Name),
|
||||
imprint: makeScalarCandidate(s, pub?.Imprint),
|
||||
coverDate: makeScalarCandidate(s, raw.CoverDate),
|
||||
storeDate: makeScalarCandidate(s, raw.StoreDate),
|
||||
description: makeScalarCandidate(s, raw.Summary),
|
||||
notes: makeScalarCandidate(s, raw.Notes),
|
||||
ageRating: makeScalarCandidate(s, raw.AgeRating),
|
||||
pageCount: makeScalarCandidate(s, raw.PageCount),
|
||||
format: makeScalarCandidate(s, series?.Format),
|
||||
language: makeScalarCandidate(s, series?.lang),
|
||||
genres: nameList((raw.Genres as unknown[]) ?? []),
|
||||
tags: ((raw.Tags as unknown[]) ?? [])
|
||||
.filter((t) => !isNil(t))
|
||||
.map((t) => makeArrayItem(s, t, safeString(t) ?? "")),
|
||||
characters: nameList((raw.Characters as unknown[]) ?? []),
|
||||
teams: nameList((raw.Teams as unknown[]) ?? []),
|
||||
locations: nameList((raw.Locations as unknown[]) ?? []),
|
||||
universes: ((raw.Universes as unknown[]) ?? [])
|
||||
.filter((u): u is Record<string, unknown> => !isNil(u))
|
||||
.map((u) =>
|
||||
makeArrayItem(
|
||||
s,
|
||||
u,
|
||||
[u.Name, u.Designation].filter(Boolean).join(" — "),
|
||||
),
|
||||
),
|
||||
storyArcs: ((raw.Arcs as unknown[]) ?? [])
|
||||
.filter((a): a is Record<string, unknown> => !isNil(a))
|
||||
.map((a) =>
|
||||
makeArrayItem(
|
||||
s,
|
||||
a,
|
||||
[a.Name, a.Number ? `#${a.Number}` : null].filter(Boolean).join(" "),
|
||||
),
|
||||
),
|
||||
stories: ((raw.Stories as unknown[]) ?? [])
|
||||
.filter((t) => !isNil(t))
|
||||
.map((t) => makeArrayItem(s, t, safeString(t) ?? "")),
|
||||
creators: ((raw.Credits as unknown[]) ?? [])
|
||||
.filter((c): c is Record<string, unknown> => !isNil(c))
|
||||
.flatMap((c) => {
|
||||
const creator = c.Creator as Record<string, unknown> | undefined;
|
||||
const roles = (c.Roles as unknown[]) ?? [];
|
||||
return roles
|
||||
.filter((r): r is Record<string, unknown> => !isNil(r))
|
||||
.map((r) =>
|
||||
makeCreditItem(
|
||||
s,
|
||||
safeString(creator?.name) ?? "",
|
||||
safeString(r.name ?? r) ?? "",
|
||||
safeString(creator?.id) ?? undefined,
|
||||
),
|
||||
);
|
||||
}),
|
||||
reprints: ((raw.Reprints as unknown[]) ?? [])
|
||||
.filter((r) => !isNil(r))
|
||||
.map((r) => makeArrayItem(s, r, safeString(r) ?? "")),
|
||||
urls: ((raw.URLs as unknown[]) ?? [])
|
||||
.filter((u) => !isNil(u))
|
||||
.map((u) => makeArrayItem(s, u, safeString(u) ?? "")),
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract canonical fields from a ComicInfo.xml payload.
|
||||
* Values are xml2js-parsed with `normalizeTags` (each key wraps its value in a single-element array).
|
||||
* Genre is a comma-separated string; the web URL maps to `urls`.
|
||||
*/
|
||||
function fromComicInfo(ci: Record<string, unknown>): AdapterResult {
|
||||
const s: SourceKey = "comicInfo";
|
||||
const webUrl = xmlVal(ci, "web");
|
||||
const genreItems: ArrayItem[] = (xmlVal(ci, "genre") ?? "")
|
||||
.split(",")
|
||||
.map((g) => g.trim())
|
||||
.filter(Boolean)
|
||||
.map((g) => makeArrayItem(s, g, g));
|
||||
|
||||
return {
|
||||
series: makeScalarCandidate(s, xmlVal(ci, "series")),
|
||||
issueNumber: makeScalarCandidate(s, xmlVal(ci, "number")),
|
||||
publisher: makeScalarCandidate(s, xmlVal(ci, "publisher")),
|
||||
description: makeScalarCandidate(s, xmlVal(ci, "summary")),
|
||||
notes: makeScalarCandidate(s, xmlVal(ci, "notes")),
|
||||
pageCount: makeScalarCandidate(s, xmlVal(ci, "pagecount")),
|
||||
language: makeScalarCandidate(s, xmlVal(ci, "languageiso")),
|
||||
urls: webUrl ? [makeArrayItem(s, webUrl, webUrl)] : [],
|
||||
genres: genreItems,
|
||||
};
|
||||
}
|
||||
|
||||
/** GCD free-text credit fields: field key → role name. */
|
||||
const GCD_CREDIT_FIELDS: Array<{ key: string; role: string }> = [
|
||||
{ key: "script", role: "Writer" },
|
||||
{ key: "pencils", role: "Penciller" },
|
||||
{ key: "inks", role: "Inker" },
|
||||
{ key: "colors", role: "Colorist" },
|
||||
{ key: "letters", role: "Letterer" },
|
||||
{ key: "editing", role: "Editor" },
|
||||
];
|
||||
|
||||
/** Split a GCD free-text credit string (semicolon-separated; strips bracketed annotations). */
|
||||
function splitGCDCreditString(raw: string): string[] {
|
||||
return raw
|
||||
.split(/;/)
|
||||
.map((name) => name.replace(/\[.*?\]/g, "").trim())
|
||||
.filter(Boolean);
|
||||
}
|
||||
|
||||
/** Parse a GCD price string like "0.10 USD" or "10p". Returns null on failure. */
|
||||
function parseGCDPrice(
|
||||
raw: string,
|
||||
): { amount: number; currency: string } | null {
|
||||
const match = raw.trim().match(/^([\d.,]+)\s*([A-Z]{2,3}|p|¢|€|£|\$)?/);
|
||||
if (!match) return null;
|
||||
const amount = parseFloat(match[1].replace(",", "."));
|
||||
const currency = match[2] ?? "USD";
|
||||
if (isNaN(amount)) return null;
|
||||
return { amount, currency };
|
||||
}
|
||||
|
||||
function fromGCD(raw: Record<string, unknown>): AdapterResult {
|
||||
const s: SourceKey = "gcd";
|
||||
const series = raw.series as Record<string, unknown> | undefined;
|
||||
const language = series?.language as Record<string, unknown> | undefined;
|
||||
const publisher = series?.publisher as Record<string, unknown> | undefined;
|
||||
const indiciaPublisher = raw.indicia_publisher as
|
||||
| Record<string, unknown>
|
||||
| undefined;
|
||||
const stories = (raw.stories as Record<string, unknown>[]) ?? [];
|
||||
const primaryStory = stories[0] ?? {};
|
||||
|
||||
const creditItems: CreditItem[] = [];
|
||||
if (raw.editing) {
|
||||
splitGCDCreditString(String(raw.editing)).forEach((name) =>
|
||||
creditItems.push(makeCreditItem(s, name, "Editor")),
|
||||
);
|
||||
}
|
||||
GCD_CREDIT_FIELDS.forEach(({ key, role }) => {
|
||||
const val = safeString(primaryStory[key]);
|
||||
if (!val) return;
|
||||
splitGCDCreditString(val).forEach((name) =>
|
||||
creditItems.push(makeCreditItem(s, name, role)),
|
||||
);
|
||||
});
|
||||
|
||||
const genreItems: ArrayItem[] = (safeString(primaryStory.genre) ?? "")
|
||||
.split(",")
|
||||
.map((g) => g.trim())
|
||||
.filter(Boolean)
|
||||
.map((g) => makeArrayItem(s, g, g));
|
||||
|
||||
const characterItems: ArrayItem[] = (
|
||||
safeString(primaryStory.characters) ?? ""
|
||||
)
|
||||
.split(/[;,]/)
|
||||
.map((c) => c.trim())
|
||||
.filter(Boolean)
|
||||
.map((c) => makeArrayItem(s, c, c));
|
||||
|
||||
const storyTitles: ArrayItem[] = stories
|
||||
.map((st) => safeString(st.title))
|
||||
.filter((t): t is string => Boolean(t))
|
||||
.map((t) => makeArrayItem(s, t, t));
|
||||
|
||||
const priceItems: ArrayItem[] = [];
|
||||
const priceStr = safeString(raw.price);
|
||||
if (priceStr) {
|
||||
const parsed = parseGCDPrice(priceStr);
|
||||
if (parsed) {
|
||||
priceItems.push(makeArrayItem(s, { ...parsed, country: "US" }, priceStr));
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
series: makeScalarCandidate(s, series?.name),
|
||||
issueNumber: makeScalarCandidate(s, raw.number),
|
||||
title: makeScalarCandidate(s, raw.title ?? primaryStory.title),
|
||||
volume: makeScalarCandidate(s, raw.volume),
|
||||
// Prefer indicia publisher (as-printed) over series publisher
|
||||
publisher: makeScalarCandidate(s, indiciaPublisher?.name ?? publisher?.name),
|
||||
coverDate: makeScalarCandidate(s, raw.publication_date),
|
||||
storeDate: makeScalarCandidate(s, raw.on_sale_date ?? raw.key_date),
|
||||
pageCount: makeScalarCandidate(s, raw.page_count),
|
||||
notes: makeScalarCandidate(s, raw.notes),
|
||||
language: makeScalarCandidate(s, language?.code),
|
||||
ageRating: makeScalarCandidate(s, raw.rating),
|
||||
genres: genreItems,
|
||||
characters: characterItems,
|
||||
stories: storyTitles,
|
||||
creators: creditItems,
|
||||
prices: priceItems,
|
||||
};
|
||||
}
|
||||
|
||||
function fromLocg(locg: Record<string, unknown>): AdapterResult {
|
||||
const s: SourceKey = "locg";
|
||||
return {
|
||||
title: makeScalarCandidate(s, locg.name),
|
||||
publisher: makeScalarCandidate(s, locg.publisher),
|
||||
description: makeScalarCandidate(s, locg.description),
|
||||
coverImage: makeScalarCandidate(s, locg.cover),
|
||||
communityRating: makeScalarCandidate(s, locg.rating),
|
||||
publicationDate: makeScalarCandidate(s, locg.publicationDate),
|
||||
};
|
||||
}
|
||||
|
||||
function fromInferred(inf: RawInferredMetadata["issue"]): AdapterResult {
|
||||
if (!inf) return {};
|
||||
const s: SourceKey = "inferredMetadata";
|
||||
return {
|
||||
title: makeScalarCandidate(s, inf.name),
|
||||
issueNumber: makeScalarCandidate(s, inf.number),
|
||||
volume: makeScalarCandidate(s, inf.year),
|
||||
};
|
||||
}
|
||||
|
||||
// ── State building ─────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Merge all adapter results directly into a `ReconcilerState`.
|
||||
* Array and credit items are deduplicated by `itemKey` using a Set (O(n)).
|
||||
* Scalar conflicts are auto-resolved when all sources agree on the same value.
|
||||
*/
|
||||
function buildState(
|
||||
sources: Partial<Record<SourceKey, AdapterResult>>,
|
||||
): ReconcilerState {
|
||||
const state: ReconcilerState = {};
|
||||
const scalarMap: Record<string, ScalarCandidate[]> = {};
|
||||
|
||||
for (const adapterResult of Object.values(sources)) {
|
||||
if (!adapterResult) continue;
|
||||
for (const [field, value] of Object.entries(adapterResult)) {
|
||||
if (!value) continue;
|
||||
|
||||
if (Array.isArray(value)) {
|
||||
// Presence of `role` distinguishes CreditItem[] from ArrayItem[].
|
||||
const isCredits = value.length > 0 && "role" in value[0];
|
||||
if (isCredits) {
|
||||
const prev = state[field];
|
||||
const existing: CreditItem[] =
|
||||
prev?.kind === "credits" ? prev.items : [];
|
||||
const seen = new Set(existing.map((i) => i.itemKey));
|
||||
const merged = [...existing];
|
||||
for (const item of value as CreditItem[]) {
|
||||
if (!seen.has(item.itemKey)) {
|
||||
seen.add(item.itemKey);
|
||||
merged.push(item);
|
||||
}
|
||||
}
|
||||
state[field] = { kind: "credits", items: merged };
|
||||
} else {
|
||||
const prev = state[field];
|
||||
const existing: ArrayItem[] =
|
||||
prev?.kind === "array" ? prev.items : [];
|
||||
const seen = new Set(existing.map((i) => i.itemKey));
|
||||
const merged = [...existing];
|
||||
for (const item of value as ArrayItem[]) {
|
||||
if (!seen.has(item.itemKey)) {
|
||||
seen.add(item.itemKey);
|
||||
merged.push(item);
|
||||
}
|
||||
}
|
||||
state[field] = { kind: "array", items: merged };
|
||||
}
|
||||
} else {
|
||||
(scalarMap[field] ??= []).push(value as ScalarCandidate);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (const [field, candidates] of Object.entries(scalarMap)) {
|
||||
const allAgree =
|
||||
candidates.length === 1 ||
|
||||
candidates.every((c) => c.value === candidates[0].value);
|
||||
state[field] = {
|
||||
kind: "scalar",
|
||||
candidates,
|
||||
selectedSource: allAgree ? candidates[0].source : null,
|
||||
};
|
||||
}
|
||||
|
||||
return state;
|
||||
}
|
||||
|
||||
// ── Reducer ────────────────────────────────────────────────────────────────────
|
||||
|
||||
type Action =
|
||||
| { type: "SELECT_SCALAR"; field: string; source: SourceKey }
|
||||
| { type: "SET_USER_VALUE"; field: string; value: string }
|
||||
| { type: "TOGGLE_ITEM"; field: string; itemKey: string; selected: boolean }
|
||||
| { type: "SET_BASE_SOURCE"; source: SourceKey }
|
||||
| { type: "RESET"; initial: ReconcilerState };
|
||||
|
||||
function reducer(state: ReconcilerState, action: Action): ReconcilerState {
|
||||
switch (action.type) {
|
||||
case "SELECT_SCALAR": {
|
||||
const field = state[action.field];
|
||||
if (field?.kind !== "scalar") return state;
|
||||
return {
|
||||
...state,
|
||||
[action.field]: {
|
||||
...field,
|
||||
selectedSource: action.source,
|
||||
userValue: undefined,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
case "SET_USER_VALUE": {
|
||||
const field = state[action.field];
|
||||
if (field?.kind !== "scalar") return state;
|
||||
return {
|
||||
...state,
|
||||
[action.field]: {
|
||||
...field,
|
||||
selectedSource: null,
|
||||
userValue: action.value,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
case "TOGGLE_ITEM": {
|
||||
const field = state[action.field];
|
||||
if (field?.kind === "array" || field?.kind === "credits") {
|
||||
return {
|
||||
...state,
|
||||
[action.field]: {
|
||||
...field,
|
||||
items: field.items.map((item) =>
|
||||
item.itemKey === action.itemKey
|
||||
? { ...item, selected: action.selected }
|
||||
: item,
|
||||
),
|
||||
} as FieldState,
|
||||
};
|
||||
}
|
||||
return state;
|
||||
}
|
||||
|
||||
case "SET_BASE_SOURCE": {
|
||||
const next = { ...state };
|
||||
for (const [field, fieldState] of Object.entries(next)) {
|
||||
if (fieldState.kind !== "scalar") continue;
|
||||
if (fieldState.candidates.some((c) => c.source === action.source)) {
|
||||
next[field] = {
|
||||
...fieldState,
|
||||
selectedSource: action.source,
|
||||
userValue: undefined,
|
||||
};
|
||||
}
|
||||
}
|
||||
return next;
|
||||
}
|
||||
|
||||
case "RESET":
|
||||
return action.initial;
|
||||
|
||||
default:
|
||||
return state;
|
||||
}
|
||||
}
|
||||
|
||||
// ── Canonical record ───────────────────────────────────────────────────────────
|
||||
|
||||
export interface CanonicalFieldValue {
|
||||
value: unknown;
|
||||
source: SourceKey | "user";
|
||||
}
|
||||
|
||||
export type CanonicalRecord = Partial<Record<string, CanonicalFieldValue>>;
|
||||
|
||||
function deriveCanonicalRecord(state: ReconcilerState): CanonicalRecord {
|
||||
const record: CanonicalRecord = {};
|
||||
|
||||
for (const [field, fieldState] of Object.entries(state)) {
|
||||
if (fieldState.kind === "scalar") {
|
||||
if (fieldState.userValue !== undefined) {
|
||||
record[field] = { value: fieldState.userValue, source: "user" };
|
||||
} else if (fieldState.selectedSource !== null) {
|
||||
const candidate = fieldState.candidates.find(
|
||||
(c) => c.source === fieldState.selectedSource,
|
||||
);
|
||||
if (candidate) {
|
||||
record[field] = { value: candidate.value, source: candidate.source };
|
||||
}
|
||||
}
|
||||
} else if (fieldState.kind === "array") {
|
||||
const selected = fieldState.items.filter((i) => i.selected);
|
||||
if (selected.length > 0) {
|
||||
const counts = selected.reduce<Record<string, number>>((acc, i) => {
|
||||
acc[i.source] = (acc[i.source] ?? 0) + 1;
|
||||
return acc;
|
||||
}, {});
|
||||
const dominant = Object.entries(counts).sort(
|
||||
([, a], [, b]) => b - a,
|
||||
)[0][0] as SourceKey;
|
||||
record[field] = {
|
||||
value: selected.map((i) => i.rawValue),
|
||||
source: dominant,
|
||||
};
|
||||
}
|
||||
} else if (fieldState.kind === "credits") {
|
||||
const selected = fieldState.items.filter((i) => i.selected);
|
||||
if (selected.length > 0) {
|
||||
record[field] = { value: selected, source: selected[0].source };
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return record;
|
||||
}
|
||||
|
||||
// ── Hook ───────────────────────────────────────────────────────────────────────
|
||||
|
||||
export interface UseReconcilerResult {
|
||||
state: ReconcilerState;
|
||||
/** Number of scalar fields with a conflict that has no selection yet. */
|
||||
unresolvedCount: number;
|
||||
/** True if any field has candidates from more than one source. */
|
||||
hasConflicts: boolean;
|
||||
canonicalRecord: CanonicalRecord;
|
||||
selectScalar: (field: string, source: SourceKey) => void;
|
||||
/** Override a scalar field with a user-typed value. */
|
||||
setUserValue: (field: string, value: string) => void;
|
||||
toggleItem: (field: string, itemKey: string, selected: boolean) => void;
|
||||
/** Adopt all available fields from a single source. */
|
||||
setBaseSource: (source: SourceKey) => void;
|
||||
reset: () => void;
|
||||
}
|
||||
|
||||
export function useReconciler(
|
||||
sourcedMetadata: RawSourcedMetadata,
|
||||
inferredMetadata?: RawInferredMetadata,
|
||||
): UseReconcilerResult {
|
||||
const initial = useMemo(() => {
|
||||
const adapters: Partial<Record<SourceKey, AdapterResult>> = {};
|
||||
|
||||
if (!isEmpty(sourcedMetadata.comicvine)) {
|
||||
adapters.comicvine = fromComicVine(
|
||||
sourcedMetadata.comicvine as Record<string, unknown>,
|
||||
);
|
||||
}
|
||||
const metron = ensureParsed(sourcedMetadata.metron);
|
||||
if (metron) adapters.metron = fromMetron(metron);
|
||||
|
||||
const gcd = ensureParsed(sourcedMetadata.gcd);
|
||||
if (gcd) adapters.gcd = fromGCD(gcd);
|
||||
|
||||
if (!isEmpty(sourcedMetadata.locg)) {
|
||||
adapters.locg = fromLocg(
|
||||
sourcedMetadata.locg as Record<string, unknown>,
|
||||
);
|
||||
}
|
||||
const ci = ensureParsed(sourcedMetadata.comicInfo);
|
||||
if (ci) adapters.comicInfo = fromComicInfo(ci);
|
||||
|
||||
if (inferredMetadata?.issue) {
|
||||
adapters.inferredMetadata = fromInferred(inferredMetadata.issue);
|
||||
}
|
||||
|
||||
return buildState(adapters);
|
||||
}, [sourcedMetadata, inferredMetadata]);
|
||||
|
||||
const [state, dispatch] = useReducer(reducer, initial);
|
||||
|
||||
const unresolvedCount = useMemo(
|
||||
() =>
|
||||
Object.values(state).filter(
|
||||
(f) =>
|
||||
f.kind === "scalar" &&
|
||||
f.selectedSource === null &&
|
||||
f.userValue === undefined &&
|
||||
f.candidates.length > 1,
|
||||
).length,
|
||||
[state],
|
||||
);
|
||||
|
||||
const hasConflicts = useMemo(
|
||||
() =>
|
||||
Object.values(state).some(
|
||||
(f) =>
|
||||
(f.kind === "scalar" && f.candidates.length > 1) ||
|
||||
((f.kind === "array" || f.kind === "credits") &&
|
||||
new Set(
|
||||
(f.items as Array<ArrayItem | CreditItem>).map((i) => i.source),
|
||||
).size > 1),
|
||||
),
|
||||
[state],
|
||||
);
|
||||
|
||||
const canonicalRecord = useMemo(() => deriveCanonicalRecord(state), [state]);
|
||||
|
||||
return {
|
||||
state,
|
||||
unresolvedCount,
|
||||
hasConflicts,
|
||||
canonicalRecord,
|
||||
selectScalar: (field, source) =>
|
||||
dispatch({ type: "SELECT_SCALAR", field, source }),
|
||||
setUserValue: (field, value) =>
|
||||
dispatch({ type: "SET_USER_VALUE", field, value }),
|
||||
toggleItem: (field, itemKey, selected) =>
|
||||
dispatch({ type: "TOGGLE_ITEM", field, itemKey, selected }),
|
||||
setBaseSource: (source) =>
|
||||
dispatch({ type: "SET_BASE_SOURCE", source }),
|
||||
reset: () => dispatch({ type: "RESET", initial }),
|
||||
};
|
||||
}
|
||||
@@ -2,11 +2,48 @@ import React from "react";
|
||||
import dayjs from "dayjs";
|
||||
import prettyBytes from "pretty-bytes";
|
||||
|
||||
export const TorrentDownloads = (props) => {
|
||||
interface TorrentInfo {
|
||||
name: string;
|
||||
hash: string;
|
||||
added_on: number;
|
||||
progress: number;
|
||||
downloaded: number;
|
||||
uploaded: number;
|
||||
trackers_count: number;
|
||||
total_size: number;
|
||||
}
|
||||
|
||||
interface TorrentData {
|
||||
torrent?: TorrentInfo;
|
||||
// Support direct TorrentDetails format from socket events
|
||||
infoHash?: string;
|
||||
downloadSpeed?: number;
|
||||
uploadSpeed?: number;
|
||||
name?: string;
|
||||
}
|
||||
|
||||
export interface TorrentDownloadsProps {
|
||||
data: TorrentData[];
|
||||
}
|
||||
|
||||
export type { TorrentData };
|
||||
|
||||
export const TorrentDownloads = (props: TorrentDownloadsProps) => {
|
||||
const { data } = props;
|
||||
return (
|
||||
<>
|
||||
{data.map(({ torrent }) => {
|
||||
{data.map((item: TorrentData, index: number) => {
|
||||
// Support both wrapped format (item.torrent) and direct format
|
||||
const torrent: TorrentInfo = item.torrent || {
|
||||
name: item.name || 'Unknown',
|
||||
hash: item.infoHash || '',
|
||||
added_on: 0,
|
||||
progress: (item as any).progress || 0,
|
||||
downloaded: 0,
|
||||
uploaded: 0,
|
||||
trackers_count: 0,
|
||||
total_size: 0,
|
||||
};
|
||||
return (
|
||||
<dl className="mt-5 dark:text-slate-200 text-slate-600">
|
||||
<dt className="text-lg">{torrent.name}</dt>
|
||||
|
||||
@@ -10,7 +10,31 @@ import { isEmpty, isNil } from "lodash";
|
||||
import ellipsize from "ellipsize";
|
||||
import prettyBytes from "pretty-bytes";
|
||||
|
||||
export const TorrentSearchPanel = (props) => {
|
||||
interface TorrentSearchPanelProps {
|
||||
issueName: string;
|
||||
comicObjectId: string;
|
||||
}
|
||||
|
||||
interface SearchFormValues {
|
||||
issueName: string;
|
||||
}
|
||||
|
||||
interface TorrentResult {
|
||||
fileName: string;
|
||||
seeders: number;
|
||||
leechers: number;
|
||||
size: number;
|
||||
files: number;
|
||||
indexer: string;
|
||||
downloadUrl: string;
|
||||
}
|
||||
|
||||
interface TorrentDownloadPayload {
|
||||
comicObjectId: string;
|
||||
torrentToDownload: string;
|
||||
}
|
||||
|
||||
export const TorrentSearchPanel = (props: TorrentSearchPanelProps) => {
|
||||
const { issueName, comicObjectId } = props;
|
||||
// Initialize searchTerm with issueName from props
|
||||
const [searchTerm, setSearchTerm] = useState({ issueName });
|
||||
@@ -40,19 +64,19 @@ export const TorrentSearchPanel = (props) => {
|
||||
enabled: !isNil(searchTerm.issueName) && searchTerm.issueName.trim() !== "", // Make sure searchTerm is not empty
|
||||
});
|
||||
const mutation = useMutation({
|
||||
mutationFn: async (newTorrent) =>
|
||||
mutationFn: async (newTorrent: TorrentDownloadPayload) =>
|
||||
axios.post(`${QBITTORRENT_SERVICE_BASE_URI}/addTorrent`, newTorrent),
|
||||
onSuccess: async (data) => {
|
||||
onSuccess: async () => {
|
||||
// Torrent added successfully
|
||||
},
|
||||
});
|
||||
const searchIndexer = (values) => {
|
||||
const searchIndexer = (values: SearchFormValues) => {
|
||||
setSearchTerm({ issueName: values.issueName }); // Update searchTerm based on the form submission
|
||||
};
|
||||
const downloadTorrent = (evt) => {
|
||||
const newTorrent = {
|
||||
const downloadTorrent = (downloadUrl: string) => {
|
||||
const newTorrent: TorrentDownloadPayload = {
|
||||
comicObjectId,
|
||||
torrentToDownload: evt,
|
||||
torrentToDownload: downloadUrl,
|
||||
};
|
||||
mutation.mutate(newTorrent);
|
||||
};
|
||||
@@ -125,7 +149,7 @@ export const TorrentSearchPanel = (props) => {
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody className="divide-y divide-slate-100 dark:divide-gray-500">
|
||||
{data?.data.map((result, idx) => (
|
||||
{data?.data.map((result: TorrentResult, idx: number) => (
|
||||
<tr key={idx}>
|
||||
<td className="px-3 py-3 text-gray-700 dark:text-slate-300 text-md">
|
||||
<p>{ellipsize(result.fileName, 90)}</p>
|
||||
|
||||
@@ -1,43 +1,23 @@
|
||||
import React, { lazy } from "react";
|
||||
import { isNil, isEmpty } from "lodash";
|
||||
import type { TabConfig, TabConfigParams } from "../../types";
|
||||
|
||||
const VolumeInformation = lazy(() => import("./Tabs/VolumeInformation").then(m => ({ default: m.VolumeInformation })));
|
||||
const ComicInfoXML = lazy(() => import("./Tabs/ComicInfoXML").then(m => ({ default: m.ComicInfoXML })));
|
||||
const ArchiveOperations = lazy(() => import("./Tabs/ArchiveOperations").then(m => ({ default: m.ArchiveOperations })));
|
||||
const AcquisitionPanel = lazy(() => import("./AcquisitionPanel"));
|
||||
const TorrentSearchPanel = lazy(() => import("./TorrentSearchPanel"));
|
||||
const DownloadsPanel = lazy(() => import("./DownloadsPanel"));
|
||||
|
||||
interface TabConfig {
|
||||
id: number;
|
||||
name: string;
|
||||
icon: React.ReactElement;
|
||||
content: React.ReactElement | null;
|
||||
shouldShow: boolean;
|
||||
}
|
||||
|
||||
interface TabConfigParams {
|
||||
data: any;
|
||||
comicInfo: any;
|
||||
isComicBookMetadataAvailable: boolean;
|
||||
areRawFileDetailsAvailable: boolean;
|
||||
airDCPPQuery: any;
|
||||
comicObjectId: string;
|
||||
userSettings: any;
|
||||
issueName: string;
|
||||
acquisition?: any;
|
||||
}
|
||||
|
||||
export const createTabConfig = ({
|
||||
data,
|
||||
comicInfo,
|
||||
isComicBookMetadataAvailable,
|
||||
hasAnyMetadata,
|
||||
areRawFileDetailsAvailable,
|
||||
airDCPPQuery,
|
||||
comicObjectId,
|
||||
userSettings,
|
||||
issueName,
|
||||
acquisition,
|
||||
onReconcileMetadata,
|
||||
}: TabConfigParams): TabConfig[] => {
|
||||
return [
|
||||
{
|
||||
@@ -46,23 +26,10 @@ export const createTabConfig = ({
|
||||
icon: (
|
||||
<i className="h-5 w-5 icon-[solar--book-2-bold] text-slate-500 dark:text-slate-300"></i>
|
||||
),
|
||||
content: isComicBookMetadataAvailable ? (
|
||||
<VolumeInformation data={data} key={1} />
|
||||
content: hasAnyMetadata ? (
|
||||
<VolumeInformation data={data} onReconcile={onReconcileMetadata} />
|
||||
) : null,
|
||||
shouldShow: isComicBookMetadataAvailable,
|
||||
},
|
||||
{
|
||||
id: 2,
|
||||
name: "ComicInfo.xml",
|
||||
icon: (
|
||||
<i className="h-5 w-5 icon-[solar--code-file-bold-duotone] text-slate-500 dark:text-slate-300" />
|
||||
),
|
||||
content: (
|
||||
<div key={2}>
|
||||
{!isNil(comicInfo) && <ComicInfoXML json={comicInfo} />}
|
||||
</div>
|
||||
),
|
||||
shouldShow: !isEmpty(comicInfo),
|
||||
shouldShow: hasAnyMetadata,
|
||||
},
|
||||
{
|
||||
id: 3,
|
||||
@@ -70,7 +37,7 @@ export const createTabConfig = ({
|
||||
<i className="h-5 w-5 icon-[solar--winrar-bold-duotone] text-slate-500 dark:text-slate-300" />
|
||||
),
|
||||
name: "Archive Operations",
|
||||
content: <ArchiveOperations data={data} key={3} />,
|
||||
content: <ArchiveOperations data={data} />,
|
||||
shouldShow: areRawFileDetailsAvailable,
|
||||
},
|
||||
{
|
||||
@@ -85,7 +52,6 @@ export const createTabConfig = ({
|
||||
comicObjectId={comicObjectId}
|
||||
comicObject={data}
|
||||
settings={userSettings}
|
||||
key={4}
|
||||
/>
|
||||
),
|
||||
shouldShow: true,
|
||||
@@ -112,7 +78,7 @@ export const createTabConfig = ({
|
||||
),
|
||||
content:
|
||||
!isNil(data) && !isEmpty(data) ? (
|
||||
<DownloadsPanel key={5} />
|
||||
<DownloadsPanel />
|
||||
) : (
|
||||
<div className="column is-three-fifths">
|
||||
<article className="message is-info">
|
||||
|
||||
@@ -56,7 +56,7 @@ export const Dashboard = (): ReactElement => {
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className="mx-auto max-w-screen-xl px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
<div className="mx-auto max-w-7xl px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
<PullList />
|
||||
{recentComics.length > 0 && <RecentlyImported comics={recentComics} />}
|
||||
{/* Wanted comics */}
|
||||
|
||||
@@ -1,105 +1,97 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import { isEmpty, isUndefined, map } from "lodash";
|
||||
import Header from "../shared/Header";
|
||||
import { GetLibraryStatisticsQuery } from "../../graphql/generated";
|
||||
import { GetLibraryStatisticsQuery, DirectorySize } from "../../graphql/generated";
|
||||
import type { LibraryStatisticsProps } from "../../types";
|
||||
|
||||
type LibraryStatisticsProps = {
|
||||
stats: GetLibraryStatisticsQuery['getLibraryStatistics'];
|
||||
};
|
||||
/**
|
||||
* Displays a snapshot of library metrics: total comic files, tagging coverage,
|
||||
* file-type breakdown, and the publisher with the most issues.
|
||||
*
|
||||
* Returns `null` when `stats` is absent or the statistics array is empty.
|
||||
*/
|
||||
export const LibraryStatistics = ({ stats }: LibraryStatisticsProps): ReactElement | null => {
|
||||
if (!stats || !stats.totalDocuments) return null;
|
||||
|
||||
const facet = stats.statistics?.[0];
|
||||
if (!facet) return null;
|
||||
|
||||
const { issues, issuesWithComicInfoXML, fileTypes, publisherWithMostComicsInLibrary } = facet;
|
||||
const topPublisher = publisherWithMostComicsInLibrary?.[0];
|
||||
|
||||
export const LibraryStatistics = (
|
||||
props: LibraryStatisticsProps,
|
||||
): ReactElement => {
|
||||
const { stats } = props;
|
||||
return (
|
||||
<div className="mt-5">
|
||||
{/* TODO: Switch iconClassNames to Solar icon */}
|
||||
<Header
|
||||
headerContent="Your Library In Numbers"
|
||||
subHeaderContent={
|
||||
<span className="text-md">A brief snapshot of your library.</span>
|
||||
}
|
||||
subHeaderContent={<span className="text-md">A brief snapshot of your library.</span>}
|
||||
iconClassNames="fa-solid fa-binoculars mr-2"
|
||||
/>
|
||||
|
||||
<div className="mt-3">
|
||||
<div className="flex flex-row gap-5">
|
||||
<div className="flex flex-col rounded-lg bg-green-100 dark:bg-green-200 px-4 py-6 text-center">
|
||||
<dt className="text-lg font-medium text-gray-500">Library size</dt>
|
||||
<dd className="text-3xl text-green-600 md:text-5xl">
|
||||
{props.stats.totalDocuments} files
|
||||
</dd>
|
||||
{props.stats.comicDirectorySize?.fileCount && (
|
||||
<dd>
|
||||
<span className="text-2xl text-green-600">
|
||||
{props.stats.comicDirectorySize.fileCount} comic files
|
||||
</span>
|
||||
</dd>
|
||||
)}
|
||||
</div>
|
||||
{/* comicinfo and comicvine tagged issues */}
|
||||
<div className="flex flex-col gap-4">
|
||||
{!isUndefined(props.stats.statistics) &&
|
||||
!isEmpty(props.stats.statistics?.[0]?.issues) && (
|
||||
<div className="flex flex-col h-fit rounded-lg bg-green-100 dark:bg-green-200 px-4 py-3 text-center">
|
||||
<span className="text-xl">
|
||||
{props.stats.statistics?.[0]?.issues?.length || 0}
|
||||
</span>{" "}
|
||||
tagged with ComicVine
|
||||
</div>
|
||||
)}
|
||||
{!isUndefined(props.stats.statistics) &&
|
||||
!isEmpty(props.stats.statistics?.[0]?.issuesWithComicInfoXML) && (
|
||||
<div className="flex flex-col h-fit rounded-lg bg-green-100 dark:bg-green-200 px-4 py-3 text-center">
|
||||
<span className="text-xl">
|
||||
{props.stats.statistics?.[0]?.issuesWithComicInfoXML?.length || 0}
|
||||
</span>{" "}
|
||||
<span className="tag is-warning has-text-weight-bold mr-2 ml-1">
|
||||
with ComicInfo.xml
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="">
|
||||
{!isUndefined(props.stats.statistics) &&
|
||||
!isEmpty(props.stats.statistics?.[0]?.fileTypes) &&
|
||||
map(props.stats.statistics?.[0]?.fileTypes, (fileType, idx) => {
|
||||
return (
|
||||
<span
|
||||
key={idx}
|
||||
className="flex flex-col mb-4 h-fit text-xl rounded-lg bg-green-100 dark:bg-green-200 px-4 py-3 text-center"
|
||||
>
|
||||
{fileType.data.length} {fileType.id}
|
||||
</span>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
|
||||
{/* file types */}
|
||||
<div className="flex flex-col h-fit text-lg rounded-lg bg-green-100 dark:bg-green-200 px-4 py-3">
|
||||
{/* publisher with most issues */}
|
||||
{!isUndefined(props.stats.statistics) &&
|
||||
!isEmpty(
|
||||
props.stats.statistics?.[0]?.publisherWithMostComicsInLibrary?.[0],
|
||||
) && (
|
||||
<>
|
||||
<span className="">
|
||||
{
|
||||
props.stats.statistics?.[0]
|
||||
?.publisherWithMostComicsInLibrary?.[0]?.id
|
||||
}
|
||||
</span>
|
||||
{" has the most issues "}
|
||||
<span className="">
|
||||
{
|
||||
props.stats.statistics?.[0]
|
||||
?.publisherWithMostComicsInLibrary?.[0]?.count
|
||||
}
|
||||
</span>
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
<div className="mt-3 flex flex-row gap-5">
|
||||
{/* Total records in database */}
|
||||
<div className="flex flex-col rounded-lg bg-card-info px-4 py-6 text-center">
|
||||
<dt className="text-lg font-medium text-gray-500">In database</dt>
|
||||
<dd className="text-3xl text-gray-700 md:text-5xl">
|
||||
{stats.totalDocuments} comics
|
||||
</dd>
|
||||
</div>
|
||||
|
||||
{/* Missing files */}
|
||||
<div className="flex flex-col rounded-lg bg-card-missing px-4 py-6 text-center">
|
||||
<dt className="text-lg font-medium text-gray-500">Missing files</dt>
|
||||
<dd className="text-3xl text-red-600 md:text-5xl">
|
||||
{stats.comicsMissingFiles}
|
||||
</dd>
|
||||
</div>
|
||||
|
||||
{/* Disk space consumed */}
|
||||
{stats.comicDirectorySize.totalSizeInGB != null && (
|
||||
<div className="flex flex-col rounded-lg bg-card-info px-4 py-6 text-center">
|
||||
<dt className="text-lg font-medium text-gray-500">Size on disk</dt>
|
||||
<dd className="text-3xl text-gray-700 md:text-5xl">
|
||||
{stats.comicDirectorySize.totalSizeInGB.toFixed(2)} GB
|
||||
</dd>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Tagging coverage */}
|
||||
<div className="flex flex-col gap-4">
|
||||
{issues && issues.length > 0 && (
|
||||
<div className="flex flex-col h-fit rounded-lg bg-card-info px-4 py-3 text-center">
|
||||
<span className="text-xl text-gray-700">{issues.length}</span>
|
||||
tagged with ComicVine
|
||||
</div>
|
||||
)}
|
||||
{issuesWithComicInfoXML && issuesWithComicInfoXML.length > 0 && (
|
||||
<div className="flex flex-col h-fit rounded-lg bg-card-info px-4 py-3 text-center">
|
||||
<span className="text-xl text-gray-700">{issuesWithComicInfoXML.length}</span>
|
||||
with ComicInfo.xml
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* File-type breakdown */}
|
||||
{fileTypes && fileTypes.length > 0 && (
|
||||
<div>
|
||||
{fileTypes.map((ft) => (
|
||||
<span
|
||||
key={ft.id}
|
||||
className="flex flex-col mb-4 h-fit text-xl rounded-lg bg-card-info px-4 py-3 text-center text-gray-700"
|
||||
>
|
||||
{ft.data.length} {ft.id}
|
||||
</span>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Publisher with most issues */}
|
||||
{topPublisher && (
|
||||
<div className="flex flex-col h-fit text-lg rounded-lg bg-card-info px-4 py-3 text-gray-700">
|
||||
<span>{topPublisher.id}</span>
|
||||
{" has the most issues "}
|
||||
<span>{topPublisher.count}</span>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
|
||||
@@ -12,10 +12,7 @@ import { Form } from "react-final-form";
|
||||
import DatePickerDialog from "../shared/DatePicker";
|
||||
import { format } from "date-fns";
|
||||
import { LocgMetadata, useGetWeeklyPullListQuery } from "../../graphql/generated";
|
||||
|
||||
interface PullListProps {
|
||||
issues?: LocgMetadata[];
|
||||
}
|
||||
import type { PullListProps } from "../../types";
|
||||
|
||||
export const PullList = (): ReactElement => {
|
||||
const queryClient = useQueryClient();
|
||||
@@ -92,6 +89,7 @@ export const PullList = (): ReactElement => {
|
||||
|
||||
return (
|
||||
<>
|
||||
{/* TODO: Switch iconClassNames to Solar icon */}
|
||||
<Header
|
||||
headerContent="Discover"
|
||||
subHeaderContent={
|
||||
@@ -135,7 +133,7 @@ export const PullList = (): ReactElement => {
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<div className="w-lvw -mr-4 sm:-mr-6 lg:-mr-8">
|
||||
<div className="mr-[calc(-1*(1rem+max(0px,(100vw-80rem)/2)))] sm:mr-[calc(-1*(1.5rem+max(0px,(100vw-80rem)/2)))] lg:mr-[calc(-1*(2rem+max(0px,(100vw-80rem)/2)))]">
|
||||
{isSuccess && !isLoading && (
|
||||
<div className="overflow-hidden" ref={emblaRef}>
|
||||
<div className="flex">
|
||||
|
||||
@@ -27,6 +27,7 @@ export const RecentlyImported = (
|
||||
|
||||
return (
|
||||
<div>
|
||||
{/* TODO: Switch iconClassNames to Solar icon */}
|
||||
<Header
|
||||
headerContent="Recently Imported"
|
||||
subHeaderContent="Recent Library activity such as imports, tagging, etc."
|
||||
@@ -42,6 +43,7 @@ export const RecentlyImported = (
|
||||
sourcedMetadata,
|
||||
canonicalMetadata,
|
||||
inferredMetadata,
|
||||
importStatus,
|
||||
} = comic;
|
||||
|
||||
// Parse sourced metadata (GraphQL returns as strings)
|
||||
@@ -63,7 +65,10 @@ export const RecentlyImported = (
|
||||
!isUndefined(comicvine) &&
|
||||
!isUndefined(comicvine.volumeInformation);
|
||||
const hasComicInfo = !isNil(comicInfo) && !isEmpty(comicInfo);
|
||||
const cardState = (hasComicInfo || isComicVineMetadataAvailable) ? "scraped" : "imported";
|
||||
const isMissingFile = importStatus?.isRawFileMissing === true;
|
||||
const cardState = isMissingFile
|
||||
? "missing"
|
||||
: (hasComicInfo || isComicVineMetadataAvailable) ? "scraped" : "imported";
|
||||
return (
|
||||
<div
|
||||
key={idx}
|
||||
@@ -127,12 +132,6 @@ export const RecentlyImported = (
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
{/* Raw file presence */}
|
||||
{isNil(rawFileDetails) && (
|
||||
<span className="h-6 w-5 sm:shrink-0 sm:items-center sm:gap-2">
|
||||
<i className="icon-[solar--file-corrupted-outline] h-5 w-5" />
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
</Card>
|
||||
</div>
|
||||
|
||||
@@ -11,9 +11,11 @@ type VolumeGroupsProps = {
|
||||
volumeGroups?: GetVolumeGroupsQuery['getComicBookGroups'];
|
||||
};
|
||||
|
||||
export const VolumeGroups = (props: VolumeGroupsProps): ReactElement => {
|
||||
export const VolumeGroups = (props: VolumeGroupsProps): ReactElement | null => {
|
||||
// Till mongo gives us back the deduplicated results with the ObjectId
|
||||
const deduplicatedGroups = unionBy(props.volumeGroups, "volumes.id");
|
||||
if (!deduplicatedGroups || deduplicatedGroups.length === 0) return null;
|
||||
|
||||
const navigate = useNavigate();
|
||||
const navigateToVolumes = (row: any) => {
|
||||
navigate(`/volumes/all`);
|
||||
@@ -29,6 +31,7 @@ export const VolumeGroups = (props: VolumeGroupsProps): ReactElement => {
|
||||
|
||||
return (
|
||||
<div>
|
||||
{/* TODO: Switch iconClassNames to Solar icon */}
|
||||
<Header
|
||||
headerContent="Volumes"
|
||||
subHeaderContent={<>Based on ComicVine Volume information</>}
|
||||
|
||||
@@ -15,7 +15,9 @@ type WantedComicsListProps = {
|
||||
|
||||
export const WantedComicsList = ({
|
||||
comics,
|
||||
}: WantedComicsListProps): ReactElement => {
|
||||
}: WantedComicsListProps): ReactElement | null => {
|
||||
if (!comics || comics.length === 0) return null;
|
||||
|
||||
const navigate = useNavigate();
|
||||
|
||||
// embla carousel
|
||||
@@ -28,6 +30,7 @@ export const WantedComicsList = ({
|
||||
|
||||
return (
|
||||
<div>
|
||||
{/* TODO: Switch iconClassNames to Solar icon */}
|
||||
<Header
|
||||
headerContent="Wanted Comics"
|
||||
subHeaderContent={<>Comics marked as wanted from various sources</>}
|
||||
|
||||
@@ -1,9 +1,6 @@
|
||||
import * as React from "react";
|
||||
import type { ZeroStateProps } from "../../types";
|
||||
|
||||
interface ZeroStateProps {
|
||||
header: string;
|
||||
message: string;
|
||||
}
|
||||
const ZeroState: React.FunctionComponent<ZeroStateProps> = (props) => {
|
||||
return (
|
||||
<article className="">
|
||||
|
||||
@@ -1,62 +1,49 @@
|
||||
import React, { ReactElement, useEffect, useState } from "react";
|
||||
import { getTransfers } from "../../actions/airdcpp.actions";
|
||||
import { isEmpty, isNil, isUndefined } from "lodash";
|
||||
import { isEmpty, isNil } from "lodash";
|
||||
import { determineCoverFile } from "../../shared/utils/metadata.utils";
|
||||
import MetadataPanel from "../shared/MetadataPanel";
|
||||
import type { DownloadsProps } from "../../types";
|
||||
import { useStore } from "../../store";
|
||||
|
||||
interface IDownloadsProps {
|
||||
data: any;
|
||||
interface BundleData {
|
||||
rawFileDetails?: Record<string, unknown>;
|
||||
inferredMetadata?: Record<string, unknown>;
|
||||
acquisition?: {
|
||||
directconnect?: {
|
||||
downloads?: Array<{
|
||||
name: string;
|
||||
size: number;
|
||||
type: { str: string };
|
||||
bundleId: string;
|
||||
}>;
|
||||
};
|
||||
};
|
||||
sourcedMetadata?: {
|
||||
locg?: unknown;
|
||||
comicvine?: unknown;
|
||||
};
|
||||
issueName?: string;
|
||||
url?: string;
|
||||
}
|
||||
|
||||
export const Downloads = (props: IDownloadsProps): ReactElement => {
|
||||
// const airDCPPConfiguration = useContext(AirDCPPSocketContext);
|
||||
const {
|
||||
airDCPPState: { settings, socket },
|
||||
} = airDCPPConfiguration;
|
||||
// const dispatch = useDispatch();
|
||||
|
||||
// const airDCPPTransfers = useSelector(
|
||||
// (state: RootState) => state.airdcpp.transfers,
|
||||
// );
|
||||
// const issueBundles = useSelector(
|
||||
// (state: RootState) => state.airdcpp.issue_bundles,
|
||||
// );
|
||||
const [bundles, setBundles] = useState([]);
|
||||
// Make the call to get all transfers from AirDC++
|
||||
export const Downloads = (_props: DownloadsProps): ReactElement => {
|
||||
// Using Zustand store for socket management
|
||||
const getSocket = useStore((state) => state.getSocket);
|
||||
|
||||
const [bundles, setBundles] = useState<BundleData[]>([]);
|
||||
const [isLoading, setIsLoading] = useState(true);
|
||||
|
||||
// Initialize socket connection and load data
|
||||
useEffect(() => {
|
||||
if (!isUndefined(socket) && !isEmpty(settings)) {
|
||||
dispatch(
|
||||
getTransfers(socket, {
|
||||
username: `${settings.directConnect.client.host.username}`,
|
||||
password: `${settings.directConnect.client.host.password}`,
|
||||
}),
|
||||
);
|
||||
const socket = getSocket();
|
||||
if (socket) {
|
||||
// Socket is connected, we could fetch transfers here
|
||||
// For now, just set loading to false since we don't have direct access to Redux state
|
||||
setIsLoading(false);
|
||||
}
|
||||
}, [socket]);
|
||||
}, [getSocket]);
|
||||
|
||||
useEffect(() => {
|
||||
if (!isUndefined(issueBundles)) {
|
||||
const foo = issueBundles.data.map((bundle) => {
|
||||
const {
|
||||
rawFileDetails,
|
||||
inferredMetadata,
|
||||
acquisition: {
|
||||
directconnect: { downloads },
|
||||
},
|
||||
sourcedMetadata: { locg, comicvine },
|
||||
} = bundle;
|
||||
const { issueName, url } = determineCoverFile({
|
||||
rawFileDetails,
|
||||
comicvine,
|
||||
locg,
|
||||
});
|
||||
return { ...bundle, issueName, url };
|
||||
});
|
||||
setBundles(foo);
|
||||
}
|
||||
}, [issueBundles]);
|
||||
|
||||
return !isNil(bundles) ? (
|
||||
return !isNil(bundles) && bundles.length > 0 ? (
|
||||
<div className="container mx-auto px-4 sm:px-6 lg:px-8">
|
||||
<section className="section">
|
||||
<h1 className="title">Downloads</h1>
|
||||
@@ -87,16 +74,16 @@ export const Downloads = (props: IDownloadsProps): ReactElement => {
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{bundle.acquisition.directconnect.downloads.map(
|
||||
(bundle, idx) => {
|
||||
{bundle.acquisition?.directconnect?.downloads?.map(
|
||||
(download, idx: number) => {
|
||||
return (
|
||||
<tr key={idx}>
|
||||
<td>{bundle.name}</td>
|
||||
<td>{bundle.size}</td>
|
||||
<td>{bundle.type.str}</td>
|
||||
<td>{download.name}</td>
|
||||
<td>{download.size}</td>
|
||||
<td>{download.type.str}</td>
|
||||
<td>
|
||||
<span className="tag is-warning">
|
||||
{bundle.bundleId}
|
||||
{download.bundleId}
|
||||
</span>
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
@@ -1,40 +1,28 @@
|
||||
import { debounce, isEmpty, map } from "lodash";
|
||||
import React, { ReactElement, useCallback, useState } from "react";
|
||||
import { useDispatch, useSelector } from "react-redux";
|
||||
import axios from "axios";
|
||||
import Card from "../shared/Carda";
|
||||
|
||||
import { searchIssue } from "../../actions/fileops.actions";
|
||||
import MetadataPanel from "../shared/MetadataPanel";
|
||||
import { SEARCH_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import type { GlobalSearchBarProps } from "../../types";
|
||||
|
||||
interface ISearchBarProps {
|
||||
data: any;
|
||||
}
|
||||
|
||||
export const SearchBar = (data: ISearchBarProps): ReactElement => {
|
||||
const dispatch = useDispatch();
|
||||
const searchResults = useSelector(
|
||||
(state: RootState) => state.fileOps.librarySearchResultsFormatted,
|
||||
);
|
||||
export const SearchBar = (data: GlobalSearchBarProps): ReactElement => {
|
||||
const [searchResults, setSearchResults] = useState<Record<string, unknown>[]>([]);
|
||||
|
||||
const performSearch = useCallback(
|
||||
debounce((e) => {
|
||||
dispatch(
|
||||
searchIssue(
|
||||
{
|
||||
query: {
|
||||
volumeName: e.target.value,
|
||||
},
|
||||
},
|
||||
{
|
||||
pagination: {
|
||||
size: 25,
|
||||
from: 0,
|
||||
},
|
||||
type: "volumeName",
|
||||
trigger: "globalSearchBar",
|
||||
},
|
||||
),
|
||||
);
|
||||
debounce(async (e) => {
|
||||
const response = await axios({
|
||||
url: `${SEARCH_SERVICE_BASE_URI}/searchIssue`,
|
||||
method: "POST",
|
||||
data: {
|
||||
query: { volumeName: e.target.value },
|
||||
pagination: { size: 25, from: 0 },
|
||||
type: "volumeName",
|
||||
trigger: "globalSearchBar",
|
||||
},
|
||||
});
|
||||
setSearchResults(response.data?.hits ?? []);
|
||||
}, 500),
|
||||
[data],
|
||||
);
|
||||
@@ -47,6 +35,7 @@ export const SearchBar = (data: ISearchBarProps): ReactElement => {
|
||||
onChange={(e) => performSearch(e)}
|
||||
/>
|
||||
|
||||
{/* TODO: Switch to Solar icon */}
|
||||
<span className="icon is-right mt-2">
|
||||
<i className="fa-solid fa-magnifying-glass"></i>
|
||||
</span>
|
||||
|
||||
@@ -490,4 +490,188 @@ describe('Import Component - Real-time Updates', () => {
|
||||
});
|
||||
});
|
||||
|
||||
describe('Import Component - Directory Status', () => {
|
||||
beforeEach(() => {
|
||||
jest.clearAllMocks();
|
||||
(axios as any).mockResolvedValue({ data: [] });
|
||||
(axios.request as jest.Mock) = jest.fn().mockResolvedValue({ data: {} });
|
||||
// Mock successful directory status by default
|
||||
(axios.get as jest.Mock) = jest.fn().mockResolvedValue({
|
||||
data: { comics: { exists: true }, userdata: { exists: true } }
|
||||
});
|
||||
});
|
||||
|
||||
test('should show warning banner when comics directory is missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: false }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Required Directories Missing')).toBeInTheDocument();
|
||||
});
|
||||
expect(screen.getByText('comics')).toBeInTheDocument();
|
||||
});
|
||||
|
||||
test('should show warning banner when userdata directory is missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: true }, userdata: { exists: false } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Required Directories Missing')).toBeInTheDocument();
|
||||
});
|
||||
expect(screen.getByText('userdata')).toBeInTheDocument();
|
||||
});
|
||||
|
||||
test('should show warning banner when both directories are missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: false }, userdata: { exists: false } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Required Directories Missing')).toBeInTheDocument();
|
||||
});
|
||||
expect(screen.getByText('comics')).toBeInTheDocument();
|
||||
expect(screen.getByText('userdata')).toBeInTheDocument();
|
||||
});
|
||||
|
||||
test('should disable import button when directories are missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: false }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
const button = screen.getByRole('button', { name: /Force Re-Import/i });
|
||||
expect(button).toBeDisabled();
|
||||
});
|
||||
});
|
||||
|
||||
test('should enable import button when all directories exist', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: true }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
const button = screen.getByRole('button', { name: /Force Re-Import/i });
|
||||
expect(button).not.toBeDisabled();
|
||||
});
|
||||
});
|
||||
|
||||
test('should not show warning banner when all directories exist', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: true }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
// Wait for the component to finish loading
|
||||
await waitFor(() => {
|
||||
expect(screen.getByRole('button', { name: /Force Re-Import/i })).toBeInTheDocument();
|
||||
});
|
||||
|
||||
// The warning banner should not be present
|
||||
expect(screen.queryByText('Required Directories Missing')).not.toBeInTheDocument();
|
||||
});
|
||||
});
|
||||
|
||||
export {};
|
||||
|
||||
@@ -1,63 +1,74 @@
|
||||
import React, { ReactElement, useCallback, useEffect, useState } from "react";
|
||||
import { format } from "date-fns";
|
||||
import { isEmpty, isNil, isUndefined } from "lodash";
|
||||
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
|
||||
/**
|
||||
* @fileoverview Import page component for managing comic library imports.
|
||||
* Provides UI for starting imports, monitoring progress, viewing history,
|
||||
* and handling directory configuration issues.
|
||||
* @module components/Import/Import
|
||||
*/
|
||||
|
||||
import { ReactElement, useEffect, useRef, useState } from "react";
|
||||
import { isEmpty } from "lodash";
|
||||
import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
|
||||
import { useStore } from "../../store";
|
||||
import { useShallow } from "zustand/react/shallow";
|
||||
import axios from "axios";
|
||||
import {
|
||||
useGetJobResultStatisticsQuery,
|
||||
useGetImportStatisticsQuery,
|
||||
useStartIncrementalImportMutation
|
||||
} from "../../graphql/generated";
|
||||
import { useGetJobResultStatisticsQuery } from "../../graphql/generated";
|
||||
import { RealTimeImportStats } from "./RealTimeImportStats";
|
||||
import { PastImportsTable } from "./PastImportsTable";
|
||||
import { AlertBanner } from "../shared/AlertBanner";
|
||||
import { useImportSessionStatus } from "../../hooks/useImportSessionStatus";
|
||||
|
||||
interface ImportProps {
|
||||
path: string;
|
||||
}
|
||||
import { SETTINGS_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import type { DirectoryStatus, DirectoryIssue } from "./import.types";
|
||||
|
||||
/**
|
||||
* Import component for adding comics to the ThreeTwo library.
|
||||
* Provides preview statistics, smart import, and queue management.
|
||||
* Import page component for managing comic library imports.
|
||||
*
|
||||
* Features:
|
||||
* - Real-time import progress tracking via WebSocket
|
||||
* - Directory status validation before import
|
||||
* - Force re-import functionality for fixing indexing issues
|
||||
* - Past import history table
|
||||
* - Session management for import tracking
|
||||
*
|
||||
* @returns {ReactElement} The import page UI
|
||||
*/
|
||||
export const Import = (props: ImportProps): ReactElement => {
|
||||
const queryClient = useQueryClient();
|
||||
const [socketReconnectTrigger, setSocketReconnectTrigger] = useState(0);
|
||||
export const Import = (): ReactElement => {
|
||||
const [importError, setImportError] = useState<string | null>(null);
|
||||
const queryClient = useQueryClient();
|
||||
const { importJobQueue, getSocket, disconnectSocket } = useStore(
|
||||
useShallow((state) => ({
|
||||
importJobQueue: state.importJobQueue,
|
||||
getSocket: state.getSocket,
|
||||
disconnectSocket: state.disconnectSocket,
|
||||
})),
|
||||
}))
|
||||
);
|
||||
|
||||
const { mutate: startIncrementalImport, isPending: isStartingImport } = useStartIncrementalImportMutation({
|
||||
onSuccess: (data) => {
|
||||
if (data.startIncrementalImport.success) {
|
||||
importJobQueue.setStatus("running");
|
||||
setImportError(null);
|
||||
}
|
||||
},
|
||||
onError: (error: any) => {
|
||||
console.error("Failed to start import:", error);
|
||||
setImportError(error?.message || "Failed to start import. Please try again.");
|
||||
// Check if required directories exist
|
||||
const {
|
||||
data: directoryStatus,
|
||||
isLoading: isCheckingDirectories,
|
||||
isError: isDirectoryCheckError,
|
||||
error: directoryError,
|
||||
} = useQuery({
|
||||
queryKey: ["directoryStatus"],
|
||||
queryFn: async (): Promise<DirectoryStatus> => {
|
||||
const response = await axios.get(
|
||||
`${SETTINGS_SERVICE_BASE_URI}/getDirectoryStatus`
|
||||
);
|
||||
return response.data;
|
||||
},
|
||||
refetchOnWindowFocus: false,
|
||||
staleTime: 30000,
|
||||
retry: false,
|
||||
});
|
||||
|
||||
const { mutate: initiateImport } = useMutation({
|
||||
mutationFn: async () => {
|
||||
const sessionId = localStorage.getItem("sessionId");
|
||||
return await axios.request({
|
||||
url: `http://localhost:3000/api/library/newImport`,
|
||||
method: "POST",
|
||||
data: { sessionId },
|
||||
});
|
||||
},
|
||||
});
|
||||
// Use isValid for quick check, issues array for detailed display
|
||||
const directoryCheckFailed = isDirectoryCheckError;
|
||||
const hasAllDirectories = directoryCheckFailed
|
||||
? false
|
||||
: (directoryStatus?.isValid ?? true);
|
||||
const directoryIssues = directoryStatus?.issues ?? [];
|
||||
|
||||
// Force re-import mutation - re-imports all files regardless of import status
|
||||
// Force re-import mutation
|
||||
const { mutate: forceReImport, isPending: isForceReImporting } = useMutation({
|
||||
mutationFn: async () => {
|
||||
const sessionId = localStorage.getItem("sessionId") || "";
|
||||
@@ -74,105 +85,64 @@ export const Import = (props: ImportProps): ReactElement => {
|
||||
},
|
||||
onError: (error: any) => {
|
||||
console.error("Failed to start force re-import:", error);
|
||||
setImportError(error?.response?.data?.message || error?.message || "Failed to start force re-import. Please try again.");
|
||||
setImportError(
|
||||
error?.response?.data?.message ||
|
||||
error?.message ||
|
||||
"Failed to start force re-import. Please try again."
|
||||
);
|
||||
},
|
||||
});
|
||||
|
||||
const { data, isError, isLoading, refetch } = useGetJobResultStatisticsQuery();
|
||||
|
||||
// Get import statistics to determine if Start Import button should be shown
|
||||
const { data: importStats } = useGetImportStatisticsQuery(
|
||||
{},
|
||||
{
|
||||
refetchOnWindowFocus: false,
|
||||
refetchInterval: false,
|
||||
}
|
||||
);
|
||||
const { data, isLoading, refetch } = useGetJobResultStatisticsQuery();
|
||||
|
||||
// Use custom hook for definitive import session status tracking
|
||||
// NO POLLING - relies on Socket.IO events only
|
||||
const importSession = useImportSessionStatus();
|
||||
|
||||
const hasActiveSession = importSession.isActive;
|
||||
const wasComplete = useRef(false);
|
||||
|
||||
// Determine if we should show the Start Import button
|
||||
const hasNewFiles = importStats?.getImportStatistics?.success &&
|
||||
importStats.getImportStatistics.stats &&
|
||||
importStats.getImportStatistics.stats.newFiles > 0;
|
||||
// React to importSession.isComplete for state updates
|
||||
useEffect(() => {
|
||||
if (importSession.isComplete && !wasComplete.current) {
|
||||
wasComplete.current = true;
|
||||
setTimeout(() => {
|
||||
queryClient.invalidateQueries({ queryKey: ["GetJobResultStatistics"] });
|
||||
refetch();
|
||||
}, 1500);
|
||||
importJobQueue.setStatus("drained");
|
||||
} else if (!importSession.isComplete) {
|
||||
wasComplete.current = false;
|
||||
}
|
||||
}, [importSession.isComplete, refetch, importJobQueue, queryClient]);
|
||||
|
||||
// Listen to socket events to update Past Imports table
|
||||
useEffect(() => {
|
||||
const socket = getSocket("/");
|
||||
const handleQueueDrained = () => refetch();
|
||||
const handleCoverExtracted = () => refetch();
|
||||
|
||||
const handleSessionStarted = () => {
|
||||
importJobQueue.setStatus("running");
|
||||
|
||||
const handleImportCompleted = () => {
|
||||
console.log(
|
||||
"[Import] IMPORT_SESSION_COMPLETED event - refreshing Past Imports"
|
||||
);
|
||||
setTimeout(() => {
|
||||
queryClient.invalidateQueries({ queryKey: ["GetJobResultStatistics"] });
|
||||
}, 1500);
|
||||
};
|
||||
|
||||
const handleSessionCompleted = () => {
|
||||
refetch();
|
||||
importJobQueue.setStatus("drained");
|
||||
const handleQueueDrained = () => {
|
||||
console.log(
|
||||
"[Import] LS_IMPORT_QUEUE_DRAINED event - refreshing Past Imports"
|
||||
);
|
||||
setTimeout(() => {
|
||||
queryClient.invalidateQueries({ queryKey: ["GetJobResultStatistics"] });
|
||||
}, 1500);
|
||||
};
|
||||
|
||||
socket.on("IMPORT_SESSION_COMPLETED", handleImportCompleted);
|
||||
socket.on("LS_IMPORT_QUEUE_DRAINED", handleQueueDrained);
|
||||
socket.on("LS_COVER_EXTRACTED", handleCoverExtracted);
|
||||
socket.on("IMPORT_SESSION_STARTED", handleSessionStarted);
|
||||
socket.on("IMPORT_SESSION_COMPLETED", handleSessionCompleted);
|
||||
|
||||
return () => {
|
||||
socket.off("IMPORT_SESSION_COMPLETED", handleImportCompleted);
|
||||
socket.off("LS_IMPORT_QUEUE_DRAINED", handleQueueDrained);
|
||||
socket.off("LS_COVER_EXTRACTED", handleCoverExtracted);
|
||||
socket.off("IMPORT_SESSION_STARTED", handleSessionStarted);
|
||||
socket.off("IMPORT_SESSION_COMPLETED", handleSessionCompleted);
|
||||
};
|
||||
}, [getSocket, refetch, importJobQueue, socketReconnectTrigger]);
|
||||
|
||||
/**
|
||||
* Toggles import queue pause/resume state
|
||||
*/
|
||||
const toggleQueue = (queueAction: string, queueStatus: string) => {
|
||||
const socket = getSocket("/");
|
||||
socket.emit(
|
||||
"call",
|
||||
"socket.setQueueStatus",
|
||||
{
|
||||
queueAction,
|
||||
queueStatus,
|
||||
},
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* Starts smart import with race condition prevention
|
||||
*/
|
||||
const handleStartSmartImport = async () => {
|
||||
// Clear any previous errors
|
||||
setImportError(null);
|
||||
|
||||
// Check for active session before starting using definitive status
|
||||
if (hasActiveSession) {
|
||||
setImportError(
|
||||
`Cannot start import: An import session "${importSession.sessionId}" is already active. Please wait for it to complete.`
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
if (importJobQueue.status === "drained") {
|
||||
localStorage.removeItem("sessionId");
|
||||
disconnectSocket("/");
|
||||
setTimeout(() => {
|
||||
getSocket("/");
|
||||
setSocketReconnectTrigger(prev => prev + 1);
|
||||
setTimeout(() => {
|
||||
const sessionId = localStorage.getItem("sessionId") || "";
|
||||
startIncrementalImport({ sessionId });
|
||||
}, 500);
|
||||
}, 100);
|
||||
} else {
|
||||
const sessionId = localStorage.getItem("sessionId") || "";
|
||||
startIncrementalImport({ sessionId });
|
||||
}
|
||||
};
|
||||
}, [getSocket, queryClient]);
|
||||
|
||||
/**
|
||||
* Handles force re-import - re-imports all files to fix indexing issues
|
||||
@@ -180,7 +150,22 @@ export const Import = (props: ImportProps): ReactElement => {
|
||||
const handleForceReImport = async () => {
|
||||
setImportError(null);
|
||||
|
||||
// Check for active session before starting using definitive status
|
||||
if (!hasAllDirectories) {
|
||||
if (directoryCheckFailed) {
|
||||
setImportError(
|
||||
"Cannot start import: Failed to verify directory status. Please check that the backend service is running."
|
||||
);
|
||||
} else {
|
||||
const issueDetails = directoryIssues
|
||||
.map((i) => `${i.directory}: ${i.issue}`)
|
||||
.join(", ");
|
||||
setImportError(
|
||||
`Cannot start import: ${issueDetails || "Required directories are missing"}. Please check your Docker volume configuration.`
|
||||
);
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
if (hasActiveSession) {
|
||||
setImportError(
|
||||
`Cannot start import: An import session "${importSession.sessionId}" is already active. Please wait for it to complete.`
|
||||
@@ -188,16 +173,17 @@ export const Import = (props: ImportProps): ReactElement => {
|
||||
return;
|
||||
}
|
||||
|
||||
if (window.confirm(
|
||||
"This will re-import ALL files in your library folder, even those already imported. " +
|
||||
"This can help fix Elasticsearch indexing issues. Continue?"
|
||||
)) {
|
||||
if (
|
||||
window.confirm(
|
||||
"This will re-import ALL files in your library folder, even those already imported. " +
|
||||
"This can help fix Elasticsearch indexing issues. Continue?"
|
||||
)
|
||||
) {
|
||||
if (importJobQueue.status === "drained") {
|
||||
localStorage.removeItem("sessionId");
|
||||
disconnectSocket("/");
|
||||
setTimeout(() => {
|
||||
getSocket("/");
|
||||
setSocketReconnectTrigger(prev => prev + 1);
|
||||
setTimeout(() => {
|
||||
forceReImport();
|
||||
}, 500);
|
||||
@@ -208,53 +194,9 @@ export const Import = (props: ImportProps): ReactElement => {
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Renders pause/resume controls based on queue status
|
||||
*/
|
||||
const renderQueueControls = (status: string): ReactElement | null => {
|
||||
switch (status) {
|
||||
case "running":
|
||||
return (
|
||||
<div>
|
||||
<button
|
||||
className="flex space-x-1 sm:mt-0 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-3 py-1 text-gray-500 hover:bg-transparent hover:text-green-600 focus:outline-none focus:ring active:text-indigo-500"
|
||||
onClick={() => {
|
||||
toggleQueue("pause", "paused");
|
||||
importJobQueue.setStatus("paused");
|
||||
}}
|
||||
>
|
||||
<span className="text-md">Pause</span>
|
||||
<span className="w-5 h-5">
|
||||
<i className="h-5 w-5 icon-[solar--pause-bold]"></i>
|
||||
</span>
|
||||
</button>
|
||||
</div>
|
||||
);
|
||||
case "paused":
|
||||
return (
|
||||
<div>
|
||||
<button
|
||||
className="flex space-x-1 sm:mt-0 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-3 py-1 text-gray-500 hover:bg-transparent hover:text-green-600 focus:outline-none focus:ring active:text-indigo-500"
|
||||
onClick={() => {
|
||||
toggleQueue("resume", "running");
|
||||
importJobQueue.setStatus("running");
|
||||
}}
|
||||
>
|
||||
<span className="text-md">Resume</span>
|
||||
<span className="w-5 h-5">
|
||||
<i className="h-5 w-5 icon-[solar--play-bold]"></i>
|
||||
</span>
|
||||
</button>
|
||||
</div>
|
||||
);
|
||||
|
||||
case "drained":
|
||||
return null;
|
||||
|
||||
default:
|
||||
return null;
|
||||
}
|
||||
};
|
||||
const canStartImport =
|
||||
!hasActiveSession &&
|
||||
(importJobQueue.status === "drained" || importJobQueue.status === undefined);
|
||||
|
||||
return (
|
||||
<div>
|
||||
@@ -266,7 +208,6 @@ export const Import = (props: ImportProps): ReactElement => {
|
||||
<h1 className="text-2xl font-bold text-gray-900 dark:text-white sm:text-3xl">
|
||||
Import
|
||||
</h1>
|
||||
|
||||
<p className="mt-1.5 text-sm text-gray-500 dark:text-white">
|
||||
Import comics into the ThreeTwo library.
|
||||
</p>
|
||||
@@ -302,180 +243,104 @@ export const Import = (props: ImportProps): ReactElement => {
|
||||
|
||||
{/* Error Message */}
|
||||
{importError && (
|
||||
<div className="my-6 max-w-screen-lg rounded-lg border-s-4 border-red-500 bg-red-50 dark:bg-red-900/20 p-4">
|
||||
<div className="flex items-start gap-3">
|
||||
<span className="w-6 h-6 text-red-600 dark:text-red-400 mt-0.5">
|
||||
<i className="h-6 w-6 icon-[solar--danger-circle-bold]"></i>
|
||||
</span>
|
||||
<div className="flex-1">
|
||||
<p className="font-semibold text-red-800 dark:text-red-300">
|
||||
Import Error
|
||||
</p>
|
||||
<p className="text-sm text-red-700 dark:text-red-400 mt-1">
|
||||
{importError}
|
||||
</p>
|
||||
</div>
|
||||
<button
|
||||
onClick={() => setImportError(null)}
|
||||
className="text-red-600 dark:text-red-400 hover:text-red-800 dark:hover:text-red-200"
|
||||
>
|
||||
<span className="w-5 h-5">
|
||||
<i className="h-5 w-5 icon-[solar--close-circle-bold]"></i>
|
||||
</span>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Active Session Warning */}
|
||||
{hasActiveSession && !hasNewFiles && (
|
||||
<div className="my-6 max-w-screen-lg rounded-lg border-s-4 border-yellow-500 bg-yellow-50 dark:bg-yellow-900/20 p-4">
|
||||
<div className="flex items-start gap-3">
|
||||
<span className="w-6 h-6 text-yellow-600 dark:text-yellow-400 mt-0.5">
|
||||
<i className="h-6 w-6 icon-[solar--info-circle-bold]"></i>
|
||||
</span>
|
||||
<div className="flex-1">
|
||||
<p className="font-semibold text-yellow-800 dark:text-yellow-300">
|
||||
Import In Progress
|
||||
</p>
|
||||
<p className="text-sm text-yellow-700 dark:text-yellow-400 mt-1">
|
||||
An import session is currently active. New imports cannot be started until it completes.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Import Action Buttons */}
|
||||
<div className="my-6 max-w-screen-lg flex flex-col sm:flex-row gap-3">
|
||||
{/* Start Smart Import Button - shown when there are new files, no active session, and no import is running */}
|
||||
{hasNewFiles &&
|
||||
!hasActiveSession &&
|
||||
(importJobQueue.status === "drained" || importJobQueue.status === undefined) && (
|
||||
<button
|
||||
className="flex space-x-1 sm:mt-0 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-5 py-3 text-gray-500 hover:bg-transparent hover:text-green-600 focus:outline-none focus:ring active:text-green-500 disabled:opacity-50 disabled:cursor-not-allowed"
|
||||
onClick={handleStartSmartImport}
|
||||
disabled={isStartingImport || hasActiveSession}
|
||||
<div className="my-6 max-w-screen-lg">
|
||||
<AlertBanner
|
||||
severity="error"
|
||||
title="Import Error"
|
||||
onClose={() => setImportError(null)}
|
||||
>
|
||||
<span className="text-md font-medium">
|
||||
{isStartingImport
|
||||
? "Starting Import..."
|
||||
: importStats?.getImportStatistics?.stats?.alreadyImported === 0
|
||||
? `Start Import (${importStats?.getImportStatistics?.stats?.newFiles} files)`
|
||||
: `Start Incremental Import (${importStats?.getImportStatistics?.stats?.newFiles} new files)`
|
||||
}
|
||||
</span>
|
||||
<span className="w-6 h-6">
|
||||
<i className="h-6 w-6 icon-[solar--file-left-bold-duotone]"></i>
|
||||
</span>
|
||||
</button>
|
||||
{importError}
|
||||
</AlertBanner>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Directory Check Error */}
|
||||
{!isCheckingDirectories && directoryCheckFailed && (
|
||||
<div className="my-6 max-w-screen-lg">
|
||||
<AlertBanner severity="error" title="Failed to Check Directory Status">
|
||||
<p>
|
||||
Unable to verify if required directories exist. Import
|
||||
functionality has been disabled.
|
||||
</p>
|
||||
<p className="mt-2">
|
||||
Error: {(directoryError as Error)?.message || "Unknown error"}
|
||||
</p>
|
||||
</AlertBanner>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Directory Status Warning */}
|
||||
{!isCheckingDirectories &&
|
||||
!directoryCheckFailed &&
|
||||
directoryIssues.length > 0 && (
|
||||
<div className="my-6 max-w-screen-lg">
|
||||
<AlertBanner
|
||||
severity="warning"
|
||||
title="Directory Configuration Issues"
|
||||
iconClass="icon-[solar--folder-error-bold]"
|
||||
>
|
||||
<p>
|
||||
The following issues were detected with your directory
|
||||
configuration:
|
||||
</p>
|
||||
<DirectoryIssuesList issues={directoryIssues} />
|
||||
<p className="mt-2">
|
||||
Please ensure these directories are mounted correctly in your
|
||||
Docker configuration.
|
||||
</p>
|
||||
</AlertBanner>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Force Re-Import Button - always shown when no import is running */}
|
||||
{!hasActiveSession &&
|
||||
(importJobQueue.status === "drained" || importJobQueue.status === undefined) && (
|
||||
{/* Force Re-Import Button */}
|
||||
{canStartImport && (
|
||||
<div className="my-6 max-w-screen-lg">
|
||||
<button
|
||||
className="flex space-x-1 sm:mt-0 sm:flex-row sm:items-center rounded-lg border border-orange-400 dark:border-orange-200 bg-orange-200 px-5 py-3 text-gray-700 hover:bg-transparent hover:text-orange-600 focus:outline-none focus:ring active:text-orange-500 disabled:opacity-50 disabled:cursor-not-allowed"
|
||||
onClick={handleForceReImport}
|
||||
disabled={isForceReImporting || hasActiveSession}
|
||||
title="Re-import all files to fix Elasticsearch indexing issues"
|
||||
disabled={isForceReImporting || hasActiveSession || !hasAllDirectories}
|
||||
title={
|
||||
!hasAllDirectories
|
||||
? "Cannot import: Required directories are missing"
|
||||
: "Re-import all files to fix Elasticsearch indexing issues"
|
||||
}
|
||||
>
|
||||
<span className="text-md font-medium">
|
||||
{isForceReImporting ? "Starting Re-Import..." : "Force Re-Import All Files"}
|
||||
{isForceReImporting
|
||||
? "Starting Re-Import..."
|
||||
: "Force Re-Import All Files"}
|
||||
</span>
|
||||
<span className="w-6 h-6">
|
||||
<i className="h-6 w-6 icon-[solar--refresh-bold-duotone]"></i>
|
||||
</span>
|
||||
</button>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Import activity is now shown in the RealTimeImportStats component above */}
|
||||
|
||||
{!isLoading && !isEmpty(data?.getJobResultStatistics) && (
|
||||
<div className="max-w-screen-lg">
|
||||
<span className="flex items-center mt-6">
|
||||
<span className="text-xl text-slate-500 dark:text-slate-200 pr-5">
|
||||
Past Imports
|
||||
</span>
|
||||
<span className="h-px flex-1 bg-slate-200 dark:bg-slate-400"></span>
|
||||
</span>
|
||||
|
||||
<div className="overflow-x-auto w-fit mt-4 rounded-lg border border-gray-200">
|
||||
<table className="min-w-full divide-y-2 divide-gray-200 dark:divide-gray-200 text-md">
|
||||
<thead className="ltr:text-left rtl:text-right">
|
||||
<tr>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
#
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Time Started
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Session Id
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Imported
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Failed
|
||||
</th>
|
||||
</tr>
|
||||
</thead>
|
||||
|
||||
<tbody className="divide-y divide-gray-200">
|
||||
{data?.getJobResultStatistics.map((jobResult: any, index: number) => {
|
||||
return (
|
||||
<tr key={index}>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300 font-medium">
|
||||
{index + 1}
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
{jobResult.earliestTimestamp && !isNaN(parseInt(jobResult.earliestTimestamp))
|
||||
? format(
|
||||
new Date(parseInt(jobResult.earliestTimestamp)),
|
||||
"EEEE, hh:mma, do LLLL y",
|
||||
)
|
||||
: "N/A"}
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="tag is-warning">
|
||||
{jobResult.sessionId}
|
||||
</span>
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="inline-flex items-center justify-center rounded-full bg-emerald-100 px-2 py-0.5 text-emerald-700">
|
||||
<span className="h-5 w-6">
|
||||
<i className="icon-[solar--check-circle-line-duotone] h-5 w-5"></i>
|
||||
</span>
|
||||
<p className="whitespace-nowrap text-sm">
|
||||
{jobResult.completedJobs}
|
||||
</p>
|
||||
</span>
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="inline-flex items-center justify-center rounded-full bg-red-100 px-2 py-0.5 text-red-700">
|
||||
<span className="h-5 w-6">
|
||||
<i className="icon-[solar--close-circle-line-duotone] h-5 w-5"></i>
|
||||
</span>
|
||||
|
||||
<p className="whitespace-nowrap text-sm">
|
||||
{jobResult.failedJobs}
|
||||
</p>
|
||||
</span>
|
||||
</td>
|
||||
</tr>
|
||||
);
|
||||
})}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Past Imports Table */}
|
||||
{!isLoading && !isEmpty(data?.getJobResultStatistics) && (
|
||||
<PastImportsTable data={data!.getJobResultStatistics as any} />
|
||||
)}
|
||||
</div>
|
||||
</section>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* Helper component to render directory issues list.
|
||||
*/
|
||||
const DirectoryIssuesList = ({ issues }: { issues: DirectoryIssue[] }): ReactElement => (
|
||||
<ul className="list-disc list-inside mt-2">
|
||||
{issues.map((item) => (
|
||||
<li key={item.directory}>
|
||||
<code className="bg-amber-100 dark:bg-amber-900/50 px-1 rounded">
|
||||
{item.directory}
|
||||
</code>
|
||||
<span className="ml-1">— {item.issue}</span>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
);
|
||||
|
||||
export default Import;
|
||||
|
||||
103
src/client/components/Import/PastImportsTable.tsx
Normal file
103
src/client/components/Import/PastImportsTable.tsx
Normal file
@@ -0,0 +1,103 @@
|
||||
/**
|
||||
* @fileoverview Table component displaying historical import sessions.
|
||||
* @module components/Import/PastImportsTable
|
||||
*/
|
||||
|
||||
import { ReactElement } from "react";
|
||||
import { format } from "date-fns";
|
||||
import type { JobResultStatistics } from "./import.types";
|
||||
|
||||
/**
|
||||
* Props for the PastImportsTable component.
|
||||
*/
|
||||
export type PastImportsTableProps = {
|
||||
/** Array of job result statistics from past imports */
|
||||
data: JobResultStatistics[];
|
||||
};
|
||||
|
||||
/**
|
||||
* Displays a table of past import sessions with their statistics.
|
||||
*
|
||||
* @param props - Component props
|
||||
* @returns Table element showing import history
|
||||
*/
|
||||
export const PastImportsTable = ({ data }: PastImportsTableProps): ReactElement => {
|
||||
return (
|
||||
<div className="max-w-screen-lg">
|
||||
<span className="flex items-center mt-6">
|
||||
<span className="text-xl text-slate-500 dark:text-slate-200 pr-5">
|
||||
Past Imports
|
||||
</span>
|
||||
<span className="h-px flex-1 bg-slate-200 dark:bg-slate-400"></span>
|
||||
</span>
|
||||
|
||||
<div className="overflow-x-auto w-fit mt-4 rounded-lg border border-gray-200">
|
||||
<table className="min-w-full divide-y-2 divide-gray-200 dark:divide-gray-200 text-md">
|
||||
<thead className="ltr:text-left rtl:text-right">
|
||||
<tr>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
#
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Time Started
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Session Id
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Imported
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Failed
|
||||
</th>
|
||||
</tr>
|
||||
</thead>
|
||||
|
||||
<tbody className="divide-y divide-gray-200">
|
||||
{data.map((jobResult, index) => (
|
||||
<tr key={jobResult.sessionId || index}>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300 font-medium">
|
||||
{index + 1}
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
{jobResult.earliestTimestamp &&
|
||||
!isNaN(parseInt(jobResult.earliestTimestamp))
|
||||
? format(
|
||||
new Date(parseInt(jobResult.earliestTimestamp)),
|
||||
"EEEE, hh:mma, do LLLL y"
|
||||
)
|
||||
: "N/A"}
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="tag is-warning">{jobResult.sessionId}</span>
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="inline-flex items-center justify-center rounded-full bg-emerald-100 px-2 py-0.5 text-emerald-700">
|
||||
<span className="h-5 w-6">
|
||||
<i className="icon-[solar--check-circle-line-duotone] h-5 w-5"></i>
|
||||
</span>
|
||||
<p className="whitespace-nowrap text-sm">
|
||||
{jobResult.completedJobs}
|
||||
</p>
|
||||
</span>
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="inline-flex items-center justify-center rounded-full bg-red-100 px-2 py-0.5 text-red-700">
|
||||
<span className="h-5 w-6">
|
||||
<i className="icon-[solar--close-circle-line-duotone] h-5 w-5"></i>
|
||||
</span>
|
||||
<p className="whitespace-nowrap text-sm">
|
||||
{jobResult.failedJobs}
|
||||
</p>
|
||||
</span>
|
||||
</td>
|
||||
</tr>
|
||||
))}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default PastImportsTable;
|
||||
@@ -1,93 +1,94 @@
|
||||
import React, { ReactElement, useEffect, useState } from "react";
|
||||
/**
|
||||
* @fileoverview Real-time import statistics component with live progress tracking.
|
||||
* Displays import statistics, progress bars, and file detection notifications
|
||||
* using WebSocket events for real-time updates.
|
||||
* @module components/Import/RealTimeImportStats
|
||||
*/
|
||||
|
||||
import { ReactElement, useState } from "react";
|
||||
import { Link } from "react-router-dom";
|
||||
import {
|
||||
useGetImportStatisticsQuery,
|
||||
useStartIncrementalImportMutation
|
||||
useGetWantedComicsQuery,
|
||||
useStartIncrementalImportMutation,
|
||||
} from "../../graphql/generated";
|
||||
import { useStore } from "../../store";
|
||||
import { useShallow } from "zustand/react/shallow";
|
||||
import { useImportSessionStatus } from "../../hooks/useImportSessionStatus";
|
||||
import { useImportSocketEvents } from "../../hooks/useImportSocketEvents";
|
||||
import { getComicDisplayLabel } from "../../shared/utils/formatting.utils";
|
||||
import { AlertCard } from "../shared/AlertCard";
|
||||
import { StatsCard } from "../shared/StatsCard";
|
||||
import { ProgressBar } from "../shared/ProgressBar";
|
||||
|
||||
/**
|
||||
* Import statistics with card-based layout and progress bar
|
||||
* Updates in real-time via the useImportSessionStatus hook
|
||||
* Real-time import statistics component with card-based layout and progress tracking.
|
||||
*
|
||||
* This component manages three distinct states:
|
||||
* - **Pre-import (idle)**: Shows current file counts and "Start Import" button when new files exist
|
||||
* - **Importing (active)**: Displays real-time progress bar with completed/total counts
|
||||
* - **Post-import (complete)**: Shows final statistics including failed imports
|
||||
*
|
||||
* Additionally, it surfaces missing files detected by the file watcher, allowing users
|
||||
* to see which previously-imported files are no longer found on disk.
|
||||
*
|
||||
* @returns {ReactElement} The rendered import statistics component
|
||||
*/
|
||||
export const RealTimeImportStats = (): ReactElement => {
|
||||
const [importError, setImportError] = useState<string | null>(null);
|
||||
|
||||
const { socketImport, detectedFile } = useImportSocketEvents();
|
||||
const importSession = useImportSessionStatus();
|
||||
|
||||
const { getSocket, disconnectSocket, importJobQueue } = useStore(
|
||||
useShallow((state) => ({
|
||||
getSocket: state.getSocket,
|
||||
disconnectSocket: state.disconnectSocket,
|
||||
importJobQueue: state.importJobQueue,
|
||||
}))
|
||||
})),
|
||||
);
|
||||
|
||||
// Get filesystem statistics (new files vs already imported)
|
||||
const { data: importStats, isLoading, refetch: refetchStats } = useGetImportStatisticsQuery(
|
||||
const { data: importStats, isLoading, isError: isStatsError, error: statsError } = useGetImportStatisticsQuery(
|
||||
{},
|
||||
{ refetchOnWindowFocus: false, refetchInterval: false }
|
||||
{ refetchOnWindowFocus: false, refetchInterval: false },
|
||||
);
|
||||
|
||||
// Get definitive import session status (handles Socket.IO events internally)
|
||||
const importSession = useImportSessionStatus();
|
||||
|
||||
const { mutate: startIncrementalImport, isPending: isStartingImport } = useStartIncrementalImportMutation({
|
||||
onSuccess: (data) => {
|
||||
if (data.startIncrementalImport.success) {
|
||||
importJobQueue.setStatus("running");
|
||||
setImportError(null);
|
||||
}
|
||||
},
|
||||
onError: (error: any) => {
|
||||
console.error("Failed to start import:", error);
|
||||
setImportError(error?.message || "Failed to start import. Please try again.");
|
||||
},
|
||||
});
|
||||
|
||||
const stats = importStats?.getImportStatistics?.stats;
|
||||
const hasNewFiles = stats && stats.newFiles > 0;
|
||||
const missingCount = stats?.missingFiles ?? 0;
|
||||
|
||||
// Refetch filesystem stats when import completes
|
||||
useEffect(() => {
|
||||
if (importSession.isComplete && importSession.status === "completed") {
|
||||
console.log("[RealTimeImportStats] Import completed, refetching filesystem stats");
|
||||
refetchStats();
|
||||
importJobQueue.setStatus("drained");
|
||||
}
|
||||
}, [importSession.isComplete, importSession.status, refetchStats, importJobQueue]);
|
||||
const { data: missingComicsData } = useGetWantedComicsQuery(
|
||||
{
|
||||
paginationOptions: { limit: 3, page: 1 },
|
||||
predicate: { "importStatus.isRawFileMissing": true },
|
||||
},
|
||||
{
|
||||
refetchOnWindowFocus: false,
|
||||
refetchInterval: false,
|
||||
enabled: missingCount > 0,
|
||||
},
|
||||
);
|
||||
|
||||
// Listen to filesystem change events to refetch stats
|
||||
useEffect(() => {
|
||||
const socket = getSocket("/");
|
||||
const missingDocs = missingComicsData?.getComicBooks?.docs ?? [];
|
||||
|
||||
const handleFilesystemChange = () => {
|
||||
refetchStats();
|
||||
};
|
||||
|
||||
// File system changes that affect import statistics
|
||||
socket.on("LS_FILE_ADDED", handleFilesystemChange);
|
||||
socket.on("LS_FILE_REMOVED", handleFilesystemChange);
|
||||
socket.on("LS_FILE_CHANGED", handleFilesystemChange);
|
||||
socket.on("LS_DIRECTORY_ADDED", handleFilesystemChange);
|
||||
socket.on("LS_DIRECTORY_REMOVED", handleFilesystemChange);
|
||||
socket.on("LS_LIBRARY_STATISTICS", handleFilesystemChange);
|
||||
|
||||
return () => {
|
||||
socket.off("LS_FILE_ADDED", handleFilesystemChange);
|
||||
socket.off("LS_FILE_REMOVED", handleFilesystemChange);
|
||||
socket.off("LS_FILE_CHANGED", handleFilesystemChange);
|
||||
socket.off("LS_DIRECTORY_ADDED", handleFilesystemChange);
|
||||
socket.off("LS_DIRECTORY_REMOVED", handleFilesystemChange);
|
||||
socket.off("LS_LIBRARY_STATISTICS", handleFilesystemChange);
|
||||
};
|
||||
}, [getSocket, refetchStats]);
|
||||
const { mutate: startIncrementalImport, isPending: isStartingImport } =
|
||||
useStartIncrementalImportMutation({
|
||||
onSuccess: (data) => {
|
||||
if (data.startIncrementalImport.success) {
|
||||
importJobQueue.setStatus("running");
|
||||
setImportError(null);
|
||||
}
|
||||
},
|
||||
onError: (error: any) => {
|
||||
setImportError(error?.message || "Failed to start import. Please try again.");
|
||||
},
|
||||
});
|
||||
|
||||
const handleStartImport = async () => {
|
||||
setImportError(null);
|
||||
|
||||
// Check if import is already active using definitive status
|
||||
if (importSession.isActive) {
|
||||
setImportError(
|
||||
`Cannot start import: An import session "${importSession.sessionId}" is already active. Please wait for it to complete.`
|
||||
`Cannot start import: An import session "${importSession.sessionId}" is already active. Please wait for it to complete.`,
|
||||
);
|
||||
return;
|
||||
}
|
||||
@@ -108,122 +109,132 @@ export const RealTimeImportStats = (): ReactElement => {
|
||||
}
|
||||
};
|
||||
|
||||
if (isLoading || !stats) {
|
||||
if (isLoading) {
|
||||
return <div className="text-gray-500 dark:text-gray-400">Loading...</div>;
|
||||
}
|
||||
|
||||
// Determine button text based on whether there are already imported files
|
||||
if (isStatsError || !stats) {
|
||||
return (
|
||||
<AlertCard variant="error" title="Failed to Load Import Statistics">
|
||||
<p>Unable to retrieve import statistics from the server. Please check that the backend service is running.</p>
|
||||
{isStatsError && (
|
||||
<p className="mt-2">Error: {statsError instanceof Error ? statsError.message : "Unknown error"}</p>
|
||||
)}
|
||||
</AlertCard>
|
||||
);
|
||||
}
|
||||
|
||||
const hasNewFiles = stats.newFiles > 0;
|
||||
const isFirstImport = stats.alreadyImported === 0;
|
||||
const buttonText = isFirstImport
|
||||
? `Start Import (${stats.newFiles} files)`
|
||||
: `Start Incremental Import (${stats.newFiles} new files)`;
|
||||
|
||||
// Calculate display statistics
|
||||
const displayStats = importSession.isActive && importSession.stats
|
||||
? {
|
||||
totalFiles: importSession.stats.filesQueued + stats.alreadyImported,
|
||||
filesQueued: importSession.stats.filesQueued,
|
||||
filesSucceeded: importSession.stats.filesSucceeded,
|
||||
}
|
||||
: {
|
||||
totalFiles: stats.totalLocalFiles,
|
||||
filesQueued: stats.newFiles,
|
||||
filesSucceeded: stats.alreadyImported,
|
||||
};
|
||||
const sessionStats = importSession.stats;
|
||||
const hasSessionStats = importSession.isActive && sessionStats !== null;
|
||||
const failedCount = hasSessionStats ? sessionStats!.filesFailed : 0;
|
||||
|
||||
const showProgressBar = socketImport !== null;
|
||||
const showFailedCard = hasSessionStats && failedCount > 0;
|
||||
const showMissingCard = missingCount > 0;
|
||||
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
{/* Error Message */}
|
||||
{importError && (
|
||||
<div className="rounded-lg border-l-4 border-red-500 bg-red-50 dark:bg-red-900/20 p-4">
|
||||
<div className="flex items-start gap-3">
|
||||
<span className="w-6 h-6 text-red-600 dark:text-red-400 mt-0.5">
|
||||
<i className="h-6 w-6 icon-[solar--danger-circle-bold]"></i>
|
||||
</span>
|
||||
<div className="flex-1">
|
||||
<p className="font-semibold text-red-800 dark:text-red-300">Import Error</p>
|
||||
<p className="text-sm text-red-700 dark:text-red-400 mt-1">{importError}</p>
|
||||
</div>
|
||||
<button
|
||||
onClick={() => setImportError(null)}
|
||||
className="text-red-600 dark:text-red-400 hover:text-red-800 dark:hover:text-red-200"
|
||||
>
|
||||
<span className="w-5 h-5">
|
||||
<i className="h-5 w-5 icon-[solar--close-circle-bold]"></i>
|
||||
</span>
|
||||
</button>
|
||||
</div>
|
||||
<AlertCard variant="error" title="Import Error" onDismiss={() => setImportError(null)}>
|
||||
{importError}
|
||||
</AlertCard>
|
||||
)}
|
||||
|
||||
{detectedFile && (
|
||||
<div className="rounded-lg border-l-4 border-blue-500 bg-blue-50 dark:bg-blue-900/20 p-3 flex items-center gap-3">
|
||||
<i className="h-5 w-5 text-blue-600 dark:text-blue-400 icon-[solar--document-add-bold-duotone] shrink-0"></i>
|
||||
<p className="text-sm text-blue-800 dark:text-blue-300 font-mono truncate">
|
||||
New file detected: {detectedFile}
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Import Button - only show when there are new files and no active import */}
|
||||
{hasNewFiles && !importSession.isActive && (
|
||||
<button
|
||||
onClick={handleStartImport}
|
||||
disabled={isStartingImport}
|
||||
className="w-full flex items-center justify-center gap-2 rounded-lg bg-green-500 hover:bg-green-600 disabled:bg-gray-400 px-6 py-3 text-white font-medium transition-colors disabled:cursor-not-allowed"
|
||||
className="flex items-center gap-2 rounded-lg bg-green-500 hover:bg-green-600 disabled:bg-gray-400 px-6 py-3 text-white font-medium transition-colors disabled:cursor-not-allowed"
|
||||
>
|
||||
<span className="w-6 h-6">
|
||||
<i className="h-6 w-6 icon-[solar--file-left-bold-duotone]"></i>
|
||||
</span>
|
||||
<i className="h-6 w-6 icon-[solar--file-left-bold-duotone]"></i>
|
||||
<span>{isStartingImport ? "Starting Import..." : buttonText}</span>
|
||||
</button>
|
||||
)}
|
||||
|
||||
{/* Active Import Progress Bar */}
|
||||
{importSession.isActive && (
|
||||
<div className="space-y-2">
|
||||
<div className="flex items-center justify-between">
|
||||
<span className="text-sm font-medium text-gray-700 dark:text-gray-300">
|
||||
Importing {importSession.stats?.filesSucceeded || 0} / {importSession.stats?.filesQueued || 0}...
|
||||
</span>
|
||||
<span className="text-sm font-semibold text-gray-900 dark:text-white">
|
||||
{Math.round(importSession.progress)}%
|
||||
</span>
|
||||
</div>
|
||||
<div className="w-full bg-gray-200 dark:bg-gray-700 rounded-full h-3 overflow-hidden">
|
||||
<div
|
||||
className="bg-blue-600 dark:bg-blue-500 h-3 rounded-full transition-all duration-300 relative"
|
||||
style={{ width: `${importSession.progress}%` }}
|
||||
>
|
||||
<div className="absolute inset-0 bg-gradient-to-r from-transparent via-white/20 to-transparent animate-shimmer"></div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
{showProgressBar && (
|
||||
<ProgressBar
|
||||
current={socketImport!.completed}
|
||||
total={socketImport!.total}
|
||||
isActive={socketImport!.active}
|
||||
activeLabel={`Importing ${socketImport!.completed} / ${socketImport!.total}`}
|
||||
completeLabel={`${socketImport!.completed} / ${socketImport!.total} imported`}
|
||||
/>
|
||||
)}
|
||||
|
||||
{/* Stats Cards */}
|
||||
<div className="grid grid-cols-1 sm:grid-cols-3 gap-4">
|
||||
{/* Files Detected Card */}
|
||||
<div className="rounded-lg p-6 text-center" style={{ backgroundColor: '#6b7280' }}>
|
||||
<div className="text-4xl font-bold text-white mb-2">
|
||||
{displayStats.totalFiles}
|
||||
</div>
|
||||
<div className="text-sm text-gray-200 font-medium">
|
||||
files detected
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* To Import Card */}
|
||||
<div className="rounded-lg p-6 text-center" style={{ backgroundColor: '#60a5fa' }}>
|
||||
<div className="text-4xl font-bold text-white mb-2">
|
||||
{displayStats.filesQueued}
|
||||
</div>
|
||||
<div className="text-sm text-gray-100 font-medium">
|
||||
to import
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Already Imported Card */}
|
||||
<div className="rounded-lg p-6 text-center" style={{ backgroundColor: '#d8dab2' }}>
|
||||
<div className="text-4xl font-bold text-gray-800 mb-2">
|
||||
{displayStats.filesSucceeded}
|
||||
</div>
|
||||
<div className="text-sm text-gray-700 font-medium">
|
||||
already imported
|
||||
</div>
|
||||
</div>
|
||||
<div className="grid grid-cols-2 sm:grid-cols-4 gap-4">
|
||||
<StatsCard
|
||||
value={stats.totalLocalFiles}
|
||||
label="in import folder"
|
||||
backgroundColor="#6b7280"
|
||||
/>
|
||||
<StatsCard
|
||||
value={stats.alreadyImported}
|
||||
label={importSession.isActive ? "imported so far" : "imported in database"}
|
||||
backgroundColor="#d8dab2"
|
||||
valueColor="text-gray-800"
|
||||
labelColor="text-gray-700"
|
||||
/>
|
||||
{showFailedCard && (
|
||||
<StatsCard
|
||||
value={failedCount}
|
||||
label="failed"
|
||||
backgroundColor="bg-red-500"
|
||||
labelColor="text-red-100"
|
||||
/>
|
||||
)}
|
||||
{showMissingCard && (
|
||||
<StatsCard
|
||||
value={missingCount}
|
||||
label="missing"
|
||||
backgroundColor="bg-card-missing"
|
||||
valueColor="text-slate-700"
|
||||
labelColor="text-slate-800"
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{showMissingCard && (
|
||||
<AlertCard variant="warning" title={`${missingCount} ${missingCount === 1 ? "file" : "files"} missing`}>
|
||||
<p>These files were previously imported but can no longer be found on disk. Move them back to restore access.</p>
|
||||
{missingDocs.length > 0 && (
|
||||
<ul className="mt-2 space-y-1">
|
||||
{missingDocs.map((comic, i) => (
|
||||
<li key={i} className="text-xs truncate">
|
||||
{getComicDisplayLabel(comic)} is missing
|
||||
</li>
|
||||
))}
|
||||
{missingCount > 3 && (
|
||||
<li className="text-xs text-amber-600 dark:text-amber-500">
|
||||
and {missingCount - 3} more.
|
||||
</li>
|
||||
)}
|
||||
</ul>
|
||||
)}
|
||||
<Link
|
||||
to="/library?filter=missingFiles"
|
||||
className="inline-flex items-center gap-1.5 mt-3 text-xs font-medium underline underline-offset-2 hover:opacity-70"
|
||||
>
|
||||
<i className="icon-[solar--file-corrupted-outline] w-4 h-4" />
|
||||
View Missing Files In Library
|
||||
<i className="icon-[solar--arrow-right-up-outline] w-3 h-3" />
|
||||
</Link>
|
||||
</AlertCard>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
43
src/client/components/Import/import.types.ts
Normal file
43
src/client/components/Import/import.types.ts
Normal file
@@ -0,0 +1,43 @@
|
||||
/**
|
||||
* @fileoverview Type definitions for the Import module.
|
||||
* @module components/Import/import.types
|
||||
*/
|
||||
|
||||
/**
|
||||
* Represents an issue with a configured directory.
|
||||
*/
|
||||
export type DirectoryIssue = {
|
||||
/** Path to the directory with issues */
|
||||
directory: string;
|
||||
/** Description of the issue */
|
||||
issue: string;
|
||||
};
|
||||
|
||||
/**
|
||||
* Result of directory status check from the backend.
|
||||
*/
|
||||
export type DirectoryStatus = {
|
||||
/** Whether all required directories are accessible */
|
||||
isValid: boolean;
|
||||
/** List of specific issues found */
|
||||
issues: DirectoryIssue[];
|
||||
};
|
||||
|
||||
/**
|
||||
* Statistics for a completed import job session.
|
||||
*/
|
||||
export type JobResultStatistics = {
|
||||
/** Unique session identifier */
|
||||
sessionId: string;
|
||||
/** Timestamp of the earliest job in the session (as string for GraphQL compatibility) */
|
||||
earliestTimestamp: string;
|
||||
/** Number of successfully completed jobs */
|
||||
completedJobs: number;
|
||||
/** Number of failed jobs */
|
||||
failedJobs: number;
|
||||
};
|
||||
|
||||
/**
|
||||
* Status of the import job queue.
|
||||
*/
|
||||
export type ImportQueueStatus = "running" | "drained" | undefined;
|
||||
@@ -1,6 +1,5 @@
|
||||
import React, { useMemo, ReactElement, useState, useEffect } from "react";
|
||||
import PropTypes from "prop-types";
|
||||
import { useNavigate } from "react-router-dom";
|
||||
import React, { useMemo, ReactElement, useState } from "react";
|
||||
import { useNavigate, useSearchParams } from "react-router-dom";
|
||||
import { isEmpty, isNil, isUndefined } from "lodash";
|
||||
import MetadataPanel from "../shared/MetadataPanel";
|
||||
import T2Table from "../shared/T2Table";
|
||||
@@ -12,79 +11,123 @@ import {
|
||||
useQueryClient,
|
||||
} from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { format, fromUnixTime, parseISO } from "date-fns";
|
||||
import { format, parseISO } from "date-fns";
|
||||
import { useGetWantedComicsQuery } from "../../graphql/generated";
|
||||
|
||||
import type { LibrarySearchQuery, FilterOption } from "../../types";
|
||||
|
||||
const FILTER_OPTIONS: { value: FilterOption; label: string }[] = [
|
||||
{ value: "all", label: "All Comics" },
|
||||
{ value: "missingFiles", label: "Missing Files" },
|
||||
];
|
||||
|
||||
/**
|
||||
* Component that tabulates the contents of the user's ThreeTwo Library.
|
||||
*
|
||||
* @component
|
||||
* @example
|
||||
* <Library />
|
||||
* Library page component. Displays a paginated, searchable table of all comics
|
||||
* in the collection, with an optional filter for comics with missing raw files.
|
||||
*/
|
||||
export const Library = (): ReactElement => {
|
||||
// Default page state
|
||||
// offset: 0
|
||||
const [offset, setOffset] = useState(0);
|
||||
const [searchQuery, setSearchQuery] = useState({
|
||||
const [searchParams] = useSearchParams();
|
||||
const initialFilter = (searchParams.get("filter") as FilterOption) ?? "all";
|
||||
|
||||
const [activeFilter, setActiveFilter] = useState<FilterOption>(initialFilter);
|
||||
const [searchQuery, setSearchQuery] = useState<LibrarySearchQuery>({
|
||||
query: {},
|
||||
pagination: {
|
||||
size: 25,
|
||||
from: offset,
|
||||
},
|
||||
pagination: { size: 25, from: 0 },
|
||||
type: "all",
|
||||
trigger: "libraryPage",
|
||||
});
|
||||
|
||||
const queryClient = useQueryClient();
|
||||
|
||||
/**
|
||||
* Method that queries the Elasticsearch index "comics" for issues specified by the query
|
||||
* @param searchQuery - A searchQuery object that contains the search term, type, and pagination params.
|
||||
*/
|
||||
const fetchIssues = async (searchQuery) => {
|
||||
const { pagination, query, type } = searchQuery;
|
||||
/** Fetches a page of issues from the search API. */
|
||||
const fetchIssues = async (q: LibrarySearchQuery) => {
|
||||
const { pagination, query, type } = q;
|
||||
return await axios({
|
||||
method: "POST",
|
||||
url: "http://localhost:3000/api/search/searchIssue",
|
||||
data: {
|
||||
query,
|
||||
pagination,
|
||||
type,
|
||||
},
|
||||
data: { query, pagination, type },
|
||||
});
|
||||
};
|
||||
|
||||
const searchIssues = (e) => {
|
||||
const { data, isPlaceholderData } = useQuery({
|
||||
queryKey: ["comics", searchQuery],
|
||||
queryFn: () => fetchIssues(searchQuery),
|
||||
placeholderData: keepPreviousData,
|
||||
enabled: activeFilter === "all",
|
||||
});
|
||||
|
||||
const { data: missingFilesData, isLoading: isMissingLoading } = useGetWantedComicsQuery(
|
||||
{
|
||||
paginationOptions: { limit: 25, page: 1 },
|
||||
predicate: { "importStatus.isRawFileMissing": true },
|
||||
},
|
||||
{ enabled: activeFilter === "missingFiles" },
|
||||
);
|
||||
|
||||
const { data: missingIdsData } = useGetWantedComicsQuery(
|
||||
{
|
||||
paginationOptions: { limit: 1000, page: 1 },
|
||||
predicate: { "importStatus.isRawFileMissing": true },
|
||||
},
|
||||
{ enabled: activeFilter === "all" },
|
||||
);
|
||||
|
||||
/** Set of comic IDs whose raw files are missing, used to highlight rows in the main table. */
|
||||
const missingIdSet = useMemo(
|
||||
() => new Set((missingIdsData?.getComicBooks?.docs ?? []).map((doc: any) => doc.id)),
|
||||
[missingIdsData],
|
||||
);
|
||||
|
||||
const searchResults = data?.data;
|
||||
const navigate = useNavigate();
|
||||
|
||||
const navigateToComicDetail = (row: any) => navigate(`/comic/details/${row.original._id}`);
|
||||
const navigateToMissingComicDetail = (row: any) => navigate(`/comic/details/${row.original.id}`);
|
||||
|
||||
/** Triggers a search by volume name and resets pagination. */
|
||||
const searchIssues = (e: any) => {
|
||||
queryClient.invalidateQueries({ queryKey: ["comics"] });
|
||||
setSearchQuery({
|
||||
query: {
|
||||
volumeName: e.search,
|
||||
},
|
||||
pagination: {
|
||||
size: 15,
|
||||
from: 0,
|
||||
},
|
||||
query: { volumeName: e.search },
|
||||
pagination: { size: 15, from: 0 },
|
||||
type: "volumeName",
|
||||
trigger: "libraryPage",
|
||||
});
|
||||
};
|
||||
|
||||
const { data, isLoading, isError, isPlaceholderData } = useQuery({
|
||||
queryKey: ["comics", offset, searchQuery],
|
||||
queryFn: () => fetchIssues(searchQuery),
|
||||
placeholderData: keepPreviousData,
|
||||
});
|
||||
|
||||
const searchResults = data?.data;
|
||||
// Programmatically navigate to comic detail
|
||||
const navigate = useNavigate();
|
||||
const navigateToComicDetail = (row) => {
|
||||
navigate(`/comic/details/${row.original._id}`);
|
||||
/** Advances to the next page of results. */
|
||||
const nextPage = (pageIndex: number, pageSize: number) => {
|
||||
if (!isPlaceholderData) {
|
||||
queryClient.invalidateQueries({ queryKey: ["comics"] });
|
||||
setSearchQuery({
|
||||
query: {},
|
||||
pagination: { size: 15, from: pageSize * pageIndex + 1 },
|
||||
type: "all",
|
||||
trigger: "libraryPage",
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
const ComicInfoXML = (value) => {
|
||||
return value.data ? (
|
||||
/** Goes back to the previous page of results. */
|
||||
const previousPage = (pageIndex: number, pageSize: number) => {
|
||||
let from = 0;
|
||||
if (pageIndex === 2) {
|
||||
from = (pageIndex - 1) * pageSize + 2 - (pageSize + 2);
|
||||
} else {
|
||||
from = (pageIndex - 1) * pageSize + 2 - (pageSize + 1);
|
||||
}
|
||||
queryClient.invalidateQueries({ queryKey: ["comics"] });
|
||||
setSearchQuery({
|
||||
query: {},
|
||||
pagination: { size: 15, from },
|
||||
type: "all",
|
||||
trigger: "libraryPage",
|
||||
});
|
||||
};
|
||||
|
||||
const ComicInfoXML = (value: any) =>
|
||||
value.data ? (
|
||||
<dl className="flex flex-col text-xs sm:text-md p-2 sm:p-3 ml-0 sm:ml-4 my-3 rounded-lg dark:bg-yellow-500 bg-yellow-300 w-full sm:w-max max-w-full">
|
||||
{/* Series Name */}
|
||||
<span className="inline-flex items-center w-fit bg-slate-50 text-slate-800 text-xs font-medium px-1.5 sm:px-2 rounded-md dark:text-slate-900 dark:bg-slate-400 max-w-full overflow-hidden">
|
||||
<span className="pr-0.5 sm:pr-1 pt-1">
|
||||
<i className="icon-[solar--bookmark-square-minimalistic-bold-duotone] w-4 h-4 sm:w-5 sm:h-5"></i>
|
||||
@@ -94,7 +137,6 @@ export const Library = (): ReactElement => {
|
||||
</span>
|
||||
</span>
|
||||
<div className="flex flex-row flex-wrap mt-1 sm:mt-2 gap-1 sm:gap-2">
|
||||
{/* Pages */}
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs px-1 sm:px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-0.5 sm:pr-1 pt-1">
|
||||
<i className="icon-[solar--notebook-minimalistic-bold-duotone] w-3.5 h-3.5 sm:w-5 sm:h-5"></i>
|
||||
@@ -103,7 +145,6 @@ export const Library = (): ReactElement => {
|
||||
Pages: {value.data.pagecount[0]}
|
||||
</span>
|
||||
</span>
|
||||
{/* Issue number */}
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs px-1 sm:px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-0.5 sm:pr-1 pt-1">
|
||||
<i className="icon-[solar--hashtag-outline] w-3 h-3 sm:w-3.5 sm:h-3.5"></i>
|
||||
@@ -117,30 +158,62 @@ export const Library = (): ReactElement => {
|
||||
</div>
|
||||
</dl>
|
||||
) : null;
|
||||
};
|
||||
|
||||
const missingFilesColumns = useMemo(
|
||||
() => [
|
||||
{
|
||||
header: "Missing Files",
|
||||
columns: [
|
||||
{
|
||||
header: "Status",
|
||||
id: "missingStatus",
|
||||
cell: () => (
|
||||
<div className="flex flex-col items-center gap-1.5 px-2 py-3 min-w-[80px]">
|
||||
<i className="icon-[solar--file-corrupted-outline] w-8 h-8 text-red-500"></i>
|
||||
<span className="inline-flex items-center rounded-md bg-red-100 px-2 py-1 text-xs font-semibold text-red-700 ring-1 ring-inset ring-red-600/20">
|
||||
MISSING
|
||||
</span>
|
||||
</div>
|
||||
),
|
||||
},
|
||||
{
|
||||
header: "Comic",
|
||||
id: "missingComic",
|
||||
minWidth: 250,
|
||||
accessorFn: (row: any) => row,
|
||||
cell: (info: any) => <MetadataPanel data={info.getValue()} />,
|
||||
},
|
||||
],
|
||||
},
|
||||
],
|
||||
[],
|
||||
);
|
||||
|
||||
const columns = useMemo(
|
||||
() => [
|
||||
{
|
||||
header: "Comic Metadata",
|
||||
footer: 1,
|
||||
columns: [
|
||||
{
|
||||
header: "File Details",
|
||||
id: "fileDetails",
|
||||
minWidth: 250,
|
||||
accessorKey: "_source",
|
||||
cell: (info) => {
|
||||
return <MetadataPanel data={info.getValue()} />;
|
||||
cell: (info: any) => {
|
||||
const source = info.getValue();
|
||||
return (
|
||||
<MetadataPanel
|
||||
data={source}
|
||||
isMissing={missingIdSet.has(info.row.original._id)}
|
||||
/>
|
||||
);
|
||||
},
|
||||
},
|
||||
{
|
||||
header: "ComicInfo.xml",
|
||||
accessorKey: "_source.sourcedMetadata.comicInfo",
|
||||
cell: (info) =>
|
||||
!isEmpty(info.getValue()) ? (
|
||||
<ComicInfoXML data={info.getValue()} />
|
||||
) : null,
|
||||
cell: (info: any) =>
|
||||
!isEmpty(info.getValue()) ? <ComicInfoXML data={info.getValue()} /> : null,
|
||||
},
|
||||
],
|
||||
},
|
||||
@@ -150,36 +223,30 @@ export const Library = (): ReactElement => {
|
||||
{
|
||||
header: "Date of Import",
|
||||
accessorKey: "_source.createdAt",
|
||||
cell: (info) => {
|
||||
return !isNil(info.getValue()) ? (
|
||||
cell: (info: any) =>
|
||||
!isNil(info.getValue()) ? (
|
||||
<div className="text-sm w-max ml-3 my-3 text-slate-600 dark:text-slate-900">
|
||||
<p>{format(parseISO(info.getValue()), "dd MMMM, yyyy")} </p>
|
||||
<p>{format(parseISO(info.getValue()), "dd MMMM, yyyy")}</p>
|
||||
{format(parseISO(info.getValue()), "h aaaa")}
|
||||
</div>
|
||||
) : null;
|
||||
},
|
||||
) : null,
|
||||
},
|
||||
{
|
||||
header: "Downloads",
|
||||
accessorKey: "_source.acquisition",
|
||||
cell: (info) => (
|
||||
cell: (info: any) => (
|
||||
<div className="flex flex-col gap-2 ml-3 my-3">
|
||||
<span className="inline-flex items-center w-fit bg-slate-50 text-slate-800 text-xs px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="inline-flex items-center w-fit bg-slate-50 text-slate-800 text-xs px-2 rounded-md dark:text-slate-900 dark:bg-slate-400 whitespace-nowrap">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--folder-path-connect-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
<span className="text-md text-slate-900 dark:text-slate-900">
|
||||
DC++: {info.getValue().directconnect.downloads.length}
|
||||
</span>
|
||||
DC++: {info.getValue().directconnect.downloads.length}
|
||||
</span>
|
||||
|
||||
<span className="inline-flex w-fit items-center bg-slate-50 text-slate-800 text-xs px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="inline-flex items-center w-fit bg-slate-50 text-slate-800 text-xs px-2 rounded-md dark:text-slate-900 dark:bg-slate-400 whitespace-nowrap">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--magnet-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
<span className="text-md text-slate-900 dark:text-slate-900">
|
||||
Torrent: {info.getValue().torrent.length}
|
||||
</span>
|
||||
Torrent: {info.getValue().torrent.length}
|
||||
</span>
|
||||
</div>
|
||||
),
|
||||
@@ -187,130 +254,100 @@ export const Library = (): ReactElement => {
|
||||
],
|
||||
},
|
||||
],
|
||||
[],
|
||||
[missingIdSet],
|
||||
);
|
||||
|
||||
/**
|
||||
* Pagination control that fetches the next x (pageSize) items
|
||||
* based on the y (pageIndex) offset from the ThreeTwo Elasticsearch index
|
||||
* @param {number} pageIndex
|
||||
* @param {number} pageSize
|
||||
* @returns void
|
||||
*
|
||||
**/
|
||||
const nextPage = (pageIndex: number, pageSize: number) => {
|
||||
if (!isPlaceholderData) {
|
||||
queryClient.invalidateQueries({ queryKey: ["comics"] });
|
||||
setSearchQuery({
|
||||
query: {},
|
||||
pagination: {
|
||||
size: 15,
|
||||
from: pageSize * pageIndex + 1,
|
||||
},
|
||||
type: "all",
|
||||
trigger: "libraryPage",
|
||||
});
|
||||
// setOffset(pageSize * pageIndex + 1);
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Pagination control that fetches the previous x (pageSize) items
|
||||
* based on the y (pageIndex) offset from the ThreeTwo Elasticsearch index
|
||||
* @param {number} pageIndex
|
||||
* @param {number} pageSize
|
||||
* @returns void
|
||||
**/
|
||||
const previousPage = (pageIndex: number, pageSize: number) => {
|
||||
let from = 0;
|
||||
if (pageIndex === 2) {
|
||||
from = (pageIndex - 1) * pageSize + 2 - (pageSize + 2);
|
||||
} else {
|
||||
from = (pageIndex - 1) * pageSize + 2 - (pageSize + 1);
|
||||
}
|
||||
queryClient.invalidateQueries({ queryKey: ["comics"] });
|
||||
setSearchQuery({
|
||||
query: {},
|
||||
pagination: {
|
||||
size: 15,
|
||||
from,
|
||||
},
|
||||
type: "all",
|
||||
trigger: "libraryPage",
|
||||
});
|
||||
// setOffset(from);
|
||||
};
|
||||
|
||||
// ImportStatus.propTypes = {
|
||||
// value: PropTypes.bool.isRequired,
|
||||
// };
|
||||
return (
|
||||
<div>
|
||||
<section>
|
||||
<header className="bg-slate-200 dark:bg-slate-500">
|
||||
<div className="mx-auto max-w-screen-xl px-4 py-2 sm:px-6 sm:py-8 lg:px-8 lg:py-4">
|
||||
<div className="sm:flex sm:items-center sm:justify-between">
|
||||
<div className="text-center sm:text-left">
|
||||
<h1 className="text-2xl font-bold text-gray-900 dark:text-white sm:text-3xl">
|
||||
Library
|
||||
</h1>
|
||||
|
||||
<p className="mt-1.5 text-sm text-gray-500 dark:text-white">
|
||||
Browse your comic book collection.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</header>
|
||||
{!isUndefined(searchResults?.hits) ? (
|
||||
<div className="mx-auto max-w-screen-xl px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
<div>
|
||||
<T2Table
|
||||
totalPages={searchResults.hits.total.value}
|
||||
columns={columns}
|
||||
sourceData={searchResults?.hits.hits}
|
||||
rowClickHandler={navigateToComicDetail}
|
||||
paginationHandlers={{
|
||||
nextPage,
|
||||
previousPage,
|
||||
}}
|
||||
>
|
||||
<SearchBar searchHandler={(e) => searchIssues(e)} />
|
||||
</T2Table>
|
||||
</div>
|
||||
</div>
|
||||
) : (
|
||||
<div className="mx-auto max-w-screen-xl mt-5">
|
||||
<article
|
||||
role="alert"
|
||||
className="rounded-lg max-w-screen-md border-s-4 border-yellow-500 bg-yellow-50 p-4 dark:border-s-4 dark:border-yellow-600 dark:bg-yellow-300 dark:text-slate-600"
|
||||
>
|
||||
<div>
|
||||
<p>
|
||||
No comics were found in the library, Elasticsearch reports no
|
||||
indices. Try importing a few comics into the library and come
|
||||
back.
|
||||
</p>
|
||||
</div>
|
||||
</article>
|
||||
<div className="block max-w-md p-6 bg-white border border-gray-200 my-3 rounded-lg shadow dark:bg-slate-400 dark:border-gray-700">
|
||||
<pre className="text-sm font-hasklig text-slate-700 dark:text-slate-700">
|
||||
{!isUndefined(searchResults?.data?.meta?.body) ? (
|
||||
<p>
|
||||
{JSON.stringify(
|
||||
searchResults?.data.meta.body.error.root_cause,
|
||||
null,
|
||||
4,
|
||||
)}
|
||||
</p>
|
||||
) : null}
|
||||
</pre>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</section>
|
||||
const FilterDropdown = () => (
|
||||
<div className="relative">
|
||||
<select
|
||||
value={activeFilter}
|
||||
onChange={(e: React.ChangeEvent<HTMLSelectElement>) => setActiveFilter(e.target.value as FilterOption)}
|
||||
className="appearance-none h-full rounded-lg border border-gray-300 dark:border-slate-600 bg-white dark:bg-slate-700 pl-3 pr-8 py-1.5 text-sm text-gray-700 dark:text-slate-200 cursor-pointer focus:outline-none focus:ring-2 focus:ring-blue-500"
|
||||
>
|
||||
{FILTER_OPTIONS.map((opt) => (
|
||||
<option key={opt.value} value={opt.value}>
|
||||
{opt.label}
|
||||
</option>
|
||||
))}
|
||||
</select>
|
||||
<i className="icon-[solar--alt-arrow-down-bold] absolute right-2 top-1/2 -translate-y-1/2 w-4 h-4 text-gray-500 dark:text-slate-400 pointer-events-none"></i>
|
||||
</div>
|
||||
);
|
||||
|
||||
const isMissingFilter = activeFilter === "missingFiles";
|
||||
|
||||
return (
|
||||
<section>
|
||||
<header className="bg-slate-200 dark:bg-slate-500">
|
||||
<div className="mx-auto max-w-screen-xl px-4 py-2 sm:px-6 sm:py-8 lg:px-8 lg:py-4">
|
||||
<div className="sm:flex sm:items-center sm:justify-between">
|
||||
<div className="text-center sm:text-left">
|
||||
<h1 className="text-2xl font-bold text-gray-900 dark:text-white sm:text-3xl">
|
||||
Library
|
||||
</h1>
|
||||
<p className="mt-1.5 text-sm text-gray-500 dark:text-white">
|
||||
Browse your comic book collection.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</header>
|
||||
|
||||
{isMissingFilter ? (
|
||||
<div className="mx-auto max-w-screen-xl px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
{isMissingLoading ? (
|
||||
<div className="text-gray-500 dark:text-gray-400">Loading...</div>
|
||||
) : (
|
||||
<T2Table
|
||||
totalPages={missingFilesData?.getComicBooks?.totalDocs ?? 0}
|
||||
columns={missingFilesColumns}
|
||||
sourceData={missingFilesData?.getComicBooks?.docs ?? []}
|
||||
rowClickHandler={navigateToMissingComicDetail}
|
||||
getRowClassName={() => "bg-card-missing/40 hover:bg-card-missing/20"}
|
||||
paginationHandlers={{ nextPage: () => {}, previousPage: () => {} }}
|
||||
>
|
||||
<FilterDropdown />
|
||||
</T2Table>
|
||||
)}
|
||||
</div>
|
||||
) : !isUndefined(searchResults?.hits) ? (
|
||||
<div className="mx-auto max-w-screen-xl px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
<T2Table
|
||||
totalPages={searchResults.hits.total.value}
|
||||
columns={columns}
|
||||
sourceData={searchResults?.hits.hits}
|
||||
rowClickHandler={navigateToComicDetail}
|
||||
getRowClassName={(row) =>
|
||||
missingIdSet.has(row.original._id)
|
||||
? "bg-card-missing/40 hover:bg-card-missing/20"
|
||||
: "hover:bg-slate-100/30 dark:hover:bg-slate-700/20"
|
||||
}
|
||||
paginationHandlers={{ nextPage, previousPage }}
|
||||
>
|
||||
<div className="flex items-center gap-2">
|
||||
<FilterDropdown />
|
||||
<SearchBar searchHandler={(e: any) => searchIssues(e)} />
|
||||
</div>
|
||||
</T2Table>
|
||||
</div>
|
||||
) : (
|
||||
<div className="mx-auto max-w-screen-xl mt-5">
|
||||
<article
|
||||
role="alert"
|
||||
className="rounded-lg max-w-screen-md border-s-4 border-yellow-500 bg-yellow-50 p-4 dark:border-s-4 dark:border-yellow-600 dark:bg-yellow-300 dark:text-slate-600"
|
||||
>
|
||||
<div>
|
||||
<p>
|
||||
No comics were found in the library, Elasticsearch reports no indices. Try
|
||||
importing a few comics into the library and come back.
|
||||
</p>
|
||||
</div>
|
||||
</article>
|
||||
<FilterDropdown />
|
||||
</div>
|
||||
)}
|
||||
</section>
|
||||
);
|
||||
};
|
||||
|
||||
export default Library;
|
||||
|
||||
@@ -8,23 +8,52 @@ import {
|
||||
import { useTable, usePagination } from "react-table";
|
||||
import prettyBytes from "pretty-bytes";
|
||||
import ellipsize from "ellipsize";
|
||||
import { useDispatch, useSelector } from "react-redux";
|
||||
import { getComicBooks } from "../../actions/fileops.actions";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { isNil, isEmpty, isUndefined } from "lodash";
|
||||
import Masonry from "react-masonry-css";
|
||||
import Card from "../shared/Carda";
|
||||
import { detectIssueTypes } from "../../shared/utils/tradepaperback.utils";
|
||||
import { Link } from "react-router-dom";
|
||||
import { LIBRARY_SERVICE_HOST } from "../../constants/endpoints";
|
||||
import { LIBRARY_SERVICE_HOST, LIBRARY_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import type { LibraryGridProps } from "../../types";
|
||||
|
||||
interface ILibraryGridProps {}
|
||||
export const LibraryGrid = (libraryGridProps: ILibraryGridProps) => {
|
||||
const data = useSelector(
|
||||
(state: RootState) => state.fileOps.recentComics.docs,
|
||||
);
|
||||
const pageTotal = useSelector(
|
||||
(state: RootState) => state.fileOps.recentComics.totalDocs,
|
||||
);
|
||||
interface ComicDoc {
|
||||
_id: string;
|
||||
rawFileDetails?: {
|
||||
cover?: {
|
||||
filePath: string;
|
||||
};
|
||||
name?: string;
|
||||
};
|
||||
sourcedMetadata?: {
|
||||
comicvine?: {
|
||||
image?: {
|
||||
small_url?: string;
|
||||
};
|
||||
name?: string;
|
||||
volumeInformation?: {
|
||||
description?: string;
|
||||
};
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
export const LibraryGrid = (libraryGridProps: LibraryGridProps) => {
|
||||
const { data: comicsData } = useQuery({
|
||||
queryKey: ["recentComics"],
|
||||
queryFn: async () =>
|
||||
axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBooks`,
|
||||
method: "POST",
|
||||
data: {
|
||||
paginationOptions: { size: 25, from: 0 },
|
||||
predicate: {},
|
||||
},
|
||||
}),
|
||||
});
|
||||
const data: ComicDoc[] = comicsData?.data?.docs ?? [];
|
||||
const pageTotal: number = comicsData?.data?.totalDocs ?? 0;
|
||||
const breakpointColumnsObj = {
|
||||
default: 5,
|
||||
1100: 4,
|
||||
@@ -42,20 +71,20 @@ export const LibraryGrid = (libraryGridProps: ILibraryGridProps) => {
|
||||
className="my-masonry-grid"
|
||||
columnClassName="my-masonry-grid_column"
|
||||
>
|
||||
{data.map(({ _id, rawFileDetails, sourcedMetadata }) => {
|
||||
{data.map(({ _id, rawFileDetails, sourcedMetadata }: ComicDoc) => {
|
||||
let imagePath = "";
|
||||
let comicName = "";
|
||||
if (!isEmpty(rawFileDetails.cover)) {
|
||||
if (rawFileDetails && !isEmpty(rawFileDetails.cover)) {
|
||||
const encodedFilePath = encodeURI(
|
||||
`${LIBRARY_SERVICE_HOST}/${removeLeadingPeriod(
|
||||
rawFileDetails.cover.filePath,
|
||||
rawFileDetails.cover?.filePath || '',
|
||||
)}`,
|
||||
);
|
||||
imagePath = escapePoundSymbol(encodedFilePath);
|
||||
comicName = rawFileDetails.name;
|
||||
} else if (!isNil(sourcedMetadata)) {
|
||||
comicName = rawFileDetails.name || '';
|
||||
} else if (!isNil(sourcedMetadata) && sourcedMetadata.comicvine?.image?.small_url) {
|
||||
imagePath = sourcedMetadata.comicvine.image.small_url;
|
||||
comicName = sourcedMetadata.comicvine.name;
|
||||
comicName = sourcedMetadata.comicvine?.name || '';
|
||||
}
|
||||
const titleElement = (
|
||||
<Link to={"/comic/details/" + _id}>
|
||||
@@ -71,7 +100,7 @@ export const LibraryGrid = (libraryGridProps: ILibraryGridProps) => {
|
||||
title={comicName ? titleElement : null}
|
||||
>
|
||||
<div className="content is-flex is-flex-direction-row">
|
||||
{!isEmpty(sourcedMetadata.comicvine) && (
|
||||
{sourcedMetadata && !isEmpty(sourcedMetadata.comicvine) && (
|
||||
<span className="icon cv-icon is-small inline-block w-6 h-6 md:w-7 md:h-7 flex-shrink-0">
|
||||
<img
|
||||
src="/src/client/assets/img/cvlogo.svg"
|
||||
@@ -79,12 +108,13 @@ export const LibraryGrid = (libraryGridProps: ILibraryGridProps) => {
|
||||
/>
|
||||
</span>
|
||||
)}
|
||||
{/* TODO: Switch to Solar icon */}
|
||||
{isNil(rawFileDetails) && (
|
||||
<span className="icon has-text-info">
|
||||
<i className="fas fa-adjust" />
|
||||
</span>
|
||||
)}
|
||||
{!isUndefined(sourcedMetadata.comicvine.volumeInformation) &&
|
||||
{sourcedMetadata?.comicvine?.volumeInformation?.description &&
|
||||
!isEmpty(
|
||||
detectIssueTypes(
|
||||
sourcedMetadata.comicvine.volumeInformation.description,
|
||||
@@ -93,8 +123,7 @@ export const LibraryGrid = (libraryGridProps: ILibraryGridProps) => {
|
||||
<span className="tag is-warning ml-1">
|
||||
{
|
||||
detectIssueTypes(
|
||||
sourcedMetadata.comicvine.volumeInformation
|
||||
.description,
|
||||
sourcedMetadata.comicvine.volumeInformation.description || '',
|
||||
).displayName
|
||||
}
|
||||
</span>
|
||||
|
||||
@@ -3,7 +3,11 @@ import PropTypes from "prop-types";
|
||||
import { Form, Field } from "react-final-form";
|
||||
import { Link } from "react-router-dom";
|
||||
|
||||
export const SearchBar = (props): ReactElement => {
|
||||
interface SearchBarProps {
|
||||
searchHandler: (values: Record<string, unknown>) => void;
|
||||
}
|
||||
|
||||
export const SearchBar = (props: SearchBarProps): ReactElement => {
|
||||
const { searchHandler } = props;
|
||||
return (
|
||||
<Form
|
||||
|
||||
@@ -3,10 +3,7 @@ import PullList from "../PullList/PullList";
|
||||
import { Volumes } from "../Volumes/Volumes";
|
||||
import WantedComics from "../WantedComics/WantedComics";
|
||||
import { Library } from "./Library";
|
||||
|
||||
interface ITabulatedContentContainerProps {
|
||||
category: string;
|
||||
}
|
||||
import type { TabulatedContentContainerProps } from "../../types";
|
||||
/**
|
||||
* Component to draw the contents of a category in a table.
|
||||
*
|
||||
@@ -18,7 +15,7 @@ interface ITabulatedContentContainerProps {
|
||||
*/
|
||||
|
||||
const TabulatedContentContainer = (
|
||||
props: ITabulatedContentContainerProps,
|
||||
props: TabulatedContentContainerProps,
|
||||
): ReactElement => {
|
||||
const { category } = props;
|
||||
const renderTabulatedContent = () => {
|
||||
|
||||
@@ -1,16 +1,27 @@
|
||||
import React, { ReactElement, useEffect, useMemo } from "react";
|
||||
import React, { ReactElement, useEffect, useMemo, useState } from "react";
|
||||
import T2Table from "../shared/T2Table";
|
||||
import { getWeeklyPullList } from "../../actions/comicinfo.actions";
|
||||
import Card from "../shared/Carda";
|
||||
import ellipsize from "ellipsize";
|
||||
import { isNil } from "lodash";
|
||||
import type { CellContext } from "@tanstack/react-table";
|
||||
|
||||
interface PullListComic {
|
||||
issue: {
|
||||
cover: string;
|
||||
name: string;
|
||||
publisher: string;
|
||||
description: string;
|
||||
price: string;
|
||||
pulls: number;
|
||||
};
|
||||
}
|
||||
|
||||
export const PullList = (): ReactElement => {
|
||||
// const pullListComics = useSelector(
|
||||
// (state: RootState) => state.comicInfo.pullList,
|
||||
// );
|
||||
// Placeholder for pull list comics - would come from API/store
|
||||
const [pullListComics, setPullListComics] = useState<PullListComic[] | null>(null);
|
||||
|
||||
useEffect(() => {
|
||||
// TODO: Implement pull list fetching
|
||||
// dispatch(
|
||||
// getWeeklyPullList({
|
||||
// startDate: "2023-7-28",
|
||||
@@ -31,7 +42,7 @@ export const PullList = (): ReactElement => {
|
||||
id: "comicDetails",
|
||||
minWidth: 450,
|
||||
accessorKey: "issue",
|
||||
cell: (row) => {
|
||||
cell: (row: CellContext<PullListComic, PullListComic["issue"]>) => {
|
||||
const item = row.getValue();
|
||||
return (
|
||||
<div className="columns">
|
||||
|
||||
@@ -1,6 +1,5 @@
|
||||
import React, { ReactElement, useState } from "react";
|
||||
import { isNil, isEmpty, isUndefined } from "lodash";
|
||||
import { IExtractedComicBookCoverFile, RootState } from "threetwo-ui-typings";
|
||||
import { detectIssueTypes } from "../../shared/utils/tradepaperback.utils";
|
||||
import { Form, Field } from "react-final-form";
|
||||
import Card from "../shared/Carda";
|
||||
@@ -16,18 +15,35 @@ import {
|
||||
LIBRARY_SERVICE_BASE_URI,
|
||||
} from "../../constants/endpoints";
|
||||
import axios from "axios";
|
||||
import type { SearchPageProps, ComicVineSearchResult } from "../../types";
|
||||
|
||||
interface ISearchProps {}
|
||||
interface ComicData {
|
||||
id: number;
|
||||
api_detail_url: string;
|
||||
image: { small_url: string; thumb_url?: string };
|
||||
cover_date?: string;
|
||||
issue_number?: string;
|
||||
name?: string;
|
||||
description?: string;
|
||||
volume?: { name: string; api_detail_url: string };
|
||||
start_year?: string;
|
||||
count_of_issues?: number;
|
||||
publisher?: { name: string };
|
||||
resource_type?: string;
|
||||
}
|
||||
|
||||
export const Search = ({}: ISearchProps): ReactElement => {
|
||||
export const Search = ({}: SearchPageProps): ReactElement => {
|
||||
const queryClient = useQueryClient();
|
||||
const formData = {
|
||||
search: "",
|
||||
};
|
||||
const [comicVineMetadata, setComicVineMetadata] = useState({});
|
||||
const [comicVineMetadata, setComicVineMetadata] = useState<{
|
||||
sourceName?: string;
|
||||
comicData?: ComicData;
|
||||
}>({});
|
||||
const [selectedResource, setSelectedResource] = useState("volume");
|
||||
const { t } = useTranslation();
|
||||
const handleResourceChange = (value) => {
|
||||
const handleResourceChange = (value: string) => {
|
||||
setSelectedResource(value);
|
||||
};
|
||||
|
||||
@@ -63,6 +79,11 @@ export const Search = ({}: ISearchProps): ReactElement => {
|
||||
comicObject,
|
||||
markEntireVolumeWanted,
|
||||
resourceType,
|
||||
}: {
|
||||
source: string;
|
||||
comicObject: any;
|
||||
markEntireVolumeWanted: boolean;
|
||||
resourceType: string;
|
||||
}) => {
|
||||
let volumeInformation = {};
|
||||
let issues = [];
|
||||
@@ -143,14 +164,14 @@ export const Search = ({}: ISearchProps): ReactElement => {
|
||||
},
|
||||
});
|
||||
|
||||
const addToLibrary = (sourceName: string, comicData) =>
|
||||
const addToLibrary = (sourceName: string, comicData: ComicData) =>
|
||||
setComicVineMetadata({ sourceName, comicData });
|
||||
|
||||
const createDescriptionMarkup = (html) => {
|
||||
const createDescriptionMarkup = (html: string) => {
|
||||
return { __html: html };
|
||||
};
|
||||
|
||||
const onSubmit = async (values) => {
|
||||
const onSubmit = async (values: { search: string }) => {
|
||||
const formData = { ...values, resource: selectedResource };
|
||||
try {
|
||||
mutate(formData);
|
||||
@@ -270,7 +291,7 @@ export const Search = ({}: ISearchProps): ReactElement => {
|
||||
)}
|
||||
{!isEmpty(comicVineSearchResults?.data?.results) ? (
|
||||
<div className="mx-auto max-w-screen-xl px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
{comicVineSearchResults.data.results.map((result) => {
|
||||
{comicVineSearchResults?.data?.results?.map((result: ComicData) => {
|
||||
return result.resource_type === "issue" ? (
|
||||
<div
|
||||
key={result.id}
|
||||
@@ -287,8 +308,8 @@ export const Search = ({}: ISearchProps): ReactElement => {
|
||||
</div>
|
||||
<div className="w-3/4">
|
||||
<div className="text-xl">
|
||||
{!isEmpty(result.volume.name) ? (
|
||||
result.volume.name
|
||||
{!isEmpty(result.volume?.name) ? (
|
||||
result.volume?.name
|
||||
) : (
|
||||
<span className="is-size-3">No Name</span>
|
||||
)}
|
||||
@@ -306,18 +327,18 @@ export const Search = ({}: ISearchProps): ReactElement => {
|
||||
{result.api_detail_url}
|
||||
</a>
|
||||
<p className="text-sm">
|
||||
{ellipsize(
|
||||
{result.description ? ellipsize(
|
||||
convert(result.description, {
|
||||
baseElements: {
|
||||
selectors: ["p", "div"],
|
||||
},
|
||||
}),
|
||||
320,
|
||||
)}
|
||||
) : ''}
|
||||
</p>
|
||||
<div className="mt-2">
|
||||
<PopoverButton
|
||||
content={`This will add ${result.volume.name} to your wanted list.`}
|
||||
content={`This will add ${result.volume?.name || 'this issue'} to your wanted list.`}
|
||||
clickHandler={() =>
|
||||
addToWantedList({
|
||||
source: "comicvine",
|
||||
@@ -408,14 +429,14 @@ export const Search = ({}: ISearchProps): ReactElement => {
|
||||
|
||||
{/* description */}
|
||||
<p className="text-sm">
|
||||
{ellipsize(
|
||||
{result.description ? ellipsize(
|
||||
convert(result.description, {
|
||||
baseElements: {
|
||||
selectors: ["p", "div"],
|
||||
},
|
||||
}),
|
||||
320,
|
||||
)}
|
||||
) : ''}
|
||||
</p>
|
||||
<div className="mt-2">
|
||||
<PopoverButton
|
||||
|
||||
@@ -1,16 +1,15 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import { useDispatch, useSelector } from "react-redux";
|
||||
import { useEffect } from "react";
|
||||
import { getServiceStatus } from "../../actions/fileops.actions";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { LIBRARY_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
|
||||
export const ServiceStatuses = (): ReactElement => {
|
||||
const serviceStatus = useSelector(
|
||||
(state: RootState) => state.fileOps.libraryServiceStatus,
|
||||
);
|
||||
const dispatch = useDispatch();
|
||||
useEffect(() => {
|
||||
dispatch(getServiceStatus());
|
||||
}, []);
|
||||
const { data } = useQuery({
|
||||
queryKey: ["serviceStatus"],
|
||||
queryFn: async () =>
|
||||
axios({ url: `${LIBRARY_SERVICE_BASE_URI}/getHealthInformation`, method: "GET" }),
|
||||
});
|
||||
const serviceStatus = data?.data;
|
||||
return (
|
||||
<div className="is-clearfix">
|
||||
<div className="mt-4">
|
||||
|
||||
@@ -38,16 +38,21 @@ export const AirDCPPHubsForm = (): ReactElement => {
|
||||
enabled: !isEmpty(settings?.data.directConnect?.client?.host),
|
||||
});
|
||||
|
||||
let hubList: any[] = [];
|
||||
interface HubOption {
|
||||
value: string;
|
||||
label: string;
|
||||
}
|
||||
|
||||
let hubList: HubOption[] = [];
|
||||
if (!isNil(hubs)) {
|
||||
hubList = hubs?.data.map(({ hub_url, identity }) => ({
|
||||
hubList = hubs?.data.map(({ hub_url, identity }: { hub_url: string; identity: { name: string } }) => ({
|
||||
value: hub_url,
|
||||
label: identity.name,
|
||||
}));
|
||||
}
|
||||
|
||||
const mutation = useMutation({
|
||||
mutationFn: async (values) =>
|
||||
mutationFn: async (values: Record<string, unknown>) =>
|
||||
await axios({
|
||||
url: `http://localhost:3000/api/settings/saveSettings`,
|
||||
method: "POST",
|
||||
@@ -69,13 +74,24 @@ export const AirDCPPHubsForm = (): ReactElement => {
|
||||
},
|
||||
});
|
||||
|
||||
const validate = async (values) => {
|
||||
const errors = {};
|
||||
const validate = async (values: Record<string, unknown>) => {
|
||||
const errors: Record<string, string> = {};
|
||||
// Add any validation logic here if needed
|
||||
return errors;
|
||||
};
|
||||
|
||||
const SelectAdapter = ({ input, ...rest }) => {
|
||||
interface SelectAdapterProps {
|
||||
input: {
|
||||
value: unknown;
|
||||
onChange: (value: unknown) => void;
|
||||
onBlur: () => void;
|
||||
onFocus: () => void;
|
||||
name: string;
|
||||
};
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
const SelectAdapter = ({ input, ...rest }: SelectAdapterProps) => {
|
||||
return <Select {...input} {...rest} isClearable isMulti />;
|
||||
};
|
||||
|
||||
@@ -155,7 +171,7 @@ export const AirDCPPHubsForm = (): ReactElement => {
|
||||
</span>
|
||||
<div className="block max-w-sm p-6 bg-white border border-gray-200 rounded-lg shadow dark:bg-slate-400 dark:border-gray-700">
|
||||
{settings?.data.directConnect?.client.hubs.map(
|
||||
({ value, label }) => (
|
||||
({ value, label }: HubOption) => (
|
||||
<div key={value}>
|
||||
<div>{label}</div>
|
||||
<span className="is-size-7">{value}</span>
|
||||
|
||||
@@ -1,7 +1,24 @@
|
||||
import React, { ReactElement } from "react";
|
||||
|
||||
export const AirDCPPSettingsConfirmation = (settingsObject): ReactElement => {
|
||||
const { settings } = settingsObject;
|
||||
interface AirDCPPSessionInfo {
|
||||
_id: string;
|
||||
system_info: {
|
||||
client_version: string;
|
||||
hostname: string;
|
||||
platform: string;
|
||||
};
|
||||
user: {
|
||||
username: string;
|
||||
active_sessions: number;
|
||||
permissions: string[];
|
||||
};
|
||||
}
|
||||
|
||||
interface AirDCPPSettingsConfirmationProps {
|
||||
settings: AirDCPPSessionInfo;
|
||||
}
|
||||
|
||||
export const AirDCPPSettingsConfirmation = ({ settings }: AirDCPPSettingsConfirmationProps): ReactElement => {
|
||||
return (
|
||||
<div>
|
||||
<span className="flex items-center mt-10 mb-4">
|
||||
|
||||
@@ -17,8 +17,16 @@ export const AirDCPPSettingsForm = () => {
|
||||
queryFn: () => axios.get(`${SETTINGS_SERVICE_BASE_URI}/getAllSettings`),
|
||||
});
|
||||
|
||||
interface HostConfig {
|
||||
hostname: string;
|
||||
port: string;
|
||||
username: string;
|
||||
password: string;
|
||||
protocol: string;
|
||||
}
|
||||
|
||||
// Fetch session information
|
||||
const fetchSessionInfo = (host) => {
|
||||
const fetchSessionInfo = (host: HostConfig) => {
|
||||
return axios.post(`${AIRDCPP_SERVICE_BASE_URI}/initialize`, { host });
|
||||
};
|
||||
|
||||
@@ -34,7 +42,7 @@ export const AirDCPPSettingsForm = () => {
|
||||
|
||||
// Handle setting update and subsequent AirDC++ initialization
|
||||
const { mutate } = useMutation({
|
||||
mutationFn: (values) => {
|
||||
mutationFn: (values: Record<string, unknown>) => {
|
||||
return axios.post("http://localhost:3000/api/settings/saveSettings", {
|
||||
settingsPayload: values,
|
||||
settingsKey: "directConnect",
|
||||
@@ -50,12 +58,13 @@ export const AirDCPPSettingsForm = () => {
|
||||
},
|
||||
});
|
||||
|
||||
const deleteSettingsMutation = useMutation(() =>
|
||||
axios.post("http://localhost:3000/api/settings/saveSettings", {
|
||||
settingsPayload: {},
|
||||
settingsKey: "directConnect",
|
||||
}),
|
||||
);
|
||||
const deleteSettingsMutation = useMutation({
|
||||
mutationFn: () =>
|
||||
axios.post("http://localhost:3000/api/settings/saveSettings", {
|
||||
settingsPayload: {},
|
||||
settingsKey: "directConnect",
|
||||
}),
|
||||
});
|
||||
|
||||
const initFormData = settingsData?.data?.directConnect?.client?.host ?? {};
|
||||
|
||||
|
||||
@@ -4,9 +4,13 @@ import { Form, Field } from "react-final-form";
|
||||
import { PROWLARR_SERVICE_BASE_URI } from "../../../constants/endpoints";
|
||||
import axios from "axios";
|
||||
|
||||
export const ProwlarrSettingsForm = (props) => {
|
||||
interface ProwlarrSettingsFormProps {
|
||||
// Add props here if needed
|
||||
}
|
||||
|
||||
export const ProwlarrSettingsForm = (_props: ProwlarrSettingsFormProps) => {
|
||||
const { data } = useQuery({
|
||||
queryFn: async (): any => {
|
||||
queryFn: async () => {
|
||||
return await axios({
|
||||
url: `${PROWLARR_SERVICE_BASE_URI}/getIndexers`,
|
||||
method: "POST",
|
||||
|
||||
@@ -3,7 +3,7 @@ import { ConnectionForm } from "../../shared/ConnectionForm/ConnectionForm";
|
||||
import { useQuery, useMutation, QueryClient } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
|
||||
export const QbittorrentConnectionForm = (): ReactElement => {
|
||||
export const QbittorrentConnectionForm = (): ReactElement | null => {
|
||||
const queryClient = new QueryClient();
|
||||
// fetch settings
|
||||
const { data, isLoading, isError } = useQuery({
|
||||
@@ -28,7 +28,7 @@ export const QbittorrentConnectionForm = (): ReactElement => {
|
||||
});
|
||||
// Update action using a mutation
|
||||
const { mutate } = useMutation({
|
||||
mutationFn: async (values) =>
|
||||
mutationFn: async (values: Record<string, unknown>) =>
|
||||
await axios({
|
||||
url: `http://localhost:3000/api/settings/saveSettings`,
|
||||
method: "POST",
|
||||
@@ -77,6 +77,7 @@ export const QbittorrentConnectionForm = (): ReactElement => {
|
||||
</>
|
||||
);
|
||||
}
|
||||
return null;
|
||||
};
|
||||
|
||||
export default QbittorrentConnectionForm;
|
||||
|
||||
@@ -8,10 +8,22 @@ import DockerVars from "./DockerVars/DockerVars";
|
||||
import { ServiceStatuses } from "../ServiceStatuses/ServiceStatuses";
|
||||
import settingsObject from "../../constants/settings/settingsMenu.json";
|
||||
import { isUndefined, map } from "lodash";
|
||||
import type { SettingsProps } from "../../types";
|
||||
|
||||
interface ISettingsProps {}
|
||||
interface SettingsMenuItem {
|
||||
id: string | number;
|
||||
displayName: string;
|
||||
children?: SettingsMenuItem[];
|
||||
}
|
||||
|
||||
export const Settings = (props: ISettingsProps): ReactElement => {
|
||||
interface SettingsCategory {
|
||||
id: number;
|
||||
category: string;
|
||||
displayName: string;
|
||||
children?: SettingsMenuItem[];
|
||||
}
|
||||
|
||||
export const Settings = (props: SettingsProps): ReactElement => {
|
||||
const [active, setActive] = useState("gen-db");
|
||||
const [expanded, setExpanded] = useState<Record<string, boolean>>({});
|
||||
|
||||
@@ -63,70 +75,70 @@ export const Settings = (props: ISettingsProps): ReactElement => {
|
||||
overflow-hidden"
|
||||
>
|
||||
<div className="px-4 py-6 overflow-y-auto">
|
||||
{map(settingsObject, (settingObject, idx) => (
|
||||
<div
|
||||
key={idx}
|
||||
className="mb-6 text-slate-700 dark:text-slate-300"
|
||||
>
|
||||
<h3 className="text-xs font-semibold text-slate-500 dark:text-slate-400 tracking-wide mb-3">
|
||||
{settingObject.category.toUpperCase()}
|
||||
</h3>
|
||||
|
||||
{!isUndefined(settingObject.children) && (
|
||||
<ul>
|
||||
{map(settingObject.children, (item, idx) => {
|
||||
const isOpen = expanded[item.id];
|
||||
|
||||
return (
|
||||
<li key={idx} className="mb-1">
|
||||
<div
|
||||
onClick={() => toggleExpanded(item.id)}
|
||||
className={`cursor-pointer flex justify-between items-center px-1 py-1 rounded-md transition-colors hover:bg-white/50 dark:hover:bg-slate-700 ${
|
||||
item.id === active
|
||||
? "font-semibold text-blue-600 dark:text-blue-400"
|
||||
: ""
|
||||
}`}
|
||||
>
|
||||
<span
|
||||
onClick={() => setActive(item.id.toString())}
|
||||
className="flex-1"
|
||||
{map(settingsObject as SettingsCategory[], (settingObject, idx) => (
|
||||
<div
|
||||
key={idx}
|
||||
className="mb-6 text-slate-700 dark:text-slate-300"
|
||||
>
|
||||
<h3 className="text-xs font-semibold text-slate-500 dark:text-slate-400 tracking-wide mb-3">
|
||||
{settingObject.category.toUpperCase()}
|
||||
</h3>
|
||||
|
||||
{!isUndefined(settingObject.children) && (
|
||||
<ul>
|
||||
{map(settingObject.children, (item: SettingsMenuItem, idx) => {
|
||||
const isOpen = expanded[String(item.id)];
|
||||
|
||||
return (
|
||||
<li key={idx} className="mb-1">
|
||||
<div
|
||||
onClick={() => toggleExpanded(String(item.id))}
|
||||
className={`cursor-pointer flex justify-between items-center px-1 py-1 rounded-md transition-colors hover:bg-white/50 dark:hover:bg-slate-700 ${
|
||||
String(item.id) === active
|
||||
? "font-semibold text-blue-600 dark:text-blue-400"
|
||||
: ""
|
||||
}`}
|
||||
>
|
||||
{item.displayName}
|
||||
</span>
|
||||
{!isUndefined(item.children) && (
|
||||
<span className="text-xs opacity-60">
|
||||
{isOpen ? "−" : "+"}
|
||||
<span
|
||||
onClick={() => setActive(String(item.id))}
|
||||
className="flex-1"
|
||||
>
|
||||
{item.displayName}
|
||||
</span>
|
||||
{!isUndefined(item.children) && (
|
||||
<span className="text-xs opacity-60">
|
||||
{isOpen ? "−" : "+"}
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{!isUndefined(item.children) && isOpen && (
|
||||
<ul className="pl-4 mt-1">
|
||||
{map(item.children, (subItem: SettingsMenuItem) => (
|
||||
<li key={String(subItem.id)} className="mb-1">
|
||||
<a
|
||||
onClick={() =>
|
||||
setActive(String(subItem.id))
|
||||
}
|
||||
className={`cursor-pointer flex items-center px-1 py-1 rounded-md transition-colors hover:bg-white/50 dark:hover:bg-slate-700 ${
|
||||
String(subItem.id) === active
|
||||
? "font-semibold text-blue-600 dark:text-blue-400"
|
||||
: ""
|
||||
}`}
|
||||
>
|
||||
{subItem.displayName}
|
||||
</a>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{!isUndefined(item.children) && isOpen && (
|
||||
<ul className="pl-4 mt-1">
|
||||
{map(item.children, (subItem) => (
|
||||
<li key={subItem.id} className="mb-1">
|
||||
<a
|
||||
onClick={() =>
|
||||
setActive(subItem.id.toString())
|
||||
}
|
||||
className={`cursor-pointer flex items-center px-1 py-1 rounded-md transition-colors hover:bg-white/50 dark:hover:bg-slate-700 ${
|
||||
subItem.id.toString() === active
|
||||
? "font-semibold text-blue-600 dark:text-blue-400"
|
||||
: ""
|
||||
}`}
|
||||
>
|
||||
{subItem.displayName}
|
||||
</a>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
)}
|
||||
</li>
|
||||
);
|
||||
})}
|
||||
</ul>
|
||||
)}
|
||||
</div>
|
||||
))}
|
||||
</li>
|
||||
);
|
||||
})}
|
||||
</ul>
|
||||
)}
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</aside>
|
||||
</div>
|
||||
|
||||
@@ -3,7 +3,7 @@ import { useMutation } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
|
||||
export const SystemSettingsForm = (): ReactElement => {
|
||||
const { mutate: flushDb, isLoading } = useMutation({
|
||||
const { mutate: flushDb, isPending } = useMutation({
|
||||
mutationFn: async () => {
|
||||
await axios({
|
||||
url: `http://localhost:3000/api/library/flushDb`,
|
||||
|
||||
@@ -1,21 +1,41 @@
|
||||
import { isArray, map } from "lodash";
|
||||
import React, { useEffect, ReactElement } from "react";
|
||||
import { useDispatch, useSelector } from "react-redux";
|
||||
import { getComicBooksDetailsByIds } from "../../actions/comicinfo.actions";
|
||||
import React, { ReactElement } from "react";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { Card } from "../shared/Carda";
|
||||
import ellipsize from "ellipsize";
|
||||
import { LIBRARY_SERVICE_HOST } from "../../constants/endpoints";
|
||||
import { LIBRARY_SERVICE_HOST, LIBRARY_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import { escapePoundSymbol } from "../../shared/utils/formatting.utils";
|
||||
import prettyBytes from "pretty-bytes";
|
||||
|
||||
const PotentialLibraryMatches = (props): ReactElement => {
|
||||
const dispatch = useDispatch();
|
||||
const comicBooks = useSelector(
|
||||
(state: RootState) => state.comicInfo.comicBooksDetails,
|
||||
);
|
||||
useEffect(() => {
|
||||
dispatch(getComicBooksDetailsByIds(props.matches));
|
||||
}, []);
|
||||
interface PotentialLibraryMatchesProps {
|
||||
matches: string[];
|
||||
}
|
||||
|
||||
interface ComicBookMatch {
|
||||
rawFileDetails: {
|
||||
cover: {
|
||||
filePath: string;
|
||||
};
|
||||
name: string;
|
||||
containedIn: string;
|
||||
extension: string;
|
||||
fileSize: number;
|
||||
};
|
||||
}
|
||||
|
||||
const PotentialLibraryMatches = (props: PotentialLibraryMatchesProps): ReactElement => {
|
||||
const { data } = useQuery({
|
||||
queryKey: ["comicBooksDetails", props.matches],
|
||||
queryFn: async () =>
|
||||
axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBooksByIds`,
|
||||
method: "POST",
|
||||
data: { ids: props.matches },
|
||||
}),
|
||||
enabled: props.matches.length > 0,
|
||||
});
|
||||
const comicBooks: ComicBookMatch[] = data?.data ?? [];
|
||||
return (
|
||||
<div className="potential-matches-container mt-10">
|
||||
{isArray(comicBooks) ? (
|
||||
|
||||
@@ -1,8 +1,7 @@
|
||||
import { isEmpty, isNil, isUndefined, map, partialRight, pick } from "lodash";
|
||||
import React, { ReactElement, useState, useCallback } from "react";
|
||||
import { useParams } from "react-router";
|
||||
import { analyzeLibrary } from "../../actions/comicinfo.actions";
|
||||
import { useQuery, useMutation, QueryClient } from "@tanstack/react-query";
|
||||
import { useQuery, useMutation } from "@tanstack/react-query";
|
||||
import PotentialLibraryMatches from "./PotentialLibraryMatches";
|
||||
import { Card } from "../shared/Carda";
|
||||
import SlidingPane from "react-sliding-pane";
|
||||
@@ -14,38 +13,87 @@ import {
|
||||
} from "../../constants/endpoints";
|
||||
import axios from "axios";
|
||||
|
||||
const VolumeDetails = (props): ReactElement => {
|
||||
interface VolumeDetailsProps {
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
interface ComicObjectData {
|
||||
sourcedMetadata: {
|
||||
comicvine: {
|
||||
id?: string;
|
||||
volumeInformation: {
|
||||
id: string;
|
||||
name: string;
|
||||
description?: string;
|
||||
image: {
|
||||
small_url: string;
|
||||
};
|
||||
publisher: {
|
||||
name: string;
|
||||
};
|
||||
};
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
interface IssueData {
|
||||
id: string;
|
||||
name: string;
|
||||
issue_number: string;
|
||||
description?: string;
|
||||
matches?: unknown[];
|
||||
image: {
|
||||
small_url: string;
|
||||
thumb_url: string;
|
||||
};
|
||||
}
|
||||
|
||||
interface StoryArc {
|
||||
name?: string;
|
||||
}
|
||||
|
||||
interface MatchItem {
|
||||
_id?: string;
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
interface ContentForSlidingPanel {
|
||||
[key: string]: {
|
||||
content: () => React.ReactNode;
|
||||
};
|
||||
}
|
||||
|
||||
const VolumeDetails = (_props: VolumeDetailsProps): ReactElement => {
|
||||
// sliding panel config
|
||||
const [visible, setVisible] = useState(false);
|
||||
const [slidingPanelContentId, setSlidingPanelContentId] = useState("");
|
||||
const [matches, setMatches] = useState([]);
|
||||
const [storyArcsData, setStoryArcsData] = useState([]);
|
||||
const [matches, setMatches] = useState<MatchItem[]>([]);
|
||||
const [storyArcsData, setStoryArcsData] = useState<StoryArc[]>([]);
|
||||
const [active, setActive] = useState(1);
|
||||
|
||||
// sliding panel init
|
||||
const contentForSlidingPanel = {
|
||||
const contentForSlidingPanel: ContentForSlidingPanel = {
|
||||
potentialMatchesInLibrary: {
|
||||
content: () => {
|
||||
const ids = map(matches, partialRight(pick, "_id"));
|
||||
const matchIds = ids.map((id: any) => id._id);
|
||||
{
|
||||
/* return <PotentialLibraryMatches matches={matchIds} />; */
|
||||
}
|
||||
const matchIds = ids.map((id: MatchItem) => id._id).filter((id): id is string => !!id);
|
||||
return <PotentialLibraryMatches matches={matchIds} />;
|
||||
},
|
||||
},
|
||||
};
|
||||
|
||||
// sliding panel handlers
|
||||
const openPotentialLibraryMatchesPanel = useCallback((potentialMatches) => {
|
||||
const openPotentialLibraryMatchesPanel = useCallback((potentialMatches: MatchItem[]) => {
|
||||
setSlidingPanelContentId("potentialMatchesInLibrary");
|
||||
setMatches(potentialMatches);
|
||||
setVisible(true);
|
||||
}, []);
|
||||
|
||||
// const analyzeIssues = useCallback((issues) => {
|
||||
// dispatch(analyzeLibrary(issues));
|
||||
// }, []);
|
||||
//
|
||||
// Function to analyze issues (commented out but typed for future use)
|
||||
const analyzeIssues = useCallback((issues: IssueData[]) => {
|
||||
// dispatch(analyzeLibrary(issues));
|
||||
console.log("Analyzing issues:", issues);
|
||||
}, []);
|
||||
|
||||
const { comicObjectId } = useParams<{ comicObjectId: string }>();
|
||||
|
||||
@@ -83,7 +131,7 @@ const VolumeDetails = (props): ReactElement => {
|
||||
// get story arcs
|
||||
const useGetStoryArcs = () => {
|
||||
return useMutation({
|
||||
mutationFn: async (comicObject) =>
|
||||
mutationFn: async (comicObject: ComicObjectData) =>
|
||||
axios({
|
||||
url: `${COMICVINE_SERVICE_URI}/getResource`,
|
||||
method: "POST",
|
||||
@@ -93,7 +141,7 @@ const VolumeDetails = (props): ReactElement => {
|
||||
filter: `id:${comicObject?.sourcedMetadata.comicvine.id}`,
|
||||
},
|
||||
}),
|
||||
onSuccess: (data) => {
|
||||
onSuccess: (data: { data: { results: StoryArc[] } }) => {
|
||||
setStoryArcsData(data?.data.results);
|
||||
},
|
||||
});
|
||||
@@ -111,13 +159,13 @@ const VolumeDetails = (props): ReactElement => {
|
||||
const IssuesInVolume = () => (
|
||||
<>
|
||||
{!isUndefined(issuesForSeries) ? (
|
||||
<div className="button" onClick={() => analyzeIssues(issuesForSeries)}>
|
||||
<div className="button" onClick={() => analyzeIssues(issuesForSeries?.data || [])}>
|
||||
Analyze Library
|
||||
</div>
|
||||
) : null}
|
||||
<>
|
||||
{isSuccess &&
|
||||
issuesForSeries.data.map((issue) => {
|
||||
issuesForSeries.data.map((issue: IssueData) => {
|
||||
return (
|
||||
<>
|
||||
<Card
|
||||
@@ -129,6 +177,7 @@ const VolumeDetails = (props): ReactElement => {
|
||||
<span className="tag is-warning mr-1">
|
||||
{issue.issue_number}
|
||||
</span>
|
||||
{/* TODO: Switch to Solar icon */}
|
||||
{!isEmpty(issue.matches) ? (
|
||||
<>
|
||||
<span className="icon has-text-success">
|
||||
@@ -156,7 +205,7 @@ const VolumeDetails = (props): ReactElement => {
|
||||
</article>
|
||||
<div className="flex flex-wrap">
|
||||
{isSuccess &&
|
||||
issuesForSeries?.data.map((issue) => {
|
||||
issuesForSeries?.data.map((issue: IssueData) => {
|
||||
return (
|
||||
<div className="my-3 dark:bg-slate-400 bg-slate-300 p-4 rounded-lg w-3/4">
|
||||
<div className="flex flex-row gap-4 mb-2">
|
||||
@@ -169,11 +218,11 @@ const VolumeDetails = (props): ReactElement => {
|
||||
<div className="w-3/4">
|
||||
<p className="text-xl">{issue.name}</p>
|
||||
<p className="text-sm">
|
||||
{convert(issue.description, {
|
||||
{issue.description ? convert(issue.description, {
|
||||
baseElements: {
|
||||
selectors: ["p"],
|
||||
},
|
||||
})}
|
||||
}) : ''}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
@@ -215,9 +264,9 @@ const VolumeDetails = (props): ReactElement => {
|
||||
{!isEmpty(storyArcsData) && status === "success" && (
|
||||
<>
|
||||
<ul>
|
||||
{storyArcsData.map((storyArc) => {
|
||||
{storyArcsData.map((storyArc: StoryArc, idx: number) => {
|
||||
return (
|
||||
<li>
|
||||
<li key={idx}>
|
||||
<span className="text-lg">{storyArc?.name}</span>
|
||||
</li>
|
||||
);
|
||||
@@ -354,7 +403,7 @@ const VolumeDetails = (props): ReactElement => {
|
||||
width={"600px"}
|
||||
>
|
||||
{slidingPanelContentId !== "" &&
|
||||
contentForSlidingPanel[slidingPanelContentId].content()}
|
||||
(contentForSlidingPanel as ContentForSlidingPanel)[slidingPanelContentId]?.content()}
|
||||
</SlidingPane>
|
||||
</div>
|
||||
</>
|
||||
|
||||
@@ -1,5 +1,4 @@
|
||||
import React, { ReactElement, useEffect, useMemo } from "react";
|
||||
import { searchIssue } from "../../actions/fileops.actions";
|
||||
import React, { ReactElement, useMemo } from "react";
|
||||
import Card from "../shared/Carda";
|
||||
import T2Table from "../shared/T2Table";
|
||||
import ellipsize from "ellipsize";
|
||||
@@ -8,8 +7,45 @@ import { Link } from "react-router-dom";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { SEARCH_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import { CellContext, ColumnDef } from "@tanstack/react-table";
|
||||
|
||||
export const Volumes = (props): ReactElement => {
|
||||
interface VolumesProps {
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
interface VolumeSourceData {
|
||||
_id: string;
|
||||
_source: {
|
||||
sourcedMetadata: {
|
||||
comicvine: {
|
||||
volumeInformation: {
|
||||
name: string;
|
||||
description?: string;
|
||||
image: {
|
||||
small_url: string;
|
||||
};
|
||||
publisher: {
|
||||
name: string;
|
||||
};
|
||||
count_of_issues: number;
|
||||
};
|
||||
};
|
||||
};
|
||||
acquisition?: {
|
||||
directconnect?: unknown[];
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
interface VolumeInformation {
|
||||
name: string;
|
||||
publisher: {
|
||||
name: string;
|
||||
};
|
||||
count_of_issues?: number;
|
||||
}
|
||||
|
||||
export const Volumes = (_props: VolumesProps): ReactElement => {
|
||||
// const volumes = useSelector((state: RootState) => state.fileOps.volumes);
|
||||
const {
|
||||
data: volumes,
|
||||
@@ -34,17 +70,18 @@ export const Volumes = (props): ReactElement => {
|
||||
queryKey: ["volumes"],
|
||||
});
|
||||
const columnData = useMemo(
|
||||
(): any => [
|
||||
(): ColumnDef<VolumeSourceData, unknown>[] => [
|
||||
{
|
||||
header: "Volume Details",
|
||||
id: "volumeDetails",
|
||||
minWidth: 450,
|
||||
accessorFn: (row) => row,
|
||||
cell: (row): any => {
|
||||
const comicObject = row.getValue();
|
||||
size: 450,
|
||||
accessorFn: (row: VolumeSourceData) => row,
|
||||
cell: (info: CellContext<VolumeSourceData, VolumeSourceData>) => {
|
||||
const comicObject = info.getValue();
|
||||
const {
|
||||
_source: { sourcedMetadata },
|
||||
} = comicObject;
|
||||
const description = sourcedMetadata.comicvine.volumeInformation.description || '';
|
||||
return (
|
||||
<div className="flex flex-row gap-3 mt-5">
|
||||
<Link to={`/volume/details/${comicObject._id}`}>
|
||||
@@ -61,9 +98,9 @@ export const Volumes = (props): ReactElement => {
|
||||
{sourcedMetadata.comicvine.volumeInformation.name}
|
||||
</div>
|
||||
<p>
|
||||
{ellipsize(
|
||||
{description ? ellipsize(
|
||||
convert(
|
||||
sourcedMetadata.comicvine.volumeInformation.description,
|
||||
description,
|
||||
{
|
||||
baseElements: {
|
||||
selectors: ["p"],
|
||||
@@ -71,7 +108,7 @@ export const Volumes = (props): ReactElement => {
|
||||
},
|
||||
),
|
||||
180,
|
||||
)}
|
||||
) : ''}
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
@@ -84,9 +121,8 @@ export const Volumes = (props): ReactElement => {
|
||||
{
|
||||
header: "Downloads",
|
||||
accessorKey: "_source.acquisition.directconnect",
|
||||
align: "right",
|
||||
cell: (props) => {
|
||||
const row = props.getValue();
|
||||
cell: (props: CellContext<VolumeSourceData, unknown[] | undefined>) => {
|
||||
const row = props.getValue() || [];
|
||||
return (
|
||||
<div
|
||||
style={{
|
||||
@@ -105,16 +141,16 @@ export const Volumes = (props): ReactElement => {
|
||||
{
|
||||
header: "Publisher",
|
||||
accessorKey: "_source.sourcedMetadata.comicvine.volumeInformation",
|
||||
cell: (props): any => {
|
||||
cell: (props: CellContext<VolumeSourceData, VolumeInformation>) => {
|
||||
const row = props.getValue();
|
||||
return <div className="mt-5 text-md">{row.publisher.name}</div>;
|
||||
return <div className="mt-5 text-md">{row?.publisher?.name}</div>;
|
||||
},
|
||||
},
|
||||
{
|
||||
header: "Issue Count",
|
||||
accessorKey:
|
||||
"_source.sourcedMetadata.comicvine.volumeInformation.count_of_issues",
|
||||
cell: (props): any => {
|
||||
cell: (props: CellContext<VolumeSourceData, number>) => {
|
||||
const row = props.getValue();
|
||||
return (
|
||||
<div className="mt-5">
|
||||
|
||||
@@ -1,12 +1,39 @@
|
||||
import React, { ReactElement, useCallback, useEffect, useMemo } from "react";
|
||||
import SearchBar from "../Library/SearchBar";
|
||||
import React, { ReactElement } from "react";
|
||||
import T2Table from "../shared/T2Table";
|
||||
import MetadataPanel from "../shared/MetadataPanel";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { SEARCH_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import { CellContext } from "@tanstack/react-table";
|
||||
|
||||
export const WantedComics = (props): ReactElement => {
|
||||
interface WantedComicsProps {
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
interface WantedSourceData {
|
||||
_id: string;
|
||||
_source: {
|
||||
acquisition?: {
|
||||
directconnect?: {
|
||||
downloads: DownloadItem[];
|
||||
};
|
||||
};
|
||||
[key: string]: unknown;
|
||||
};
|
||||
}
|
||||
|
||||
interface DownloadItem {
|
||||
name: string;
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
interface AcquisitionData {
|
||||
directconnect?: {
|
||||
downloads: DownloadItem[];
|
||||
};
|
||||
}
|
||||
|
||||
export const WantedComics = (_props: WantedComicsProps): ReactElement => {
|
||||
const {
|
||||
data: wantedComics,
|
||||
isSuccess,
|
||||
@@ -39,9 +66,9 @@ export const WantedComics = (props): ReactElement => {
|
||||
{
|
||||
header: "Details",
|
||||
id: "comicDetails",
|
||||
minWidth: 350,
|
||||
accessorFn: (data) => data,
|
||||
cell: (value) => {
|
||||
size: 350,
|
||||
accessorFn: (data: WantedSourceData) => data,
|
||||
cell: (value: CellContext<WantedSourceData, WantedSourceData>) => {
|
||||
const row = value.getValue()._source;
|
||||
return row && <MetadataPanel data={row} />;
|
||||
},
|
||||
@@ -53,17 +80,14 @@ export const WantedComics = (props): ReactElement => {
|
||||
columns: [
|
||||
{
|
||||
header: "Files",
|
||||
align: "right",
|
||||
accessorKey: "_source.acquisition",
|
||||
cell: (props) => {
|
||||
const {
|
||||
directconnect: { downloads },
|
||||
} = props.getValue();
|
||||
cell: (props: CellContext<WantedSourceData, AcquisitionData | undefined>) => {
|
||||
const acquisition = props.getValue();
|
||||
const downloads = acquisition?.directconnect?.downloads || [];
|
||||
return (
|
||||
<div
|
||||
style={{
|
||||
display: "flex",
|
||||
// flexDirection: "column",
|
||||
justifyContent: "center",
|
||||
}}
|
||||
>
|
||||
@@ -78,17 +102,21 @@ export const WantedComics = (props): ReactElement => {
|
||||
header: "Download Details",
|
||||
id: "downloadDetails",
|
||||
accessorKey: "_source.acquisition",
|
||||
cell: (data) => (
|
||||
<ol>
|
||||
{data.getValue().directconnect.downloads.map((download, idx) => {
|
||||
return (
|
||||
<li className="is-size-7" key={idx}>
|
||||
{download.name}
|
||||
</li>
|
||||
);
|
||||
})}
|
||||
</ol>
|
||||
),
|
||||
cell: (data: CellContext<WantedSourceData, AcquisitionData | undefined>) => {
|
||||
const acquisition = data.getValue();
|
||||
const downloads = acquisition?.directconnect?.downloads || [];
|
||||
return (
|
||||
<ol>
|
||||
{downloads.map((download: DownloadItem, idx: number) => {
|
||||
return (
|
||||
<li className="is-size-7" key={idx}>
|
||||
{download.name}
|
||||
</li>
|
||||
);
|
||||
})}
|
||||
</ol>
|
||||
);
|
||||
},
|
||||
},
|
||||
{
|
||||
header: "Type",
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user