Compare commits
8 Commits
dep-hell
...
qbittorren
| Author | SHA1 | Date | |
|---|---|---|---|
| c6f719e78b | |||
| b4d1b678b1 | |||
| aa3192bc1a | |||
| 56ddfbd16e | |||
| 173735da45 | |||
| f4408cd493 | |||
| 41a9428729 | |||
| e1da6ddcef |
@@ -1,163 +0,0 @@
|
||||
<p align="center">
|
||||
<img src="https://em-content.zobj.net/source/apple/391/rock_1faa8.png" width="80" />
|
||||
</p>
|
||||
|
||||
<h1 align="center">caveman-compress</h1>
|
||||
|
||||
<p align="center">
|
||||
<strong>shrink memory file. save token every session.</strong>
|
||||
</p>
|
||||
|
||||
---
|
||||
|
||||
A Claude Code skill that compresses your project memory files (`CLAUDE.md`, todos, preferences) into caveman format — so every session loads fewer tokens automatically.
|
||||
|
||||
Claude read `CLAUDE.md` on every session start. If file big, cost big. Caveman make file small. Cost go down forever.
|
||||
|
||||
## What It Do
|
||||
|
||||
```
|
||||
/caveman:compress CLAUDE.md
|
||||
```
|
||||
|
||||
```
|
||||
CLAUDE.md ← compressed (Claude reads this — fewer tokens every session)
|
||||
CLAUDE.original.md ← human-readable backup (you edit this)
|
||||
```
|
||||
|
||||
Original never lost. You can read and edit `.original.md`. Run skill again to re-compress after edits.
|
||||
|
||||
## Benchmarks
|
||||
|
||||
Real results on real project files:
|
||||
|
||||
| File | Original | Compressed | Saved |
|
||||
|------|----------:|----------:|------:|
|
||||
| `claude-md-preferences.md` | 706 | 285 | **59.6%** |
|
||||
| `project-notes.md` | 1145 | 535 | **53.3%** |
|
||||
| `claude-md-project.md` | 1122 | 636 | **43.3%** |
|
||||
| `todo-list.md` | 627 | 388 | **38.1%** |
|
||||
| `mixed-with-code.md` | 888 | 560 | **36.9%** |
|
||||
| **Average** | **898** | **481** | **46%** |
|
||||
|
||||
All validations passed ✅ — headings, code blocks, URLs, file paths preserved exactly.
|
||||
|
||||
## Before / After
|
||||
|
||||
<table>
|
||||
<tr>
|
||||
<td width="50%">
|
||||
|
||||
### 📄 Original (706 tokens)
|
||||
|
||||
> "I strongly prefer TypeScript with strict mode enabled for all new code. Please don't use `any` type unless there's genuinely no way around it, and if you do, leave a comment explaining the reasoning. I find that taking the time to properly type things catches a lot of bugs before they ever make it to runtime."
|
||||
|
||||
</td>
|
||||
<td width="50%">
|
||||
|
||||
### 🪨 Caveman (285 tokens)
|
||||
|
||||
> "Prefer TypeScript strict mode always. No `any` unless unavoidable — comment why if used. Proper types catch bugs early."
|
||||
|
||||
</td>
|
||||
</tr>
|
||||
</table>
|
||||
|
||||
**Same instructions. 60% fewer tokens. Every. Single. Session.**
|
||||
|
||||
## Security
|
||||
|
||||
`caveman-compress` is flagged as Snyk High Risk due to subprocess and file I/O patterns detected by static analysis. This is a false positive — see [SECURITY.md](./SECURITY.md) for a full explanation of what the skill does and does not do.
|
||||
|
||||
## Install
|
||||
|
||||
Compress is built in with the `caveman` plugin. Install `caveman` once, then use `/caveman:compress`.
|
||||
|
||||
If you need local files, the compress skill lives at:
|
||||
|
||||
```bash
|
||||
caveman-compress/
|
||||
```
|
||||
|
||||
**Requires:** Python 3.10+
|
||||
|
||||
## Usage
|
||||
|
||||
```
|
||||
/caveman:compress <filepath>
|
||||
```
|
||||
|
||||
Examples:
|
||||
```
|
||||
/caveman:compress CLAUDE.md
|
||||
/caveman:compress docs/preferences.md
|
||||
/caveman:compress todos.md
|
||||
```
|
||||
|
||||
### What files work
|
||||
|
||||
| Type | Compress? |
|
||||
|------|-----------|
|
||||
| `.md`, `.txt`, `.rst` | ✅ Yes |
|
||||
| Extensionless natural language | ✅ Yes |
|
||||
| `.py`, `.js`, `.ts`, `.json`, `.yaml` | ❌ Skip (code/config) |
|
||||
| `*.original.md` | ❌ Skip (backup files) |
|
||||
|
||||
## How It Work
|
||||
|
||||
```
|
||||
/caveman:compress CLAUDE.md
|
||||
↓
|
||||
detect file type (no tokens)
|
||||
↓
|
||||
Claude compresses (tokens — one call)
|
||||
↓
|
||||
validate output (no tokens)
|
||||
checks: headings, code blocks, URLs, file paths, bullets
|
||||
↓
|
||||
if errors: Claude fixes cherry-picked issues only (tokens — targeted fix)
|
||||
does NOT recompress — only patches broken parts
|
||||
↓
|
||||
retry up to 2 times
|
||||
↓
|
||||
write compressed → CLAUDE.md
|
||||
write original → CLAUDE.original.md
|
||||
```
|
||||
|
||||
Only two things use tokens: initial compression + targeted fix if validation fails. Everything else is local Python.
|
||||
|
||||
## What Is Preserved
|
||||
|
||||
Caveman compress natural language. It never touch:
|
||||
|
||||
- Code blocks (` ``` ` fenced or indented)
|
||||
- Inline code (`` `backtick content` ``)
|
||||
- URLs and links
|
||||
- File paths (`/src/components/...`)
|
||||
- Commands (`npm install`, `git commit`)
|
||||
- Technical terms, library names, API names
|
||||
- Headings (exact text preserved)
|
||||
- Tables (structure preserved, cell text compressed)
|
||||
- Dates, version numbers, numeric values
|
||||
|
||||
## Why This Matter
|
||||
|
||||
`CLAUDE.md` loads on **every session start**. A 1000-token project memory file costs tokens every single time you open a project. Over 100 sessions that's 100,000 tokens of overhead — just for context you already wrote.
|
||||
|
||||
Caveman cut that by ~46% on average. Same instructions. Same accuracy. Less waste.
|
||||
|
||||
```
|
||||
┌────────────────────────────────────────────┐
|
||||
│ TOKEN SAVINGS PER FILE █████ 46% │
|
||||
│ SESSIONS THAT BENEFIT ██████████ 100% │
|
||||
│ INFORMATION PRESERVED ██████████ 100% │
|
||||
│ SETUP TIME █ 1x │
|
||||
└────────────────────────────────────────────┘
|
||||
```
|
||||
|
||||
## Part of Caveman
|
||||
|
||||
This skill is part of the [caveman](https://github.com/JuliusBrussee/caveman) toolkit — making Claude use fewer tokens without losing accuracy.
|
||||
|
||||
- **caveman** — make Claude *speak* like caveman (cuts response tokens ~65%)
|
||||
- **caveman-compress** — make Claude *read* less (cuts context tokens ~46%)
|
||||
@@ -1,31 +0,0 @@
|
||||
# Security
|
||||
|
||||
## Snyk High Risk Rating
|
||||
|
||||
`caveman-compress` receives a Snyk High Risk rating due to static analysis heuristics. This document explains what the skill does and does not do.
|
||||
|
||||
### What triggers the rating
|
||||
|
||||
1. **subprocess usage**: The skill calls the `claude` CLI via `subprocess.run()` as a fallback when `ANTHROPIC_API_KEY` is not set. The subprocess call uses a fixed argument list — no shell interpolation occurs. User file content is passed via stdin, not as a shell argument.
|
||||
|
||||
2. **File read/write**: The skill reads the file the user explicitly points it at, compresses it, and writes the result back to the same path. A `.original.md` backup is saved alongside it. No files outside the user-specified path are read or written.
|
||||
|
||||
### What the skill does NOT do
|
||||
|
||||
- Does not execute user file content as code
|
||||
- Does not make network requests except to Anthropic's API (via SDK or CLI)
|
||||
- Does not access files outside the path the user provides
|
||||
- Does not use shell=True or string interpolation in subprocess calls
|
||||
- Does not collect or transmit any data beyond the file being compressed
|
||||
|
||||
### Auth behavior
|
||||
|
||||
If `ANTHROPIC_API_KEY` is set, the skill uses the Anthropic Python SDK directly (no subprocess). If not set, it falls back to the `claude` CLI, which uses the user's existing Claude desktop authentication.
|
||||
|
||||
### File size limit
|
||||
|
||||
Files larger than 500KB are rejected before any API call is made.
|
||||
|
||||
### Reporting a vulnerability
|
||||
|
||||
If you believe you've found a genuine security issue, please open a GitHub issue with the label `security`.
|
||||
@@ -1,111 +0,0 @@
|
||||
---
|
||||
name: caveman-compress
|
||||
description: >
|
||||
Compress natural language memory files (CLAUDE.md, todos, preferences) into caveman format
|
||||
to save input tokens. Preserves all technical substance, code, URLs, and structure.
|
||||
Compressed version overwrites the original file. Human-readable backup saved as FILE.original.md.
|
||||
Trigger: /caveman:compress <filepath> or "compress memory file"
|
||||
---
|
||||
|
||||
# Caveman Compress
|
||||
|
||||
## Purpose
|
||||
|
||||
Compress natural language files (CLAUDE.md, todos, preferences) into caveman-speak to reduce input tokens. Compressed version overwrites original. Human-readable backup saved as `<filename>.original.md`.
|
||||
|
||||
## Trigger
|
||||
|
||||
`/caveman:compress <filepath>` or when user asks to compress a memory file.
|
||||
|
||||
## Process
|
||||
|
||||
1. The compression scripts live in `caveman-compress/scripts/` (adjacent to this SKILL.md). If the path is not immediately available, search for `caveman-compress/scripts/__main__.py`.
|
||||
|
||||
2. Run:
|
||||
|
||||
cd caveman-compress && python3 -m scripts <absolute_filepath>
|
||||
|
||||
3. The CLI will:
|
||||
- detect file type (no tokens)
|
||||
- call Claude to compress
|
||||
- validate output (no tokens)
|
||||
- if errors: cherry-pick fix with Claude (targeted fixes only, no recompression)
|
||||
- retry up to 2 times
|
||||
- if still failing after 2 retries: report error to user, leave original file untouched
|
||||
|
||||
4. Return result to user
|
||||
|
||||
## Compression Rules
|
||||
|
||||
### Remove
|
||||
- Articles: a, an, the
|
||||
- Filler: just, really, basically, actually, simply, essentially, generally
|
||||
- Pleasantries: "sure", "certainly", "of course", "happy to", "I'd recommend"
|
||||
- Hedging: "it might be worth", "you could consider", "it would be good to"
|
||||
- Redundant phrasing: "in order to" → "to", "make sure to" → "ensure", "the reason is because" → "because"
|
||||
- Connective fluff: "however", "furthermore", "additionally", "in addition"
|
||||
|
||||
### Preserve EXACTLY (never modify)
|
||||
- Code blocks (fenced ``` and indented)
|
||||
- Inline code (`backtick content`)
|
||||
- URLs and links (full URLs, markdown links)
|
||||
- File paths (`/src/components/...`, `./config.yaml`)
|
||||
- Commands (`npm install`, `git commit`, `docker build`)
|
||||
- Technical terms (library names, API names, protocols, algorithms)
|
||||
- Proper nouns (project names, people, companies)
|
||||
- Dates, version numbers, numeric values
|
||||
- Environment variables (`$HOME`, `NODE_ENV`)
|
||||
|
||||
### Preserve Structure
|
||||
- All markdown headings (keep exact heading text, compress body below)
|
||||
- Bullet point hierarchy (keep nesting level)
|
||||
- Numbered lists (keep numbering)
|
||||
- Tables (compress cell text, keep structure)
|
||||
- Frontmatter/YAML headers in markdown files
|
||||
|
||||
### Compress
|
||||
- Use short synonyms: "big" not "extensive", "fix" not "implement a solution for", "use" not "utilize"
|
||||
- Fragments OK: "Run tests before commit" not "You should always run tests before committing"
|
||||
- Drop "you should", "make sure to", "remember to" — just state the action
|
||||
- Merge redundant bullets that say the same thing differently
|
||||
- Keep one example where multiple examples show the same pattern
|
||||
|
||||
CRITICAL RULE:
|
||||
Anything inside ``` ... ``` must be copied EXACTLY.
|
||||
Do not:
|
||||
- remove comments
|
||||
- remove spacing
|
||||
- reorder lines
|
||||
- shorten commands
|
||||
- simplify anything
|
||||
|
||||
Inline code (`...`) must be preserved EXACTLY.
|
||||
Do not modify anything inside backticks.
|
||||
|
||||
If file contains code blocks:
|
||||
- Treat code blocks as read-only regions
|
||||
- Only compress text outside them
|
||||
- Do not merge sections around code
|
||||
|
||||
## Pattern
|
||||
|
||||
Original:
|
||||
> You should always make sure to run the test suite before pushing any changes to the main branch. This is important because it helps catch bugs early and prevents broken builds from being deployed to production.
|
||||
|
||||
Compressed:
|
||||
> Run tests before push to main. Catch bugs early, prevent broken prod deploys.
|
||||
|
||||
Original:
|
||||
> The application uses a microservices architecture with the following components. The API gateway handles all incoming requests and routes them to the appropriate service. The authentication service is responsible for managing user sessions and JWT tokens.
|
||||
|
||||
Compressed:
|
||||
> Microservices architecture. API gateway route all requests to services. Auth service manage user sessions + JWT tokens.
|
||||
|
||||
## Boundaries
|
||||
|
||||
- ONLY compress natural language files (.md, .txt, extensionless)
|
||||
- NEVER modify: .py, .js, .ts, .json, .yaml, .yml, .toml, .env, .lock, .css, .html, .xml, .sql, .sh
|
||||
- If file has mixed content (prose + code), compress ONLY the prose sections
|
||||
- If unsure whether something is code or prose, leave it unchanged
|
||||
- Original file is backed up as FILE.original.md before overwriting
|
||||
- Never compress FILE.original.md (skip it)
|
||||
@@ -1,9 +0,0 @@
|
||||
"""Caveman compress scripts.
|
||||
|
||||
This package provides tools to compress natural language markdown files
|
||||
into caveman format to save input tokens.
|
||||
"""
|
||||
|
||||
__all__ = ["cli", "compress", "detect", "validate"]
|
||||
|
||||
__version__ = "1.0.0"
|
||||
@@ -1,3 +0,0 @@
|
||||
from .cli import main
|
||||
|
||||
main()
|
||||
@@ -1,78 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
from pathlib import Path
|
||||
import sys
|
||||
|
||||
# Support both direct execution and module import
|
||||
try:
|
||||
from .validate import validate
|
||||
except ImportError:
|
||||
sys.path.insert(0, str(Path(__file__).parent))
|
||||
from validate import validate
|
||||
|
||||
try:
|
||||
import tiktoken
|
||||
_enc = tiktoken.get_encoding("o200k_base")
|
||||
except ImportError:
|
||||
_enc = None
|
||||
|
||||
|
||||
def count_tokens(text):
|
||||
if _enc is None:
|
||||
return len(text.split()) # fallback: word count
|
||||
return len(_enc.encode(text))
|
||||
|
||||
|
||||
def benchmark_pair(orig_path: Path, comp_path: Path):
|
||||
orig_text = orig_path.read_text()
|
||||
comp_text = comp_path.read_text()
|
||||
|
||||
orig_tokens = count_tokens(orig_text)
|
||||
comp_tokens = count_tokens(comp_text)
|
||||
saved = 100 * (orig_tokens - comp_tokens) / orig_tokens if orig_tokens > 0 else 0.0
|
||||
result = validate(orig_path, comp_path)
|
||||
|
||||
return (comp_path.name, orig_tokens, comp_tokens, saved, result.is_valid)
|
||||
|
||||
|
||||
def print_table(rows):
|
||||
print("\n| File | Original | Compressed | Saved % | Valid |")
|
||||
print("|------|----------|------------|---------|-------|")
|
||||
for r in rows:
|
||||
print(f"| {r[0]} | {r[1]} | {r[2]} | {r[3]:.1f}% | {'✅' if r[4] else '❌'} |")
|
||||
|
||||
|
||||
def main():
|
||||
# Direct file pair: python3 benchmark.py original.md compressed.md
|
||||
if len(sys.argv) == 3:
|
||||
orig = Path(sys.argv[1]).resolve()
|
||||
comp = Path(sys.argv[2]).resolve()
|
||||
if not orig.exists():
|
||||
print(f"❌ Not found: {orig}")
|
||||
sys.exit(1)
|
||||
if not comp.exists():
|
||||
print(f"❌ Not found: {comp}")
|
||||
sys.exit(1)
|
||||
print_table([benchmark_pair(orig, comp)])
|
||||
return
|
||||
|
||||
# Glob mode: repo_root/tests/caveman-compress/
|
||||
tests_dir = Path(__file__).parent.parent.parent / "tests" / "caveman-compress"
|
||||
if not tests_dir.exists():
|
||||
print(f"❌ Tests dir not found: {tests_dir}")
|
||||
sys.exit(1)
|
||||
|
||||
rows = []
|
||||
for orig in sorted(tests_dir.glob("*.original.md")):
|
||||
comp = orig.with_name(orig.stem.removesuffix(".original") + ".md")
|
||||
if comp.exists():
|
||||
rows.append(benchmark_pair(orig, comp))
|
||||
|
||||
if not rows:
|
||||
print("No compressed file pairs found.")
|
||||
return
|
||||
|
||||
print_table(rows)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -1,73 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Caveman Compress CLI
|
||||
|
||||
Usage:
|
||||
caveman <filepath>
|
||||
"""
|
||||
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
from .compress import compress_file
|
||||
from .detect import detect_file_type, should_compress
|
||||
|
||||
|
||||
def print_usage():
|
||||
print("Usage: caveman <filepath>")
|
||||
|
||||
|
||||
def main():
|
||||
if len(sys.argv) != 2:
|
||||
print_usage()
|
||||
sys.exit(1)
|
||||
|
||||
filepath = Path(sys.argv[1])
|
||||
|
||||
# Check file exists
|
||||
if not filepath.exists():
|
||||
print(f"❌ File not found: {filepath}")
|
||||
sys.exit(1)
|
||||
|
||||
if not filepath.is_file():
|
||||
print(f"❌ Not a file: {filepath}")
|
||||
sys.exit(1)
|
||||
|
||||
filepath = filepath.resolve()
|
||||
|
||||
# Detect file type
|
||||
file_type = detect_file_type(filepath)
|
||||
|
||||
print(f"Detected: {file_type}")
|
||||
|
||||
# Check if compressible
|
||||
if not should_compress(filepath):
|
||||
print("Skipping: file is not natural language (code/config)")
|
||||
sys.exit(0)
|
||||
|
||||
print("Starting caveman compression...\n")
|
||||
|
||||
try:
|
||||
success = compress_file(filepath)
|
||||
|
||||
if success:
|
||||
print("\nCompression completed successfully")
|
||||
backup_path = filepath.with_name(filepath.stem + ".original.md")
|
||||
print(f"Compressed: {filepath}")
|
||||
print(f"Original: {backup_path}")
|
||||
sys.exit(0)
|
||||
else:
|
||||
print("\n❌ Compression failed after retries")
|
||||
sys.exit(2)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("\nInterrupted by user")
|
||||
sys.exit(130)
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n❌ Error: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -1,227 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Caveman Memory Compression Orchestrator
|
||||
|
||||
Usage:
|
||||
python scripts/compress.py <filepath>
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from typing import List
|
||||
|
||||
OUTER_FENCE_REGEX = re.compile(
|
||||
r"\A\s*(`{3,}|~{3,})[^\n]*\n(.*)\n\1\s*\Z", re.DOTALL
|
||||
)
|
||||
|
||||
# Filenames and paths that almost certainly hold secrets or PII. Compressing
|
||||
# them ships raw bytes to the Anthropic API — a third-party data boundary that
|
||||
# developers on sensitive codebases cannot cross. detect.py already skips .env
|
||||
# by extension, but credentials.md / secrets.txt / ~/.aws/credentials would
|
||||
# slip through the natural-language filter. This is a hard refuse before read.
|
||||
SENSITIVE_BASENAME_REGEX = re.compile(
|
||||
r"(?ix)^("
|
||||
r"\.env(\..+)?"
|
||||
r"|\.netrc"
|
||||
r"|credentials(\..+)?"
|
||||
r"|secrets?(\..+)?"
|
||||
r"|passwords?(\..+)?"
|
||||
r"|id_(rsa|dsa|ecdsa|ed25519)(\.pub)?"
|
||||
r"|authorized_keys"
|
||||
r"|known_hosts"
|
||||
r"|.*\.(pem|key|p12|pfx|crt|cer|jks|keystore|asc|gpg)"
|
||||
r")$"
|
||||
)
|
||||
|
||||
SENSITIVE_PATH_COMPONENTS = frozenset({".ssh", ".aws", ".gnupg", ".kube", ".docker"})
|
||||
|
||||
SENSITIVE_NAME_TOKENS = (
|
||||
"secret", "credential", "password", "passwd",
|
||||
"apikey", "accesskey", "token", "privatekey",
|
||||
)
|
||||
|
||||
|
||||
def is_sensitive_path(filepath: Path) -> bool:
|
||||
"""Heuristic denylist for files that must never be shipped to a third-party API."""
|
||||
name = filepath.name
|
||||
if SENSITIVE_BASENAME_REGEX.match(name):
|
||||
return True
|
||||
lowered_parts = {p.lower() for p in filepath.parts}
|
||||
if lowered_parts & SENSITIVE_PATH_COMPONENTS:
|
||||
return True
|
||||
# Normalize separators so "api-key" and "api_key" both match "apikey".
|
||||
lower = re.sub(r"[_\-\s.]", "", name.lower())
|
||||
return any(tok in lower for tok in SENSITIVE_NAME_TOKENS)
|
||||
|
||||
|
||||
def strip_llm_wrapper(text: str) -> str:
|
||||
"""Strip outer ```markdown ... ``` fence when it wraps the entire output."""
|
||||
m = OUTER_FENCE_REGEX.match(text)
|
||||
if m:
|
||||
return m.group(2)
|
||||
return text
|
||||
|
||||
from .detect import should_compress
|
||||
from .validate import validate
|
||||
|
||||
MAX_RETRIES = 2
|
||||
|
||||
|
||||
# ---------- Claude Calls ----------
|
||||
|
||||
|
||||
def call_claude(prompt: str) -> str:
|
||||
api_key = os.environ.get("ANTHROPIC_API_KEY")
|
||||
if api_key:
|
||||
try:
|
||||
import anthropic
|
||||
|
||||
client = anthropic.Anthropic(api_key=api_key)
|
||||
msg = client.messages.create(
|
||||
model=os.environ.get("CAVEMAN_MODEL", "claude-sonnet-4-5"),
|
||||
max_tokens=8192,
|
||||
messages=[{"role": "user", "content": prompt}],
|
||||
)
|
||||
return strip_llm_wrapper(msg.content[0].text.strip())
|
||||
except ImportError:
|
||||
pass # anthropic not installed, fall back to CLI
|
||||
# Fallback: use claude CLI (handles desktop auth)
|
||||
try:
|
||||
result = subprocess.run(
|
||||
["claude", "--print"],
|
||||
input=prompt,
|
||||
text=True,
|
||||
capture_output=True,
|
||||
check=True,
|
||||
)
|
||||
return strip_llm_wrapper(result.stdout.strip())
|
||||
except subprocess.CalledProcessError as e:
|
||||
raise RuntimeError(f"Claude call failed:\n{e.stderr}")
|
||||
|
||||
|
||||
def build_compress_prompt(original: str) -> str:
|
||||
return f"""
|
||||
Compress this markdown into caveman format.
|
||||
|
||||
STRICT RULES:
|
||||
- Do NOT modify anything inside ``` code blocks
|
||||
- Do NOT modify anything inside inline backticks
|
||||
- Preserve ALL URLs exactly
|
||||
- Preserve ALL headings exactly
|
||||
- Preserve file paths and commands
|
||||
- Return ONLY the compressed markdown body — do NOT wrap the entire output in a ```markdown fence or any other fence. Inner code blocks from the original stay as-is; do not add a new outer fence around the whole file.
|
||||
|
||||
Only compress natural language.
|
||||
|
||||
TEXT:
|
||||
{original}
|
||||
"""
|
||||
|
||||
|
||||
def build_fix_prompt(original: str, compressed: str, errors: List[str]) -> str:
|
||||
errors_str = "\n".join(f"- {e}" for e in errors)
|
||||
return f"""You are fixing a caveman-compressed markdown file. Specific validation errors were found.
|
||||
|
||||
CRITICAL RULES:
|
||||
- DO NOT recompress or rephrase the file
|
||||
- ONLY fix the listed errors — leave everything else exactly as-is
|
||||
- The ORIGINAL is provided as reference only (to restore missing content)
|
||||
- Preserve caveman style in all untouched sections
|
||||
|
||||
ERRORS TO FIX:
|
||||
{errors_str}
|
||||
|
||||
HOW TO FIX:
|
||||
- Missing URL: find it in ORIGINAL, restore it exactly where it belongs in COMPRESSED
|
||||
- Code block mismatch: find the exact code block in ORIGINAL, restore it in COMPRESSED
|
||||
- Heading mismatch: restore the exact heading text from ORIGINAL into COMPRESSED
|
||||
- Do not touch any section not mentioned in the errors
|
||||
|
||||
ORIGINAL (reference only):
|
||||
{original}
|
||||
|
||||
COMPRESSED (fix this):
|
||||
{compressed}
|
||||
|
||||
Return ONLY the fixed compressed file. No explanation.
|
||||
"""
|
||||
|
||||
|
||||
# ---------- Core Logic ----------
|
||||
|
||||
|
||||
def compress_file(filepath: Path) -> bool:
|
||||
# Resolve and validate path
|
||||
filepath = filepath.resolve()
|
||||
MAX_FILE_SIZE = 500_000 # 500KB
|
||||
if not filepath.exists():
|
||||
raise FileNotFoundError(f"File not found: {filepath}")
|
||||
if filepath.stat().st_size > MAX_FILE_SIZE:
|
||||
raise ValueError(f"File too large to compress safely (max 500KB): {filepath}")
|
||||
|
||||
# Refuse files that look like they contain secrets or PII. Compressing ships
|
||||
# the raw bytes to the Anthropic API — a third-party boundary — so we fail
|
||||
# loudly rather than silently exfiltrate credentials or keys. Override is
|
||||
# intentional: the user must rename the file if the heuristic is wrong.
|
||||
if is_sensitive_path(filepath):
|
||||
raise ValueError(
|
||||
f"Refusing to compress {filepath}: filename looks sensitive "
|
||||
"(credentials, keys, secrets, or known private paths). "
|
||||
"Compression sends file contents to the Anthropic API. "
|
||||
"Rename the file if this is a false positive."
|
||||
)
|
||||
|
||||
print(f"Processing: {filepath}")
|
||||
|
||||
if not should_compress(filepath):
|
||||
print("Skipping (not natural language)")
|
||||
return False
|
||||
|
||||
original_text = filepath.read_text(errors="ignore")
|
||||
backup_path = filepath.with_name(filepath.stem + ".original.md")
|
||||
|
||||
# Check if backup already exists to prevent accidental overwriting
|
||||
if backup_path.exists():
|
||||
print(f"⚠️ Backup file already exists: {backup_path}")
|
||||
print("The original backup may contain important content.")
|
||||
print("Aborting to prevent data loss. Please remove or rename the backup file if you want to proceed.")
|
||||
return False
|
||||
|
||||
# Step 1: Compress
|
||||
print("Compressing with Claude...")
|
||||
compressed = call_claude(build_compress_prompt(original_text))
|
||||
|
||||
# Save original as backup, write compressed to original path
|
||||
backup_path.write_text(original_text)
|
||||
filepath.write_text(compressed)
|
||||
|
||||
# Step 2: Validate + Retry
|
||||
for attempt in range(MAX_RETRIES):
|
||||
print(f"\nValidation attempt {attempt + 1}")
|
||||
|
||||
result = validate(backup_path, filepath)
|
||||
|
||||
if result.is_valid:
|
||||
print("Validation passed")
|
||||
break
|
||||
|
||||
print("❌ Validation failed:")
|
||||
for err in result.errors:
|
||||
print(f" - {err}")
|
||||
|
||||
if attempt == MAX_RETRIES - 1:
|
||||
# Restore original on failure
|
||||
filepath.write_text(original_text)
|
||||
backup_path.unlink(missing_ok=True)
|
||||
print("❌ Failed after retries — original restored")
|
||||
return False
|
||||
|
||||
print("Fixing with Claude...")
|
||||
compressed = call_claude(
|
||||
build_fix_prompt(original_text, compressed, result.errors)
|
||||
)
|
||||
filepath.write_text(compressed)
|
||||
|
||||
return True
|
||||
@@ -1,121 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Detect whether a file is natural language (compressible) or code/config (skip)."""
|
||||
|
||||
import json
|
||||
import re
|
||||
from pathlib import Path
|
||||
|
||||
# Extensions that are natural language and compressible
|
||||
COMPRESSIBLE_EXTENSIONS = {".md", ".txt", ".markdown", ".rst"}
|
||||
|
||||
# Extensions that are code/config and should be skipped
|
||||
SKIP_EXTENSIONS = {
|
||||
".py", ".js", ".ts", ".tsx", ".jsx", ".json", ".yaml", ".yml",
|
||||
".toml", ".env", ".lock", ".css", ".scss", ".html", ".xml",
|
||||
".sql", ".sh", ".bash", ".zsh", ".go", ".rs", ".java", ".c",
|
||||
".cpp", ".h", ".hpp", ".rb", ".php", ".swift", ".kt", ".lua",
|
||||
".dockerfile", ".makefile", ".csv", ".ini", ".cfg",
|
||||
}
|
||||
|
||||
# Patterns that indicate a line is code
|
||||
CODE_PATTERNS = [
|
||||
re.compile(r"^\s*(import |from .+ import |require\(|const |let |var )"),
|
||||
re.compile(r"^\s*(def |class |function |async function |export )"),
|
||||
re.compile(r"^\s*(if\s*\(|for\s*\(|while\s*\(|switch\s*\(|try\s*\{)"),
|
||||
re.compile(r"^\s*[\}\]\);]+\s*$"), # closing braces/brackets
|
||||
re.compile(r"^\s*@\w+"), # decorators/annotations
|
||||
re.compile(r'^\s*"[^"]+"\s*:\s*'), # JSON-like key-value
|
||||
re.compile(r"^\s*\w+\s*=\s*[{\[\(\"']"), # assignment with literal
|
||||
]
|
||||
|
||||
|
||||
def _is_code_line(line: str) -> bool:
|
||||
"""Check if a line looks like code."""
|
||||
return any(p.match(line) for p in CODE_PATTERNS)
|
||||
|
||||
|
||||
def _is_json_content(text: str) -> bool:
|
||||
"""Check if content is valid JSON."""
|
||||
try:
|
||||
json.loads(text)
|
||||
return True
|
||||
except (json.JSONDecodeError, ValueError):
|
||||
return False
|
||||
|
||||
|
||||
def _is_yaml_content(lines: list[str]) -> bool:
|
||||
"""Heuristic: check if content looks like YAML."""
|
||||
yaml_indicators = 0
|
||||
for line in lines[:30]:
|
||||
stripped = line.strip()
|
||||
if stripped.startswith("---"):
|
||||
yaml_indicators += 1
|
||||
elif re.match(r"^\w[\w\s]*:\s", stripped):
|
||||
yaml_indicators += 1
|
||||
elif stripped.startswith("- ") and ":" in stripped:
|
||||
yaml_indicators += 1
|
||||
# If most non-empty lines look like YAML
|
||||
non_empty = sum(1 for l in lines[:30] if l.strip())
|
||||
return non_empty > 0 and yaml_indicators / non_empty > 0.6
|
||||
|
||||
|
||||
def detect_file_type(filepath: Path) -> str:
|
||||
"""Classify a file as 'natural_language', 'code', 'config', or 'unknown'.
|
||||
|
||||
Returns:
|
||||
One of: 'natural_language', 'code', 'config', 'unknown'
|
||||
"""
|
||||
ext = filepath.suffix.lower()
|
||||
|
||||
# Extension-based classification
|
||||
if ext in COMPRESSIBLE_EXTENSIONS:
|
||||
return "natural_language"
|
||||
if ext in SKIP_EXTENSIONS:
|
||||
return "code" if ext not in {".json", ".yaml", ".yml", ".toml", ".ini", ".cfg", ".env"} else "config"
|
||||
|
||||
# Extensionless files (like CLAUDE.md, TODO) — check content
|
||||
if not ext:
|
||||
try:
|
||||
text = filepath.read_text(errors="ignore")
|
||||
except (OSError, PermissionError):
|
||||
return "unknown"
|
||||
|
||||
lines = text.splitlines()[:50]
|
||||
|
||||
if _is_json_content(text[:10000]):
|
||||
return "config"
|
||||
if _is_yaml_content(lines):
|
||||
return "config"
|
||||
|
||||
code_lines = sum(1 for l in lines if l.strip() and _is_code_line(l))
|
||||
non_empty = sum(1 for l in lines if l.strip())
|
||||
if non_empty > 0 and code_lines / non_empty > 0.4:
|
||||
return "code"
|
||||
|
||||
return "natural_language"
|
||||
|
||||
return "unknown"
|
||||
|
||||
|
||||
def should_compress(filepath: Path) -> bool:
|
||||
"""Return True if the file is natural language and should be compressed."""
|
||||
if not filepath.is_file():
|
||||
return False
|
||||
# Skip backup files
|
||||
if filepath.name.endswith(".original.md"):
|
||||
return False
|
||||
return detect_file_type(filepath) == "natural_language"
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
if len(sys.argv) < 2:
|
||||
print("Usage: python detect.py <file1> [file2] ...")
|
||||
sys.exit(1)
|
||||
|
||||
for path_str in sys.argv[1:]:
|
||||
p = Path(path_str).resolve()
|
||||
file_type = detect_file_type(p)
|
||||
compress = should_compress(p)
|
||||
print(f" {p.name:30s} type={file_type:20s} compress={compress}")
|
||||
@@ -1,189 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
import re
|
||||
from pathlib import Path
|
||||
|
||||
URL_REGEX = re.compile(r"https?://[^\s)]+")
|
||||
FENCE_OPEN_REGEX = re.compile(r"^(\s{0,3})(`{3,}|~{3,})(.*)$")
|
||||
HEADING_REGEX = re.compile(r"^(#{1,6})\s+(.*)", re.MULTILINE)
|
||||
BULLET_REGEX = re.compile(r"^\s*[-*+]\s+", re.MULTILINE)
|
||||
|
||||
# crude but effective path detection
|
||||
# Requires either a path prefix (./ ../ / or drive letter) or a slash/backslash within the match
|
||||
PATH_REGEX = re.compile(r"(?:\./|\.\./|/|[A-Za-z]:\\)[\w\-/\\\.]+|[\w\-\.]+[/\\][\w\-/\\\.]+")
|
||||
|
||||
|
||||
class ValidationResult:
|
||||
def __init__(self):
|
||||
self.is_valid = True
|
||||
self.errors = []
|
||||
self.warnings = []
|
||||
|
||||
def add_error(self, msg):
|
||||
self.is_valid = False
|
||||
self.errors.append(msg)
|
||||
|
||||
def add_warning(self, msg):
|
||||
self.warnings.append(msg)
|
||||
|
||||
|
||||
def read_file(path: Path) -> str:
|
||||
return path.read_text(errors="ignore")
|
||||
|
||||
|
||||
# ---------- Extractors ----------
|
||||
|
||||
|
||||
def extract_headings(text):
|
||||
return [(level, title.strip()) for level, title in HEADING_REGEX.findall(text)]
|
||||
|
||||
|
||||
def extract_code_blocks(text):
|
||||
"""Line-based fenced code block extractor.
|
||||
|
||||
Handles ``` and ~~~ fences with variable length (CommonMark: closing
|
||||
fence must use same char and be at least as long as opening). Supports
|
||||
nested fences (e.g. an outer 4-backtick block wrapping inner 3-backtick
|
||||
content).
|
||||
"""
|
||||
blocks = []
|
||||
lines = text.split("\n")
|
||||
i = 0
|
||||
n = len(lines)
|
||||
while i < n:
|
||||
m = FENCE_OPEN_REGEX.match(lines[i])
|
||||
if not m:
|
||||
i += 1
|
||||
continue
|
||||
fence_char = m.group(2)[0]
|
||||
fence_len = len(m.group(2))
|
||||
open_line = lines[i]
|
||||
block_lines = [open_line]
|
||||
i += 1
|
||||
closed = False
|
||||
while i < n:
|
||||
close_m = FENCE_OPEN_REGEX.match(lines[i])
|
||||
if (
|
||||
close_m
|
||||
and close_m.group(2)[0] == fence_char
|
||||
and len(close_m.group(2)) >= fence_len
|
||||
and close_m.group(3).strip() == ""
|
||||
):
|
||||
block_lines.append(lines[i])
|
||||
closed = True
|
||||
i += 1
|
||||
break
|
||||
block_lines.append(lines[i])
|
||||
i += 1
|
||||
if closed:
|
||||
blocks.append("\n".join(block_lines))
|
||||
# Unclosed fences are silently skipped — they indicate malformed markdown
|
||||
# and including them would cause false-positive validation failures.
|
||||
return blocks
|
||||
|
||||
|
||||
def extract_urls(text):
|
||||
return set(URL_REGEX.findall(text))
|
||||
|
||||
|
||||
def extract_paths(text):
|
||||
return set(PATH_REGEX.findall(text))
|
||||
|
||||
|
||||
def count_bullets(text):
|
||||
return len(BULLET_REGEX.findall(text))
|
||||
|
||||
|
||||
# ---------- Validators ----------
|
||||
|
||||
|
||||
def validate_headings(orig, comp, result):
|
||||
h1 = extract_headings(orig)
|
||||
h2 = extract_headings(comp)
|
||||
|
||||
if len(h1) != len(h2):
|
||||
result.add_error(f"Heading count mismatch: {len(h1)} vs {len(h2)}")
|
||||
|
||||
if h1 != h2:
|
||||
result.add_warning("Heading text/order changed")
|
||||
|
||||
|
||||
def validate_code_blocks(orig, comp, result):
|
||||
c1 = extract_code_blocks(orig)
|
||||
c2 = extract_code_blocks(comp)
|
||||
|
||||
if c1 != c2:
|
||||
result.add_error("Code blocks not preserved exactly")
|
||||
|
||||
|
||||
def validate_urls(orig, comp, result):
|
||||
u1 = extract_urls(orig)
|
||||
u2 = extract_urls(comp)
|
||||
|
||||
if u1 != u2:
|
||||
result.add_error(f"URL mismatch: lost={u1 - u2}, added={u2 - u1}")
|
||||
|
||||
|
||||
def validate_paths(orig, comp, result):
|
||||
p1 = extract_paths(orig)
|
||||
p2 = extract_paths(comp)
|
||||
|
||||
if p1 != p2:
|
||||
result.add_warning(f"Path mismatch: lost={p1 - p2}, added={p2 - p1}")
|
||||
|
||||
|
||||
def validate_bullets(orig, comp, result):
|
||||
b1 = count_bullets(orig)
|
||||
b2 = count_bullets(comp)
|
||||
|
||||
if b1 == 0:
|
||||
return
|
||||
|
||||
diff = abs(b1 - b2) / b1
|
||||
|
||||
if diff > 0.15:
|
||||
result.add_warning(f"Bullet count changed too much: {b1} -> {b2}")
|
||||
|
||||
|
||||
# ---------- Main ----------
|
||||
|
||||
|
||||
def validate(original_path: Path, compressed_path: Path) -> ValidationResult:
|
||||
result = ValidationResult()
|
||||
|
||||
orig = read_file(original_path)
|
||||
comp = read_file(compressed_path)
|
||||
|
||||
validate_headings(orig, comp, result)
|
||||
validate_code_blocks(orig, comp, result)
|
||||
validate_urls(orig, comp, result)
|
||||
validate_paths(orig, comp, result)
|
||||
validate_bullets(orig, comp, result)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
# ---------- CLI ----------
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
if len(sys.argv) != 3:
|
||||
print("Usage: python validate.py <original> <compressed>")
|
||||
sys.exit(1)
|
||||
|
||||
orig = Path(sys.argv[1]).resolve()
|
||||
comp = Path(sys.argv[2]).resolve()
|
||||
|
||||
res = validate(orig, comp)
|
||||
|
||||
print(f"\nValid: {res.is_valid}")
|
||||
|
||||
if res.errors:
|
||||
print("\nErrors:")
|
||||
for e in res.errors:
|
||||
print(f" - {e}")
|
||||
|
||||
if res.warnings:
|
||||
print("\nWarnings:")
|
||||
for w in res.warnings:
|
||||
print(f" - {w}")
|
||||
@@ -1,59 +0,0 @@
|
||||
---
|
||||
name: caveman-help
|
||||
description: >
|
||||
Quick-reference card for all caveman modes, skills, and commands.
|
||||
One-shot display, not a persistent mode. Trigger: /caveman-help,
|
||||
"caveman help", "what caveman commands", "how do I use caveman".
|
||||
---
|
||||
|
||||
# Caveman Help
|
||||
|
||||
Display this reference card when invoked. One-shot — do NOT change mode, write flag files, or persist anything. Output in caveman style.
|
||||
|
||||
## Modes
|
||||
|
||||
| Mode | Trigger | What change |
|
||||
|------|---------|-------------|
|
||||
| **Lite** | `/caveman lite` | Drop filler. Keep sentence structure. |
|
||||
| **Full** | `/caveman` | Drop articles, filler, pleasantries, hedging. Fragments OK. Default. |
|
||||
| **Ultra** | `/caveman ultra` | Extreme compression. Bare fragments. Tables over prose. |
|
||||
| **Wenyan-Lite** | `/caveman wenyan-lite` | Classical Chinese style, light compression. |
|
||||
| **Wenyan-Full** | `/caveman wenyan` | Full 文言文. Maximum classical terseness. |
|
||||
| **Wenyan-Ultra** | `/caveman wenyan-ultra` | Extreme. Ancient scholar on a budget. |
|
||||
|
||||
Mode stick until changed or session end.
|
||||
|
||||
## Skills
|
||||
|
||||
| Skill | Trigger | What it do |
|
||||
|-------|---------|-----------|
|
||||
| **caveman-commit** | `/caveman-commit` | Terse commit messages. Conventional Commits. ≤50 char subject. |
|
||||
| **caveman-review** | `/caveman-review` | One-line PR comments: `L42: bug: user null. Add guard.` |
|
||||
| **caveman-compress** | `/caveman:compress <file>` | Compress .md files to caveman prose. Saves ~46% input tokens. |
|
||||
| **caveman-help** | `/caveman-help` | This card. |
|
||||
|
||||
## Deactivate
|
||||
|
||||
Say "stop caveman" or "normal mode". Resume anytime with `/caveman`.
|
||||
|
||||
## Configure Default Mode
|
||||
|
||||
Default mode = `full`. Change it:
|
||||
|
||||
**Environment variable** (highest priority):
|
||||
```bash
|
||||
export CAVEMAN_DEFAULT_MODE=ultra
|
||||
```
|
||||
|
||||
**Config file** (`~/.config/caveman/config.json`):
|
||||
```json
|
||||
{ "defaultMode": "lite" }
|
||||
```
|
||||
|
||||
Set `"off"` to disable auto-activation on session start. User can still activate manually with `/caveman`.
|
||||
|
||||
Resolution: env var > config file > `full`.
|
||||
|
||||
## More
|
||||
|
||||
Full docs: https://github.com/JuliusBrussee/caveman
|
||||
@@ -1,67 +0,0 @@
|
||||
---
|
||||
name: caveman
|
||||
description: >
|
||||
Ultra-compressed communication mode. Cuts token usage ~75% by speaking like caveman
|
||||
while keeping full technical accuracy. Supports intensity levels: lite, full (default), ultra,
|
||||
wenyan-lite, wenyan-full, wenyan-ultra.
|
||||
Use when user says "caveman mode", "talk like caveman", "use caveman", "less tokens",
|
||||
"be brief", or invokes /caveman. Also auto-triggers when token efficiency is requested.
|
||||
---
|
||||
|
||||
Respond terse like smart caveman. All technical substance stay. Only fluff die.
|
||||
|
||||
## Persistence
|
||||
|
||||
ACTIVE EVERY RESPONSE. No revert after many turns. No filler drift. Still active if unsure. Off only: "stop caveman" / "normal mode".
|
||||
|
||||
Default: **full**. Switch: `/caveman lite|full|ultra`.
|
||||
|
||||
## Rules
|
||||
|
||||
Drop: articles (a/an/the), filler (just/really/basically/actually/simply), pleasantries (sure/certainly/of course/happy to), hedging. Fragments OK. Short synonyms (big not extensive, fix not "implement a solution for"). Technical terms exact. Code blocks unchanged. Errors quoted exact.
|
||||
|
||||
Pattern: `[thing] [action] [reason]. [next step].`
|
||||
|
||||
Not: "Sure! I'd be happy to help you with that. The issue you're experiencing is likely caused by..."
|
||||
Yes: "Bug in auth middleware. Token expiry check use `<` not `<=`. Fix:"
|
||||
|
||||
## Intensity
|
||||
|
||||
| Level | What change |
|
||||
|-------|------------|
|
||||
| **lite** | No filler/hedging. Keep articles + full sentences. Professional but tight |
|
||||
| **full** | Drop articles, fragments OK, short synonyms. Classic caveman |
|
||||
| **ultra** | Abbreviate (DB/auth/config/req/res/fn/impl), strip conjunctions, arrows for causality (X → Y), one word when one word enough |
|
||||
| **wenyan-lite** | Semi-classical. Drop filler/hedging but keep grammar structure, classical register |
|
||||
| **wenyan-full** | Maximum classical terseness. Fully 文言文. 80-90% character reduction. Classical sentence patterns, verbs precede objects, subjects often omitted, classical particles (之/乃/為/其) |
|
||||
| **wenyan-ultra** | Extreme abbreviation while keeping classical Chinese feel. Maximum compression, ultra terse |
|
||||
|
||||
Example — "Why React component re-render?"
|
||||
- lite: "Your component re-renders because you create a new object reference each render. Wrap it in `useMemo`."
|
||||
- full: "New object ref each render. Inline object prop = new ref = re-render. Wrap in `useMemo`."
|
||||
- ultra: "Inline obj prop → new ref → re-render. `useMemo`."
|
||||
- wenyan-lite: "組件頻重繪,以每繪新生對象參照故。以 useMemo 包之。"
|
||||
- wenyan-full: "物出新參照,致重繪。useMemo .Wrap之。"
|
||||
- wenyan-ultra: "新參照→重繪。useMemo Wrap。"
|
||||
|
||||
Example — "Explain database connection pooling."
|
||||
- lite: "Connection pooling reuses open connections instead of creating new ones per request. Avoids repeated handshake overhead."
|
||||
- full: "Pool reuse open DB connections. No new connection per request. Skip handshake overhead."
|
||||
- ultra: "Pool = reuse DB conn. Skip handshake → fast under load."
|
||||
- wenyan-full: "池reuse open connection。不每req新開。skip handshake overhead。"
|
||||
- wenyan-ultra: "池reuse conn。skip handshake → fast。"
|
||||
|
||||
## Auto-Clarity
|
||||
|
||||
Drop caveman for: security warnings, irreversible action confirmations, multi-step sequences where fragment order risks misread, user asks to clarify or repeats question. Resume caveman after clear part done.
|
||||
|
||||
Example — destructive op:
|
||||
> **Warning:** This will permanently delete all rows in the `users` table and cannot be undone.
|
||||
> ```sql
|
||||
> DROP TABLE users;
|
||||
> ```
|
||||
> Caveman resume. Verify backup exist first.
|
||||
|
||||
## Boundaries
|
||||
|
||||
Code/commits/PRs: write normal. "stop caveman" or "normal mode": revert. Level persist until changed or session end.
|
||||
@@ -1,111 +0,0 @@
|
||||
---
|
||||
name: compress
|
||||
description: >
|
||||
Compress natural language memory files (CLAUDE.md, todos, preferences) into caveman format
|
||||
to save input tokens. Preserves all technical substance, code, URLs, and structure.
|
||||
Compressed version overwrites the original file. Human-readable backup saved as FILE.original.md.
|
||||
Trigger: /caveman:compress <filepath> or "compress memory file"
|
||||
---
|
||||
|
||||
# Caveman Compress
|
||||
|
||||
## Purpose
|
||||
|
||||
Compress natural language files (CLAUDE.md, todos, preferences) into caveman-speak to reduce input tokens. Compressed version overwrites original. Human-readable backup saved as `<filename>.original.md`.
|
||||
|
||||
## Trigger
|
||||
|
||||
`/caveman:compress <filepath>` or when user asks to compress a memory file.
|
||||
|
||||
## Process
|
||||
|
||||
1. This SKILL.md lives alongside `scripts/` in the same directory. Find that directory.
|
||||
|
||||
2. Run:
|
||||
|
||||
cd <directory_containing_this_SKILL.md> && python3 -m scripts <absolute_filepath>
|
||||
|
||||
3. The CLI will:
|
||||
- detect file type (no tokens)
|
||||
- call Claude to compress
|
||||
- validate output (no tokens)
|
||||
- if errors: cherry-pick fix with Claude (targeted fixes only, no recompression)
|
||||
- retry up to 2 times
|
||||
- if still failing after 2 retries: report error to user, leave original file untouched
|
||||
|
||||
4. Return result to user
|
||||
|
||||
## Compression Rules
|
||||
|
||||
### Remove
|
||||
- Articles: a, an, the
|
||||
- Filler: just, really, basically, actually, simply, essentially, generally
|
||||
- Pleasantries: "sure", "certainly", "of course", "happy to", "I'd recommend"
|
||||
- Hedging: "it might be worth", "you could consider", "it would be good to"
|
||||
- Redundant phrasing: "in order to" → "to", "make sure to" → "ensure", "the reason is because" → "because"
|
||||
- Connective fluff: "however", "furthermore", "additionally", "in addition"
|
||||
|
||||
### Preserve EXACTLY (never modify)
|
||||
- Code blocks (fenced ``` and indented)
|
||||
- Inline code (`backtick content`)
|
||||
- URLs and links (full URLs, markdown links)
|
||||
- File paths (`/src/components/...`, `./config.yaml`)
|
||||
- Commands (`npm install`, `git commit`, `docker build`)
|
||||
- Technical terms (library names, API names, protocols, algorithms)
|
||||
- Proper nouns (project names, people, companies)
|
||||
- Dates, version numbers, numeric values
|
||||
- Environment variables (`$HOME`, `NODE_ENV`)
|
||||
|
||||
### Preserve Structure
|
||||
- All markdown headings (keep exact heading text, compress body below)
|
||||
- Bullet point hierarchy (keep nesting level)
|
||||
- Numbered lists (keep numbering)
|
||||
- Tables (compress cell text, keep structure)
|
||||
- Frontmatter/YAML headers in markdown files
|
||||
|
||||
### Compress
|
||||
- Use short synonyms: "big" not "extensive", "fix" not "implement a solution for", "use" not "utilize"
|
||||
- Fragments OK: "Run tests before commit" not "You should always run tests before committing"
|
||||
- Drop "you should", "make sure to", "remember to" — just state the action
|
||||
- Merge redundant bullets that say the same thing differently
|
||||
- Keep one example where multiple examples show the same pattern
|
||||
|
||||
CRITICAL RULE:
|
||||
Anything inside ``` ... ``` must be copied EXACTLY.
|
||||
Do not:
|
||||
- remove comments
|
||||
- remove spacing
|
||||
- reorder lines
|
||||
- shorten commands
|
||||
- simplify anything
|
||||
|
||||
Inline code (`...`) must be preserved EXACTLY.
|
||||
Do not modify anything inside backticks.
|
||||
|
||||
If file contains code blocks:
|
||||
- Treat code blocks as read-only regions
|
||||
- Only compress text outside them
|
||||
- Do not merge sections around code
|
||||
|
||||
## Pattern
|
||||
|
||||
Original:
|
||||
> You should always make sure to run the test suite before pushing any changes to the main branch. This is important because it helps catch bugs early and prevents broken builds from being deployed to production.
|
||||
|
||||
Compressed:
|
||||
> Run tests before push to main. Catch bugs early, prevent broken prod deploys.
|
||||
|
||||
Original:
|
||||
> The application uses a microservices architecture with the following components. The API gateway handles all incoming requests and routes them to the appropriate service. The authentication service is responsible for managing user sessions and JWT tokens.
|
||||
|
||||
Compressed:
|
||||
> Microservices architecture. API gateway route all requests to services. Auth service manage user sessions + JWT tokens.
|
||||
|
||||
## Boundaries
|
||||
|
||||
- ONLY compress natural language files (.md, .txt, extensionless)
|
||||
- NEVER modify: .py, .js, .ts, .json, .yaml, .yml, .toml, .env, .lock, .css, .html, .xml, .sql, .sh
|
||||
- If file has mixed content (prose + code), compress ONLY the prose sections
|
||||
- If unsure whether something is code or prose, leave it unchanged
|
||||
- Original file is backed up as FILE.original.md before overwriting
|
||||
- Never compress FILE.original.md (skip it)
|
||||
@@ -1,9 +0,0 @@
|
||||
"""Caveman compress scripts.
|
||||
|
||||
This package provides tools to compress natural language markdown files
|
||||
into caveman format to save input tokens.
|
||||
"""
|
||||
|
||||
__all__ = ["cli", "compress", "detect", "validate"]
|
||||
|
||||
__version__ = "1.0.0"
|
||||
@@ -1,3 +0,0 @@
|
||||
from .cli import main
|
||||
|
||||
main()
|
||||
@@ -1,78 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
from pathlib import Path
|
||||
import sys
|
||||
|
||||
# Support both direct execution and module import
|
||||
try:
|
||||
from .validate import validate
|
||||
except ImportError:
|
||||
sys.path.insert(0, str(Path(__file__).parent))
|
||||
from validate import validate
|
||||
|
||||
try:
|
||||
import tiktoken
|
||||
_enc = tiktoken.get_encoding("o200k_base")
|
||||
except ImportError:
|
||||
_enc = None
|
||||
|
||||
|
||||
def count_tokens(text):
|
||||
if _enc is None:
|
||||
return len(text.split()) # fallback: word count
|
||||
return len(_enc.encode(text))
|
||||
|
||||
|
||||
def benchmark_pair(orig_path: Path, comp_path: Path):
|
||||
orig_text = orig_path.read_text()
|
||||
comp_text = comp_path.read_text()
|
||||
|
||||
orig_tokens = count_tokens(orig_text)
|
||||
comp_tokens = count_tokens(comp_text)
|
||||
saved = 100 * (orig_tokens - comp_tokens) / orig_tokens if orig_tokens > 0 else 0.0
|
||||
result = validate(orig_path, comp_path)
|
||||
|
||||
return (comp_path.name, orig_tokens, comp_tokens, saved, result.is_valid)
|
||||
|
||||
|
||||
def print_table(rows):
|
||||
print("\n| File | Original | Compressed | Saved % | Valid |")
|
||||
print("|------|----------|------------|---------|-------|")
|
||||
for r in rows:
|
||||
print(f"| {r[0]} | {r[1]} | {r[2]} | {r[3]:.1f}% | {'✅' if r[4] else '❌'} |")
|
||||
|
||||
|
||||
def main():
|
||||
# Direct file pair: python3 benchmark.py original.md compressed.md
|
||||
if len(sys.argv) == 3:
|
||||
orig = Path(sys.argv[1]).resolve()
|
||||
comp = Path(sys.argv[2]).resolve()
|
||||
if not orig.exists():
|
||||
print(f"❌ Not found: {orig}")
|
||||
sys.exit(1)
|
||||
if not comp.exists():
|
||||
print(f"❌ Not found: {comp}")
|
||||
sys.exit(1)
|
||||
print_table([benchmark_pair(orig, comp)])
|
||||
return
|
||||
|
||||
# Glob mode: repo_root/tests/caveman-compress/
|
||||
tests_dir = Path(__file__).parent.parent.parent / "tests" / "caveman-compress"
|
||||
if not tests_dir.exists():
|
||||
print(f"❌ Tests dir not found: {tests_dir}")
|
||||
sys.exit(1)
|
||||
|
||||
rows = []
|
||||
for orig in sorted(tests_dir.glob("*.original.md")):
|
||||
comp = orig.with_name(orig.stem.removesuffix(".original") + ".md")
|
||||
if comp.exists():
|
||||
rows.append(benchmark_pair(orig, comp))
|
||||
|
||||
if not rows:
|
||||
print("No compressed file pairs found.")
|
||||
return
|
||||
|
||||
print_table(rows)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -1,73 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Caveman Compress CLI
|
||||
|
||||
Usage:
|
||||
caveman <filepath>
|
||||
"""
|
||||
|
||||
import sys
|
||||
from pathlib import Path
|
||||
|
||||
from .compress import compress_file
|
||||
from .detect import detect_file_type, should_compress
|
||||
|
||||
|
||||
def print_usage():
|
||||
print("Usage: caveman <filepath>")
|
||||
|
||||
|
||||
def main():
|
||||
if len(sys.argv) != 2:
|
||||
print_usage()
|
||||
sys.exit(1)
|
||||
|
||||
filepath = Path(sys.argv[1])
|
||||
|
||||
# Check file exists
|
||||
if not filepath.exists():
|
||||
print(f"❌ File not found: {filepath}")
|
||||
sys.exit(1)
|
||||
|
||||
if not filepath.is_file():
|
||||
print(f"❌ Not a file: {filepath}")
|
||||
sys.exit(1)
|
||||
|
||||
filepath = filepath.resolve()
|
||||
|
||||
# Detect file type
|
||||
file_type = detect_file_type(filepath)
|
||||
|
||||
print(f"Detected: {file_type}")
|
||||
|
||||
# Check if compressible
|
||||
if not should_compress(filepath):
|
||||
print("Skipping: file is not natural language (code/config)")
|
||||
sys.exit(0)
|
||||
|
||||
print("Starting caveman compression...\n")
|
||||
|
||||
try:
|
||||
success = compress_file(filepath)
|
||||
|
||||
if success:
|
||||
print("\nCompression completed successfully")
|
||||
backup_path = filepath.with_name(filepath.stem + ".original.md")
|
||||
print(f"Compressed: {filepath}")
|
||||
print(f"Original: {backup_path}")
|
||||
sys.exit(0)
|
||||
else:
|
||||
print("\n❌ Compression failed after retries")
|
||||
sys.exit(2)
|
||||
|
||||
except KeyboardInterrupt:
|
||||
print("\nInterrupted by user")
|
||||
sys.exit(130)
|
||||
|
||||
except Exception as e:
|
||||
print(f"\n❌ Error: {e}")
|
||||
sys.exit(1)
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
main()
|
||||
@@ -1,227 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""
|
||||
Caveman Memory Compression Orchestrator
|
||||
|
||||
Usage:
|
||||
python scripts/compress.py <filepath>
|
||||
"""
|
||||
|
||||
import os
|
||||
import re
|
||||
import subprocess
|
||||
from pathlib import Path
|
||||
from typing import List
|
||||
|
||||
OUTER_FENCE_REGEX = re.compile(
|
||||
r"\A\s*(`{3,}|~{3,})[^\n]*\n(.*)\n\1\s*\Z", re.DOTALL
|
||||
)
|
||||
|
||||
# Filenames and paths that almost certainly hold secrets or PII. Compressing
|
||||
# them ships raw bytes to the Anthropic API — a third-party data boundary that
|
||||
# developers on sensitive codebases cannot cross. detect.py already skips .env
|
||||
# by extension, but credentials.md / secrets.txt / ~/.aws/credentials would
|
||||
# slip through the natural-language filter. This is a hard refuse before read.
|
||||
SENSITIVE_BASENAME_REGEX = re.compile(
|
||||
r"(?ix)^("
|
||||
r"\.env(\..+)?"
|
||||
r"|\.netrc"
|
||||
r"|credentials(\..+)?"
|
||||
r"|secrets?(\..+)?"
|
||||
r"|passwords?(\..+)?"
|
||||
r"|id_(rsa|dsa|ecdsa|ed25519)(\.pub)?"
|
||||
r"|authorized_keys"
|
||||
r"|known_hosts"
|
||||
r"|.*\.(pem|key|p12|pfx|crt|cer|jks|keystore|asc|gpg)"
|
||||
r")$"
|
||||
)
|
||||
|
||||
SENSITIVE_PATH_COMPONENTS = frozenset({".ssh", ".aws", ".gnupg", ".kube", ".docker"})
|
||||
|
||||
SENSITIVE_NAME_TOKENS = (
|
||||
"secret", "credential", "password", "passwd",
|
||||
"apikey", "accesskey", "token", "privatekey",
|
||||
)
|
||||
|
||||
|
||||
def is_sensitive_path(filepath: Path) -> bool:
|
||||
"""Heuristic denylist for files that must never be shipped to a third-party API."""
|
||||
name = filepath.name
|
||||
if SENSITIVE_BASENAME_REGEX.match(name):
|
||||
return True
|
||||
lowered_parts = {p.lower() for p in filepath.parts}
|
||||
if lowered_parts & SENSITIVE_PATH_COMPONENTS:
|
||||
return True
|
||||
# Normalize separators so "api-key" and "api_key" both match "apikey".
|
||||
lower = re.sub(r"[_\-\s.]", "", name.lower())
|
||||
return any(tok in lower for tok in SENSITIVE_NAME_TOKENS)
|
||||
|
||||
|
||||
def strip_llm_wrapper(text: str) -> str:
|
||||
"""Strip outer ```markdown ... ``` fence when it wraps the entire output."""
|
||||
m = OUTER_FENCE_REGEX.match(text)
|
||||
if m:
|
||||
return m.group(2)
|
||||
return text
|
||||
|
||||
from .detect import should_compress
|
||||
from .validate import validate
|
||||
|
||||
MAX_RETRIES = 2
|
||||
|
||||
|
||||
# ---------- Claude Calls ----------
|
||||
|
||||
|
||||
def call_claude(prompt: str) -> str:
|
||||
api_key = os.environ.get("ANTHROPIC_API_KEY")
|
||||
if api_key:
|
||||
try:
|
||||
import anthropic
|
||||
|
||||
client = anthropic.Anthropic(api_key=api_key)
|
||||
msg = client.messages.create(
|
||||
model=os.environ.get("CAVEMAN_MODEL", "claude-sonnet-4-5"),
|
||||
max_tokens=8192,
|
||||
messages=[{"role": "user", "content": prompt}],
|
||||
)
|
||||
return strip_llm_wrapper(msg.content[0].text.strip())
|
||||
except ImportError:
|
||||
pass # anthropic not installed, fall back to CLI
|
||||
# Fallback: use claude CLI (handles desktop auth)
|
||||
try:
|
||||
result = subprocess.run(
|
||||
["claude", "--print"],
|
||||
input=prompt,
|
||||
text=True,
|
||||
capture_output=True,
|
||||
check=True,
|
||||
)
|
||||
return strip_llm_wrapper(result.stdout.strip())
|
||||
except subprocess.CalledProcessError as e:
|
||||
raise RuntimeError(f"Claude call failed:\n{e.stderr}")
|
||||
|
||||
|
||||
def build_compress_prompt(original: str) -> str:
|
||||
return f"""
|
||||
Compress this markdown into caveman format.
|
||||
|
||||
STRICT RULES:
|
||||
- Do NOT modify anything inside ``` code blocks
|
||||
- Do NOT modify anything inside inline backticks
|
||||
- Preserve ALL URLs exactly
|
||||
- Preserve ALL headings exactly
|
||||
- Preserve file paths and commands
|
||||
- Return ONLY the compressed markdown body — do NOT wrap the entire output in a ```markdown fence or any other fence. Inner code blocks from the original stay as-is; do not add a new outer fence around the whole file.
|
||||
|
||||
Only compress natural language.
|
||||
|
||||
TEXT:
|
||||
{original}
|
||||
"""
|
||||
|
||||
|
||||
def build_fix_prompt(original: str, compressed: str, errors: List[str]) -> str:
|
||||
errors_str = "\n".join(f"- {e}" for e in errors)
|
||||
return f"""You are fixing a caveman-compressed markdown file. Specific validation errors were found.
|
||||
|
||||
CRITICAL RULES:
|
||||
- DO NOT recompress or rephrase the file
|
||||
- ONLY fix the listed errors — leave everything else exactly as-is
|
||||
- The ORIGINAL is provided as reference only (to restore missing content)
|
||||
- Preserve caveman style in all untouched sections
|
||||
|
||||
ERRORS TO FIX:
|
||||
{errors_str}
|
||||
|
||||
HOW TO FIX:
|
||||
- Missing URL: find it in ORIGINAL, restore it exactly where it belongs in COMPRESSED
|
||||
- Code block mismatch: find the exact code block in ORIGINAL, restore it in COMPRESSED
|
||||
- Heading mismatch: restore the exact heading text from ORIGINAL into COMPRESSED
|
||||
- Do not touch any section not mentioned in the errors
|
||||
|
||||
ORIGINAL (reference only):
|
||||
{original}
|
||||
|
||||
COMPRESSED (fix this):
|
||||
{compressed}
|
||||
|
||||
Return ONLY the fixed compressed file. No explanation.
|
||||
"""
|
||||
|
||||
|
||||
# ---------- Core Logic ----------
|
||||
|
||||
|
||||
def compress_file(filepath: Path) -> bool:
|
||||
# Resolve and validate path
|
||||
filepath = filepath.resolve()
|
||||
MAX_FILE_SIZE = 500_000 # 500KB
|
||||
if not filepath.exists():
|
||||
raise FileNotFoundError(f"File not found: {filepath}")
|
||||
if filepath.stat().st_size > MAX_FILE_SIZE:
|
||||
raise ValueError(f"File too large to compress safely (max 500KB): {filepath}")
|
||||
|
||||
# Refuse files that look like they contain secrets or PII. Compressing ships
|
||||
# the raw bytes to the Anthropic API — a third-party boundary — so we fail
|
||||
# loudly rather than silently exfiltrate credentials or keys. Override is
|
||||
# intentional: the user must rename the file if the heuristic is wrong.
|
||||
if is_sensitive_path(filepath):
|
||||
raise ValueError(
|
||||
f"Refusing to compress {filepath}: filename looks sensitive "
|
||||
"(credentials, keys, secrets, or known private paths). "
|
||||
"Compression sends file contents to the Anthropic API. "
|
||||
"Rename the file if this is a false positive."
|
||||
)
|
||||
|
||||
print(f"Processing: {filepath}")
|
||||
|
||||
if not should_compress(filepath):
|
||||
print("Skipping (not natural language)")
|
||||
return False
|
||||
|
||||
original_text = filepath.read_text(errors="ignore")
|
||||
backup_path = filepath.with_name(filepath.stem + ".original.md")
|
||||
|
||||
# Check if backup already exists to prevent accidental overwriting
|
||||
if backup_path.exists():
|
||||
print(f"⚠️ Backup file already exists: {backup_path}")
|
||||
print("The original backup may contain important content.")
|
||||
print("Aborting to prevent data loss. Please remove or rename the backup file if you want to proceed.")
|
||||
return False
|
||||
|
||||
# Step 1: Compress
|
||||
print("Compressing with Claude...")
|
||||
compressed = call_claude(build_compress_prompt(original_text))
|
||||
|
||||
# Save original as backup, write compressed to original path
|
||||
backup_path.write_text(original_text)
|
||||
filepath.write_text(compressed)
|
||||
|
||||
# Step 2: Validate + Retry
|
||||
for attempt in range(MAX_RETRIES):
|
||||
print(f"\nValidation attempt {attempt + 1}")
|
||||
|
||||
result = validate(backup_path, filepath)
|
||||
|
||||
if result.is_valid:
|
||||
print("Validation passed")
|
||||
break
|
||||
|
||||
print("❌ Validation failed:")
|
||||
for err in result.errors:
|
||||
print(f" - {err}")
|
||||
|
||||
if attempt == MAX_RETRIES - 1:
|
||||
# Restore original on failure
|
||||
filepath.write_text(original_text)
|
||||
backup_path.unlink(missing_ok=True)
|
||||
print("❌ Failed after retries — original restored")
|
||||
return False
|
||||
|
||||
print("Fixing with Claude...")
|
||||
compressed = call_claude(
|
||||
build_fix_prompt(original_text, compressed, result.errors)
|
||||
)
|
||||
filepath.write_text(compressed)
|
||||
|
||||
return True
|
||||
@@ -1,121 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
"""Detect whether a file is natural language (compressible) or code/config (skip)."""
|
||||
|
||||
import json
|
||||
import re
|
||||
from pathlib import Path
|
||||
|
||||
# Extensions that are natural language and compressible
|
||||
COMPRESSIBLE_EXTENSIONS = {".md", ".txt", ".markdown", ".rst"}
|
||||
|
||||
# Extensions that are code/config and should be skipped
|
||||
SKIP_EXTENSIONS = {
|
||||
".py", ".js", ".ts", ".tsx", ".jsx", ".json", ".yaml", ".yml",
|
||||
".toml", ".env", ".lock", ".css", ".scss", ".html", ".xml",
|
||||
".sql", ".sh", ".bash", ".zsh", ".go", ".rs", ".java", ".c",
|
||||
".cpp", ".h", ".hpp", ".rb", ".php", ".swift", ".kt", ".lua",
|
||||
".dockerfile", ".makefile", ".csv", ".ini", ".cfg",
|
||||
}
|
||||
|
||||
# Patterns that indicate a line is code
|
||||
CODE_PATTERNS = [
|
||||
re.compile(r"^\s*(import |from .+ import |require\(|const |let |var )"),
|
||||
re.compile(r"^\s*(def |class |function |async function |export )"),
|
||||
re.compile(r"^\s*(if\s*\(|for\s*\(|while\s*\(|switch\s*\(|try\s*\{)"),
|
||||
re.compile(r"^\s*[\}\]\);]+\s*$"), # closing braces/brackets
|
||||
re.compile(r"^\s*@\w+"), # decorators/annotations
|
||||
re.compile(r'^\s*"[^"]+"\s*:\s*'), # JSON-like key-value
|
||||
re.compile(r"^\s*\w+\s*=\s*[{\[\(\"']"), # assignment with literal
|
||||
]
|
||||
|
||||
|
||||
def _is_code_line(line: str) -> bool:
|
||||
"""Check if a line looks like code."""
|
||||
return any(p.match(line) for p in CODE_PATTERNS)
|
||||
|
||||
|
||||
def _is_json_content(text: str) -> bool:
|
||||
"""Check if content is valid JSON."""
|
||||
try:
|
||||
json.loads(text)
|
||||
return True
|
||||
except (json.JSONDecodeError, ValueError):
|
||||
return False
|
||||
|
||||
|
||||
def _is_yaml_content(lines: list[str]) -> bool:
|
||||
"""Heuristic: check if content looks like YAML."""
|
||||
yaml_indicators = 0
|
||||
for line in lines[:30]:
|
||||
stripped = line.strip()
|
||||
if stripped.startswith("---"):
|
||||
yaml_indicators += 1
|
||||
elif re.match(r"^\w[\w\s]*:\s", stripped):
|
||||
yaml_indicators += 1
|
||||
elif stripped.startswith("- ") and ":" in stripped:
|
||||
yaml_indicators += 1
|
||||
# If most non-empty lines look like YAML
|
||||
non_empty = sum(1 for l in lines[:30] if l.strip())
|
||||
return non_empty > 0 and yaml_indicators / non_empty > 0.6
|
||||
|
||||
|
||||
def detect_file_type(filepath: Path) -> str:
|
||||
"""Classify a file as 'natural_language', 'code', 'config', or 'unknown'.
|
||||
|
||||
Returns:
|
||||
One of: 'natural_language', 'code', 'config', 'unknown'
|
||||
"""
|
||||
ext = filepath.suffix.lower()
|
||||
|
||||
# Extension-based classification
|
||||
if ext in COMPRESSIBLE_EXTENSIONS:
|
||||
return "natural_language"
|
||||
if ext in SKIP_EXTENSIONS:
|
||||
return "code" if ext not in {".json", ".yaml", ".yml", ".toml", ".ini", ".cfg", ".env"} else "config"
|
||||
|
||||
# Extensionless files (like CLAUDE.md, TODO) — check content
|
||||
if not ext:
|
||||
try:
|
||||
text = filepath.read_text(errors="ignore")
|
||||
except (OSError, PermissionError):
|
||||
return "unknown"
|
||||
|
||||
lines = text.splitlines()[:50]
|
||||
|
||||
if _is_json_content(text[:10000]):
|
||||
return "config"
|
||||
if _is_yaml_content(lines):
|
||||
return "config"
|
||||
|
||||
code_lines = sum(1 for l in lines if l.strip() and _is_code_line(l))
|
||||
non_empty = sum(1 for l in lines if l.strip())
|
||||
if non_empty > 0 and code_lines / non_empty > 0.4:
|
||||
return "code"
|
||||
|
||||
return "natural_language"
|
||||
|
||||
return "unknown"
|
||||
|
||||
|
||||
def should_compress(filepath: Path) -> bool:
|
||||
"""Return True if the file is natural language and should be compressed."""
|
||||
if not filepath.is_file():
|
||||
return False
|
||||
# Skip backup files
|
||||
if filepath.name.endswith(".original.md"):
|
||||
return False
|
||||
return detect_file_type(filepath) == "natural_language"
|
||||
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
if len(sys.argv) < 2:
|
||||
print("Usage: python detect.py <file1> [file2] ...")
|
||||
sys.exit(1)
|
||||
|
||||
for path_str in sys.argv[1:]:
|
||||
p = Path(path_str).resolve()
|
||||
file_type = detect_file_type(p)
|
||||
compress = should_compress(p)
|
||||
print(f" {p.name:30s} type={file_type:20s} compress={compress}")
|
||||
@@ -1,189 +0,0 @@
|
||||
#!/usr/bin/env python3
|
||||
import re
|
||||
from pathlib import Path
|
||||
|
||||
URL_REGEX = re.compile(r"https?://[^\s)]+")
|
||||
FENCE_OPEN_REGEX = re.compile(r"^(\s{0,3})(`{3,}|~{3,})(.*)$")
|
||||
HEADING_REGEX = re.compile(r"^(#{1,6})\s+(.*)", re.MULTILINE)
|
||||
BULLET_REGEX = re.compile(r"^\s*[-*+]\s+", re.MULTILINE)
|
||||
|
||||
# crude but effective path detection
|
||||
# Requires either a path prefix (./ ../ / or drive letter) or a slash/backslash within the match
|
||||
PATH_REGEX = re.compile(r"(?:\./|\.\./|/|[A-Za-z]:\\)[\w\-/\\\.]+|[\w\-\.]+[/\\][\w\-/\\\.]+")
|
||||
|
||||
|
||||
class ValidationResult:
|
||||
def __init__(self):
|
||||
self.is_valid = True
|
||||
self.errors = []
|
||||
self.warnings = []
|
||||
|
||||
def add_error(self, msg):
|
||||
self.is_valid = False
|
||||
self.errors.append(msg)
|
||||
|
||||
def add_warning(self, msg):
|
||||
self.warnings.append(msg)
|
||||
|
||||
|
||||
def read_file(path: Path) -> str:
|
||||
return path.read_text(errors="ignore")
|
||||
|
||||
|
||||
# ---------- Extractors ----------
|
||||
|
||||
|
||||
def extract_headings(text):
|
||||
return [(level, title.strip()) for level, title in HEADING_REGEX.findall(text)]
|
||||
|
||||
|
||||
def extract_code_blocks(text):
|
||||
"""Line-based fenced code block extractor.
|
||||
|
||||
Handles ``` and ~~~ fences with variable length (CommonMark: closing
|
||||
fence must use same char and be at least as long as opening). Supports
|
||||
nested fences (e.g. an outer 4-backtick block wrapping inner 3-backtick
|
||||
content).
|
||||
"""
|
||||
blocks = []
|
||||
lines = text.split("\n")
|
||||
i = 0
|
||||
n = len(lines)
|
||||
while i < n:
|
||||
m = FENCE_OPEN_REGEX.match(lines[i])
|
||||
if not m:
|
||||
i += 1
|
||||
continue
|
||||
fence_char = m.group(2)[0]
|
||||
fence_len = len(m.group(2))
|
||||
open_line = lines[i]
|
||||
block_lines = [open_line]
|
||||
i += 1
|
||||
closed = False
|
||||
while i < n:
|
||||
close_m = FENCE_OPEN_REGEX.match(lines[i])
|
||||
if (
|
||||
close_m
|
||||
and close_m.group(2)[0] == fence_char
|
||||
and len(close_m.group(2)) >= fence_len
|
||||
and close_m.group(3).strip() == ""
|
||||
):
|
||||
block_lines.append(lines[i])
|
||||
closed = True
|
||||
i += 1
|
||||
break
|
||||
block_lines.append(lines[i])
|
||||
i += 1
|
||||
if closed:
|
||||
blocks.append("\n".join(block_lines))
|
||||
# Unclosed fences are silently skipped — they indicate malformed markdown
|
||||
# and including them would cause false-positive validation failures.
|
||||
return blocks
|
||||
|
||||
|
||||
def extract_urls(text):
|
||||
return set(URL_REGEX.findall(text))
|
||||
|
||||
|
||||
def extract_paths(text):
|
||||
return set(PATH_REGEX.findall(text))
|
||||
|
||||
|
||||
def count_bullets(text):
|
||||
return len(BULLET_REGEX.findall(text))
|
||||
|
||||
|
||||
# ---------- Validators ----------
|
||||
|
||||
|
||||
def validate_headings(orig, comp, result):
|
||||
h1 = extract_headings(orig)
|
||||
h2 = extract_headings(comp)
|
||||
|
||||
if len(h1) != len(h2):
|
||||
result.add_error(f"Heading count mismatch: {len(h1)} vs {len(h2)}")
|
||||
|
||||
if h1 != h2:
|
||||
result.add_warning("Heading text/order changed")
|
||||
|
||||
|
||||
def validate_code_blocks(orig, comp, result):
|
||||
c1 = extract_code_blocks(orig)
|
||||
c2 = extract_code_blocks(comp)
|
||||
|
||||
if c1 != c2:
|
||||
result.add_error("Code blocks not preserved exactly")
|
||||
|
||||
|
||||
def validate_urls(orig, comp, result):
|
||||
u1 = extract_urls(orig)
|
||||
u2 = extract_urls(comp)
|
||||
|
||||
if u1 != u2:
|
||||
result.add_error(f"URL mismatch: lost={u1 - u2}, added={u2 - u1}")
|
||||
|
||||
|
||||
def validate_paths(orig, comp, result):
|
||||
p1 = extract_paths(orig)
|
||||
p2 = extract_paths(comp)
|
||||
|
||||
if p1 != p2:
|
||||
result.add_warning(f"Path mismatch: lost={p1 - p2}, added={p2 - p1}")
|
||||
|
||||
|
||||
def validate_bullets(orig, comp, result):
|
||||
b1 = count_bullets(orig)
|
||||
b2 = count_bullets(comp)
|
||||
|
||||
if b1 == 0:
|
||||
return
|
||||
|
||||
diff = abs(b1 - b2) / b1
|
||||
|
||||
if diff > 0.15:
|
||||
result.add_warning(f"Bullet count changed too much: {b1} -> {b2}")
|
||||
|
||||
|
||||
# ---------- Main ----------
|
||||
|
||||
|
||||
def validate(original_path: Path, compressed_path: Path) -> ValidationResult:
|
||||
result = ValidationResult()
|
||||
|
||||
orig = read_file(original_path)
|
||||
comp = read_file(compressed_path)
|
||||
|
||||
validate_headings(orig, comp, result)
|
||||
validate_code_blocks(orig, comp, result)
|
||||
validate_urls(orig, comp, result)
|
||||
validate_paths(orig, comp, result)
|
||||
validate_bullets(orig, comp, result)
|
||||
|
||||
return result
|
||||
|
||||
|
||||
# ---------- CLI ----------
|
||||
|
||||
if __name__ == "__main__":
|
||||
import sys
|
||||
|
||||
if len(sys.argv) != 3:
|
||||
print("Usage: python validate.py <original> <compressed>")
|
||||
sys.exit(1)
|
||||
|
||||
orig = Path(sys.argv[1]).resolve()
|
||||
comp = Path(sys.argv[2]).resolve()
|
||||
|
||||
res = validate(orig, comp)
|
||||
|
||||
print(f"\nValid: {res.is_valid}")
|
||||
|
||||
if res.errors:
|
||||
print("\nErrors:")
|
||||
for e in res.errors:
|
||||
print(f" - {e}")
|
||||
|
||||
if res.warnings:
|
||||
print("\nWarnings:")
|
||||
for w in res.warnings:
|
||||
print(f" - {w}")
|
||||
@@ -1 +0,0 @@
|
||||
../../.agents/skills/caveman
|
||||
@@ -1 +0,0 @@
|
||||
../../.agents/skills/caveman-compress
|
||||
@@ -1 +0,0 @@
|
||||
../../.agents/skills/caveman-help
|
||||
@@ -1 +0,0 @@
|
||||
../../.agents/skills/compress
|
||||
@@ -1,379 +0,0 @@
|
||||
---
|
||||
name: jsdoc
|
||||
description: Commenting and documentation guidelines. Auto-activate when the user discusses comments, documentation, docstrings, code clarity, API docs, JSDoc, or asks about commenting strategies.
|
||||
---
|
||||
|
||||
Auto-activate when: User discusses comments, documentation, docstrings, code clarity, code quality, API docs, JSDoc, Python docstrings, or asks about commenting strategies.
|
||||
Core Principle
|
||||
|
||||
Write code that speaks for itself. Comment only when necessary to explain WHY, not WHAT.
|
||||
|
||||
Most code does not need comments. Well-written code with clear naming and structure is self-documenting.
|
||||
|
||||
The best comment is the one you don't need to write because the code is already obvious.
|
||||
The Commenting Philosophy
|
||||
When to Comment
|
||||
|
||||
✅ DO comment when explaining:
|
||||
|
||||
WHY something is done (business logic, design decisions)
|
||||
Complex algorithms and their reasoning
|
||||
Non-obvious trade-offs or constraints
|
||||
Workarounds for bugs or limitations
|
||||
API contracts and public interfaces
|
||||
Regex patterns and what they match
|
||||
Performance considerations or optimizations
|
||||
Constants and magic numbers
|
||||
Gotchas or surprising behaviors
|
||||
|
||||
❌ DON'T comment when:
|
||||
|
||||
The code is obvious and self-explanatory
|
||||
The comment repeats the code (redundant)
|
||||
Better naming would eliminate the need
|
||||
The comment would become outdated quickly
|
||||
It's decorative or organizational noise
|
||||
It states what a standard language construct does
|
||||
|
||||
Comment Anti-Patterns
|
||||
❌ 1. Obvious Comments
|
||||
|
||||
BAD:
|
||||
|
||||
counter = 0 # Initialize counter to zero
|
||||
counter += 1 # Increment counter by one
|
||||
user_name = input("Enter name: ") # Get user name from input
|
||||
|
||||
Better: No comment needed - the code is self-explanatory.
|
||||
❌ 2. Redundant Comments
|
||||
|
||||
BAD:
|
||||
|
||||
def get_user_name(user):
|
||||
return user.name # Return the user's name
|
||||
|
||||
def calculate_total(items):
|
||||
# Loop through items and sum the prices
|
||||
total = 0
|
||||
for item in items:
|
||||
total += item.price
|
||||
return total
|
||||
|
||||
Better:
|
||||
|
||||
def get_user_name(user):
|
||||
return user.name
|
||||
|
||||
def calculate_total(items):
|
||||
return sum(item.price for item in items)
|
||||
|
||||
❌ 3. Outdated Comments
|
||||
|
||||
BAD:
|
||||
|
||||
# Calculate tax at 5% rate
|
||||
tax = price * 0.08 # Actually 8%, comment is wrong
|
||||
|
||||
# DEPRECATED: Use new_api_function() instead
|
||||
def old_function(): # Still being used, comment is misleading
|
||||
pass
|
||||
|
||||
Better: Keep comments in sync with code, or remove them entirely.
|
||||
❌ 4. Noise Comments
|
||||
|
||||
BAD:
|
||||
|
||||
# Start of function
|
||||
def calculate():
|
||||
# Declare variable
|
||||
result = 0
|
||||
# Return result
|
||||
return result
|
||||
# End of function
|
||||
|
||||
Better: Remove all of these comments.
|
||||
❌ 5. Dead Code & Changelog Comments
|
||||
|
||||
BAD:
|
||||
|
||||
# Don't comment out code - use version control
|
||||
# def old_function():
|
||||
# return "deprecated"
|
||||
|
||||
# Don't maintain history in comments
|
||||
# Modified by John on 2023-01-15
|
||||
# Fixed bug reported by Sarah on 2023-02-03
|
||||
|
||||
Better: Delete the code. Git has the history.
|
||||
Good Comment Examples
|
||||
✅ Complex Business Logic
|
||||
|
||||
# Apply progressive tax brackets: 10% up to $10k, 20% above
|
||||
# This matches IRS publication 501 for 2024
|
||||
def calculate_progressive_tax(income):
|
||||
if income <= 10000:
|
||||
return income * 0.10
|
||||
else:
|
||||
return 1000 + (income - 10000) * 0.20
|
||||
|
||||
✅ Non-obvious Algorithms
|
||||
|
||||
# Using Floyd-Warshall for all-pairs shortest paths
|
||||
# because we need distances between all nodes.
|
||||
# Time: O(n³), Space: O(n²)
|
||||
for k in range(vertices):
|
||||
for i in range(vertices):
|
||||
for j in range(vertices):
|
||||
dist[i][j] = min(dist[i][j], dist[i][k] + dist[k][j])
|
||||
|
||||
✅ Regex Patterns
|
||||
|
||||
# Match email format: username@domain.extension
|
||||
# Allows letters, numbers, dots, hyphens in username
|
||||
# Requires valid domain and 2+ char extension
|
||||
email_pattern = r'^[a-zA-Z0-9._%+-]+@[a-zA-Z0-9.-]+\.[a-zA-Z]{2,}$'
|
||||
|
||||
✅ API Constraints or Gotchas
|
||||
|
||||
# GitHub API rate limit: 5000 requests/hour for authenticated users
|
||||
# We implement exponential backoff to handle rate limiting
|
||||
await rate_limiter.wait()
|
||||
response = await fetch(github_api_url)
|
||||
|
||||
✅ Workarounds for Bugs
|
||||
|
||||
# HACK: Workaround for bug in library v2.1.0
|
||||
# Remove after upgrading to v2.2.0
|
||||
# See: https://github.com/library/issues/123
|
||||
if library_version == "2.1.0":
|
||||
apply_workaround()
|
||||
|
||||
Decision Framework
|
||||
|
||||
Before writing a comment, ask yourself:
|
||||
Step 1: Is the code self-explanatory?
|
||||
|
||||
If YES → No comment needed
|
||||
If NO → Continue to step 2
|
||||
|
||||
Step 2: Would a better variable/function name eliminate the need?
|
||||
|
||||
If YES → Refactor the code instead
|
||||
If NO → Continue to step 3
|
||||
|
||||
Step 3: Does this explain WHY, not WHAT?
|
||||
|
||||
If explaining WHAT → Refactor code to be clearer
|
||||
If explaining WHY → Good comment candidate
|
||||
|
||||
Step 4: Will this help future maintainers?
|
||||
|
||||
If YES → Write the comment
|
||||
If NO → Skip it
|
||||
|
||||
Special Cases for Comments
|
||||
Public APIs and Docstrings
|
||||
Python Docstrings
|
||||
|
||||
def calculate_compound_interest(
|
||||
principal: float,
|
||||
rate: float,
|
||||
time: int,
|
||||
compound_frequency: int = 1
|
||||
) -> float:
|
||||
"""
|
||||
Calculate compound interest using the standard formula.
|
||||
|
||||
Args:
|
||||
principal: Initial amount invested
|
||||
rate: Annual interest rate as decimal (e.g., 0.05 for 5%)
|
||||
time: Time period in years
|
||||
compound_frequency: Times per year interest compounds (default: 1)
|
||||
|
||||
Returns:
|
||||
Final amount after compound interest
|
||||
|
||||
Raises:
|
||||
ValueError: If any parameter is negative
|
||||
|
||||
Example:
|
||||
>>> calculate_compound_interest(1000, 0.05, 10)
|
||||
1628.89
|
||||
"""
|
||||
if principal < 0 or rate < 0 or time < 0:
|
||||
raise ValueError("Parameters must be non-negative")
|
||||
|
||||
# Compound interest formula: A = P(1 + r/n)^(nt)
|
||||
return principal * (1 + rate / compound_frequency) ** (compound_frequency * time)
|
||||
|
||||
JavaScript/TypeScript JSDoc
|
||||
|
||||
/**
|
||||
* Fetch user data from the API.
|
||||
*
|
||||
* @param {string} userId - The unique user identifier
|
||||
* @param {Object} options - Configuration options
|
||||
* @param {boolean} options.includeProfile - Include profile data (default: true)
|
||||
* @param {number} options.timeout - Request timeout in ms (default: 5000)
|
||||
*
|
||||
* @returns {Promise<User>} User object with requested fields
|
||||
*
|
||||
* @throws {Error} If userId is invalid or request fails
|
||||
*
|
||||
* @example
|
||||
* const user = await fetchUser('123', { includeProfile: true });
|
||||
*/
|
||||
async function fetchUser(userId, options = {}) {
|
||||
// Implementation
|
||||
}
|
||||
|
||||
Constants and Configuration
|
||||
|
||||
# Based on network reliability studies (95th percentile)
|
||||
MAX_RETRIES = 3
|
||||
|
||||
# AWS Lambda timeout is 15s, leaving 5s buffer for cleanup
|
||||
API_TIMEOUT = 10000 # milliseconds
|
||||
|
||||
# Cache duration optimized for balance between freshness and load
|
||||
# See: docs/performance-tuning.md
|
||||
CACHE_TTL = 300 # 5 minutes
|
||||
|
||||
Annotations for TODOs and Warnings
|
||||
|
||||
# TODO: Replace with proper authentication after security review
|
||||
# Issue: #456
|
||||
def temporary_auth(user):
|
||||
return True
|
||||
|
||||
# WARNING: This function modifies the original array instead of creating a copy
|
||||
def sort_in_place(arr):
|
||||
arr.sort()
|
||||
return arr
|
||||
|
||||
# FIXME: Memory leak in production - investigate connection pooling
|
||||
# Ticket: JIRA-789
|
||||
def get_connection():
|
||||
return create_connection()
|
||||
|
||||
# PERF: Consider caching this result if called frequently in hot path
|
||||
def expensive_calculation(data):
|
||||
return complex_algorithm(data)
|
||||
|
||||
# SECURITY: Validate input to prevent SQL injection before using in query
|
||||
def build_query(user_input):
|
||||
sanitized = escape_sql(user_input)
|
||||
return f"SELECT * FROM users WHERE name = '{sanitized}'"
|
||||
|
||||
Common Annotation Keywords
|
||||
|
||||
TODO: - Work that needs to be done
|
||||
FIXME: - Known bugs that need fixing
|
||||
HACK: - Temporary workarounds
|
||||
NOTE: - Important information or context
|
||||
WARNING: - Critical information about usage
|
||||
PERF: - Performance considerations
|
||||
SECURITY: - Security-related notes
|
||||
BUG: - Known bug documentation
|
||||
REFACTOR: - Code that needs refactoring
|
||||
DEPRECATED: - Soon-to-be-removed code
|
||||
|
||||
Refactoring Over Commenting
|
||||
Instead of Commenting Complex Code...
|
||||
|
||||
BAD: Complex code with comment
|
||||
|
||||
# Check if user is admin or has special permissions
|
||||
if user.role == "admin" or (user.permissions and "special" in user.permissions):
|
||||
grant_access()
|
||||
|
||||
...Extract to Named Function
|
||||
|
||||
GOOD: Self-explanatory through naming
|
||||
|
||||
def user_has_admin_access(user):
|
||||
return user.role == "admin" or has_special_permission(user)
|
||||
|
||||
def has_special_permission(user):
|
||||
return user.permissions and "special" in user.permissions
|
||||
|
||||
if user_has_admin_access(user):
|
||||
grant_access()
|
||||
|
||||
Language-Specific Examples
|
||||
JavaScript
|
||||
|
||||
// Good: Explains WHY we debounce
|
||||
// Debounce search to reduce API calls (500ms wait after last keystroke)
|
||||
const debouncedSearch = debounce(searchAPI, 500);
|
||||
|
||||
// Bad: Obvious
|
||||
let count = 0; // Initialize count to zero
|
||||
count++; // Increment count
|
||||
|
||||
// Good: Explains algorithm choice
|
||||
// Using Set for O(1) lookup instead of Array.includes() which is O(n)
|
||||
const seen = new Set(ids);
|
||||
|
||||
Python
|
||||
|
||||
# Good: Explains the algorithm choice
|
||||
# Using binary search because data is sorted and we need O(log n) performance
|
||||
index = bisect.bisect_left(sorted_list, target)
|
||||
|
||||
# Bad: Redundant
|
||||
def get_total(items):
|
||||
return sum(items) # Return the sum of items
|
||||
|
||||
# Good: Explains why we're doing this
|
||||
# Extract to separate function for type checking in mypy
|
||||
def validate_user(user):
|
||||
if not user or not user.id:
|
||||
raise ValueError("Invalid user")
|
||||
return user
|
||||
|
||||
TypeScript
|
||||
|
||||
// Good: Explains the type assertion
|
||||
// TypeScript can't infer this is never null after the check
|
||||
const element = document.getElementById('app') as HTMLElement;
|
||||
|
||||
// Bad: Obvious
|
||||
const sum = a + b; // Add a and b
|
||||
|
||||
// Good: Explains non-obvious behavior
|
||||
// spread operator creates shallow copy; use JSON for deep copy
|
||||
const newConfig = { ...config };
|
||||
|
||||
Comment Quality Checklist
|
||||
|
||||
Before committing, ensure your comments:
|
||||
|
||||
Explain WHY, not WHAT
|
||||
Are grammatically correct and clear
|
||||
Will remain accurate as code evolves
|
||||
Add genuine value to code understanding
|
||||
Are placed appropriately (above the code they describe)
|
||||
Use proper spelling and professional language
|
||||
Follow team conventions for annotation keywords
|
||||
Could not be replaced by better naming or structure
|
||||
Are not obvious statements about language features
|
||||
Reference tickets/issues when applicable
|
||||
|
||||
Summary
|
||||
|
||||
Priority order:
|
||||
|
||||
Clear code - Self-explanatory through naming and structure
|
||||
Good comments - Explain WHY when necessary
|
||||
Documentation - API docs, docstrings for public interfaces
|
||||
No comments - Better than bad comments that lie or clutter
|
||||
|
||||
Remember: Comments are a failure to make the code self-explanatory. Use them sparingly and wisely.
|
||||
Key Takeaways
|
||||
Goal Approach
|
||||
Reduce comments Improve naming, extract functions, simplify logic
|
||||
Improve clarity Use self-explanatory code structure, clear variable names
|
||||
Document APIs Use docstrings/JSDoc for public interfaces
|
||||
Explain WHY Comment only business logic, algorithms, workarounds
|
||||
Maintain accuracy Update comments when code changes, or remove them
|
||||
@@ -1,360 +0,0 @@
|
||||
---
|
||||
name: typescript
|
||||
description: TypeScript engineering guidelines based on Google's style guide. Use when writing, reviewing, or refactoring TypeScript code in this project.
|
||||
---
|
||||
|
||||
Comprehensive guidelines for writing production-quality TypeScript based on Google's TypeScript Style Guide.
|
||||
Naming Conventions
|
||||
Type Convention Example
|
||||
Classes, Interfaces, Types, Enums UpperCamelCase UserService, HttpClient
|
||||
Variables, Parameters, Functions lowerCamelCase userName, processData
|
||||
Global Constants, Enum Values CONSTANT_CASE MAX_RETRIES, Status.ACTIVE
|
||||
Type Parameters Single letter or UpperCamelCase T, ResponseType
|
||||
Naming Principles
|
||||
|
||||
Descriptive names, avoid ambiguous abbreviations
|
||||
Treat acronyms as words: loadHttpUrl not loadHTTPURL
|
||||
No prefixes like opt_ for optional parameters
|
||||
No trailing underscores for private properties
|
||||
Single-letter variables only when scope is <10 lines
|
||||
|
||||
Variable Declarations
|
||||
|
||||
// Always use const by default
|
||||
const users = getUsers();
|
||||
|
||||
// Use let only when reassignment is needed
|
||||
let count = 0;
|
||||
count++;
|
||||
|
||||
// Never use var
|
||||
// var x = 1; // WRONG
|
||||
|
||||
// One variable per declaration
|
||||
const a = 1;
|
||||
const b = 2;
|
||||
// const a = 1, b = 2; // WRONG
|
||||
|
||||
Types and Interfaces
|
||||
Prefer Type Aliases Over Interfaces
|
||||
|
||||
// Good: type alias for object shapes
|
||||
type User = {
|
||||
id: string;
|
||||
name: string;
|
||||
email?: string;
|
||||
};
|
||||
|
||||
// Avoid: interface for object shapes
|
||||
// interface User {
|
||||
// id: string;
|
||||
// name: string;
|
||||
// }
|
||||
|
||||
// Type aliases work for everything: objects, unions, intersections, mapped types
|
||||
type Status = 'active' | 'inactive';
|
||||
type Combined = TypeA & TypeB;
|
||||
type Handler = (event: Event) => void;
|
||||
|
||||
// Benefits of types over interfaces:
|
||||
// 1. Consistent syntax for all type definitions
|
||||
// 2. Cannot be merged/extended unexpectedly (no declaration merging)
|
||||
// 3. Better for union types and computed properties
|
||||
// 4. Works with utility types more naturally
|
||||
|
||||
Type Inference
|
||||
|
||||
Leverage inference for trivially inferred types:
|
||||
|
||||
// Good: inference is clear
|
||||
const name = 'Alice';
|
||||
const items = [1, 2, 3];
|
||||
|
||||
// Good: explicit for complex expressions
|
||||
const result: ProcessedData = complexTransformation(input);
|
||||
|
||||
Array Types
|
||||
|
||||
// Simple types: use T[]
|
||||
const numbers: number[];
|
||||
const names: readonly string[];
|
||||
|
||||
// Multi-dimensional: use T[][]
|
||||
const matrix: number[][];
|
||||
|
||||
// Complex types: use Array<T>
|
||||
const handlers: Array<(event: Event) => void>;
|
||||
|
||||
Null and Undefined
|
||||
|
||||
// Prefer optional fields over union with undefined
|
||||
interface Config {
|
||||
timeout?: number; // Good
|
||||
// timeout: number | undefined; // Avoid
|
||||
}
|
||||
|
||||
// Type aliases must NOT include |null or |undefined
|
||||
type UserId = string; // Good
|
||||
// type UserId = string | null; // WRONG
|
||||
|
||||
// May use == for null comparison (catches both null and undefined)
|
||||
if (value == null) {
|
||||
// handles both null and undefined
|
||||
}
|
||||
|
||||
Types to Avoid
|
||||
|
||||
// Avoid any - use unknown instead
|
||||
function parse(input: unknown): Data { }
|
||||
|
||||
// Avoid {} - use unknown, Record<string, T>, or object
|
||||
function process(obj: Record<string, unknown>): void { }
|
||||
|
||||
// Use lowercase primitives
|
||||
let name: string; // Good
|
||||
// let name: String; // WRONG
|
||||
|
||||
// Never use wrapper objects
|
||||
// new String('hello') // WRONG
|
||||
|
||||
Classes
|
||||
Structure
|
||||
|
||||
class UserService {
|
||||
// Fields first, initialized where declared
|
||||
private readonly cache = new Map<string, User>();
|
||||
private lastAccess: Date | null = null;
|
||||
|
||||
// Constructor with parameter properties
|
||||
constructor(
|
||||
private readonly api: ApiClient,
|
||||
private readonly logger: Logger,
|
||||
) {}
|
||||
|
||||
// Methods separated by blank lines
|
||||
async getUser(id: string): Promise<User> {
|
||||
// ...
|
||||
}
|
||||
|
||||
private validateId(id: string): boolean {
|
||||
// ...
|
||||
}
|
||||
}
|
||||
|
||||
Visibility
|
||||
|
||||
class Example {
|
||||
// private by default, only use public when needed externally
|
||||
private internalState = 0;
|
||||
|
||||
// readonly for properties never reassigned after construction
|
||||
readonly id: string;
|
||||
|
||||
// Never use #private syntax - use TypeScript visibility
|
||||
// #field = 1; // WRONG
|
||||
private field = 1; // Good
|
||||
}
|
||||
|
||||
Avoid Arrow Functions as Properties
|
||||
|
||||
class Handler {
|
||||
// Avoid: arrow function as property
|
||||
// handleClick = () => { ... };
|
||||
|
||||
// Good: instance method
|
||||
handleClick(): void {
|
||||
// ...
|
||||
}
|
||||
}
|
||||
|
||||
// Bind at call site if needed
|
||||
element.addEventListener('click', () => handler.handleClick());
|
||||
|
||||
Static Methods
|
||||
|
||||
Never use this in static methods
|
||||
Call on defining class, not subclasses
|
||||
|
||||
Functions
|
||||
Prefer Function Declarations
|
||||
|
||||
// Good: function declaration for named functions
|
||||
function processData(input: Data): Result {
|
||||
return transform(input);
|
||||
}
|
||||
|
||||
// Arrow functions when type annotation needed
|
||||
const handler: EventHandler = (event) => {
|
||||
// ...
|
||||
};
|
||||
|
||||
Arrow Function Bodies
|
||||
|
||||
// Concise body only when return value is used
|
||||
const double = (x: number) => x * 2;
|
||||
|
||||
// Block body when return should be void
|
||||
const log = (msg: string) => {
|
||||
console.log(msg);
|
||||
};
|
||||
|
||||
Parameters
|
||||
|
||||
// Use rest parameters, not arguments
|
||||
function sum(...numbers: number[]): number {
|
||||
return numbers.reduce((a, b) => a + b, 0);
|
||||
}
|
||||
|
||||
// Destructuring for multiple optional params
|
||||
interface Options {
|
||||
timeout?: number;
|
||||
retries?: number;
|
||||
}
|
||||
function fetch(url: string, { timeout = 5000, retries = 3 }: Options = {}) {
|
||||
// ...
|
||||
}
|
||||
|
||||
// Never name a parameter 'arguments'
|
||||
|
||||
Imports and Exports
|
||||
Always Use Named Exports
|
||||
|
||||
// Good: named exports
|
||||
export function processData() { }
|
||||
export class UserService { }
|
||||
export interface Config { }
|
||||
|
||||
// Never use default exports
|
||||
// export default class UserService { } // WRONG
|
||||
|
||||
Import Styles
|
||||
|
||||
// Module import for large APIs
|
||||
import * as fs from 'fs';
|
||||
|
||||
// Named imports for frequently used symbols
|
||||
import { readFile, writeFile } from 'fs/promises';
|
||||
|
||||
// Type-only imports when only used as types
|
||||
import type { User, Config } from './types';
|
||||
|
||||
Module Organization
|
||||
|
||||
Use modules, never namespace Foo { }
|
||||
Never use require() - use ES6 imports
|
||||
Use relative imports within same project
|
||||
Avoid excessive ../../../
|
||||
|
||||
Control Structures
|
||||
Always Use Braces
|
||||
|
||||
// Good
|
||||
if (condition) {
|
||||
doSomething();
|
||||
}
|
||||
|
||||
// Exception: single-line if
|
||||
if (condition) return early;
|
||||
|
||||
Loops
|
||||
|
||||
// Prefer for...of for arrays
|
||||
for (const item of items) {
|
||||
process(item);
|
||||
}
|
||||
|
||||
// Use Object methods with for...of for objects
|
||||
for (const [key, value] of Object.entries(obj)) {
|
||||
// ...
|
||||
}
|
||||
|
||||
// Never use unfiltered for...in on arrays
|
||||
|
||||
Equality
|
||||
|
||||
// Always use === and !==
|
||||
if (a === b) { }
|
||||
|
||||
// Exception: == null catches both null and undefined
|
||||
if (value == null) { }
|
||||
|
||||
Switch Statements
|
||||
|
||||
switch (status) {
|
||||
case Status.Active:
|
||||
handleActive();
|
||||
break;
|
||||
case Status.Inactive:
|
||||
handleInactive();
|
||||
break;
|
||||
default:
|
||||
// Always include default, even if empty
|
||||
break;
|
||||
}
|
||||
|
||||
Exception Handling
|
||||
|
||||
// Always throw Error instances
|
||||
throw new Error('Something went wrong');
|
||||
// throw 'error'; // WRONG
|
||||
|
||||
// Catch with unknown type
|
||||
try {
|
||||
riskyOperation();
|
||||
} catch (e: unknown) {
|
||||
if (e instanceof Error) {
|
||||
logger.error(e.message);
|
||||
}
|
||||
throw e;
|
||||
}
|
||||
|
||||
// Empty catch needs justification comment
|
||||
try {
|
||||
optional();
|
||||
} catch {
|
||||
// Intentionally ignored: fallback behavior handles this
|
||||
}
|
||||
|
||||
Type Assertions
|
||||
|
||||
// Use 'as' syntax, not angle brackets
|
||||
const input = value as string;
|
||||
// const input = <string>value; // WRONG in TSX, avoid everywhere
|
||||
|
||||
// Double assertion through unknown when needed
|
||||
const config = (rawData as unknown) as Config;
|
||||
|
||||
// Add comment explaining why assertion is safe
|
||||
const element = document.getElementById('app') as HTMLElement;
|
||||
// Safe: element exists in index.html
|
||||
|
||||
Strings
|
||||
|
||||
// Use single quotes for string literals
|
||||
const name = 'Alice';
|
||||
|
||||
// Template literals for interpolation or multiline
|
||||
const message = `Hello, ${name}!`;
|
||||
const query = `
|
||||
SELECT *
|
||||
FROM users
|
||||
WHERE id = ?
|
||||
`;
|
||||
|
||||
// Never use backslash line continuations
|
||||
|
||||
Disallowed Features
|
||||
Feature Alternative
|
||||
var const or let
|
||||
Array() constructor [] literal
|
||||
Object() constructor {} literal
|
||||
any type unknown
|
||||
namespace modules
|
||||
require() import
|
||||
Default exports Named exports
|
||||
#private fields private modifier
|
||||
eval() Never use
|
||||
const enum Regular enum
|
||||
debugger Remove before commit
|
||||
with Never use
|
||||
Prototype modification Never modify
|
||||
28
.eslintrc.js
Normal file
28
.eslintrc.js
Normal file
@@ -0,0 +1,28 @@
|
||||
module.exports = {
|
||||
extends: ["plugin:react/recommended", "plugin:@typescript-eslint/recommended", "plugin:prettier/recommended", "plugin:css-modules/recommended", "plugin:storybook/recommended", "plugin:storybook/recommended", "plugin:storybook/recommended", "plugin:storybook/recommended"],
|
||||
parser: "@typescript-eslint/parser",
|
||||
parserOptions: {
|
||||
sourceType: "module",
|
||||
ecmaVersion: 2020,
|
||||
ecmaFeatures: {
|
||||
jsx: true // Allows for the parsing of JSX
|
||||
}
|
||||
},
|
||||
|
||||
plugins: ["@typescript-eslint", "css-modules"],
|
||||
settings: {
|
||||
"import/resolver": {
|
||||
node: {
|
||||
extensions: [".js", ".jsx", ".ts", ".tsx"]
|
||||
}
|
||||
},
|
||||
react: {
|
||||
version: "detect" // Tells eslint-plugin-react to automatically detect the version of React to use
|
||||
}
|
||||
},
|
||||
|
||||
// Fine tune rules
|
||||
rules: {
|
||||
"@typescript-eslint/no-var-requires": 0
|
||||
}
|
||||
};
|
||||
@@ -1,4 +1,4 @@
|
||||
export default {
|
||||
semi: true,
|
||||
trailingComma: "all",
|
||||
module.exports = {
|
||||
semi: true,
|
||||
trailingComma: "all",
|
||||
};
|
||||
|
||||
35
Dockerfile
35
Dockerfile
@@ -1,36 +1,19 @@
|
||||
# Use Node.js 22 as the base image
|
||||
FROM node:22-alpine
|
||||
|
||||
FROM node:18.15.0-alpine
|
||||
LABEL maintainer="Rishi Ghan <rishi.ghan@gmail.com>"
|
||||
|
||||
# Set the working directory inside the container
|
||||
WORKDIR /threetwo
|
||||
|
||||
# Copy package.json and yarn.lock to leverage Docker cache
|
||||
COPY package.json yarn.lock ./
|
||||
COPY package.json ./
|
||||
COPY yarn.lock ./
|
||||
COPY nodemon.json ./
|
||||
COPY jsdoc.json ./
|
||||
|
||||
# Install build dependencies necessary for native modules (for node-sass)
|
||||
RUN apk --no-cache add \
|
||||
g++ \
|
||||
make \
|
||||
python3 \
|
||||
autoconf \
|
||||
automake \
|
||||
libtool \
|
||||
nasm \
|
||||
git
|
||||
# RUN apt-get update && apt-get install -y git python3 build-essential autoconf automake g++ libpng-dev make
|
||||
RUN apk --no-cache add g++ make libpng-dev git python3 libc6-compat autoconf automake libjpeg-turbo-dev libpng-dev mesa-dev mesa libxi build-base gcc libtool nasm
|
||||
RUN yarn --ignore-engines
|
||||
|
||||
# Install node modules
|
||||
RUN yarn install --ignore-engines
|
||||
|
||||
# Explicitly install sass
|
||||
RUN yarn add -D sass
|
||||
|
||||
# Copy the rest of the application files into the container
|
||||
COPY . .
|
||||
|
||||
# Expose the application port (default for Vite)
|
||||
EXPOSE 5173
|
||||
|
||||
# Start the application with yarn
|
||||
ENTRYPOINT ["yarn", "start"]
|
||||
ENTRYPOINT [ "npm", "start" ]
|
||||
@@ -1 +0,0 @@
|
||||
module.exports = 'test-file-stub';
|
||||
16
codegen.yml
16
codegen.yml
@@ -1,16 +0,0 @@
|
||||
schema: http://localhost:3000/graphql
|
||||
documents: 'src/client/graphql/**/*.graphql'
|
||||
generates:
|
||||
src/client/graphql/generated.ts:
|
||||
plugins:
|
||||
- typescript
|
||||
- typescript-operations
|
||||
- typescript-react-query
|
||||
config:
|
||||
fetcher:
|
||||
func: './fetcher#fetcher'
|
||||
isReactHook: false
|
||||
exposeFetcher: true
|
||||
exposeQueryKeys: true
|
||||
addInfiniteQuery: true
|
||||
reactQueryVersion: 5
|
||||
@@ -1,59 +0,0 @@
|
||||
import js from "@eslint/js";
|
||||
import typescript from "@typescript-eslint/eslint-plugin";
|
||||
import typescriptParser from "@typescript-eslint/parser";
|
||||
import react from "eslint-plugin-react";
|
||||
import prettier from "eslint-plugin-prettier";
|
||||
import cssModules from "eslint-plugin-css-modules";
|
||||
import storybook from "eslint-plugin-storybook";
|
||||
|
||||
export default [
|
||||
js.configs.recommended,
|
||||
{
|
||||
files: ["**/*.{js,jsx,ts,tsx}"],
|
||||
languageOptions: {
|
||||
parser: typescriptParser,
|
||||
parserOptions: {
|
||||
sourceType: "module",
|
||||
ecmaVersion: 2020,
|
||||
ecmaFeatures: {
|
||||
jsx: true,
|
||||
},
|
||||
},
|
||||
},
|
||||
plugins: {
|
||||
"@typescript-eslint": typescript,
|
||||
react,
|
||||
prettier,
|
||||
"css-modules": cssModules,
|
||||
storybook,
|
||||
},
|
||||
settings: {
|
||||
"import/resolver": {
|
||||
node: {
|
||||
extensions: [".js", ".jsx", ".ts", ".tsx"],
|
||||
},
|
||||
},
|
||||
react: {
|
||||
version: "detect",
|
||||
},
|
||||
},
|
||||
rules: {
|
||||
...typescript.configs.recommended.rules,
|
||||
...react.configs.recommended.rules,
|
||||
...prettier.configs.recommended.rules,
|
||||
"@typescript-eslint/no-var-requires": "off",
|
||||
"@typescript-eslint/no-explicit-any": "off",
|
||||
"react/react-in-jsx-scope": "off",
|
||||
"no-undef": "off",
|
||||
},
|
||||
},
|
||||
{
|
||||
files: ["**/*.stories.{js,jsx,ts,tsx}"],
|
||||
rules: {
|
||||
...storybook.configs.recommended.rules,
|
||||
},
|
||||
},
|
||||
{
|
||||
ignores: ["dist/**", "node_modules/**", "build/**"],
|
||||
},
|
||||
];
|
||||
@@ -1,28 +0,0 @@
|
||||
module.exports = {
|
||||
preset: 'ts-jest',
|
||||
testEnvironment: 'jsdom',
|
||||
setupFilesAfterEnv: ['<rootDir>/jest.setup.cjs'],
|
||||
moduleNameMapper: {
|
||||
'\\.(css|less|scss|sass)$': 'identity-obj-proxy',
|
||||
'\\.(jpg|jpeg|png|gif|svg)$': '<rootDir>/__mocks__/fileMock.cjs',
|
||||
},
|
||||
testMatch: [
|
||||
'**/__tests__/**/*.+(ts|tsx|js)',
|
||||
'**/?(*.)+(spec|test).+(ts|tsx|js)',
|
||||
],
|
||||
transform: {
|
||||
'^.+\\.(ts|tsx)$': ['ts-jest', {
|
||||
tsconfig: {
|
||||
jsx: 'react',
|
||||
esModuleInterop: true,
|
||||
allowSyntheticDefaultImports: true,
|
||||
},
|
||||
}],
|
||||
},
|
||||
moduleFileExtensions: ['ts', 'tsx', 'js', 'jsx', 'json', 'node'],
|
||||
collectCoverageFrom: [
|
||||
'src/**/*.{ts,tsx}',
|
||||
'!src/**/*.d.ts',
|
||||
'!src/**/*.stories.tsx',
|
||||
],
|
||||
};
|
||||
@@ -1,25 +0,0 @@
|
||||
require('@testing-library/jest-dom');
|
||||
|
||||
// Mock window.matchMedia
|
||||
Object.defineProperty(window, 'matchMedia', {
|
||||
writable: true,
|
||||
value: jest.fn().mockImplementation(query => ({
|
||||
matches: false,
|
||||
media: query,
|
||||
onchange: null,
|
||||
addListener: jest.fn(),
|
||||
removeListener: jest.fn(),
|
||||
addEventListener: jest.fn(),
|
||||
removeEventListener: jest.fn(),
|
||||
dispatchEvent: jest.fn(),
|
||||
})),
|
||||
});
|
||||
|
||||
// Mock localStorage
|
||||
const localStorageMock = {
|
||||
getItem: jest.fn(),
|
||||
setItem: jest.fn(),
|
||||
removeItem: jest.fn(),
|
||||
clear: jest.fn(),
|
||||
};
|
||||
global.localStorage = localStorageMock;
|
||||
13
nodemon.json
Normal file
13
nodemon.json
Normal file
@@ -0,0 +1,13 @@
|
||||
{
|
||||
"ignore": [
|
||||
"**/*.test.ts",
|
||||
"**/*.spec.ts",
|
||||
"node_modules",
|
||||
"src/client"
|
||||
],
|
||||
"watch": [
|
||||
"src/server"
|
||||
],
|
||||
"exec": "tsc -p tsconfig.server.json && node server/",
|
||||
"ext": "ts"
|
||||
}
|
||||
20050
package-lock.json
generated
20050
package-lock.json
generated
File diff suppressed because it is too large
Load Diff
226
package.json
226
package.json
@@ -1,151 +1,127 @@
|
||||
{
|
||||
"name": "threetwo",
|
||||
"version": "0.1.0",
|
||||
"type": "module",
|
||||
"description": "ThreeTwo! A good comic book curator.",
|
||||
"main": "server/index.js",
|
||||
"typings": "server/index.js",
|
||||
"scripts": {
|
||||
"build": "vite build",
|
||||
"dev": "rimraf dist && yarn build && vite",
|
||||
"start": "yarn build && vite",
|
||||
"dev": "rimraf dist && npm run build && vite",
|
||||
"start": "npm run build && vite",
|
||||
"docs": "jsdoc -c jsdoc.json",
|
||||
"test": "jest",
|
||||
"test:watch": "jest --watch",
|
||||
"test:coverage": "jest --coverage",
|
||||
"storybook": "storybook dev -p 6006",
|
||||
"build-storybook": "storybook build",
|
||||
"codegen": "wait-on http-get://localhost:3000/graphql/health && graphql-codegen",
|
||||
"codegen:watch": "graphql-codegen --config codegen.yml --watch",
|
||||
"knip": "knip"
|
||||
"build-storybook": "storybook build"
|
||||
},
|
||||
"author": "Rishi Ghan",
|
||||
"license": "MIT",
|
||||
"dependencies": {
|
||||
"@dnd-kit/core": "^6.3.1",
|
||||
"@dnd-kit/sortable": "^10.0.0",
|
||||
"@dnd-kit/utilities": "^3.2.2",
|
||||
"@floating-ui/react": "^0.27.18",
|
||||
"@floating-ui/react-dom": "^2.1.7",
|
||||
"@dnd-kit/core": "^6.0.8",
|
||||
"@dnd-kit/sortable": "^7.0.2",
|
||||
"@dnd-kit/utilities": "^3.2.1",
|
||||
"@fortawesome/fontawesome-free": "^6.3.0",
|
||||
"@popperjs/core": "^2.11.8",
|
||||
"@tailwindcss/vite": "^4.2.2",
|
||||
"@tanstack/react-query": "^5.90.21",
|
||||
"@tanstack/react-table": "^8.21.3",
|
||||
"@types/mime-types": "^3.0.1",
|
||||
"@rollup/plugin-node-resolve": "^15.0.1",
|
||||
"@tanstack/react-query": "^5.0.5",
|
||||
"@tanstack/react-table": "^8.9.3",
|
||||
"@types/mime-types": "^2.1.0",
|
||||
"@types/react-router-dom": "^5.3.3",
|
||||
"@vitejs/plugin-react": "^5.1.4",
|
||||
"airdcpp-apisocket": "^3.0.0-beta.14",
|
||||
"axios": "^1.13.5",
|
||||
"axios-cache-interceptor": "^1.11.4",
|
||||
"axios-rate-limit": "^1.6.2",
|
||||
"@vitejs/plugin-react": "^4.2.1",
|
||||
"airdcpp-apisocket": "^2.5.0-beta.2",
|
||||
"axios": "^1.3.4",
|
||||
"axios-cache-interceptor": "^1.0.1",
|
||||
"axios-rate-limit": "^1.3.0",
|
||||
"babel-plugin-styled-components": "^2.1.4",
|
||||
"date-fns": "^4.1.0",
|
||||
"dayjs": "^1.11.19",
|
||||
"ellipsize": "^0.7.0",
|
||||
"embla-carousel-react": "^8.6.0",
|
||||
"filename-parser": "^1.0.4",
|
||||
"final-form": "^5.0.0",
|
||||
"final-form-arrays": "^4.0.0",
|
||||
"focus-trap-react": "^12.0.0",
|
||||
"graphql": "^16.13.1",
|
||||
"date-fns": "^2.28.0",
|
||||
"dayjs": "^1.10.6",
|
||||
"ellipsize": "^0.5.1",
|
||||
"express": "^4.17.1",
|
||||
"filename-parser": "^1.0.2",
|
||||
"final-form": "^4.20.2",
|
||||
"final-form-arrays": "^3.0.2",
|
||||
"focus-trap-react": "^10.2.3",
|
||||
"history": "^5.3.0",
|
||||
"html-to-text": "^9.0.5",
|
||||
"i18next": "^25.8.13",
|
||||
"i18next-browser-languagedetector": "^8.2.1",
|
||||
"i18next-http-backend": "^3.0.2",
|
||||
"immer": "^11.1.4",
|
||||
"jsdoc": "^4.0.5",
|
||||
"lodash": "^4.17.23",
|
||||
"motion": "^12.38.0",
|
||||
"pretty-bytes": "^7.1.0",
|
||||
"html-to-text": "^8.1.0",
|
||||
"immer": "^10.0.3",
|
||||
"jsdoc": "^3.6.10",
|
||||
"keen-slider": "^6.8.6",
|
||||
"lodash": "^4.17.21",
|
||||
"pretty-bytes": "^5.6.0",
|
||||
"prop-types": "^15.8.1",
|
||||
"qs": "^6.15.0",
|
||||
"react": "^19.2.4",
|
||||
"react-collapsible": "^2.10.0",
|
||||
"react-comic-viewer": "^0.5.1",
|
||||
"react-day-picker": "^9.13.2",
|
||||
"react-dom": "^19.2.4",
|
||||
"react-fast-compare": "^3.2.2",
|
||||
"react-final-form": "^7.0.0",
|
||||
"react-final-form-arrays": "^4.0.0",
|
||||
"react-i18next": "^16.5.4",
|
||||
"react-loader-spinner": "^8.0.2",
|
||||
"react-modal": "^3.16.3",
|
||||
"react-router": "^7.13.1",
|
||||
"react-router-dom": "^7.13.1",
|
||||
"react-select": "^5.10.2",
|
||||
"react-select-async-paginate": "^0.7.11",
|
||||
"react-sliding-pane": "^7.3.0",
|
||||
"react-textarea-autosize": "^8.5.9",
|
||||
"react-toastify": "^11.0.5",
|
||||
"socket.io-client": "^4.8.3",
|
||||
"styled-components": "^6.3.11",
|
||||
"qs": "^6.10.5",
|
||||
"react": "^18.2.0",
|
||||
"react-collapsible": "^2.9.0",
|
||||
"react-comic-viewer": "^0.4.0",
|
||||
"react-day-picker": "^8.10.0",
|
||||
"react-dom": "^18.2.0",
|
||||
"react-fast-compare": "^3.2.0",
|
||||
"react-final-form": "^6.5.9",
|
||||
"react-final-form-arrays": "^3.1.4",
|
||||
"react-loader-spinner": "^4.0.0",
|
||||
"react-modal": "^3.15.1",
|
||||
"react-popper": "^2.3.0",
|
||||
"react-router": "^6.9.0",
|
||||
"react-router-dom": "^6.9.0",
|
||||
"react-select": "^5.8.0",
|
||||
"react-select-async-paginate": "^0.7.2",
|
||||
"react-sliding-pane": "^7.1.0",
|
||||
"react-textarea-autosize": "^8.3.4",
|
||||
"reapop": "^4.2.1",
|
||||
"socket.io-client": "^4.3.2",
|
||||
"styled-components": "^6.1.0",
|
||||
"threetwo-ui-typings": "^1.0.14",
|
||||
"vaul": "^1.1.2",
|
||||
"vite": "^7.3.1",
|
||||
"vite-plugin-html": "^3.2.2",
|
||||
"websocket": "^1.0.35",
|
||||
"zustand": "^5.0.11"
|
||||
"vite": "^5.0.5",
|
||||
"vite-plugin-html": "^3.2.0",
|
||||
"websocket": "^1.0.34",
|
||||
"zustand": "^4.4.6"
|
||||
},
|
||||
"devDependencies": {
|
||||
"@eslint/js": "^10.0.0",
|
||||
"@graphql-codegen/cli": "^6.1.2",
|
||||
"@graphql-codegen/typescript": "^5.0.8",
|
||||
"@graphql-codegen/typescript-operations": "^5.0.8",
|
||||
"@graphql-codegen/typescript-react-query": "^6.1.2",
|
||||
"@iconify-json/solar": "^1.2.5",
|
||||
"@iconify/json": "^2.2.443",
|
||||
"@iconify/tailwind": "^1.2.0",
|
||||
"@iconify/tailwind4": "^1.2.1",
|
||||
"@iconify/utils": "^3.1.0",
|
||||
"@storybook/addon-essentials": "^8.6.17",
|
||||
"@storybook/addon-interactions": "^8.6.17",
|
||||
"@storybook/addon-links": "^8.6.17",
|
||||
"@storybook/addon-onboarding": "^8.6.17",
|
||||
"@storybook/blocks": "^8.6.17",
|
||||
"@storybook/react": "^8.6.17",
|
||||
"@storybook/react-vite": "^8.6.17",
|
||||
"@storybook/testing-library": "^0.2.2",
|
||||
"@tailwindcss/postcss": "^4.2.1",
|
||||
"@tanstack/eslint-plugin-query": "^5.91.4",
|
||||
"@tanstack/react-query-devtools": "^5.91.3",
|
||||
"@testing-library/jest-dom": "^6.9.1",
|
||||
"@testing-library/react": "^16.3.2",
|
||||
"@testing-library/user-event": "^14.6.1",
|
||||
"@types/ellipsize": "^0.1.3",
|
||||
"@types/html-to-text": "^9.0.4",
|
||||
"@types/jest": "^30.0.0",
|
||||
"@types/lodash": "^4.17.24",
|
||||
"@types/node": "^25.6.0",
|
||||
"@types/prop-types": "^15.7.15",
|
||||
"@types/react": "^19.2.14",
|
||||
"@types/react-dom": "^19.2.3",
|
||||
"@types/react-table": "^7.7.20",
|
||||
"autoprefixer": "^10.4.27",
|
||||
"@iconify-json/solar": "^1.1.8",
|
||||
"@iconify/tailwind": "^0.1.4",
|
||||
"@storybook/addon-essentials": "^7.4.1",
|
||||
"@storybook/addon-interactions": "^7.4.1",
|
||||
"@storybook/addon-links": "^7.4.1",
|
||||
"@storybook/addon-onboarding": "^1.0.8",
|
||||
"@storybook/blocks": "^7.4.1",
|
||||
"@storybook/react": "^7.4.1",
|
||||
"@storybook/react-vite": "^7.4.1",
|
||||
"@storybook/testing-library": "^0.2.0",
|
||||
"@tanstack/eslint-plugin-query": "^5.0.5",
|
||||
"@tanstack/react-query-devtools": "^5.1.0",
|
||||
"@tsconfig/node14": "^1.0.0",
|
||||
"@types/ellipsize": "^0.1.1",
|
||||
"@types/express": "^4.17.8",
|
||||
"@types/jest": "^26.0.20",
|
||||
"@types/lodash": "^4.14.168",
|
||||
"@types/node": "^14.14.34",
|
||||
"@types/react": "^18.0.28",
|
||||
"@types/react-dom": "^18.0.11",
|
||||
"@types/react-redux": "^7.1.25",
|
||||
"autoprefixer": "^10.4.16",
|
||||
"body-parser": "^1.19.0",
|
||||
"docdash": "^2.0.2",
|
||||
"eslint": "^10.0.2",
|
||||
"eslint-config-prettier": "^10.1.8",
|
||||
"eslint-plugin-css-modules": "^2.12.0",
|
||||
"eslint-plugin-import": "^2.32.0",
|
||||
"eslint-plugin-jsdoc": "^62.7.1",
|
||||
"eslint-plugin-jsx-a11y": "^6.10.2",
|
||||
"eslint-plugin-prettier": "^5.5.5",
|
||||
"eslint-plugin-react": "^7.37.5",
|
||||
"eslint-plugin-storybook": "^0.11.1",
|
||||
"identity-obj-proxy": "^3.0.0",
|
||||
"eslint": "^8.49.0",
|
||||
"eslint-config-prettier": "^8.1.0",
|
||||
"eslint-plugin-css-modules": "^2.11.0",
|
||||
"eslint-plugin-import": "^2.22.1",
|
||||
"eslint-plugin-jsdoc": "^46.6.0",
|
||||
"eslint-plugin-jsx-a11y": "^6.0.3",
|
||||
"eslint-plugin-prettier": "^3.3.1",
|
||||
"eslint-plugin-react": "^7.22.0",
|
||||
"eslint-plugin-storybook": "^0.6.13",
|
||||
"express": "^4.17.1",
|
||||
"install": "^0.13.0",
|
||||
"jest": "^30.2.0",
|
||||
"jest-environment-jsdom": "^30.2.0",
|
||||
"postcss": "^8.5.6",
|
||||
"postcss-import": "^16.1.1",
|
||||
"prettier": "^3.8.1",
|
||||
"react-refresh": "^0.18.0",
|
||||
"rimraf": "^6.1.3",
|
||||
"sass": "^1.97.3",
|
||||
"storybook": "^8.6.17",
|
||||
"tailwindcss": "^4.2.2",
|
||||
"ts-jest": "^29.4.6",
|
||||
"jest": "^29.6.3",
|
||||
"nodemon": "^3.0.1",
|
||||
"postcss": "^8.4.32",
|
||||
"postcss-import": "^15.1.0",
|
||||
"prettier": "^2.2.1",
|
||||
"react-refresh": "^0.14.0",
|
||||
"rimraf": "^4.1.3",
|
||||
"sass": "^1.69.5",
|
||||
"storybook": "^7.3.2",
|
||||
"tailwindcss": "^3.4.1",
|
||||
"tui-jsdoc-template": "^1.2.2",
|
||||
"typescript": "^6.0.2",
|
||||
"wait-on": "^9.0.4"
|
||||
"typescript": "^5.1.6"
|
||||
},
|
||||
"resolutions": {
|
||||
"jackspeak": "2.1.1"
|
||||
|
||||
@@ -1,211 +0,0 @@
|
||||
# Implementation Plan: Directory Status Check for Import.tsx
|
||||
|
||||
## Overview
|
||||
|
||||
Add functionality to `Import.tsx` that checks if the required directories (`comics` and `userdata`) exist before allowing the import process to start. If either directory is missing, display a warning banner to the user and disable the import functionality.
|
||||
|
||||
## API Endpoint
|
||||
|
||||
- **Endpoint**: `GET /api/library/getDirectoryStatus`
|
||||
- **Response Structure**:
|
||||
```typescript
|
||||
interface DirectoryStatus {
|
||||
comics: { exists: boolean };
|
||||
userdata: { exists: boolean };
|
||||
}
|
||||
```
|
||||
|
||||
## Implementation Details
|
||||
|
||||
### 1. Add Directory Status Type
|
||||
|
||||
In [`Import.tsx`](src/client/components/Import/Import.tsx:1), add a type definition for the directory status response:
|
||||
|
||||
```typescript
|
||||
interface DirectoryStatus {
|
||||
comics: { exists: boolean };
|
||||
userdata: { exists: boolean };
|
||||
}
|
||||
```
|
||||
|
||||
### 2. Create useQuery Hook for Directory Status
|
||||
|
||||
Use `@tanstack/react-query` (already imported) to fetch directory status on component mount:
|
||||
|
||||
```typescript
|
||||
const { data: directoryStatus, isLoading: isCheckingDirectories, error: directoryError } = useQuery({
|
||||
queryKey: ['directoryStatus'],
|
||||
queryFn: async (): Promise<DirectoryStatus> => {
|
||||
const response = await axios.get('http://localhost:3000/api/library/getDirectoryStatus');
|
||||
return response.data;
|
||||
},
|
||||
refetchOnWindowFocus: false,
|
||||
staleTime: 30000, // Cache for 30 seconds
|
||||
});
|
||||
```
|
||||
|
||||
### 3. Derive Missing Directories State
|
||||
|
||||
Compute which directories are missing from the query result:
|
||||
|
||||
```typescript
|
||||
const missingDirectories = useMemo(() => {
|
||||
if (!directoryStatus) return [];
|
||||
const missing: string[] = [];
|
||||
if (!directoryStatus.comics?.exists) missing.push('comics');
|
||||
if (!directoryStatus.userdata?.exists) missing.push('userdata');
|
||||
return missing;
|
||||
}, [directoryStatus]);
|
||||
|
||||
const hasAllDirectories = missingDirectories.length === 0;
|
||||
```
|
||||
|
||||
### 4. Create Warning Banner Component
|
||||
|
||||
Add a warning banner that displays when directories are missing, positioned above the import button. This uses the same styling patterns as the existing error banner:
|
||||
|
||||
```tsx
|
||||
{/* Directory Status Warning */}
|
||||
{!isCheckingDirectories && missingDirectories.length > 0 && (
|
||||
<div className="my-6 max-w-screen-lg rounded-lg border-s-4 border-amber-500 bg-amber-50 dark:bg-amber-900/20 p-4">
|
||||
<div className="flex items-start gap-3">
|
||||
<span className="w-6 h-6 text-amber-600 dark:text-amber-400 mt-0.5">
|
||||
<i className="h-6 w-6 icon-[solar--folder-error-bold]"></i>
|
||||
</span>
|
||||
<div className="flex-1">
|
||||
<p className="font-semibold text-amber-800 dark:text-amber-300">
|
||||
Required Directories Missing
|
||||
</p>
|
||||
<p className="text-sm text-amber-700 dark:text-amber-400 mt-1">
|
||||
The following directories do not exist and must be created before importing:
|
||||
</p>
|
||||
<ul className="list-disc list-inside text-sm text-amber-700 dark:text-amber-400 mt-2">
|
||||
{missingDirectories.map((dir) => (
|
||||
<li key={dir}>
|
||||
<code className="bg-amber-100 dark:bg-amber-900/50 px-1 rounded">{dir}</code>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
<p className="text-sm text-amber-700 dark:text-amber-400 mt-2">
|
||||
Please ensure these directories are mounted correctly in your Docker configuration.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
```
|
||||
|
||||
### 5. Disable Import Button When Directories Missing
|
||||
|
||||
Modify the button's `disabled` prop and click handler:
|
||||
|
||||
```tsx
|
||||
<button
|
||||
className="..."
|
||||
onClick={handleForceReImport}
|
||||
disabled={isForceReImporting || hasActiveSession || !hasAllDirectories}
|
||||
title={!hasAllDirectories
|
||||
? "Cannot import: Required directories are missing"
|
||||
: "Re-import all files to fix Elasticsearch indexing issues"}
|
||||
>
|
||||
```
|
||||
|
||||
### 6. Update handleForceReImport Guard
|
||||
|
||||
Add early return in the handler for missing directories:
|
||||
|
||||
```typescript
|
||||
const handleForceReImport = async () => {
|
||||
setImportError(null);
|
||||
|
||||
// Check for missing directories
|
||||
if (!hasAllDirectories) {
|
||||
setImportError(
|
||||
`Cannot start import: Required directories are missing (${missingDirectories.join(', ')}). Please check your Docker volume configuration.`
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
// ... existing logic
|
||||
};
|
||||
```
|
||||
|
||||
## File Changes Summary
|
||||
|
||||
| File | Changes |
|
||||
|------|---------|
|
||||
| [`src/client/components/Import/Import.tsx`](src/client/components/Import/Import.tsx) | Add useQuery for directory status, warning banner UI, disable button logic |
|
||||
| [`src/client/components/Import/Import.test.tsx`](src/client/components/Import/Import.test.tsx) | Add tests for directory status scenarios |
|
||||
|
||||
## Test Cases to Add
|
||||
|
||||
### Import.test.tsx Updates
|
||||
|
||||
1. **Should show warning banner when comics directory is missing**
|
||||
2. **Should show warning banner when userdata directory is missing**
|
||||
3. **Should show warning banner when both directories are missing**
|
||||
4. **Should disable import button when directories are missing**
|
||||
5. **Should enable import button when all directories exist**
|
||||
6. **Should handle directory status API error gracefully**
|
||||
|
||||
Example test structure:
|
||||
|
||||
```typescript
|
||||
describe('Import Component - Directory Status', () => {
|
||||
beforeEach(() => {
|
||||
jest.clearAllMocks();
|
||||
// Mock successful directory status by default
|
||||
(axios.get as jest.Mock) = jest.fn().mockResolvedValue({
|
||||
data: { comics: { exists: true }, userdata: { exists: true } }
|
||||
});
|
||||
});
|
||||
|
||||
test('should show warning when comics directory is missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: false }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Required Directories Missing')).toBeInTheDocument();
|
||||
expect(screen.getByText('comics')).toBeInTheDocument();
|
||||
});
|
||||
});
|
||||
|
||||
test('should disable import button when directories are missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: false }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
const button = screen.getByRole('button', { name: /Force Re-Import/i });
|
||||
expect(button).toBeDisabled();
|
||||
});
|
||||
});
|
||||
});
|
||||
```
|
||||
|
||||
## Architecture Diagram
|
||||
|
||||
```mermaid
|
||||
flowchart TD
|
||||
A[Import Component Mounts] --> B[Fetch Directory Status]
|
||||
B --> C{API Success?}
|
||||
C -->|Yes| D{All Directories Exist?}
|
||||
C -->|No| E[Show Error Banner]
|
||||
D -->|Yes| F[Enable Import Button]
|
||||
D -->|No| G[Show Warning Banner]
|
||||
G --> H[Disable Import Button]
|
||||
F --> I[User Clicks Import]
|
||||
I --> J[Proceed with Import]
|
||||
```
|
||||
|
||||
## Notes
|
||||
|
||||
- The directory status is fetched once on mount with a 30-second stale time
|
||||
- The warning uses amber/yellow colors to differentiate from error messages (red)
|
||||
- The existing `importError` state and UI can remain unchanged
|
||||
- No changes needed to the backend - the endpoint already exists
|
||||
@@ -1,7 +1,7 @@
|
||||
export default {
|
||||
module.exports = {
|
||||
plugins: {
|
||||
"postcss-import": {},
|
||||
"@tailwindcss/postcss": {},
|
||||
tailwindcss: {},
|
||||
autoprefixer: {},
|
||||
},
|
||||
};
|
||||
|
||||
@@ -1,25 +0,0 @@
|
||||
{
|
||||
"version": 1,
|
||||
"skills": {
|
||||
"caveman": {
|
||||
"source": "JuliusBrussee/caveman",
|
||||
"sourceType": "github",
|
||||
"computedHash": "a818cdc41dcfaa50dd891c5cb5e5705968338de02e7e37949ca56e8c30ad4176"
|
||||
},
|
||||
"caveman-compress": {
|
||||
"source": "JuliusBrussee/caveman",
|
||||
"sourceType": "github",
|
||||
"computedHash": "300fb8578258161e1752a2a4142a7e9ff178c960bcb83b84422e2987421f33bf"
|
||||
},
|
||||
"caveman-help": {
|
||||
"source": "JuliusBrussee/caveman",
|
||||
"sourceType": "github",
|
||||
"computedHash": "3cd5f7d3f88c8ef7b16a6555dc61f5a11b14151386697609ab6887ab8b5f059d"
|
||||
},
|
||||
"compress": {
|
||||
"source": "JuliusBrussee/caveman",
|
||||
"sourceType": "github",
|
||||
"computedHash": "05c97bc3120108acd0b80bdef7fb4fa7c224ba83c8d384ccbc97f92e8a065918"
|
||||
}
|
||||
}
|
||||
}
|
||||
47
src/app.css
47
src/app.css
@@ -1,47 +0,0 @@
|
||||
@import "tailwindcss";
|
||||
@config "../tailwind.config.ts";
|
||||
|
||||
html, body {
|
||||
overflow-x: hidden;
|
||||
}
|
||||
|
||||
/* Custom Project Fonts */
|
||||
@font-face {
|
||||
font-family: "PP Object Sans Regular";
|
||||
src: url("/fonts/PPObjectSans-Regular.otf") format("opentype");
|
||||
font-weight: 400;
|
||||
font-style: normal;
|
||||
font-display: swap;
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-family: "PP Object Sans Heavy";
|
||||
src: url("/fonts/PPObjectSans-Heavy.otf") format("opentype");
|
||||
font-weight: 700;
|
||||
font-style: normal;
|
||||
font-display: swap;
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-family: "PP Object Sans Slanted";
|
||||
src: url("/fonts/PPObjectSans-Slanted.otf") format("opentype");
|
||||
font-weight: 400;
|
||||
font-style: italic;
|
||||
font-display: swap;
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-family: "PP Object Sans HeavySlanted";
|
||||
src: url("/fonts/PPObjectSans-HeavySlanted.otf") format("opentype");
|
||||
font-weight: 700;
|
||||
font-style: italic;
|
||||
font-display: swap;
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-family: "Hasklig Regular";
|
||||
src: url("/fonts/Hasklig-Regular.otf") format("opentype");
|
||||
font-weight: 400;
|
||||
font-style: normal;
|
||||
font-display: swap;
|
||||
}
|
||||
@@ -7,12 +7,11 @@ This folder houses all the components, utils and libraries that make up ThreeTwo
|
||||
|
||||
It is based on React 18, and uses:
|
||||
|
||||
1. _zustand_ for state management
|
||||
1. _Redux_ for state management
|
||||
2. _socket.io_ for transferring data in real-time
|
||||
3. _React Router_ for routing
|
||||
4. React DnD for drag-and-drop
|
||||
5. @tanstack/react-table for all tables
|
||||
6. @tanstack/react-query for API calls
|
||||
|
||||
|
||||
|
||||
|
||||
178
src/client/actions/airdcpp.actions.tsx
Normal file
178
src/client/actions/airdcpp.actions.tsx
Normal file
@@ -0,0 +1,178 @@
|
||||
import {
|
||||
SearchQuery,
|
||||
SearchInstance,
|
||||
PriorityEnum,
|
||||
SearchResponse,
|
||||
} from "threetwo-ui-typings";
|
||||
import {
|
||||
LIBRARY_SERVICE_BASE_URI,
|
||||
SEARCH_SERVICE_BASE_URI,
|
||||
} from "../constants/endpoints";
|
||||
import {
|
||||
AIRDCPP_SEARCH_RESULTS_ADDED,
|
||||
AIRDCPP_SEARCH_RESULTS_UPDATED,
|
||||
AIRDCPP_HUB_SEARCHES_SENT,
|
||||
AIRDCPP_RESULT_DOWNLOAD_INITIATED,
|
||||
AIRDCPP_DOWNLOAD_PROGRESS_TICK,
|
||||
AIRDCPP_BUNDLES_FETCHED,
|
||||
AIRDCPP_SEARCH_IN_PROGRESS,
|
||||
AIRDCPP_FILE_DOWNLOAD_COMPLETED,
|
||||
LS_SINGLE_IMPORT,
|
||||
IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
AIRDCPP_TRANSFERS_FETCHED,
|
||||
LIBRARY_ISSUE_BUNDLES,
|
||||
AIRDCPP_SOCKET_CONNECTED,
|
||||
AIRDCPP_SOCKET_DISCONNECTED,
|
||||
} from "../constants/action-types";
|
||||
import { isNil } from "lodash";
|
||||
import axios from "axios";
|
||||
|
||||
interface SearchData {
|
||||
query: Pick<SearchQuery, "pattern"> & Partial<Omit<SearchQuery, "pattern">>;
|
||||
hub_urls: string[] | undefined | null;
|
||||
priority: PriorityEnum;
|
||||
}
|
||||
|
||||
export const sleep = (ms: number): Promise<NodeJS.Timeout> => {
|
||||
return new Promise((resolve) => setTimeout(resolve, ms));
|
||||
};
|
||||
|
||||
export const toggleAirDCPPSocketConnectionStatus =
|
||||
(status: String, payload?: any) => async (dispatch) => {
|
||||
switch (status) {
|
||||
case "connected":
|
||||
dispatch({
|
||||
type: AIRDCPP_SOCKET_CONNECTED,
|
||||
data: payload,
|
||||
});
|
||||
break;
|
||||
|
||||
case "disconnected":
|
||||
dispatch({
|
||||
type: AIRDCPP_SOCKET_DISCONNECTED,
|
||||
data: payload,
|
||||
});
|
||||
break;
|
||||
|
||||
default:
|
||||
console.log("Can't set AirDC++ socket status.");
|
||||
break;
|
||||
}
|
||||
};
|
||||
export const downloadAirDCPPItem =
|
||||
(
|
||||
searchInstanceId: Number,
|
||||
resultId: String,
|
||||
comicObjectId: String,
|
||||
name: String,
|
||||
size: Number,
|
||||
type: any,
|
||||
ADCPPSocket: any,
|
||||
credentials: any,
|
||||
): void =>
|
||||
async (dispatch) => {
|
||||
try {
|
||||
if (!ADCPPSocket.isConnected()) {
|
||||
await ADCPPSocket.connect();
|
||||
}
|
||||
let bundleDBImportResult = {};
|
||||
const downloadResult = await ADCPPSocket.post(
|
||||
`search/${searchInstanceId}/results/${resultId}/download`,
|
||||
);
|
||||
|
||||
if (!isNil(downloadResult)) {
|
||||
bundleDBImportResult = await axios({
|
||||
method: "POST",
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/applyAirDCPPDownloadMetadata`,
|
||||
headers: {
|
||||
"Content-Type": "application/json; charset=utf-8",
|
||||
},
|
||||
data: {
|
||||
bundleId: downloadResult.bundle_info.id,
|
||||
comicObjectId,
|
||||
name,
|
||||
size,
|
||||
type,
|
||||
},
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: AIRDCPP_RESULT_DOWNLOAD_INITIATED,
|
||||
downloadResult,
|
||||
bundleDBImportResult,
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
comicBookDetail: bundleDBImportResult.data,
|
||||
IMS_inProgress: false,
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
export const getBundlesForComic =
|
||||
(comicObjectId: string, ADCPPSocket: any, credentials: any) =>
|
||||
async (dispatch) => {
|
||||
try {
|
||||
if (!ADCPPSocket.isConnected()) {
|
||||
await ADCPPSocket.connect();
|
||||
}
|
||||
const comicObject = await axios({
|
||||
method: "POST",
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBookById`,
|
||||
headers: {
|
||||
"Content-Type": "application/json; charset=utf-8",
|
||||
},
|
||||
data: {
|
||||
id: `${comicObjectId}`,
|
||||
},
|
||||
});
|
||||
// get only the bundles applicable for the comic
|
||||
if (comicObject.data.acquisition.directconnect) {
|
||||
const filteredBundles =
|
||||
comicObject.data.acquisition.directconnect.downloads.map(
|
||||
async ({ bundleId }) => {
|
||||
return await ADCPPSocket.get(`queue/bundles/${bundleId}`);
|
||||
},
|
||||
);
|
||||
dispatch({
|
||||
type: AIRDCPP_BUNDLES_FETCHED,
|
||||
bundles: await Promise.all(filteredBundles),
|
||||
});
|
||||
}
|
||||
} catch (error) {
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
export const getTransfers =
|
||||
(ADCPPSocket: any, credentials: any) => async (dispatch) => {
|
||||
try {
|
||||
if (!ADCPPSocket.isConnected()) {
|
||||
await ADCPPSocket.connect();
|
||||
}
|
||||
const bundles = await ADCPPSocket.get("queue/bundles/1/85", {});
|
||||
if (!isNil(bundles)) {
|
||||
dispatch({
|
||||
type: AIRDCPP_TRANSFERS_FETCHED,
|
||||
bundles,
|
||||
});
|
||||
const bundleIds = bundles.map((bundle) => bundle.id);
|
||||
// get issues with matching bundleIds
|
||||
const issue_bundles = await axios({
|
||||
url: `${SEARCH_SERVICE_BASE_URI}/groupIssuesByBundles`,
|
||||
method: "POST",
|
||||
data: { bundleIds },
|
||||
});
|
||||
dispatch({
|
||||
type: LIBRARY_ISSUE_BUNDLES,
|
||||
issue_bundles,
|
||||
});
|
||||
}
|
||||
} catch (err) {
|
||||
throw err;
|
||||
}
|
||||
};
|
||||
209
src/client/actions/comicinfo.actions.tsx
Normal file
209
src/client/actions/comicinfo.actions.tsx
Normal file
@@ -0,0 +1,209 @@
|
||||
import axios from "axios";
|
||||
import rateLimiter from "axios-rate-limit";
|
||||
import { setupCache } from "axios-cache-interceptor";
|
||||
import {
|
||||
CV_SEARCH_SUCCESS,
|
||||
CV_API_CALL_IN_PROGRESS,
|
||||
CV_API_GENERIC_FAILURE,
|
||||
IMS_COMIC_BOOK_DB_OBJECT_CALL_IN_PROGRESS,
|
||||
IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
CV_ISSUES_METADATA_CALL_IN_PROGRESS,
|
||||
CV_CLEANUP,
|
||||
IMS_COMIC_BOOKS_DB_OBJECTS_FETCHED,
|
||||
CV_ISSUES_MATCHES_IN_LIBRARY_FETCHED,
|
||||
CV_ISSUES_FOR_VOLUME_IN_LIBRARY_SUCCESS,
|
||||
CV_WEEKLY_PULLLIST_CALL_IN_PROGRESS,
|
||||
CV_WEEKLY_PULLLIST_FETCHED,
|
||||
LIBRARY_STATISTICS_CALL_IN_PROGRESS,
|
||||
LIBRARY_STATISTICS_FETCHED,
|
||||
} from "../constants/action-types";
|
||||
import {
|
||||
COMICVINE_SERVICE_URI,
|
||||
LIBRARY_SERVICE_BASE_URI,
|
||||
} from "../constants/endpoints";
|
||||
|
||||
const http = rateLimiter(axios.create(), {
|
||||
maxRequests: 1,
|
||||
perMilliseconds: 1000,
|
||||
maxRPS: 1,
|
||||
});
|
||||
const cachedAxios = setupCache(axios);
|
||||
export const getWeeklyPullList = (options) => async (dispatch) => {
|
||||
try {
|
||||
dispatch({
|
||||
type: CV_WEEKLY_PULLLIST_CALL_IN_PROGRESS,
|
||||
});
|
||||
await cachedAxios(`${COMICVINE_SERVICE_URI}/getWeeklyPullList`, {
|
||||
method: "get",
|
||||
params: options,
|
||||
}).then((response) => {
|
||||
dispatch({
|
||||
type: CV_WEEKLY_PULLLIST_FETCHED,
|
||||
data: response.data.result,
|
||||
});
|
||||
});
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
}
|
||||
};
|
||||
|
||||
export const comicinfoAPICall = (options) => async (dispatch) => {
|
||||
try {
|
||||
dispatch({
|
||||
type: CV_API_CALL_IN_PROGRESS,
|
||||
inProgress: true,
|
||||
});
|
||||
const serviceURI = `${COMICVINE_SERVICE_URI}/${options.callURIAction}`;
|
||||
const response = await http(serviceURI, {
|
||||
method: options.callMethod,
|
||||
params: options.callParams,
|
||||
data: options.data ? options.data : null,
|
||||
headers: {
|
||||
"Content-Type": "application/json",
|
||||
"Access-Control-Allow-Origin": "*",
|
||||
},
|
||||
});
|
||||
|
||||
switch (options.callURIAction) {
|
||||
case "search":
|
||||
dispatch({
|
||||
type: CV_SEARCH_SUCCESS,
|
||||
searchResults: response.data,
|
||||
});
|
||||
break;
|
||||
|
||||
default:
|
||||
console.log("Could not complete request.");
|
||||
}
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
dispatch({
|
||||
type: CV_API_GENERIC_FAILURE,
|
||||
error,
|
||||
});
|
||||
}
|
||||
};
|
||||
export const getIssuesForSeries =
|
||||
(comicObjectID: string) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: CV_ISSUES_METADATA_CALL_IN_PROGRESS,
|
||||
});
|
||||
dispatch({
|
||||
type: CV_CLEANUP,
|
||||
});
|
||||
|
||||
const issues = await axios({
|
||||
url: `${COMICVINE_SERVICE_URI}/getIssuesForSeries`,
|
||||
method: "POST",
|
||||
params: {
|
||||
comicObjectID,
|
||||
},
|
||||
});
|
||||
console.log(issues);
|
||||
dispatch({
|
||||
type: CV_ISSUES_FOR_VOLUME_IN_LIBRARY_SUCCESS,
|
||||
issues: issues.data.results,
|
||||
});
|
||||
};
|
||||
|
||||
export const analyzeLibrary = (issues) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: CV_ISSUES_METADATA_CALL_IN_PROGRESS,
|
||||
});
|
||||
const queryObjects = issues.map((issue) => {
|
||||
const { id, name, issue_number } = issue;
|
||||
return {
|
||||
issueId: id,
|
||||
issueName: name,
|
||||
volumeName: issue.volume.name,
|
||||
issueNumber: issue_number,
|
||||
};
|
||||
});
|
||||
const foo = await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/findIssueForSeries`,
|
||||
method: "POST",
|
||||
data: {
|
||||
queryObjects,
|
||||
},
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: CV_ISSUES_MATCHES_IN_LIBRARY_FETCHED,
|
||||
matches: foo.data,
|
||||
});
|
||||
};
|
||||
|
||||
export const getLibraryStatistics = () => async (dispatch) => {
|
||||
dispatch({
|
||||
type: LIBRARY_STATISTICS_CALL_IN_PROGRESS,
|
||||
});
|
||||
const result = await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/libraryStatistics`,
|
||||
method: "GET",
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: LIBRARY_STATISTICS_FETCHED,
|
||||
data: result.data,
|
||||
});
|
||||
};
|
||||
|
||||
export const getComicBookDetailById =
|
||||
(comicBookObjectId: string) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_CALL_IN_PROGRESS,
|
||||
IMS_inProgress: true,
|
||||
});
|
||||
const result = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBookById`,
|
||||
method: "POST",
|
||||
data: {
|
||||
id: comicBookObjectId,
|
||||
},
|
||||
});
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
comicBookDetail: result.data,
|
||||
IMS_inProgress: false,
|
||||
});
|
||||
};
|
||||
|
||||
export const getComicBooksDetailsByIds =
|
||||
(comicBookObjectIds: Array<string>) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_CALL_IN_PROGRESS,
|
||||
IMS_inProgress: true,
|
||||
});
|
||||
const result = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBooksByIds`,
|
||||
method: "POST",
|
||||
data: {
|
||||
ids: comicBookObjectIds,
|
||||
},
|
||||
});
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOKS_DB_OBJECTS_FETCHED,
|
||||
comicBooks: result.data,
|
||||
});
|
||||
};
|
||||
|
||||
export const applyComicVineMatch =
|
||||
(match, comicObjectId) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_CALL_IN_PROGRESS,
|
||||
IMS_inProgress: true,
|
||||
});
|
||||
const result = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/applyComicVineMetadata`,
|
||||
method: "POST",
|
||||
data: {
|
||||
match,
|
||||
comicObjectId,
|
||||
},
|
||||
});
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
comicBookDetail: result.data,
|
||||
IMS_inProgress: false,
|
||||
});
|
||||
};
|
||||
385
src/client/actions/fileops.actions.tsx
Normal file
385
src/client/actions/fileops.actions.tsx
Normal file
@@ -0,0 +1,385 @@
|
||||
import axios from "axios";
|
||||
import { IFolderData } from "threetwo-ui-typings";
|
||||
import {
|
||||
COMICVINE_SERVICE_URI,
|
||||
IMAGETRANSFORMATION_SERVICE_BASE_URI,
|
||||
LIBRARY_SERVICE_BASE_URI,
|
||||
SEARCH_SERVICE_BASE_URI,
|
||||
JOB_QUEUE_SERVICE_BASE_URI,
|
||||
} from "../constants/endpoints";
|
||||
import {
|
||||
IMS_COMIC_BOOK_GROUPS_FETCHED,
|
||||
IMS_COMIC_BOOK_GROUPS_CALL_IN_PROGRESS,
|
||||
IMS_RECENT_COMICS_FETCHED,
|
||||
IMS_WANTED_COMICS_FETCHED,
|
||||
CV_API_CALL_IN_PROGRESS,
|
||||
CV_SEARCH_SUCCESS,
|
||||
CV_CLEANUP,
|
||||
IMS_CV_METADATA_IMPORT_CALL_IN_PROGRESS,
|
||||
IMS_CV_METADATA_IMPORT_SUCCESSFUL,
|
||||
IMS_CV_METADATA_IMPORT_FAILED,
|
||||
LS_IMPORT,
|
||||
IMG_ANALYSIS_CALL_IN_PROGRESS,
|
||||
IMG_ANALYSIS_DATA_FETCH_SUCCESS,
|
||||
IMS_COMIC_BOOK_ARCHIVE_EXTRACTION_CALL_IN_PROGRESS,
|
||||
SS_SEARCH_RESULTS_FETCHED,
|
||||
SS_SEARCH_IN_PROGRESS,
|
||||
FILEOPS_STATE_RESET,
|
||||
LS_IMPORT_CALL_IN_PROGRESS,
|
||||
SS_SEARCH_FAILED,
|
||||
SS_SEARCH_RESULTS_FETCHED_SPECIAL,
|
||||
WANTED_COMICS_FETCHED,
|
||||
VOLUMES_FETCHED,
|
||||
LIBRARY_SERVICE_HEALTH,
|
||||
LS_SET_QUEUE_STATUS,
|
||||
LS_IMPORT_JOB_STATISTICS_FETCHED,
|
||||
} from "../constants/action-types";
|
||||
import { success } from "react-notification-system-redux";
|
||||
|
||||
import { isNil } from "lodash";
|
||||
|
||||
export const getServiceStatus = (serviceName?: string) => async (dispatch) => {
|
||||
axios
|
||||
.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getHealthInformation`,
|
||||
method: "GET",
|
||||
transformResponse: (r: string) => JSON.parse(r),
|
||||
})
|
||||
.then((response) => {
|
||||
const { data } = response;
|
||||
dispatch({
|
||||
type: LIBRARY_SERVICE_HEALTH,
|
||||
status: data,
|
||||
});
|
||||
});
|
||||
};
|
||||
export async function walkFolder(path: string): Promise<Array<IFolderData>> {
|
||||
return axios
|
||||
.request<Array<IFolderData>>({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/walkFolders`,
|
||||
method: "POST",
|
||||
data: {
|
||||
basePathToWalk: path,
|
||||
},
|
||||
transformResponse: (r: string) => JSON.parse(r),
|
||||
})
|
||||
.then((response) => {
|
||||
const { data } = response;
|
||||
return data;
|
||||
})
|
||||
.catch((error) => error);
|
||||
}
|
||||
/**
|
||||
* Fetches comic book covers along with some metadata
|
||||
* @return the comic book metadata
|
||||
*/
|
||||
export const fetchComicBookMetadata = () => async (dispatch) => {
|
||||
dispatch({
|
||||
type: LS_IMPORT_CALL_IN_PROGRESS,
|
||||
});
|
||||
|
||||
// dispatch(
|
||||
// success({
|
||||
// // uid: 'once-please', // you can specify your own uid if required
|
||||
// title: "Import Started",
|
||||
// message: `<span class="icon-text has-text-success"><i class="fas fa-plug"></i></span> Socket <span class="has-text-info">${socket.id}</span> connected. <strong>${walkedFolders.length}</strong> comics scanned.`,
|
||||
// dismissible: "click",
|
||||
// position: "tr",
|
||||
// autoDismiss: 0,
|
||||
// }),
|
||||
// );
|
||||
const sessionId = localStorage.getItem("sessionId");
|
||||
dispatch({
|
||||
type: LS_IMPORT,
|
||||
});
|
||||
|
||||
await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/newImport`,
|
||||
method: "POST",
|
||||
data: { sessionId },
|
||||
});
|
||||
};
|
||||
|
||||
export const getImportJobResultStatistics = () => async (dispatch) => {
|
||||
const result = await axios.request({
|
||||
url: `${JOB_QUEUE_SERVICE_BASE_URI}/getJobResultStatistics`,
|
||||
method: "GET",
|
||||
});
|
||||
dispatch({
|
||||
type: LS_IMPORT_JOB_STATISTICS_FETCHED,
|
||||
data: result.data,
|
||||
});
|
||||
};
|
||||
|
||||
export const setQueueControl =
|
||||
(queueAction: string, queueStatus: string) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: LS_SET_QUEUE_STATUS,
|
||||
meta: { remote: true },
|
||||
data: { queueAction, queueStatus },
|
||||
});
|
||||
};
|
||||
|
||||
/**
|
||||
* Fetches comic book metadata for various types
|
||||
* @return metadata for the comic book object categories
|
||||
* @param options
|
||||
**/
|
||||
export const getComicBooks = (options) => async (dispatch) => {
|
||||
const { paginationOptions, predicate, comicStatus } = options;
|
||||
|
||||
const response = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBooks`,
|
||||
method: "POST",
|
||||
data: {
|
||||
paginationOptions,
|
||||
predicate,
|
||||
},
|
||||
});
|
||||
|
||||
switch (comicStatus) {
|
||||
case "recent":
|
||||
dispatch({
|
||||
type: IMS_RECENT_COMICS_FETCHED,
|
||||
data: response.data,
|
||||
});
|
||||
break;
|
||||
case "wanted":
|
||||
dispatch({
|
||||
type: IMS_WANTED_COMICS_FETCHED,
|
||||
data: response.data.docs,
|
||||
});
|
||||
break;
|
||||
default:
|
||||
console.log("Unrecognized comic status.");
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Makes a call to library service to import the comic book metadata into the ThreeTwo data store.
|
||||
* @returns Nothing.
|
||||
* @param payload
|
||||
*/
|
||||
export const importToDB =
|
||||
(sourceName: string, metadata?: any) => (dispatch) => {
|
||||
try {
|
||||
const comicBookMetadata = {
|
||||
importType: "new",
|
||||
payload: {
|
||||
rawFileDetails: {
|
||||
name: "",
|
||||
},
|
||||
importStatus: {
|
||||
isImported: true,
|
||||
tagged: false,
|
||||
matchedResult: {
|
||||
score: "0",
|
||||
},
|
||||
},
|
||||
sourcedMetadata: metadata || null,
|
||||
acquisition: { source: { wanted: true, name: sourceName } },
|
||||
},
|
||||
};
|
||||
dispatch({
|
||||
type: IMS_CV_METADATA_IMPORT_CALL_IN_PROGRESS,
|
||||
});
|
||||
return axios
|
||||
.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/rawImportToDb`,
|
||||
method: "POST",
|
||||
data: comicBookMetadata,
|
||||
// transformResponse: (r: string) => JSON.parse(r),
|
||||
})
|
||||
.then((response) => {
|
||||
const { data } = response;
|
||||
dispatch({
|
||||
type: IMS_CV_METADATA_IMPORT_SUCCESSFUL,
|
||||
importResult: data,
|
||||
});
|
||||
});
|
||||
} catch (error) {
|
||||
dispatch({
|
||||
type: IMS_CV_METADATA_IMPORT_FAILED,
|
||||
importError: error,
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
export const fetchVolumeGroups = () => async (dispatch) => {
|
||||
try {
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_GROUPS_CALL_IN_PROGRESS,
|
||||
});
|
||||
const response = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBookGroups`,
|
||||
method: "GET",
|
||||
});
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_GROUPS_FETCHED,
|
||||
data: response.data,
|
||||
});
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
}
|
||||
};
|
||||
export const fetchComicVineMatches =
|
||||
(searchPayload, issueSearchQuery, seriesSearchQuery?) => async (dispatch) => {
|
||||
console.log(issueSearchQuery);
|
||||
try {
|
||||
dispatch({
|
||||
type: CV_API_CALL_IN_PROGRESS,
|
||||
});
|
||||
axios
|
||||
.request({
|
||||
url: `${COMICVINE_SERVICE_URI}/volumeBasedSearch`,
|
||||
method: "POST",
|
||||
data: {
|
||||
format: "json",
|
||||
// hack
|
||||
query: issueSearchQuery.inferredIssueDetails.name
|
||||
.replace(/[^a-zA-Z0-9 ]/g, "")
|
||||
.trim(),
|
||||
limit: "100",
|
||||
page: 1,
|
||||
resources: "volume",
|
||||
scorerConfiguration: {
|
||||
searchParams: issueSearchQuery.inferredIssueDetails,
|
||||
},
|
||||
rawFileDetails: searchPayload.rawFileDetails,
|
||||
},
|
||||
transformResponse: (r) => {
|
||||
const matches = JSON.parse(r);
|
||||
return matches;
|
||||
// return sortBy(matches, (match) => -match.score);
|
||||
},
|
||||
})
|
||||
.then((response) => {
|
||||
let matches: any = [];
|
||||
if (
|
||||
!isNil(response.data.results) &&
|
||||
response.data.results.length === 1
|
||||
) {
|
||||
matches = response.data.results;
|
||||
} else {
|
||||
matches = response.data.map((match) => match);
|
||||
}
|
||||
dispatch({
|
||||
type: CV_SEARCH_SUCCESS,
|
||||
searchResults: matches,
|
||||
searchQueryObject: {
|
||||
issue: issueSearchQuery,
|
||||
series: seriesSearchQuery,
|
||||
},
|
||||
});
|
||||
});
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
}
|
||||
|
||||
dispatch({
|
||||
type: CV_CLEANUP,
|
||||
});
|
||||
};
|
||||
|
||||
/**
|
||||
* This method is a proxy to `uncompressFullArchive` which uncompresses complete `rar` or `zip` archives
|
||||
* @param {string} path The path to the compressed archive
|
||||
* @param {any} options Options object
|
||||
* @returns {any}
|
||||
*/
|
||||
export const extractComicArchive =
|
||||
(path: string, options: any): any =>
|
||||
async (dispatch) => {
|
||||
dispatch({
|
||||
type: IMS_COMIC_BOOK_ARCHIVE_EXTRACTION_CALL_IN_PROGRESS,
|
||||
});
|
||||
await axios({
|
||||
method: "POST",
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/uncompressFullArchive`,
|
||||
headers: {
|
||||
"Content-Type": "application/json; charset=utf-8",
|
||||
},
|
||||
data: {
|
||||
filePath: path,
|
||||
options,
|
||||
},
|
||||
});
|
||||
};
|
||||
|
||||
/**
|
||||
* Description
|
||||
* @param {any} query
|
||||
* @param {any} options
|
||||
* @returns {any}
|
||||
*/
|
||||
export const searchIssue = (query, options) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: SS_SEARCH_IN_PROGRESS,
|
||||
});
|
||||
|
||||
const response = await axios({
|
||||
url: `${SEARCH_SERVICE_BASE_URI}/searchIssue`,
|
||||
method: "POST",
|
||||
data: { ...query, ...options },
|
||||
});
|
||||
|
||||
if (response.data.code === 404) {
|
||||
dispatch({
|
||||
type: SS_SEARCH_FAILED,
|
||||
data: response.data,
|
||||
});
|
||||
}
|
||||
|
||||
switch (options.trigger) {
|
||||
case "wantedComicsPage":
|
||||
dispatch({
|
||||
type: WANTED_COMICS_FETCHED,
|
||||
data: response.data.hits,
|
||||
});
|
||||
break;
|
||||
case "globalSearchBar":
|
||||
dispatch({
|
||||
type: SS_SEARCH_RESULTS_FETCHED_SPECIAL,
|
||||
data: response.data.hits,
|
||||
});
|
||||
break;
|
||||
|
||||
case "libraryPage":
|
||||
dispatch({
|
||||
type: SS_SEARCH_RESULTS_FETCHED,
|
||||
data: response.data.hits,
|
||||
});
|
||||
break;
|
||||
case "volumesPage":
|
||||
dispatch({
|
||||
type: VOLUMES_FETCHED,
|
||||
data: response.data.hits,
|
||||
});
|
||||
break;
|
||||
|
||||
default:
|
||||
break;
|
||||
}
|
||||
};
|
||||
export const analyzeImage =
|
||||
(imageFilePath: string | Buffer) => async (dispatch) => {
|
||||
dispatch({
|
||||
type: FILEOPS_STATE_RESET,
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: IMG_ANALYSIS_CALL_IN_PROGRESS,
|
||||
});
|
||||
|
||||
const foo = await axios({
|
||||
url: `${IMAGETRANSFORMATION_SERVICE_BASE_URI}/analyze`,
|
||||
method: "POST",
|
||||
data: {
|
||||
imageFilePath,
|
||||
},
|
||||
});
|
||||
dispatch({
|
||||
type: IMG_ANALYSIS_DATA_FETCH_SUCCESS,
|
||||
result: foo.data,
|
||||
});
|
||||
};
|
||||
28
src/client/actions/metron.actions.tsx
Normal file
28
src/client/actions/metron.actions.tsx
Normal file
@@ -0,0 +1,28 @@
|
||||
import axios from "axios";
|
||||
import { isNil } from "lodash";
|
||||
import { METRON_SERVICE_URI } from "../constants/endpoints";
|
||||
|
||||
export const fetchMetronResource = async (options) => {
|
||||
const metronResourceResults = await axios.post(
|
||||
`${METRON_SERVICE_URI}/fetchResource`,
|
||||
options,
|
||||
);
|
||||
console.log(metronResourceResults);
|
||||
console.log("has more? ", !isNil(metronResourceResults.data.next));
|
||||
const results = metronResourceResults.data.results.map((result) => {
|
||||
return {
|
||||
label: result.name || result.__str__,
|
||||
value: result.id,
|
||||
};
|
||||
});
|
||||
|
||||
return {
|
||||
options: results,
|
||||
hasMore: !isNil(metronResourceResults.data.next),
|
||||
additional: {
|
||||
page: !isNil(metronResourceResults.data.next)
|
||||
? options.query.page + 1
|
||||
: null,
|
||||
},
|
||||
};
|
||||
};
|
||||
77
src/client/actions/settings.actions.tsx
Normal file
77
src/client/actions/settings.actions.tsx
Normal file
@@ -0,0 +1,77 @@
|
||||
import axios from "axios";
|
||||
import {
|
||||
SETTINGS_OBJECT_FETCHED,
|
||||
SETTINGS_CALL_IN_PROGRESS,
|
||||
SETTINGS_DB_FLUSH_SUCCESS,
|
||||
SETTINGS_QBITTORRENT_TORRENTS_LIST_FETCHED,
|
||||
} from "../reducers/settings.reducer";
|
||||
import {
|
||||
LIBRARY_SERVICE_BASE_URI,
|
||||
SETTINGS_SERVICE_BASE_URI,
|
||||
QBITTORRENT_SERVICE_BASE_URI,
|
||||
} from "../constants/endpoints";
|
||||
|
||||
export const getSettings = (settingsKey?) => async (dispatch) => {
|
||||
const result = await axios({
|
||||
url: `${SETTINGS_SERVICE_BASE_URI}/getSettings`,
|
||||
method: "POST",
|
||||
data: settingsKey,
|
||||
});
|
||||
{
|
||||
dispatch({
|
||||
type: SETTINGS_OBJECT_FETCHED,
|
||||
data: result.data,
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
export const deleteSettings = () => async (dispatch) => {
|
||||
const result = await axios({
|
||||
url: `${SETTINGS_SERVICE_BASE_URI}/deleteSettings`,
|
||||
method: "POST",
|
||||
});
|
||||
|
||||
if (result.data.ok === 1) {
|
||||
dispatch({
|
||||
type: SETTINGS_OBJECT_FETCHED,
|
||||
data: {},
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
export const flushDb = () => async (dispatch) => {
|
||||
dispatch({
|
||||
type: SETTINGS_CALL_IN_PROGRESS,
|
||||
});
|
||||
|
||||
const flushDbResult = await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/flushDb`,
|
||||
method: "POST",
|
||||
});
|
||||
|
||||
if (flushDbResult) {
|
||||
dispatch({
|
||||
type: SETTINGS_DB_FLUSH_SUCCESS,
|
||||
data: flushDbResult.data,
|
||||
});
|
||||
}
|
||||
};
|
||||
|
||||
export const getQBitTorrentClientInfo = (hostInfo) => async (dispatch) => {
|
||||
await axios.request({
|
||||
url: `${QBITTORRENT_SERVICE_BASE_URI}/connect`,
|
||||
method: "POST",
|
||||
data: hostInfo,
|
||||
});
|
||||
const qBittorrentClientInfo = await axios.request({
|
||||
url: `${QBITTORRENT_SERVICE_BASE_URI}/getClientInfo`,
|
||||
method: "GET",
|
||||
});
|
||||
|
||||
dispatch({
|
||||
type: SETTINGS_QBITTORRENT_TORRENTS_LIST_FETCHED,
|
||||
data: qBittorrentClientInfo.data,
|
||||
});
|
||||
};
|
||||
|
||||
export const getProwlarrConnectionInfo = (hostInfo) => async (dispatch) => {};
|
||||
15
src/client/assets/scss/App.scss
Normal file
15
src/client/assets/scss/App.scss
Normal file
@@ -0,0 +1,15 @@
|
||||
@tailwind base;
|
||||
@tailwind components;
|
||||
@tailwind utilities;
|
||||
|
||||
@layer base {
|
||||
@font-face {
|
||||
font-family: "PP Object Sans Regular";
|
||||
src: url("/fonts/PPObjectSans-Regular.otf") format("opentype");
|
||||
}
|
||||
|
||||
@font-face {
|
||||
font-family: "Hasklig Regular";
|
||||
src: url("/fonts/Hasklig-Regular.otf") format("opentype");
|
||||
}
|
||||
}
|
||||
@@ -1,47 +1,13 @@
|
||||
/**
|
||||
* @fileoverview Root application component.
|
||||
* Provides the main layout structure with navigation, content outlet,
|
||||
* and toast notifications. Initializes socket connection on mount.
|
||||
* @module components/App
|
||||
*/
|
||||
|
||||
import React, { ReactElement, useEffect } from "react";
|
||||
import React, { ReactElement } from "react";
|
||||
import { Outlet } from "react-router-dom";
|
||||
import { Navbar2 } from "./shared/Navbar2";
|
||||
import { ToastContainer } from "react-toastify";
|
||||
import "../../app.css";
|
||||
import { useStore } from "../store";
|
||||
import "../assets/scss/App.scss";
|
||||
|
||||
/**
|
||||
* Root application component that provides the main layout structure.
|
||||
*
|
||||
* Features:
|
||||
* - Initializes WebSocket connection to the server on mount
|
||||
* - Renders the navigation bar across all routes
|
||||
* - Provides React Router outlet for child routes
|
||||
* - Includes toast notification container for app-wide notifications
|
||||
*
|
||||
* @returns {ReactElement} The root application layout
|
||||
* @example
|
||||
* // Used as the root element in React Router configuration
|
||||
* const router = createBrowserRouter([
|
||||
* {
|
||||
* path: "/",
|
||||
* element: <App />,
|
||||
* children: [...]
|
||||
* }
|
||||
* ]);
|
||||
*/
|
||||
export const App = (): ReactElement => {
|
||||
useEffect(() => {
|
||||
useStore.getState().getSocket("/"); // Connect to the base namespace
|
||||
}, []);
|
||||
|
||||
return (
|
||||
<>
|
||||
<Navbar2 />
|
||||
<Outlet />
|
||||
<ToastContainer stacked hideProgressBar />
|
||||
</>
|
||||
);
|
||||
};
|
||||
|
||||
@@ -1,192 +1,239 @@
|
||||
import React, {
|
||||
ReactElement,
|
||||
useEffect,
|
||||
useRef,
|
||||
useState,
|
||||
} from "react";
|
||||
import React, { useCallback, ReactElement, useEffect, useState } from "react";
|
||||
import { getBundlesForComic, sleep } from "../../actions/airdcpp.actions";
|
||||
import { SearchQuery, PriorityEnum, SearchResponse } from "threetwo-ui-typings";
|
||||
import { RootState, SearchInstance } from "threetwo-ui-typings";
|
||||
import ellipsize from "ellipsize";
|
||||
import { Form, Field } from "react-final-form";
|
||||
import { difference } from "../../shared/utils/object.utils";
|
||||
import { isEmpty, isNil, map } from "lodash";
|
||||
import { useStore } from "../../store";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import { useShallow } from "zustand/react/shallow";
|
||||
import { useQuery, useQueryClient } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { AIRDCPP_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import type { Socket } from "socket.io-client";
|
||||
import type { AcquisitionPanelProps } from "../../types";
|
||||
|
||||
interface HubData {
|
||||
hub_url: string;
|
||||
identity: { name: string };
|
||||
value: string;
|
||||
}
|
||||
|
||||
interface AirDCPPSearchResult {
|
||||
id: string;
|
||||
dupe?: unknown;
|
||||
type: { id: string; str: string };
|
||||
name: string;
|
||||
slots: { total: number; free: number };
|
||||
users: { user: { nicks: string; flags: string[] } };
|
||||
size: number;
|
||||
interface IAcquisitionPanelProps {
|
||||
query: any;
|
||||
comicObjectId: any;
|
||||
comicObject: any;
|
||||
settings: any;
|
||||
}
|
||||
|
||||
export const AcquisitionPanel = (
|
||||
props: AcquisitionPanelProps,
|
||||
props: IAcquisitionPanelProps,
|
||||
): ReactElement => {
|
||||
const socketRef = useRef<Socket | undefined>(undefined);
|
||||
const {
|
||||
airDCPPSocketInstance,
|
||||
airDCPPClientConfiguration,
|
||||
airDCPPSessionInformation,
|
||||
airDCPPDownloadTick,
|
||||
} = useStore(
|
||||
useShallow((state) => ({
|
||||
airDCPPSocketInstance: state.airDCPPSocketInstance,
|
||||
airDCPPClientConfiguration: state.airDCPPClientConfiguration,
|
||||
airDCPPSessionInformation: state.airDCPPSessionInformation,
|
||||
airDCPPDownloadTick: state.airDCPPDownloadTick,
|
||||
})),
|
||||
);
|
||||
|
||||
const [dcppQuery, setDcppQuery] = useState({});
|
||||
const [airDCPPSearchResults, setAirDCPPSearchResults] = useState<AirDCPPSearchResult[]>([]);
|
||||
const [airDCPPSearchStatus, setAirDCPPSearchStatus] = useState(false);
|
||||
const [airDCPPSearchInstance, setAirDCPPSearchInstance] = useState<{ id?: string; owner?: string; expires_in?: number }>({});
|
||||
const [airDCPPSearchInfo, setAirDCPPSearchInfo] = useState<{ query?: { pattern: string; extensions: string[]; file_type: string } }>({});
|
||||
interface SearchData {
|
||||
query: Pick<SearchQuery, "pattern"> & Partial<Omit<SearchQuery, "pattern">>;
|
||||
hub_urls: string[] | undefined | null;
|
||||
priority: PriorityEnum;
|
||||
}
|
||||
|
||||
/**
|
||||
* Get the hubs list from an AirDCPP Socket
|
||||
*/
|
||||
const { data: hubs } = useQuery({
|
||||
queryKey: ["hubs"],
|
||||
queryFn: async () => await airDCPPSocketInstance.get(`hubs`),
|
||||
});
|
||||
const { comicObjectId } = props;
|
||||
const issueName = props.query.issue.name || "";
|
||||
const sanitizedIssueName = issueName.replace(/[^a-zA-Z0-9 ]/g, " ");
|
||||
|
||||
const [dcppQuery, setDcppQuery] = useState({});
|
||||
const [airDCPPSearchResults, setAirDCPPSearchResults] = useState([]);
|
||||
const [airDCPPSearchStatus, setAirDCPPSearchStatus] = useState(false);
|
||||
const [airDCPPSearchInstance, setAirDCPPSearchInstance] = useState({});
|
||||
const [airDCPPSearchInfo, setAirDCPPSearchInfo] = useState({});
|
||||
const queryClient = useQueryClient();
|
||||
|
||||
// Construct a AirDC++ query based on metadata inferred, upon component mount
|
||||
// Pre-populate the search input with the search string, so that
|
||||
// All the user has to do is hit "Search AirDC++"
|
||||
useEffect(() => {
|
||||
const socket = useStore.getState().getSocket("manual");
|
||||
socketRef.current = socket;
|
||||
|
||||
// --- Handlers ---
|
||||
const handleResultAdded = ({ result }: any) => {
|
||||
setAirDCPPSearchResults((prev) =>
|
||||
prev.some((r) => r.id === result.id) ? prev : [...prev, result],
|
||||
);
|
||||
};
|
||||
|
||||
const handleResultUpdated = ({ result }: any) => {
|
||||
setAirDCPPSearchResults((prev) => {
|
||||
const idx = prev.findIndex((r) => r.id === result.id);
|
||||
if (idx === -1) return prev;
|
||||
if (JSON.stringify(prev[idx]) === JSON.stringify(result)) return prev;
|
||||
const next = [...prev];
|
||||
next[idx] = result;
|
||||
return next;
|
||||
});
|
||||
};
|
||||
|
||||
const handleSearchInitiated = (data: any) => {
|
||||
setAirDCPPSearchInstance(data.instance);
|
||||
};
|
||||
|
||||
const handleSearchesSent = (data: any) => {
|
||||
setAirDCPPSearchInfo(data.searchInfo);
|
||||
};
|
||||
|
||||
// --- Subscribe once ---
|
||||
socket.on("searchResultAdded", handleResultAdded);
|
||||
socket.on("searchResultUpdated", handleResultUpdated);
|
||||
socket.on("searchInitiated", handleSearchInitiated);
|
||||
socket.on("searchesSent", handleSearchesSent);
|
||||
|
||||
return () => {
|
||||
socket.off("searchResultAdded", handleResultAdded);
|
||||
socket.off("searchResultUpdated", handleResultUpdated);
|
||||
socket.off("searchInitiated", handleSearchInitiated);
|
||||
socket.off("searchesSent", handleSearchesSent);
|
||||
// if you want to fully close the socket:
|
||||
// useStore.getState().disconnectSocket("/manual");
|
||||
};
|
||||
}, []);
|
||||
|
||||
const {
|
||||
data: settings,
|
||||
isLoading,
|
||||
isError,
|
||||
} = useQuery({
|
||||
queryKey: ["settings"],
|
||||
queryFn: async () =>
|
||||
await axios({
|
||||
url: "http://localhost:3000/api/settings/getAllSettings",
|
||||
method: "GET",
|
||||
}),
|
||||
});
|
||||
|
||||
const { data: hubs } = useQuery({
|
||||
queryKey: ["hubs"],
|
||||
queryFn: async () =>
|
||||
await axios({
|
||||
url: `${AIRDCPP_SERVICE_BASE_URI}/getHubs`,
|
||||
method: "POST",
|
||||
data: {
|
||||
host: settings?.data.directConnect?.client?.host,
|
||||
},
|
||||
}),
|
||||
enabled: !isEmpty(settings?.data.directConnect?.client?.host),
|
||||
});
|
||||
|
||||
useEffect(() => {
|
||||
// AirDC++ search query
|
||||
const dcppSearchQuery = {
|
||||
query: {
|
||||
pattern: `${sanitizedIssueName.replace(/#/g, "")}`,
|
||||
extensions: ["cbz", "cbr", "cb7"],
|
||||
},
|
||||
hub_urls: map(hubs?.data, (item) => item.value),
|
||||
hub_urls: map(hubs, (item) => item.value),
|
||||
priority: 5,
|
||||
};
|
||||
setDcppQuery(dcppSearchQuery);
|
||||
}, [hubs, sanitizedIssueName]);
|
||||
}, []);
|
||||
|
||||
const search = async (searchData: any) => {
|
||||
setAirDCPPSearchResults([]);
|
||||
socketRef.current?.emit("call", "socket.search", {
|
||||
query: searchData,
|
||||
namespace: "/manual",
|
||||
config: {
|
||||
protocol: `ws`,
|
||||
hostname: `192.168.1.119:5600`,
|
||||
username: `admin`,
|
||||
password: `password`,
|
||||
},
|
||||
});
|
||||
/**
|
||||
* Method to perform a search via an AirDC++ websocket
|
||||
* @param {SearchData} data - a SearchData query
|
||||
* @param {any} ADCPPSocket - an intialized AirDC++ socket instance
|
||||
*/
|
||||
const search = async (data: SearchData, ADCPPSocket: any) => {
|
||||
try {
|
||||
if (!ADCPPSocket.isConnected()) {
|
||||
await ADCPPSocket();
|
||||
}
|
||||
const instance: SearchInstance = await ADCPPSocket.post("search");
|
||||
setAirDCPPSearchStatus(true);
|
||||
|
||||
// We want to get notified about every new result in order to make the user experience better
|
||||
await ADCPPSocket.addListener(
|
||||
`search`,
|
||||
"search_result_added",
|
||||
async (groupedResult) => {
|
||||
// ...add the received result in the UI
|
||||
// (it's probably a good idea to have some kind of throttling for the UI updates as there can be thousands of results)
|
||||
setAirDCPPSearchResults((state) => [...state, groupedResult]);
|
||||
},
|
||||
instance.id,
|
||||
);
|
||||
|
||||
// We also want to update the existing items in our list when new hits arrive for the previously listed files/directories
|
||||
await ADCPPSocket.addListener(
|
||||
`search`,
|
||||
"search_result_updated",
|
||||
async (groupedResult) => {
|
||||
// ...update properties of the existing result in the UI
|
||||
const bundleToUpdateIndex = airDCPPSearchResults?.findIndex(
|
||||
(bundle) => bundle.result.id === groupedResult.result.id,
|
||||
);
|
||||
const updatedState = [...airDCPPSearchResults];
|
||||
if (
|
||||
!isNil(difference(updatedState[bundleToUpdateIndex], groupedResult))
|
||||
) {
|
||||
updatedState[bundleToUpdateIndex] = groupedResult;
|
||||
}
|
||||
setAirDCPPSearchResults((state) => [...state, ...updatedState]);
|
||||
},
|
||||
instance.id,
|
||||
);
|
||||
|
||||
// We need to show something to the user in case the search won't yield any results so that he won't be waiting forever)
|
||||
// Wait for 5 seconds for any results to arrive after the searches were sent to the hubs
|
||||
await ADCPPSocket.addListener(
|
||||
`search`,
|
||||
"search_hub_searches_sent",
|
||||
async (searchInfo) => {
|
||||
await sleep(5000);
|
||||
|
||||
// Check the number of received results (in real use cases we should know that even without calling the API)
|
||||
const currentInstance = await ADCPPSocket.get(
|
||||
`search/${instance.id}`,
|
||||
);
|
||||
setAirDCPPSearchInstance(currentInstance);
|
||||
setAirDCPPSearchInfo(searchInfo);
|
||||
if (currentInstance.result_count === 0) {
|
||||
// ...nothing was received, show an informative message to the user
|
||||
console.log("No more search results.");
|
||||
}
|
||||
|
||||
// The search can now be considered to be "complete"
|
||||
// If there's an "in progress" indicator in the UI, that could also be disabled here
|
||||
setAirDCPPSearchInstance(instance);
|
||||
setAirDCPPSearchStatus(false);
|
||||
},
|
||||
instance.id,
|
||||
);
|
||||
// Finally, perform the actual search
|
||||
await ADCPPSocket.post(`search/${instance.id}/hub_search`, data);
|
||||
} catch (error) {
|
||||
console.log(error);
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
|
||||
/**
|
||||
* Method to download a bundle associated with a search result from AirDC++
|
||||
* @param {Number} searchInstanceId - description
|
||||
* @param {String} resultId - description
|
||||
* @param {String} comicObjectId - description
|
||||
* @param {String} name - description
|
||||
* @param {Number} size - description
|
||||
* @param {any} type - description
|
||||
* @param {any} ADCPPSocket - description
|
||||
* @returns {void} - description
|
||||
*/
|
||||
const download = async (
|
||||
searchInstanceId: string | number,
|
||||
resultId: string,
|
||||
comicObjectId: string,
|
||||
name: string,
|
||||
size: number,
|
||||
type: unknown,
|
||||
config: Record<string, unknown>,
|
||||
): Promise<void> => {
|
||||
socketRef.current?.emit(
|
||||
"call",
|
||||
"socket.download",
|
||||
{
|
||||
searchInstanceId,
|
||||
resultId,
|
||||
comicObjectId,
|
||||
name,
|
||||
size,
|
||||
type,
|
||||
config,
|
||||
},
|
||||
(data: any) => {
|
||||
// Download initiated
|
||||
},
|
||||
);
|
||||
};
|
||||
searchInstanceId: Number,
|
||||
resultId: String,
|
||||
comicObjectId: String,
|
||||
name: String,
|
||||
size: Number,
|
||||
type: any,
|
||||
ADCPPSocket: any,
|
||||
): void => {
|
||||
try {
|
||||
if (!ADCPPSocket.isConnected()) {
|
||||
await ADCPPSocket.connect();
|
||||
}
|
||||
let bundleDBImportResult = {};
|
||||
const downloadResult = await ADCPPSocket.post(
|
||||
`search/${searchInstanceId}/results/${resultId}/download`,
|
||||
);
|
||||
|
||||
const getDCPPSearchResults = async (searchQuery: { issueName: string }) => {
|
||||
if (!isNil(downloadResult)) {
|
||||
bundleDBImportResult = await axios({
|
||||
method: "POST",
|
||||
url: `http://localhost:3000/api/library/applyAirDCPPDownloadMetadata`,
|
||||
headers: {
|
||||
"Content-Type": "application/json; charset=utf-8",
|
||||
},
|
||||
data: {
|
||||
bundleId: downloadResult.bundle_info.id,
|
||||
comicObjectId,
|
||||
name,
|
||||
size,
|
||||
type,
|
||||
},
|
||||
});
|
||||
console.log(bundleDBImportResult?.data);
|
||||
queryClient.invalidateQueries({ queryKey: ["comicBookMetadata"] });
|
||||
|
||||
// dispatch({
|
||||
// type: AIRDCPP_RESULT_DOWNLOAD_INITIATED,
|
||||
// downloadResult,
|
||||
// bundleDBImportResult,
|
||||
// });
|
||||
//
|
||||
// dispatch({
|
||||
// type: IMS_COMIC_BOOK_DB_OBJECT_FETCHED,
|
||||
// comicBookDetail: bundleDBImportResult.data,
|
||||
// IMS_inProgress: false,
|
||||
// });
|
||||
}
|
||||
} catch (error) {
|
||||
throw error;
|
||||
}
|
||||
};
|
||||
const getDCPPSearchResults = async (searchQuery) => {
|
||||
const manualQuery = {
|
||||
query: {
|
||||
pattern: `${searchQuery.issueName}`,
|
||||
extensions: ["cbz", "cbr", "cb7"],
|
||||
},
|
||||
hub_urls: [hubs?.data[0].hub_url],
|
||||
hub_urls: map(hubs, (hub) => hub.hub_url),
|
||||
priority: 5,
|
||||
};
|
||||
|
||||
search(manualQuery);
|
||||
search(manualQuery, airDCPPSocketInstance);
|
||||
};
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className="mt-5 mb-3">
|
||||
{!isEmpty(hubs?.data) ? (
|
||||
<div className="mt-5">
|
||||
{!isEmpty(airDCPPSocketInstance) ? (
|
||||
<Form
|
||||
onSubmit={getDCPPSearchResults}
|
||||
initialValues={{
|
||||
@@ -231,24 +278,16 @@ export const AcquisitionPanel = (
|
||||
)}
|
||||
/>
|
||||
) : (
|
||||
<article
|
||||
role="alert"
|
||||
className="mt-4 rounded-lg text-sm max-w-screen-md border-s-4 border-yellow-500 bg-yellow-50 p-4 dark:border-s-4 dark:border-yellow-600 dark:bg-yellow-300 dark:text-slate-600"
|
||||
>
|
||||
No AirDC++ hub configured. Please configure it in{" "}
|
||||
<code>Settings > AirDC++ > Hubs</code>.
|
||||
</article>
|
||||
<div className="">
|
||||
<article className="">
|
||||
<div className="">
|
||||
AirDC++ is not configured. Please configure it in{" "}
|
||||
<code>Settings > AirDC++ > Connection</code>.
|
||||
</div>
|
||||
</article>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
{/* configured hub */}
|
||||
{!isEmpty(hubs?.data) && (
|
||||
<span className="inline-flex items-center bg-green-50 text-slate-800 text-xs font-medium px-2.5 py-0.5 rounded-md dark:text-slate-900 dark:bg-green-300">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--server-2-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
{hubs && hubs?.data[0].hub_url}
|
||||
</span>
|
||||
)}
|
||||
|
||||
{/* AirDC++ search instance details */}
|
||||
{!isNil(airDCPPSearchInstance) &&
|
||||
@@ -259,7 +298,7 @@ export const AcquisitionPanel = (
|
||||
<dl>
|
||||
<dt>
|
||||
<div className="mb-1">
|
||||
{hubs?.data.map((value: HubData, idx: number) => (
|
||||
{hubs.map((value, idx) => (
|
||||
<span className="tag is-warning" key={idx}>
|
||||
{value.identity.name}
|
||||
</span>
|
||||
@@ -270,19 +309,19 @@ export const AcquisitionPanel = (
|
||||
<dt>
|
||||
Query:
|
||||
<span className="has-text-weight-semibold">
|
||||
{airDCPPSearchInfo.query?.pattern}
|
||||
{airDCPPSearchInfo.query.pattern}
|
||||
</span>
|
||||
</dt>
|
||||
<dd>
|
||||
Extensions:
|
||||
<span className="has-text-weight-semibold">
|
||||
{airDCPPSearchInfo.query?.extensions.join(", ")}
|
||||
{airDCPPSearchInfo.query.extensions.join(", ")}
|
||||
</span>
|
||||
</dd>
|
||||
<dd>
|
||||
File type:
|
||||
<span className="has-text-weight-semibold">
|
||||
{airDCPPSearchInfo.query?.file_type}
|
||||
{airDCPPSearchInfo.query.file_type}
|
||||
</span>
|
||||
</dd>
|
||||
</dl>
|
||||
@@ -298,118 +337,134 @@ export const AcquisitionPanel = (
|
||||
)}
|
||||
|
||||
{/* AirDC++ results */}
|
||||
<div className="">
|
||||
<div className="columns">
|
||||
{!isNil(airDCPPSearchResults) && !isEmpty(airDCPPSearchResults) ? (
|
||||
<div className="overflow-x-auto max-w-full mt-6">
|
||||
<table className="w-full table-auto text-sm text-gray-900 dark:text-slate-100">
|
||||
<div className="overflow-x-auto w-fit mt-4 rounded-lg border border-gray-200 dark:border-gray-500">
|
||||
<table className="min-w-full divide-y-2 divide-gray-200 dark:divide-gray-500 text-md">
|
||||
<thead>
|
||||
<tr className="border-b border-gray-300 dark:border-slate-700">
|
||||
<th className="whitespace-nowrap px-3 py-2 text-left text-[11px] font-semibold tracking-wide text-gray-500 dark:text-slate-400 uppercase">
|
||||
<tr>
|
||||
<th className="whitespace-nowrap px-2 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Name
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-3 py-2 text-left text-[11px] font-semibold tracking-wide text-gray-500 dark:text-slate-400 uppercase">
|
||||
<th className="whitespace-nowrap py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Type
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-3 py-2 text-left text-[11px] font-semibold tracking-wide text-gray-500 dark:text-slate-400 uppercase">
|
||||
<th className="whitespace-nowrap py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Slots
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-3 py-2 text-left text-[11px] font-semibold tracking-wide text-gray-500 dark:text-slate-400 uppercase">
|
||||
<th className="whitespace-nowrap py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Actions
|
||||
</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{map(
|
||||
airDCPPSearchResults,
|
||||
({ dupe, type, name, id, slots, users, size }, idx) => (
|
||||
<tbody className="divide-y divide-slate-100 dark:divide-gray-500">
|
||||
{map(airDCPPSearchResults, ({ result }, idx) => {
|
||||
return (
|
||||
<tr
|
||||
key={idx}
|
||||
className={
|
||||
!isNil(dupe)
|
||||
? "border-b border-gray-200 dark:border-slate-700 bg-gray-100 dark:bg-gray-700"
|
||||
: "border-b border-gray-200 dark:border-slate-700 text-sm"
|
||||
!isNil(result.dupe)
|
||||
? "bg-gray-100 dark:bg-gray-700"
|
||||
: "w-fit text-sm"
|
||||
}
|
||||
>
|
||||
{/* NAME */}
|
||||
<td className="whitespace-nowrap px-3 py-3 text-gray-700 dark:text-slate-300 max-w-xs">
|
||||
<td className="whitespace-nowrap px-3 py-3 text-gray-700 dark:text-slate-300">
|
||||
<p className="mb-2">
|
||||
{/* TODO: Switch to Solar icon */}
|
||||
{type.id === "directory" && (
|
||||
<i className="fas fa-folder mr-1"></i>
|
||||
)}
|
||||
{ellipsize(name, 45)}
|
||||
{result.type.id === "directory" ? (
|
||||
<i className="fas fa-folder"></i>
|
||||
) : null}
|
||||
{ellipsize(result.name, 70)}
|
||||
</p>
|
||||
|
||||
<dl>
|
||||
<dd>
|
||||
<div className="inline-flex flex-wrap gap-1">
|
||||
{!isNil(dupe) && (
|
||||
<span className="inline-flex items-center gap-1 bg-slate-100 text-slate-800 text-xs font-medium py-0.5 px-2 rounded dark:bg-slate-400 dark:text-slate-900">
|
||||
<i className="icon-[solar--copy-bold-duotone] w-4 h-4"></i>
|
||||
Dupe
|
||||
<div className="inline-flex flex-row gap-2">
|
||||
{!isNil(result.dupe) ? (
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs font-medium px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--copy-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
|
||||
<span className="text-md text-slate-500 dark:text-slate-900">
|
||||
Dupe
|
||||
</span>
|
||||
</span>
|
||||
) : null}
|
||||
|
||||
{/* Nicks */}
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs font-medium px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--user-rounded-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
|
||||
<span className="text-md text-slate-500 dark:text-slate-900">
|
||||
{result.users.user.nicks}
|
||||
</span>
|
||||
)}
|
||||
<span className="inline-flex items-center gap-1 bg-slate-100 text-slate-800 text-xs font-medium py-0.5 px-2 rounded dark:bg-slate-400 dark:text-slate-900">
|
||||
<i className="icon-[solar--user-rounded-bold-duotone] w-4 h-4"></i>
|
||||
{users.user.nicks}
|
||||
</span>
|
||||
{users.user.flags.map((flag, idx) => (
|
||||
<span
|
||||
key={idx}
|
||||
className="inline-flex items-center gap-1 bg-slate-100 text-slate-800 text-xs font-medium py-0.5 px-2 rounded dark:bg-slate-400 dark:text-slate-900"
|
||||
>
|
||||
<i className="icon-[solar--tag-horizontal-bold-duotone] w-4 h-4"></i>
|
||||
{flag}
|
||||
{/* Flags */}
|
||||
{result.users.user.flags.map((flag, idx) => (
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs font-medium px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--tag-horizontal-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
|
||||
<span className="text-md text-slate-500 dark:text-slate-900">
|
||||
{flag}
|
||||
</span>
|
||||
</span>
|
||||
))}
|
||||
</div>
|
||||
</dd>
|
||||
</dl>
|
||||
</td>
|
||||
<td>
|
||||
{/* Extension */}
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs font-medium px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--zip-file-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
|
||||
{/* TYPE */}
|
||||
<td className="px-2 py-3">
|
||||
<span className="inline-flex items-center gap-1 bg-slate-100 text-slate-800 text-xs font-medium py-0.5 px-2 rounded dark:bg-slate-400 dark:text-slate-900">
|
||||
<i className="icon-[solar--zip-file-bold-duotone] w-4 h-4"></i>
|
||||
{type.str}
|
||||
<span className="text-md text-slate-500 dark:text-slate-900">
|
||||
{result.type.str}
|
||||
</span>
|
||||
</span>
|
||||
</td>
|
||||
<td className="px-2">
|
||||
{/* Slots */}
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs font-medium px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--settings-minimalistic-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
|
||||
{/* SLOTS */}
|
||||
<td className="px-2 py-3">
|
||||
<span className="inline-flex items-center gap-1 bg-slate-100 text-slate-800 text-xs font-medium py-0.5 px-2 rounded dark:bg-slate-400 dark:text-slate-900">
|
||||
<i className="icon-[solar--settings-minimalistic-bold-duotone] w-4 h-4"></i>
|
||||
{slots.total} slots; {slots.free} free
|
||||
<span className="text-md text-slate-500 dark:text-slate-900">
|
||||
{result.slots.total} slots; {result.slots.free} free
|
||||
</span>
|
||||
</span>
|
||||
</td>
|
||||
|
||||
{/* ACTIONS */}
|
||||
<td className="px-2 py-3">
|
||||
<td className="px-2">
|
||||
<button
|
||||
className="inline-flex items-center gap-1 rounded border border-green-500 bg-green-500 px-2 py-1 text-xs font-medium text-white hover:bg-transparent hover:text-green-400 dark:border-green-300 dark:bg-green-300 dark:text-slate-900 dark:hover:bg-transparent"
|
||||
className="flex space-x-1 sm:mt-0 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-3 py-1 text-gray-500 hover:bg-transparent hover:text-green-600 focus:outline-none focus:ring active:text-indigo-500"
|
||||
onClick={() =>
|
||||
download(
|
||||
airDCPPSearchInstance.id ?? "",
|
||||
id,
|
||||
airDCPPSearchInstance.id,
|
||||
result.id,
|
||||
comicObjectId,
|
||||
name,
|
||||
size,
|
||||
type,
|
||||
{
|
||||
protocol: `ws`,
|
||||
hostname: `192.168.1.119:5600`,
|
||||
username: `admin`,
|
||||
password: `password`,
|
||||
},
|
||||
result.name,
|
||||
result.size,
|
||||
result.type,
|
||||
airDCPPSocketInstance,
|
||||
)
|
||||
}
|
||||
>
|
||||
Download
|
||||
<i className="icon-[solar--download-bold-duotone] w-4 h-4"></i>
|
||||
<span className="text-xs">Download</span>
|
||||
<span className="w-5 h-5">
|
||||
<i className="h-5 w-5 icon-[solar--download-bold-duotone]"></i>
|
||||
</span>
|
||||
</button>
|
||||
</td>
|
||||
</tr>
|
||||
),
|
||||
)}
|
||||
);
|
||||
})}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
|
||||
@@ -1,31 +1,17 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import Select, { StylesConfig, SingleValue } from "react-select";
|
||||
import { ActionOption } from "../actionMenuConfig";
|
||||
import Select from "react-select";
|
||||
|
||||
interface MenuConfiguration {
|
||||
filteredActionOptions: ActionOption[];
|
||||
customStyles: StylesConfig<ActionOption, false>;
|
||||
handleActionSelection: (action: SingleValue<ActionOption>) => void;
|
||||
}
|
||||
|
||||
interface MenuProps {
|
||||
data?: unknown;
|
||||
handlers?: {
|
||||
setSlidingPanelContentId: (id: string) => void;
|
||||
setVisible: (visible: boolean) => void;
|
||||
};
|
||||
configuration: MenuConfiguration;
|
||||
}
|
||||
|
||||
export const Menu = (props: MenuProps): ReactElement => {
|
||||
export const Menu = (props): ReactElement => {
|
||||
const {
|
||||
filteredActionOptions,
|
||||
customStyles,
|
||||
handleActionSelection,
|
||||
Placeholder,
|
||||
} = props.configuration;
|
||||
|
||||
return (
|
||||
<Select<ActionOption, false>
|
||||
<Select
|
||||
components={{ Placeholder }}
|
||||
placeholder={
|
||||
<span className="inline-flex flex-row items-center gap-2 pt-1">
|
||||
<div className="w-6 h-6">
|
||||
|
||||
@@ -3,67 +3,44 @@ import prettyBytes from "pretty-bytes";
|
||||
import dayjs from "dayjs";
|
||||
import ellipsize from "ellipsize";
|
||||
import { map } from "lodash";
|
||||
import { DownloadProgressTick } from "./DownloadProgressTick";
|
||||
|
||||
interface BundleData {
|
||||
id: string;
|
||||
name: string;
|
||||
target: string;
|
||||
size: number;
|
||||
}
|
||||
|
||||
interface AirDCPPBundlesProps {
|
||||
data: BundleData[];
|
||||
}
|
||||
|
||||
export const AirDCPPBundles = (props: AirDCPPBundlesProps) => {
|
||||
export const AirDCPPBundles = (props) => {
|
||||
return (
|
||||
<div className="overflow-x-auto w-fit mt-6">
|
||||
<table className="min-w-full text-sm text-gray-900 dark:text-slate-100">
|
||||
<div className="overflow-x-auto w-fit mt-4 rounded-lg border border-gray-200">
|
||||
<table className="min-w-full divide-y-2 divide-gray-200 dark:divide-gray-200 text-md">
|
||||
<thead>
|
||||
<tr className="border-b border-gray-300 dark:border-slate-700">
|
||||
<th className="px-3 py-2 text-left text-[11px] font-semibold tracking-wide text-gray-500 dark:text-slate-400 uppercase">
|
||||
<tr>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Filename
|
||||
</th>
|
||||
<th className="px-3 py-2 text-left text-[11px] font-semibold tracking-wide text-gray-500 dark:text-slate-400 uppercase">
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Size
|
||||
</th>
|
||||
<th className="px-3 py-2 text-left text-[11px] font-semibold tracking-wide text-gray-500 dark:text-slate-400 uppercase">
|
||||
Download Status
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Download Time
|
||||
</th>
|
||||
<th className="px-3 py-2 text-left text-[11px] font-semibold tracking-wide text-gray-500 dark:text-slate-400 uppercase">
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Bundle ID
|
||||
</th>
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{map(props.data, (bundle, index) => (
|
||||
<tr
|
||||
key={bundle.id}
|
||||
className={
|
||||
Number(index) !== props.data.length - 1
|
||||
? "border-b border-gray-200 dark:border-slate-700"
|
||||
: ""
|
||||
}
|
||||
>
|
||||
<td className="px-3 py-2 align-top">
|
||||
<h5 className="font-medium text-gray-800 dark:text-slate-200">
|
||||
{ellipsize(bundle.name, 58)}
|
||||
</h5>
|
||||
<p className="text-xs text-gray-500 dark:text-slate-400">
|
||||
{ellipsize(bundle.target, 88)}
|
||||
</p>
|
||||
<tbody className="divide-y divide-gray-200">
|
||||
{map(props.data, (bundle) => (
|
||||
<tr key={bundle.id} className="text-sm">
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
<h5>{ellipsize(bundle.name, 58)}</h5>
|
||||
<span className="text-xs">{ellipsize(bundle.target, 88)}</span>
|
||||
</td>
|
||||
<td className="px-3 py-2 align-top">
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
{prettyBytes(bundle.size)}
|
||||
</td>
|
||||
<td className="px-3 py-2 align-top">
|
||||
<DownloadProgressTick bundleId={bundle.id} />
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
{dayjs
|
||||
.unix(bundle.time_finished)
|
||||
.format("h:mm on ddd, D MMM, YYYY")}
|
||||
</td>
|
||||
<td className="px-3 py-2 align-top">
|
||||
<span className="text-xs text-yellow-800 dark:text-yellow-300 font-medium">
|
||||
{bundle.id}
|
||||
</span>
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="tag is-warning">{bundle.id}</span>
|
||||
</td>
|
||||
</tr>
|
||||
))}
|
||||
|
||||
@@ -1,70 +1,36 @@
|
||||
import React, { ReactElement, useCallback, useState } from "react";
|
||||
import axios from "axios";
|
||||
import { isNil } from "lodash";
|
||||
import PropTypes from "prop-types";
|
||||
import { fetchMetronResource } from "../../../actions/metron.actions";
|
||||
import Creatable from "react-select/creatable";
|
||||
import { withAsyncPaginate } from "react-select-async-paginate";
|
||||
import { METRON_SERVICE_URI } from "../../../constants/endpoints";
|
||||
|
||||
const CreatableAsyncPaginate = withAsyncPaginate(Creatable);
|
||||
|
||||
export interface AsyncSelectPaginateProps {
|
||||
metronResource?: string;
|
||||
placeholder?: string | React.ReactNode;
|
||||
value?: object;
|
||||
onChange?(...args: unknown[]): unknown;
|
||||
meta?: Record<string, unknown>;
|
||||
input?: Record<string, unknown>;
|
||||
name?: string;
|
||||
type?: string;
|
||||
}
|
||||
|
||||
interface AdditionalType {
|
||||
page: number | null;
|
||||
}
|
||||
|
||||
interface MetronResultItem {
|
||||
name?: string;
|
||||
__str__?: string;
|
||||
id: number;
|
||||
}
|
||||
|
||||
export const AsyncSelectPaginate = (props: AsyncSelectPaginateProps): ReactElement => {
|
||||
export const AsyncSelectPaginate = (props): ReactElement => {
|
||||
const [value, setValue] = useState(null);
|
||||
const [isAddingInProgress, setIsAddingInProgress] = useState(false);
|
||||
|
||||
const loadData = useCallback(async (
|
||||
query: string,
|
||||
_loadedOptions: unknown,
|
||||
additional?: AdditionalType
|
||||
) => {
|
||||
const page = additional?.page ?? 1;
|
||||
const options = {
|
||||
const loadData = useCallback((query, loadedOptions, { page }) => {
|
||||
return fetchMetronResource({
|
||||
method: "GET",
|
||||
resource: props.metronResource || "",
|
||||
query: { name: query, page },
|
||||
};
|
||||
const response = await axios.post(`${METRON_SERVICE_URI}/fetchResource`, options);
|
||||
const results = response.data.results.map((result: MetronResultItem) => ({
|
||||
label: result.name || result.__str__,
|
||||
value: result.id,
|
||||
}));
|
||||
return {
|
||||
options: results,
|
||||
hasMore: !isNil(response.data.next),
|
||||
additional: {
|
||||
page: !isNil(response.data.next) ? page + 1 : null,
|
||||
resource: props.metronResource,
|
||||
query: {
|
||||
name: query,
|
||||
page,
|
||||
},
|
||||
};
|
||||
}, [props.metronResource]);
|
||||
});
|
||||
}, []);
|
||||
|
||||
return (
|
||||
<CreatableAsyncPaginate
|
||||
SelectComponent={Creatable}
|
||||
debounceTimeout={200}
|
||||
isDisabled={isAddingInProgress}
|
||||
value={props.value}
|
||||
// eslint-disable-next-line @typescript-eslint/no-explicit-any
|
||||
loadOptions={loadData as any}
|
||||
loadOptions={loadData}
|
||||
placeholder={props.placeholder}
|
||||
// onCreateOption={onCreateOption}
|
||||
onChange={props.onChange}
|
||||
// cacheUniqs={[cacheUniq]}
|
||||
additional={{
|
||||
page: 1,
|
||||
}}
|
||||
@@ -72,4 +38,11 @@ export const AsyncSelectPaginate = (props: AsyncSelectPaginateProps): ReactEleme
|
||||
);
|
||||
};
|
||||
|
||||
AsyncSelectPaginate.propTypes = {
|
||||
metronResource: PropTypes.string.isRequired,
|
||||
placeholder: PropTypes.string,
|
||||
value: PropTypes.object,
|
||||
onChange: PropTypes.func,
|
||||
};
|
||||
|
||||
export default AsyncSelectPaginate;
|
||||
|
||||
@@ -1,37 +1,50 @@
|
||||
import React, { useState, ReactElement, useCallback, useMemo } from "react";
|
||||
import React, { useState, ReactElement, useCallback } from "react";
|
||||
import { useParams } from "react-router-dom";
|
||||
import Card from "../shared/Carda";
|
||||
import { ComicVineMatchPanel } from "./ComicVineMatchPanel";
|
||||
|
||||
import { RawFileDetails } from "./RawFileDetails";
|
||||
import { ComicVineSearchForm } from "./ComicVineSearchForm";
|
||||
|
||||
import TabControls from "./TabControls";
|
||||
import { EditMetadataPanel } from "./EditMetadataPanel";
|
||||
import { Menu } from "./ActionMenu/Menu";
|
||||
import { ArchiveOperations } from "./Tabs/ArchiveOperations";
|
||||
import { ComicInfoXML } from "./Tabs/ComicInfoXML";
|
||||
import AcquisitionPanel from "./AcquisitionPanel";
|
||||
import TorrentSearchPanel from "./TorrentSearchPanel";
|
||||
import DownloadsPanel from "./DownloadsPanel";
|
||||
import { VolumeInformation } from "./Tabs/VolumeInformation";
|
||||
|
||||
import { isEmpty, isUndefined, isNil, filter } from "lodash";
|
||||
import { components } from "react-select";
|
||||
import { RootState } from "threetwo-ui-typings";
|
||||
|
||||
import "react-sliding-pane/dist/react-sliding-pane.css";
|
||||
import "react-loader-spinner/dist/loader/css/react-spinner-loader.css";
|
||||
import Loader from "react-loader-spinner";
|
||||
import SlidingPane from "react-sliding-pane";
|
||||
import Modal from "react-modal";
|
||||
import ComicViewer from "react-comic-viewer";
|
||||
|
||||
import { extractComicArchive } from "../../actions/fileops.actions";
|
||||
import { determineCoverFile } from "../../shared/utils/metadata.utils";
|
||||
import axios from "axios";
|
||||
import { styled } from "styled-components";
|
||||
import type { ComicDetailProps } from "../../types";
|
||||
|
||||
// Extracted modules
|
||||
import { useComicVineMatching } from "./useComicVineMatching";
|
||||
import { createTabConfig } from "./tabConfig";
|
||||
import { actionOptions, customStyles, ActionOption } from "./actionMenuConfig";
|
||||
import { CVMatchesPanel, EditMetadataPanelWrapper } from "./SlidingPanelContent";
|
||||
|
||||
// Styled component - moved outside to prevent recreation
|
||||
const StyledSlidingPanel = styled(SlidingPane)`
|
||||
background: #ccc;
|
||||
`;
|
||||
import { COMICVINE_SERVICE_URI } from "../../constants/endpoints";
|
||||
import { refineQuery } from "filename-parser";
|
||||
|
||||
type ComicDetailProps = {};
|
||||
/**
|
||||
* Displays full comic detail: cover, file info, action menu, and tabbed panels
|
||||
* for metadata, archive operations, and acquisition.
|
||||
* Component for displaying the metadata for a comic in greater detail.
|
||||
*
|
||||
* @param data.queryClient - react-query client passed through to the CV match
|
||||
* panel so it can invalidate queries after a match is applied.
|
||||
* @param data.comicObjectId - optional override for the comic ID; used when the
|
||||
* component is rendered outside a route that provides the ID via `useParams`.
|
||||
* @component
|
||||
* @example
|
||||
* return (
|
||||
* <ComicDetail/>
|
||||
* )
|
||||
*/
|
||||
|
||||
export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
const {
|
||||
data: {
|
||||
@@ -41,22 +54,135 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
sourcedMetadata: { comicvine, locg, comicInfo },
|
||||
acquisition,
|
||||
createdAt,
|
||||
updatedAt,
|
||||
},
|
||||
userSettings,
|
||||
queryClient,
|
||||
comicObjectId: comicObjectIdProp,
|
||||
} = data;
|
||||
|
||||
const [activeTab, setActiveTab] = useState<number | undefined>(undefined);
|
||||
const [page, setPage] = useState(1);
|
||||
const [visible, setVisible] = useState(false);
|
||||
const [slidingPanelContentId, setSlidingPanelContentId] = useState("");
|
||||
const [modalIsOpen, setIsOpen] = useState(false);
|
||||
const [comicVineMatches, setComicVineMatches] = useState([]);
|
||||
|
||||
const { comicObjectId } = useParams<{ comicObjectId: string }>();
|
||||
const { comicVineMatches, prepareAndFetchMatches } = useComicVineMatching();
|
||||
|
||||
// const dispatch = useDispatch();
|
||||
|
||||
const openModal = useCallback((filePath) => {
|
||||
setIsOpen(true);
|
||||
// dispatch(
|
||||
// extractComicArchive(filePath, {
|
||||
// type: "full",
|
||||
// purpose: "reading",
|
||||
// imageResizeOptions: {
|
||||
// baseWidth: 1024,
|
||||
// },
|
||||
// }),
|
||||
// );
|
||||
}, []);
|
||||
|
||||
// overridden <SlidingPanel> with some styles
|
||||
const StyledSlidingPanel = styled(SlidingPane)`
|
||||
background: #ccc;
|
||||
`;
|
||||
const afterOpenModal = useCallback((things) => {
|
||||
// references are now sync'd and can be accessed.
|
||||
// subtitle.style.color = "#f00";
|
||||
console.log("kolaveri", things);
|
||||
}, []);
|
||||
|
||||
const closeModal = useCallback(() => {
|
||||
setIsOpen(false);
|
||||
}, []);
|
||||
|
||||
// sliding panel init
|
||||
const contentForSlidingPanel = {
|
||||
CVMatches: {
|
||||
content: (props) => (
|
||||
<>
|
||||
<div>
|
||||
<ComicVineSearchForm data={rawFileDetails} />
|
||||
</div>
|
||||
|
||||
<div className="border-slate-500 border rounded-lg p-2 mt-3">
|
||||
<p className="">Searching for:</p>
|
||||
{inferredMetadata.issue ? (
|
||||
<>
|
||||
<span className="">{inferredMetadata.issue.name} </span>
|
||||
<span className=""> # {inferredMetadata.issue.number} </span>
|
||||
</>
|
||||
) : null}
|
||||
</div>
|
||||
<ComicVineMatchPanel
|
||||
props={{
|
||||
comicVineMatches,
|
||||
comicObjectId,
|
||||
}}
|
||||
/>
|
||||
</>
|
||||
),
|
||||
},
|
||||
|
||||
editComicBookMetadata: {
|
||||
content: () => <EditMetadataPanel data={rawFileDetails} />,
|
||||
},
|
||||
};
|
||||
|
||||
// Actions
|
||||
|
||||
const fetchComicVineMatches = async (
|
||||
searchPayload,
|
||||
issueSearchQuery,
|
||||
seriesSearchQuery,
|
||||
) => {
|
||||
try {
|
||||
const response = await axios({
|
||||
url: `${COMICVINE_SERVICE_URI}/volumeBasedSearch`,
|
||||
method: "POST",
|
||||
data: {
|
||||
format: "json",
|
||||
// hack
|
||||
query: issueSearchQuery.inferredIssueDetails.name
|
||||
.replace(/[^a-zA-Z0-9 ]/g, "")
|
||||
.trim(),
|
||||
limit: "100",
|
||||
page: 1,
|
||||
resources: "volume",
|
||||
scorerConfiguration: {
|
||||
searchParams: issueSearchQuery.inferredIssueDetails,
|
||||
},
|
||||
rawFileDetails: searchPayload,
|
||||
},
|
||||
transformResponse: (r) => {
|
||||
const matches = JSON.parse(r);
|
||||
return matches;
|
||||
// return sortBy(matches, (match) => -match.score);
|
||||
},
|
||||
});
|
||||
let matches: any = [];
|
||||
if (!isNil(response.data.results) && response.data.results.length === 1) {
|
||||
matches = response.data.results;
|
||||
} else {
|
||||
matches = response.data.map((match) => match);
|
||||
}
|
||||
const scoredMatches = matches.sort((a, b) => b.score - a.score);
|
||||
setComicVineMatches(scoredMatches);
|
||||
} catch (err) {
|
||||
console.log(err);
|
||||
}
|
||||
};
|
||||
|
||||
// Action event handlers
|
||||
const openDrawerWithCVMatches = () => {
|
||||
prepareAndFetchMatches(rawFileDetails, comicvine);
|
||||
let seriesSearchQuery: IComicVineSearchQuery = {} as IComicVineSearchQuery;
|
||||
let issueSearchQuery: IComicVineSearchQuery = {} as IComicVineSearchQuery;
|
||||
|
||||
if (!isUndefined(rawFileDetails)) {
|
||||
issueSearchQuery = refineQuery(rawFileDetails.name);
|
||||
} else if (!isEmpty(comicvine)) {
|
||||
issueSearchQuery = refineQuery(comicvine.name);
|
||||
}
|
||||
fetchComicVineMatches(rawFileDetails, issueSearchQuery, seriesSearchQuery);
|
||||
setSlidingPanelContentId("CVMatches");
|
||||
setVisible(true);
|
||||
};
|
||||
@@ -66,17 +192,47 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
setVisible(true);
|
||||
}, []);
|
||||
|
||||
// Hide "match on Comic Vine" when there are no raw file details — matching
|
||||
// requires file metadata to seed the search query.
|
||||
const filteredActionOptions: ActionOption[] = actionOptions.filter((item) => {
|
||||
// Actions menu options and handler
|
||||
const CVMatchLabel = (
|
||||
<span className="inline-flex flex-row items-center gap-2">
|
||||
<div className="w-6 h-6">
|
||||
<i className="icon-[solar--magic-stick-3-bold-duotone] w-6 h-6"></i>
|
||||
</div>
|
||||
<div>Match on ComicVine</div>
|
||||
</span>
|
||||
);
|
||||
const editLabel = (
|
||||
<span className="inline-flex flex-row items-center gap-2">
|
||||
<div className="w-6 h-6">
|
||||
<i className="icon-[solar--pen-2-bold-duotone] w-6 h-6"></i>
|
||||
</div>
|
||||
<div>Edit Metadata</div>
|
||||
</span>
|
||||
);
|
||||
const deleteLabel = (
|
||||
<span className="inline-flex flex-row items-center gap-2">
|
||||
<div className="w-6 h-6">
|
||||
<i className="icon-[solar--trash-bin-trash-bold-duotone] w-6 h-6"></i>
|
||||
</div>
|
||||
<div>Delete Comic</div>
|
||||
</span>
|
||||
);
|
||||
const Placeholder = (props) => {
|
||||
return <components.Placeholder {...props} />;
|
||||
};
|
||||
const actionOptions = [
|
||||
{ value: "match-on-comic-vine", label: CVMatchLabel },
|
||||
{ value: "edit-metdata", label: editLabel },
|
||||
{ value: "delete-comic", label: deleteLabel },
|
||||
];
|
||||
|
||||
const filteredActionOptions = filter(actionOptions, (item) => {
|
||||
if (isUndefined(rawFileDetails)) {
|
||||
return item.value !== "match-on-comic-vine";
|
||||
}
|
||||
return true;
|
||||
return item;
|
||||
});
|
||||
|
||||
const handleActionSelection = (action: ActionOption | null) => {
|
||||
if (!action) return;
|
||||
const handleActionSelection = (action) => {
|
||||
switch (action.value) {
|
||||
case "match-on-comic-vine":
|
||||
openDrawerWithCVMatches();
|
||||
@@ -85,19 +241,40 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
openEditMetadataPanel();
|
||||
break;
|
||||
default:
|
||||
console.log("No valid action selected.");
|
||||
break;
|
||||
}
|
||||
};
|
||||
const customStyles = {
|
||||
menu: (base) => ({
|
||||
...base,
|
||||
backgroundColor: "rgb(156, 163, 175)",
|
||||
}),
|
||||
placeholder: (base) => ({
|
||||
...base,
|
||||
color: "black",
|
||||
}),
|
||||
option: (base, { data, isDisabled, isFocused, isSelected }) => ({
|
||||
...base,
|
||||
backgroundColor: isFocused ? "gray" : "rgb(156, 163, 175)",
|
||||
}),
|
||||
singleValue: (base) => ({
|
||||
...base,
|
||||
paddingTop: "0.4rem",
|
||||
}),
|
||||
control: (base) => ({
|
||||
...base,
|
||||
backgroundColor: "rgb(156, 163, 175)",
|
||||
color: "black",
|
||||
border: "1px solid rgb(156, 163, 175)",
|
||||
}),
|
||||
};
|
||||
|
||||
// Check for metadata availability
|
||||
// check for the availability of CV metadata
|
||||
const isComicBookMetadataAvailable =
|
||||
!isUndefined(comicvine) && !isUndefined(comicvine?.volumeInformation);
|
||||
|
||||
const hasAnyMetadata =
|
||||
isComicBookMetadataAvailable ||
|
||||
!isEmpty(comicInfo) ||
|
||||
!isNil(locg);
|
||||
!isUndefined(comicvine) && !isUndefined(comicvine.volumeInformation);
|
||||
|
||||
// check for the availability of rawFileDetails
|
||||
const areRawFileDetailsAvailable =
|
||||
!isUndefined(rawFileDetails) && !isEmpty(rawFileDetails);
|
||||
|
||||
@@ -107,58 +284,110 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
locg,
|
||||
});
|
||||
|
||||
// Query for airdc++
|
||||
const airDCPPQuery = useMemo(() => ({
|
||||
issue: { name: issueName },
|
||||
}), [issueName]);
|
||||
|
||||
// Create tab configuration
|
||||
const openReconcilePanel = useCallback(() => {
|
||||
setSlidingPanelContentId("metadataReconciliation");
|
||||
setVisible(true);
|
||||
}, []);
|
||||
|
||||
const tabGroup = useMemo(() => createTabConfig({
|
||||
data: data.data,
|
||||
hasAnyMetadata,
|
||||
areRawFileDetailsAvailable,
|
||||
airDCPPQuery,
|
||||
comicObjectId: _id,
|
||||
userSettings,
|
||||
issueName,
|
||||
acquisition,
|
||||
onReconcileMetadata: openReconcilePanel,
|
||||
}), [data.data, hasAnyMetadata, areRawFileDetailsAvailable, airDCPPQuery, _id, userSettings, issueName, acquisition, openReconcilePanel]);
|
||||
|
||||
const filteredTabs = useMemo(() => tabGroup.filter((tab) => tab.shouldShow), [tabGroup]);
|
||||
|
||||
// Sliding panel content mapping
|
||||
const renderSlidingPanelContent = () => {
|
||||
switch (slidingPanelContentId) {
|
||||
case "CVMatches":
|
||||
return (
|
||||
<CVMatchesPanel
|
||||
rawFileDetails={rawFileDetails}
|
||||
inferredMetadata={inferredMetadata}
|
||||
comicVineMatches={comicVineMatches}
|
||||
// Prefer the route param; fall back to the data ID when rendered outside a route.
|
||||
comicObjectId={comicObjectId || _id}
|
||||
queryClient={queryClient}
|
||||
onMatchApplied={() => {
|
||||
setVisible(false);
|
||||
setActiveTab(1);
|
||||
}}
|
||||
/>
|
||||
);
|
||||
case "editComicBookMetadata":
|
||||
return <EditMetadataPanelWrapper rawFileDetails={rawFileDetails} />;
|
||||
default:
|
||||
return null;
|
||||
}
|
||||
// query for airdc++
|
||||
const airDCPPQuery = {
|
||||
issue: {
|
||||
name: issueName,
|
||||
},
|
||||
};
|
||||
|
||||
// Tab content and header details
|
||||
const tabGroup = [
|
||||
{
|
||||
id: 1,
|
||||
name: "Volume Information",
|
||||
icon: (
|
||||
<i className="h-5 w-5 icon-[solar--book-2-bold] text-slate-500 dark:text-slate-300"></i>
|
||||
),
|
||||
content: isComicBookMetadataAvailable ? (
|
||||
<VolumeInformation data={data.data} key={1} />
|
||||
) : null,
|
||||
shouldShow: isComicBookMetadataAvailable,
|
||||
},
|
||||
{
|
||||
id: 2,
|
||||
name: "ComicInfo.xml",
|
||||
icon: (
|
||||
<i className="h-5 w-5 icon-[solar--code-file-bold-duotone] text-slate-500 dark:text-slate-300" />
|
||||
),
|
||||
content: (
|
||||
<div key={2}>
|
||||
{!isNil(comicInfo) && <ComicInfoXML json={comicInfo} />}
|
||||
</div>
|
||||
),
|
||||
shouldShow: !isEmpty(comicInfo),
|
||||
},
|
||||
{
|
||||
id: 3,
|
||||
icon: (
|
||||
<i className="h-5 w-5 icon-[solar--winrar-bold-duotone] text-slate-500 dark:text-slate-300" />
|
||||
),
|
||||
name: "Archive Operations",
|
||||
content: <ArchiveOperations data={data.data} key={3} />,
|
||||
shouldShow: areRawFileDetailsAvailable,
|
||||
},
|
||||
{
|
||||
id: 4,
|
||||
icon: (
|
||||
<i className="h-5 w-5 icon-[solar--folder-path-connect-bold-duotone] text-slate-500 dark:text-slate-300" />
|
||||
),
|
||||
name: "DC++ Search",
|
||||
content: (
|
||||
<AcquisitionPanel
|
||||
query={airDCPPQuery}
|
||||
comicObjectId={_id}
|
||||
comicObject={data.data}
|
||||
userSettings={userSettings}
|
||||
key={4}
|
||||
/>
|
||||
),
|
||||
shouldShow: true,
|
||||
},
|
||||
{
|
||||
id: 5,
|
||||
icon: (
|
||||
<span className="inline-flex flex-row">
|
||||
<i className="h-5 w-5 icon-[solar--magnet-bold-duotone] text-slate-500 dark:text-slate-300" />
|
||||
</span>
|
||||
),
|
||||
name: "Torrent Search",
|
||||
content: <TorrentSearchPanel comicObjectId={_id} issueName={issueName} />,
|
||||
shouldShow: true,
|
||||
},
|
||||
{
|
||||
id: 6,
|
||||
name: "Downloads",
|
||||
icon: (
|
||||
<>
|
||||
{acquisition?.directconnect?.downloads?.length +
|
||||
acquisition?.torrent.length}
|
||||
</>
|
||||
),
|
||||
content:
|
||||
!isNil(data.data) && !isEmpty(data.data) ? (
|
||||
<DownloadsPanel key={5} />
|
||||
) : (
|
||||
<div className="column is-three-fifths">
|
||||
<article className="message is-info">
|
||||
<div className="message-body is-size-6 is-family-secondary">
|
||||
AirDC++ is not configured. Please configure it in{" "}
|
||||
<code>Settings</code>.
|
||||
</div>
|
||||
</article>
|
||||
</div>
|
||||
),
|
||||
shouldShow: true,
|
||||
},
|
||||
];
|
||||
// filtered Tabs
|
||||
const filteredTabs = tabGroup.filter((tab) => tab.shouldShow);
|
||||
|
||||
// Determine which cover image to use:
|
||||
// 1. from the locally imported or
|
||||
// 2. from the CV-scraped version
|
||||
|
||||
return (
|
||||
<section className="mx-auto max-w-screen-xl px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
<section className="container mx-auto">
|
||||
<div className="section">
|
||||
{!isNil(data) && !isEmpty(data) && (
|
||||
<>
|
||||
@@ -172,13 +401,14 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
|
||||
{/* raw file details */}
|
||||
{!isUndefined(rawFileDetails) &&
|
||||
!isEmpty(rawFileDetails?.cover) && (
|
||||
!isEmpty(rawFileDetails.cover) && (
|
||||
<div className="grid">
|
||||
<RawFileDetails
|
||||
data={{
|
||||
rawFileDetails,
|
||||
inferredMetadata,
|
||||
createdAt,
|
||||
rawFileDetails: rawFileDetails,
|
||||
inferredMetadata: inferredMetadata,
|
||||
created_at: createdAt,
|
||||
updated_at: updatedAt,
|
||||
}}
|
||||
>
|
||||
{/* action dropdown */}
|
||||
@@ -190,10 +420,30 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
filteredActionOptions,
|
||||
customStyles,
|
||||
handleActionSelection,
|
||||
Placeholder,
|
||||
}}
|
||||
/>
|
||||
</div>
|
||||
</RawFileDetails>
|
||||
|
||||
{/* <Modal
|
||||
style={{ content: { marginTop: "2rem" } }}
|
||||
isOpen={modalIsOpen}
|
||||
onAfterOpen={afterOpenModal}
|
||||
onRequestClose={closeModal}
|
||||
contentLabel="Example Modal"
|
||||
>
|
||||
<button onClick={closeModal}>close</button>
|
||||
{extractedComicBook && (
|
||||
<ComicViewer
|
||||
pages={extractedComicBook}
|
||||
direction="ltr"
|
||||
className={{
|
||||
closeButton: "border: 1px solid red;",
|
||||
}}
|
||||
/>
|
||||
)}
|
||||
</Modal> */}
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
@@ -201,9 +451,7 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
|
||||
<TabControls
|
||||
filteredTabs={filteredTabs}
|
||||
downloadCount={acquisition?.directconnect?.downloads?.length || 0}
|
||||
activeTab={activeTab}
|
||||
setActiveTab={setActiveTab}
|
||||
downloadCount={acquisition?.directconnect?.downloads?.length}
|
||||
/>
|
||||
|
||||
<StyledSlidingPanel
|
||||
@@ -212,7 +460,8 @@ export const ComicDetail = (data: ComicDetailProps): ReactElement => {
|
||||
title={"Comic Vine Search Matches"}
|
||||
width={"600px"}
|
||||
>
|
||||
{renderSlidingPanelContent()}
|
||||
{slidingPanelContentId !== "" &&
|
||||
contentForSlidingPanel[slidingPanelContentId].content()}
|
||||
</StyledSlidingPanel>
|
||||
</>
|
||||
)}
|
||||
|
||||
@@ -1,40 +1,35 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import { useParams } from "react-router-dom";
|
||||
import { ComicDetail } from "../ComicDetail/ComicDetail";
|
||||
import { useQueryClient } from "@tanstack/react-query";
|
||||
import { useGetComicByIdQuery } from "../../graphql/generated";
|
||||
import { adaptGraphQLComicToLegacy } from "../../graphql/adapters/comicAdapter";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import { LIBRARY_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import axios from "axios";
|
||||
|
||||
export const ComicDetailContainer = (): ReactElement | null => {
|
||||
const { comicObjectId } = useParams<{ comicObjectId: string }>();
|
||||
const queryClient = useQueryClient();
|
||||
|
||||
const {
|
||||
data: comicBookDetailData,
|
||||
isLoading,
|
||||
isError,
|
||||
} = useGetComicByIdQuery(
|
||||
{ id: comicObjectId! },
|
||||
{ enabled: !!comicObjectId }
|
||||
} = useQuery({
|
||||
queryKey: ["comicBookMetadata"],
|
||||
queryFn: async () =>
|
||||
await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBookById`,
|
||||
method: "POST",
|
||||
data: {
|
||||
id: comicObjectId,
|
||||
},
|
||||
}),
|
||||
});
|
||||
|
||||
{
|
||||
isError && <>Error</>;
|
||||
}
|
||||
{
|
||||
isLoading && <>Loading...</>;
|
||||
}
|
||||
return (
|
||||
comicBookDetailData?.data && <ComicDetail data={comicBookDetailData.data} />
|
||||
);
|
||||
|
||||
if (isError) {
|
||||
return <div>Error loading comic details</div>;
|
||||
}
|
||||
|
||||
if (isLoading) {
|
||||
return <div>Loading...</div>;
|
||||
}
|
||||
|
||||
const adaptedData = comicBookDetailData?.comic
|
||||
? adaptGraphQLComicToLegacy(comicBookDetailData.comic)
|
||||
: null;
|
||||
|
||||
return adaptedData ? (
|
||||
<ComicDetail
|
||||
data={adaptedData}
|
||||
queryClient={queryClient}
|
||||
comicObjectId={comicObjectId}
|
||||
/>
|
||||
) : null;
|
||||
};
|
||||
|
||||
@@ -1,22 +1,13 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import PropTypes from "prop-types";
|
||||
import { detectIssueTypes } from "../../shared/utils/tradepaperback.utils";
|
||||
import dayjs from "dayjs";
|
||||
import { isEmpty, isUndefined } from "lodash";
|
||||
import Card from "../shared/Carda";
|
||||
import { convert } from "html-to-text";
|
||||
import type { ComicVineDetailsProps } from "../../types";
|
||||
|
||||
export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement => {
|
||||
export const ComicVineDetails = (props): ReactElement => {
|
||||
const { data, updatedAt } = props;
|
||||
|
||||
if (!data || !data.volumeInformation) {
|
||||
return <div className="text-slate-500 dark:text-gray-400">No ComicVine data available</div>;
|
||||
}
|
||||
|
||||
const detectedIssueType = data.volumeInformation.description
|
||||
? detectIssueTypes(data.volumeInformation.description)
|
||||
: undefined;
|
||||
|
||||
return (
|
||||
<div className="text-slate-500 dark:text-gray-400">
|
||||
<div className="">
|
||||
@@ -24,9 +15,10 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
<div className="flex flex-row gap-4">
|
||||
<div className="min-w-fit">
|
||||
<Card
|
||||
imageUrl={data.volumeInformation.image?.thumb_url}
|
||||
imageUrl={data.volumeInformation.image.thumb_url}
|
||||
orientation={"cover-only"}
|
||||
hasDetails={false}
|
||||
// cardContainerStyle={{ maxWidth: 200 }}
|
||||
/>
|
||||
</div>
|
||||
<div className="flex flex-col gap-5">
|
||||
@@ -48,7 +40,7 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
<div className="text-md">ComicVine Metadata</div>
|
||||
<div className="text-sm">
|
||||
Last scraped on{" "}
|
||||
{updatedAt ? dayjs(updatedAt).format("MMM D YYYY [at] h:mm a") : "Unknown"}
|
||||
{dayjs(updatedAt).format("MMM D YYYY [at] h:mm a")}
|
||||
</div>
|
||||
<div className="text-sm">
|
||||
ComicVine Issue ID
|
||||
@@ -60,7 +52,7 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
{/* Publisher details */}
|
||||
<div className="ml-8">
|
||||
Published by{" "}
|
||||
<span>{data.volumeInformation.publisher?.name}</span>
|
||||
<span>{data.volumeInformation.publisher.name}</span>
|
||||
<div>
|
||||
Total issues in this volume{" "}
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs font-medium px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
@@ -76,11 +68,16 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
<span>{data.issue_number}</span>
|
||||
</div>
|
||||
)}
|
||||
{!isUndefined(detectedIssueType) ? (
|
||||
{!isUndefined(
|
||||
detectIssueTypes(data.volumeInformation.description),
|
||||
) ? (
|
||||
<div>
|
||||
<span>Detected Type</span>
|
||||
<span>
|
||||
{detectedIssueType.displayName}
|
||||
{
|
||||
detectIssueTypes(data.volumeInformation.description)
|
||||
.displayName
|
||||
}
|
||||
</span>
|
||||
</div>
|
||||
) : data.resource_type ? (
|
||||
@@ -95,7 +92,6 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
{/* Description */}
|
||||
<div className="mt-3 w-3/4">
|
||||
{!isEmpty(data.description) &&
|
||||
data.description &&
|
||||
convert(data.description, {
|
||||
baseElements: {
|
||||
selectors: ["p"],
|
||||
@@ -111,3 +107,13 @@ export const ComicVineDetails = (props: ComicVineDetailsProps): ReactElement =>
|
||||
};
|
||||
|
||||
export default ComicVineDetails;
|
||||
|
||||
ComicVineDetails.propTypes = {
|
||||
updatedAt: PropTypes.string,
|
||||
data: PropTypes.shape({
|
||||
name: PropTypes.string,
|
||||
number: PropTypes.string,
|
||||
resource_type: PropTypes.string,
|
||||
id: PropTypes.number,
|
||||
}),
|
||||
};
|
||||
|
||||
@@ -1,13 +1,12 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import { ComicVineSearchForm } from "../ComicVineSearchForm";
|
||||
import MatchResult from "./MatchResult";
|
||||
import { isEmpty } from "lodash";
|
||||
import { useStore } from "../../store";
|
||||
import { useShallow } from "zustand/react/shallow";
|
||||
import type { ComicVineMatchPanelProps } from "../../types";
|
||||
|
||||
/** Displays ComicVine search results or a status message while searching. */
|
||||
export const ComicVineMatchPanel = ({ props: comicVineData }: ComicVineMatchPanelProps): ReactElement => {
|
||||
const { comicObjectId, comicVineMatches, queryClient, onMatchApplied } = comicVineData;
|
||||
export const ComicVineMatchPanel = (comicVineData): ReactElement => {
|
||||
const { comicObjectId, comicVineMatches } = comicVineData.props;
|
||||
const { comicvine } = useStore(
|
||||
useShallow((state) => ({
|
||||
comicvine: state.comicvine,
|
||||
@@ -20,8 +19,6 @@ export const ComicVineMatchPanel = ({ props: comicVineData }: ComicVineMatchPane
|
||||
<MatchResult
|
||||
matchData={comicVineMatches}
|
||||
comicObjectId={comicObjectId}
|
||||
queryClient={queryClient}
|
||||
onMatchApplied={onMatchApplied}
|
||||
/>
|
||||
) : (
|
||||
<>
|
||||
|
||||
@@ -1,16 +1,7 @@
|
||||
import React, { useCallback } from "react";
|
||||
import { Form, Field } from "react-final-form";
|
||||
import { ValidationErrors } from "final-form";
|
||||
|
||||
interface ComicVineSearchFormProps {
|
||||
rawFileDetails?: Record<string, unknown>;
|
||||
}
|
||||
|
||||
interface SearchFormValues {
|
||||
issueName?: string;
|
||||
issueNumber?: string;
|
||||
issueYear?: string;
|
||||
}
|
||||
import Collapsible from "react-collapsible";
|
||||
import { fetchComicVineMatches } from "../../actions/fileops.actions";
|
||||
|
||||
/**
|
||||
* Component for performing search against ComicVine
|
||||
@@ -21,8 +12,8 @@ interface SearchFormValues {
|
||||
* <ComicVineSearchForm data={rawFileDetails} />
|
||||
* )
|
||||
*/
|
||||
export const ComicVineSearchForm = (props: ComicVineSearchFormProps) => {
|
||||
const onSubmit = useCallback((value: SearchFormValues) => {
|
||||
export const ComicVineSearchForm = (data) => {
|
||||
const onSubmit = useCallback((value) => {
|
||||
const userInititatedQuery = {
|
||||
inferredIssueDetails: {
|
||||
name: value.issueName,
|
||||
@@ -33,8 +24,8 @@ export const ComicVineSearchForm = (props: ComicVineSearchFormProps) => {
|
||||
};
|
||||
// dispatch(fetchComicVineMatches(data, userInititatedQuery));
|
||||
}, []);
|
||||
const validate = (_values: SearchFormValues): ValidationErrors | undefined => {
|
||||
return undefined;
|
||||
const validate = () => {
|
||||
return true;
|
||||
};
|
||||
|
||||
const MyForm = () => (
|
||||
@@ -43,46 +34,52 @@ export const ComicVineSearchForm = (props: ComicVineSearchFormProps) => {
|
||||
validate={validate}
|
||||
render={({ handleSubmit }) => (
|
||||
<form onSubmit={handleSubmit}>
|
||||
<label className="block py-1 text-slate-700 dark:text-slate-200">Issue Name</label>
|
||||
<span className="flex items-center">
|
||||
<span className="text-md text-slate-500 dark:text-slate-500 pr-5">
|
||||
Override Search Query
|
||||
</span>
|
||||
<span className="h-px flex-1 bg-slate-200 dark:bg-slate-400"></span>
|
||||
</span>
|
||||
<label className="block py-1">Issue Name</label>
|
||||
<Field name="issueName">
|
||||
{(props) => (
|
||||
<input
|
||||
{...props.input}
|
||||
className="appearance-none bg-slate-100 dark:bg-slate-700 h-10 w-full rounded-md border border-slate-300 dark:border-slate-600 text-slate-900 dark:text-slate-100 py-1 pr-7 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-blue-300"
|
||||
className="appearance-none dark:bg-slate-100 bg-slate-100 h-10 w-full rounded-md border-none text-gray-700 dark:text-slate-200 py-1 pr-7 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:shadow-outline-blue focus:border-blue-300"
|
||||
placeholder="Type the issue name"
|
||||
/>
|
||||
)}
|
||||
</Field>
|
||||
<div className="flex flex-row gap-4 mt-2">
|
||||
<div className="flex flex-row gap-4">
|
||||
<div>
|
||||
<label className="block py-1 text-slate-700 dark:text-slate-200">Number</label>
|
||||
<label className="block py-1">Number</label>
|
||||
<Field name="issueNumber">
|
||||
{(props) => (
|
||||
<input
|
||||
{...props.input}
|
||||
className="appearance-none bg-slate-100 dark:bg-slate-700 h-10 w-14 rounded-md border border-slate-300 dark:border-slate-600 text-slate-900 dark:text-slate-100 py-1 pr-2 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-blue-300"
|
||||
className="appearance-none dark:bg-slate-100 bg-slate-100 h-10 w-14 rounded-md border-none text-gray-700 dark:text-slate-200 py-1 pr-7 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:shadow-outline-blue focus:border-blue-300"
|
||||
placeholder="#"
|
||||
/>
|
||||
)}
|
||||
</Field>
|
||||
</div>
|
||||
<div>
|
||||
<label className="block py-1 text-slate-700 dark:text-slate-200">Year</label>
|
||||
<label className="block py-1">Year</label>
|
||||
<Field name="issueYear">
|
||||
{(props) => (
|
||||
<input
|
||||
{...props.input}
|
||||
className="appearance-none bg-slate-100 dark:bg-slate-700 h-10 w-20 rounded-md border border-slate-300 dark:border-slate-600 text-slate-900 dark:text-slate-100 py-1 pr-2 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:ring-2 focus:ring-blue-500 focus:border-blue-300"
|
||||
className="appearance-none dark:bg-slate-100 bg-slate-100 h-10 w-20 rounded-md border-none text-gray-700 dark:text-slate-200 py-1 pr-7 pl-3 sm:text-md sm:leading-5 focus:outline-none focus:shadow-outline-blue focus:border-blue-300"
|
||||
placeholder="1984"
|
||||
/>
|
||||
)}
|
||||
</Field>
|
||||
</div>
|
||||
|
||||
<div className="flex items-end">
|
||||
<div className="flex justify-end mt-5">
|
||||
<button
|
||||
type="submit"
|
||||
className="flex h-10 items-center rounded-lg border border-green-500 dark:border-green-400 bg-green-500 dark:bg-green-600 px-4 py-2 text-white font-medium hover:bg-green-600 dark:hover:bg-green-500 focus:outline-none focus:ring-2 focus:ring-green-500 focus:ring-offset-2 active:bg-green-700"
|
||||
className="flex h-10 sm:mt-3 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-4 py-2 text-gray-500 hover:bg-transparent hover:text-red-600 focus:outline-none focus:ring active:text-indigo-500"
|
||||
>
|
||||
Search
|
||||
</button>
|
||||
|
||||
@@ -1,107 +1,32 @@
|
||||
import prettyBytes from "pretty-bytes";
|
||||
import React, { ReactElement, useEffect, useRef, useState } from "react";
|
||||
import { useStore } from "../../store";
|
||||
import type { Socket } from "socket.io-client";
|
||||
import type { DownloadProgressTickProps } from "../../types";
|
||||
|
||||
/**
|
||||
* Shape of the download tick data received over the socket.
|
||||
*/
|
||||
type DownloadTickData = {
|
||||
id: number;
|
||||
name: string;
|
||||
downloaded_bytes: number;
|
||||
size: number;
|
||||
speed: number;
|
||||
seconds_left: number;
|
||||
status: {
|
||||
id: string;
|
||||
str: string;
|
||||
completed: boolean;
|
||||
downloaded: boolean;
|
||||
failed: boolean;
|
||||
hook_error: any;
|
||||
};
|
||||
sources: {
|
||||
online: number;
|
||||
total: number;
|
||||
str: string;
|
||||
};
|
||||
target: string;
|
||||
};
|
||||
|
||||
export const DownloadProgressTick: React.FC<DownloadProgressTickProps> = ({
|
||||
bundleId,
|
||||
}): ReactElement | null => {
|
||||
const socketRef = useRef<Socket | undefined>(undefined);
|
||||
const [tick, setTick] = useState<DownloadTickData | null>(null);
|
||||
useEffect(() => {
|
||||
const socket = useStore.getState().getSocket("manual");
|
||||
socketRef.current = socket;
|
||||
|
||||
socket.emit("call", "socket.listenFileProgress", {
|
||||
namespace: "/manual",
|
||||
config: {
|
||||
protocol: `ws`,
|
||||
hostname: `192.168.1.119:5600`,
|
||||
username: `admin`,
|
||||
password: `password`,
|
||||
},
|
||||
});
|
||||
|
||||
/**
|
||||
* Handler for each "downloadTick" event.
|
||||
* Only update state if event.id matches bundleId.
|
||||
*
|
||||
* @param {DownloadTickData} data - Payload from the server
|
||||
*/
|
||||
const onDownloadTick = (data: DownloadTickData) => {
|
||||
// Compare numeric data.id to string bundleId
|
||||
if (data.id === parseInt(bundleId, 10)) {
|
||||
setTick(data);
|
||||
}
|
||||
};
|
||||
|
||||
socket.on("downloadTick", onDownloadTick);
|
||||
return () => {
|
||||
socket.off("downloadTick", onDownloadTick);
|
||||
};
|
||||
}, [socketRef, bundleId]);
|
||||
|
||||
if (!tick) {
|
||||
return <>Nothing detected.</>;
|
||||
}
|
||||
|
||||
// Compute human-readable values and percentages
|
||||
const downloaded = prettyBytes(tick.downloaded_bytes);
|
||||
const total = prettyBytes(tick.size);
|
||||
const percent = tick.size > 0
|
||||
? Math.round((tick.downloaded_bytes / tick.size) * 100)
|
||||
: 0;
|
||||
const speed = prettyBytes(tick.speed) + "/s";
|
||||
const minutesLeft = Math.round(tick.seconds_left / 60);
|
||||
import React, { ReactElement } from "react";
|
||||
|
||||
export const DownloadProgressTick = (props): ReactElement => {
|
||||
return (
|
||||
<div className="mt-2 p-2 border rounded-md bg-white shadow-sm">
|
||||
{/* Downloaded vs Total */}
|
||||
<div className="mt-1 flex items-center space-x-2">
|
||||
<span className="text-sm text-gray-700">{downloaded} of {total}</span>
|
||||
<div>
|
||||
<h4 className="is-size-5">{props.data.name}</h4>
|
||||
<div>
|
||||
<span className="is-size-4 has-text-weight-semibold">
|
||||
{prettyBytes(props.data.downloaded_bytes)} of{" "}
|
||||
{prettyBytes(props.data.size)}{" "}
|
||||
</span>
|
||||
<progress
|
||||
className="progress is-small is-success"
|
||||
value={props.data.downloaded_bytes}
|
||||
max={props.data.size}
|
||||
>
|
||||
{(parseInt(props.data.downloaded_bytes) / parseInt(props.data.size)) *
|
||||
100}
|
||||
%
|
||||
</progress>
|
||||
</div>
|
||||
<div className="is-size-6 mt-1 mb-2">
|
||||
<p>{prettyBytes(props.data.speed)} per second.</p>
|
||||
Time left:
|
||||
{Math.round(parseInt(props.data.seconds_left) / 60)}
|
||||
</div>
|
||||
|
||||
{/* Progress bar */}
|
||||
<div className="relative mt-2 h-2 bg-gray-200 rounded overflow-hidden">
|
||||
<div
|
||||
className="absolute inset-y-0 left-0 bg-green-500"
|
||||
style={{ width: `${percent}%` }}
|
||||
/>
|
||||
</div>
|
||||
<div className="mt-1 text-xs text-gray-600">{percent}% complete</div>
|
||||
|
||||
{/* Speed and Time Left */}
|
||||
<div className="mt-2 flex space-x-4 text-xs text-gray-600">
|
||||
<span>Speed: {speed}</span>
|
||||
<span>Time left: {minutesLeft} min</span>
|
||||
</div>
|
||||
<div>{props.data.target}</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
@@ -1,7 +1,8 @@
|
||||
import React, { useEffect, ReactElement, useState, useMemo } from "react";
|
||||
import { isEmpty, isNil, isUndefined, map } from "lodash";
|
||||
import React, { useEffect, useContext, ReactElement, useState } from "react";
|
||||
import { RootState } from "threetwo-ui-typings";
|
||||
import { isEmpty, map } from "lodash";
|
||||
import { AirDCPPBundles } from "./AirDCPPBundles";
|
||||
import { TorrentDownloads, TorrentData } from "./TorrentDownloads";
|
||||
import { TorrentDownloads } from "./TorrentDownloads";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import {
|
||||
@@ -13,142 +14,134 @@ import { useStore } from "../../store";
|
||||
import { useShallow } from "zustand/react/shallow";
|
||||
import { useParams } from "react-router-dom";
|
||||
|
||||
export interface TorrentDetails {
|
||||
infoHash: string;
|
||||
progress: number;
|
||||
downloadSpeed?: number;
|
||||
uploadSpeed?: number;
|
||||
interface IDownloadsPanelProps {
|
||||
key: number;
|
||||
}
|
||||
|
||||
/**
|
||||
* DownloadsPanel displays two tabs of download information for a specific comic:
|
||||
* - DC++ (AirDCPP) bundles
|
||||
* - Torrent downloads
|
||||
* It also listens for real-time torrent updates via a WebSocket.
|
||||
*
|
||||
* @component
|
||||
* @returns {ReactElement | null} The rendered DownloadsPanel or null if no socket is available.
|
||||
*/
|
||||
export const DownloadsPanel = (): ReactElement | null => {
|
||||
export const DownloadsPanel = (
|
||||
props: IDownloadsPanelProps,
|
||||
): ReactElement | null => {
|
||||
const { comicObjectId } = useParams<{ comicObjectId: string }>();
|
||||
const [bundles, setBundles] = useState([]);
|
||||
const [infoHashes, setInfoHashes] = useState<string[]>([]);
|
||||
const [torrentDetails, setTorrentDetails] = useState<TorrentData[]>([]);
|
||||
const [activeTab, setActiveTab] = useState<"directconnect" | "torrents">(
|
||||
"directconnect",
|
||||
const [torrentDetails, setTorrentDetails] = useState([]);
|
||||
const [activeTab, setActiveTab] = useState("torrents");
|
||||
const { airDCPPSocketInstance, socketIOInstance } = useStore(
|
||||
useShallow((state: any) => ({
|
||||
airDCPPSocketInstance: state.airDCPPSocketInstance,
|
||||
socketIOInstance: state.socketIOInstance,
|
||||
})),
|
||||
);
|
||||
|
||||
const { socketIOInstance } = useStore(
|
||||
useShallow((state: any) => ({ socketIOInstance: state.socketIOInstance })),
|
||||
);
|
||||
|
||||
/**
|
||||
* Registers socket listeners on mount and cleans up on unmount.
|
||||
*/
|
||||
useEffect(() => {
|
||||
if (!socketIOInstance) return;
|
||||
|
||||
/**
|
||||
* Handler for incoming torrent data events.
|
||||
* Merges new entries or updates existing ones by infoHash.
|
||||
*
|
||||
* @param {TorrentDetails} data - Payload from the socket event.
|
||||
*/
|
||||
const handleTorrentData = (data: TorrentDetails) => {
|
||||
setTorrentDetails((prev) => {
|
||||
const idx = prev.findIndex((t) => t.infoHash === data.infoHash);
|
||||
if (idx === -1) {
|
||||
return [...prev, data];
|
||||
// React to torrent progress data sent over websockets
|
||||
socketIOInstance.on("AS_TORRENT_DATA", (data) => {
|
||||
const torrents = data.torrents
|
||||
.flatMap(({ _id, details }) => {
|
||||
if (_id === comicObjectId) {
|
||||
return details;
|
||||
}
|
||||
const next = [...prev];
|
||||
next[idx] = { ...next[idx], ...data };
|
||||
return next;
|
||||
});
|
||||
};
|
||||
|
||||
socketIOInstance.on("AS_TORRENT_DATA", handleTorrentData);
|
||||
|
||||
return () => {
|
||||
socketIOInstance.off("AS_TORRENT_DATA", handleTorrentData);
|
||||
};
|
||||
}, [socketIOInstance]);
|
||||
|
||||
// ————— DC++ Bundles (via REST) —————
|
||||
const { data: bundles } = useQuery({
|
||||
queryKey: ["bundles", comicObjectId],
|
||||
})
|
||||
.filter((item) => item !== undefined);
|
||||
setTorrentDetails(torrents);
|
||||
});
|
||||
// Fetch the downloaded files and currently-downloading file(s) from AirDC++
|
||||
const { data: comicObject, isSuccess } = useQuery({
|
||||
queryKey: ["bundles"],
|
||||
queryFn: async () =>
|
||||
await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getBundles`,
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBookById`,
|
||||
method: "POST",
|
||||
headers: {
|
||||
"Content-Type": "application/json; charset=utf-8",
|
||||
},
|
||||
data: {
|
||||
comicObjectId,
|
||||
config: {
|
||||
protocol: `ws`,
|
||||
hostname: `192.168.1.119:5600`,
|
||||
username: `admin`,
|
||||
password: `password`,
|
||||
},
|
||||
id: `${comicObjectId}`,
|
||||
},
|
||||
}),
|
||||
});
|
||||
|
||||
// ————— Torrent Jobs (via REST) —————
|
||||
const { data: rawJobs = [] } = useQuery<any[]>({
|
||||
queryKey: ["torrents", comicObjectId],
|
||||
queryFn: async () => {
|
||||
const { data } = await axios.get(
|
||||
`${TORRENT_JOB_SERVICE_BASE_URI}/getTorrentData`,
|
||||
{ params: { trigger: activeTab } },
|
||||
);
|
||||
return Array.isArray(data) ? data : [];
|
||||
},
|
||||
initialData: [],
|
||||
enabled: activeTab === "torrents",
|
||||
const getBundles = async (comicObject) => {
|
||||
if (comicObject?.data.acquisition.directconnect) {
|
||||
const filteredBundles =
|
||||
comicObject.data.acquisition.directconnect.downloads.map(
|
||||
async ({ bundleId }) => {
|
||||
return await airDCPPSocketInstance.get(`queue/bundles/${bundleId}`);
|
||||
},
|
||||
);
|
||||
return await Promise.all(filteredBundles);
|
||||
}
|
||||
};
|
||||
|
||||
// Call the scheduled job for fetching torrent data
|
||||
// triggered by the active tab been set to "torrents"
|
||||
const { data: torrentData } = useQuery({
|
||||
queryFn: () =>
|
||||
axios({
|
||||
url: `${TORRENT_JOB_SERVICE_BASE_URI}/getTorrentData`,
|
||||
method: "GET",
|
||||
params: {
|
||||
trigger: activeTab,
|
||||
},
|
||||
}),
|
||||
queryKey: [activeTab],
|
||||
});
|
||||
|
||||
// Only when rawJobs changes *and* activeTab === "torrents" should we update infoHashes:
|
||||
useEffect(() => {
|
||||
if (activeTab !== "torrents") return;
|
||||
setInfoHashes(rawJobs.map((j: any) => j.infoHash));
|
||||
}, [activeTab]);
|
||||
getBundles(comicObject).then((result) => {
|
||||
setBundles(result);
|
||||
});
|
||||
}, [comicObject]);
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className="mt-5 mb-3">
|
||||
<nav className="flex space-x-2">
|
||||
<button
|
||||
onClick={() => setActiveTab("directconnect")}
|
||||
className={`px-4 py-1 rounded-full text-sm font-medium transition-colors ${
|
||||
activeTab === "directconnect"
|
||||
? "bg-green-500 text-white"
|
||||
: "bg-gray-200 text-gray-700 hover:bg-gray-300"
|
||||
}`}
|
||||
>
|
||||
DC++
|
||||
</button>
|
||||
<button
|
||||
onClick={() => setActiveTab("torrents")}
|
||||
className={`px-4 py-1 rounded-full text-sm font-medium transition-colors ${
|
||||
activeTab === "torrents"
|
||||
? "bg-blue-500 text-white"
|
||||
: "bg-gray-200 text-gray-700 hover:bg-gray-300"
|
||||
}`}
|
||||
>
|
||||
Torrents
|
||||
</button>
|
||||
</nav>
|
||||
<div className="columns is-multiline">
|
||||
{!isEmpty(airDCPPSocketInstance) &&
|
||||
!isEmpty(bundles) &&
|
||||
activeTab === "directconnect" && <AirDCPPBundles data={bundles} />}
|
||||
|
||||
<div className="mt-4">
|
||||
{activeTab === "torrents" ? (
|
||||
<TorrentDownloads data={torrentDetails} />
|
||||
) : !isNil(bundles?.data) && bundles.data.length > 0 ? (
|
||||
<AirDCPPBundles data={bundles.data} />
|
||||
) : (
|
||||
<p>No DC++ bundles found.</p>
|
||||
)}
|
||||
<div>
|
||||
<div className="sm:hidden">
|
||||
<label htmlFor="Download Type" className="sr-only">
|
||||
Download Type
|
||||
</label>
|
||||
|
||||
<select id="Tab" className="w-full rounded-md border-gray-200">
|
||||
<option>DC++ Downloads</option>
|
||||
<option>Torrents</option>
|
||||
</select>
|
||||
</div>
|
||||
|
||||
<div className="hidden sm:block">
|
||||
<nav className="flex gap-6" aria-label="Tabs">
|
||||
<a
|
||||
href="#"
|
||||
className={`shrink-0 rounded-lg p-2 text-sm font-medium hover:bg-gray-50 hover:text-gray-700 ${
|
||||
activeTab === "directconnect"
|
||||
? "bg-slate-200 dark:text-slate-200 dark:bg-slate-400 text-slate-800"
|
||||
: "dark:text-slate-400 text-slate-800"
|
||||
}`}
|
||||
aria-current="page"
|
||||
onClick={() => setActiveTab("directconnect")}
|
||||
>
|
||||
DC++ Downloads
|
||||
</a>
|
||||
|
||||
<a
|
||||
href="#"
|
||||
className={`shrink-0 rounded-lg p-2 text-sm font-medium hover:bg-gray-50 hover:text-gray-700 ${
|
||||
activeTab === "torrents"
|
||||
? "bg-slate-200 text-slate-800"
|
||||
: "dark:text-slate-400 text-slate-800"
|
||||
}`}
|
||||
onClick={() => setActiveTab("torrents")}
|
||||
>
|
||||
Torrents
|
||||
</a>
|
||||
</nav>
|
||||
</div>
|
||||
</div>
|
||||
</>
|
||||
|
||||
{activeTab === "torrents" && <TorrentDownloads data={torrentDetails} />}
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default DownloadsPanel;
|
||||
|
||||
@@ -1,41 +1,55 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import { Form, Field, FieldRenderProps } from "react-final-form";
|
||||
import React, { ReactElement, useCallback, useEffect, useState } from "react";
|
||||
import { Form, Field } from "react-final-form";
|
||||
import arrayMutators from "final-form-arrays";
|
||||
import { FieldArray } from "react-final-form-arrays";
|
||||
import AsyncSelectPaginate from "./AsyncSelectPaginate/AsyncSelectPaginate";
|
||||
import TextareaAutosize from "react-textarea-autosize";
|
||||
|
||||
interface EditMetadataPanelProps {
|
||||
data: {
|
||||
name?: string | null;
|
||||
[key: string]: any;
|
||||
};
|
||||
}
|
||||
|
||||
/** Adapts react-final-form's Field render prop to AsyncSelectPaginate. */
|
||||
const AsyncSelectPaginateAdapter = ({ input, ...rest }: FieldRenderProps<any>) => (
|
||||
<AsyncSelectPaginate {...input} {...rest} onChange={(value) => input.onChange(value)} />
|
||||
);
|
||||
|
||||
/** Adapts react-final-form's Field render prop to TextareaAutosize. */
|
||||
const TextareaAutosizeAdapter = ({ input, ...rest }: FieldRenderProps<any>) => (
|
||||
<TextareaAutosize {...input} {...rest} onChange={(value) => input.onChange(value)} />
|
||||
);
|
||||
|
||||
/** Sliding panel form for manually editing comic metadata fields. */
|
||||
export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElement => {
|
||||
export const EditMetadataPanel = (props): ReactElement => {
|
||||
const validate = async () => {};
|
||||
const onSubmit = async () => {};
|
||||
|
||||
const { data } = props;
|
||||
|
||||
const AsyncSelectPaginateAdapter = ({ input, ...rest }) => {
|
||||
return (
|
||||
<AsyncSelectPaginate
|
||||
{...input}
|
||||
{...rest}
|
||||
onChange={(value) => input.onChange(value)}
|
||||
/>
|
||||
);
|
||||
};
|
||||
const TextareaAutosizeAdapter = ({ input, ...rest }) => {
|
||||
return (
|
||||
<TextareaAutosize
|
||||
{...input}
|
||||
{...rest}
|
||||
onChange={(value) => input.onChange(value)}
|
||||
/>
|
||||
);
|
||||
};
|
||||
// const rawFileDetails = useSelector(
|
||||
// (state: RootState) => state.comicInfo.comicBookDetail.rawFileDetails.name,
|
||||
// );
|
||||
|
||||
return (
|
||||
<>
|
||||
<Form
|
||||
onSubmit={onSubmit}
|
||||
mutators={{ ...arrayMutators }}
|
||||
validate={validate}
|
||||
mutators={{
|
||||
...arrayMutators,
|
||||
}}
|
||||
render={({
|
||||
handleSubmit,
|
||||
form: {
|
||||
mutators: { push, pop },
|
||||
},
|
||||
}, // injected from final-form-arrays above
|
||||
pristine,
|
||||
form,
|
||||
submitting,
|
||||
values,
|
||||
}) => (
|
||||
<form onSubmit={handleSubmit}>
|
||||
{/* Issue Name */}
|
||||
@@ -66,6 +80,7 @@ export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElemen
|
||||
<p className="text-xs">Do not enter the first zero</p>
|
||||
</div>
|
||||
<div>
|
||||
{/* year */}
|
||||
<div className="text-sm">Issue Year</div>
|
||||
<Field
|
||||
name="issue_year"
|
||||
@@ -85,6 +100,8 @@ export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElemen
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* page count */}
|
||||
|
||||
{/* Description */}
|
||||
<div className="mt-2">
|
||||
<label className="text-sm">Description</label>
|
||||
@@ -96,7 +113,7 @@ export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElemen
|
||||
/>
|
||||
</div>
|
||||
|
||||
<hr />
|
||||
<hr size="1" />
|
||||
|
||||
<div className="field is-horizontal">
|
||||
<div className="field-label">
|
||||
@@ -112,7 +129,6 @@ export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElemen
|
||||
className="input"
|
||||
placeholder="SKU"
|
||||
/>
|
||||
{/* TODO: Switch to Solar icon */}
|
||||
<span className="icon is-small is-left">
|
||||
<i className="fa-solid fa-barcode"></i>
|
||||
</span>
|
||||
@@ -129,7 +145,6 @@ export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElemen
|
||||
className="input"
|
||||
placeholder="UPC Code"
|
||||
/>
|
||||
{/* TODO: Switch to Solar icon */}
|
||||
<span className="icon is-small is-left">
|
||||
<i className="fa-solid fa-box"></i>
|
||||
</span>
|
||||
@@ -138,7 +153,7 @@ export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElemen
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<hr />
|
||||
<hr size="1" />
|
||||
|
||||
{/* Publisher */}
|
||||
<div className="field is-horizontal">
|
||||
@@ -152,7 +167,6 @@ export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElemen
|
||||
name={"publisher"}
|
||||
component={AsyncSelectPaginateAdapter}
|
||||
placeholder={
|
||||
/* TODO: Switch to Solar icon */
|
||||
<div>
|
||||
<i className="fas fa-print mr-2"></i> Publisher
|
||||
</div>
|
||||
@@ -176,7 +190,6 @@ export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElemen
|
||||
name={"story_arc"}
|
||||
component={AsyncSelectPaginateAdapter}
|
||||
placeholder={
|
||||
/* TODO: Switch to Solar icon */
|
||||
<div>
|
||||
<i className="fas fa-book-open mr-2"></i> Story Arc
|
||||
</div>
|
||||
@@ -200,7 +213,6 @@ export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElemen
|
||||
name={"series"}
|
||||
component={AsyncSelectPaginateAdapter}
|
||||
placeholder={
|
||||
/* TODO: Switch to Solar icon */
|
||||
<div>
|
||||
<i className="fas fa-layer-group mr-2"></i> Series
|
||||
</div>
|
||||
@@ -212,7 +224,7 @@ export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElemen
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<hr />
|
||||
<hr size="1" />
|
||||
|
||||
{/* team credits */}
|
||||
<div className="field is-horizontal">
|
||||
@@ -255,7 +267,6 @@ export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElemen
|
||||
name={`${name}.creator`}
|
||||
component={AsyncSelectPaginateAdapter}
|
||||
placeholder={
|
||||
/* TODO: Switch to Solar icon */
|
||||
<div>
|
||||
<i className="fa-solid fa-ghost"></i> Creator
|
||||
</div>
|
||||
@@ -271,7 +282,6 @@ export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElemen
|
||||
name={`${name}.role`}
|
||||
metronResource={"role"}
|
||||
placeholder={
|
||||
/* TODO: Switch to Solar icon */
|
||||
<div>
|
||||
<i className="fa-solid fa-key"></i> Role
|
||||
</div>
|
||||
@@ -280,7 +290,6 @@ export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElemen
|
||||
/>
|
||||
</p>
|
||||
</div>
|
||||
{/* TODO: Switch to Solar icon */}
|
||||
<span
|
||||
className="icon is-danger mt-2"
|
||||
onClick={() => fields.remove(index)}
|
||||
@@ -293,6 +302,7 @@ export const EditMetadataPanel = ({ data }: EditMetadataPanelProps): ReactElemen
|
||||
))
|
||||
}
|
||||
</FieldArray>
|
||||
<pre>{JSON.stringify(values, undefined, 2)}</pre>
|
||||
</form>
|
||||
)}
|
||||
/>
|
||||
|
||||
@@ -4,67 +4,26 @@ import { convert } from "html-to-text";
|
||||
import ellipsize from "ellipsize";
|
||||
import { LIBRARY_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import axios from "axios";
|
||||
import { useGetComicByIdQuery } from "../../graphql/generated";
|
||||
import type { MatchResultProps } from "../../types";
|
||||
|
||||
const handleBrokenImage = (e: React.SyntheticEvent<HTMLImageElement>) => {
|
||||
e.currentTarget.src = "http://localhost:3050/dist/img/noimage.svg";
|
||||
};
|
||||
|
||||
interface ComicVineMatch {
|
||||
description?: string;
|
||||
name?: string;
|
||||
score: string | number;
|
||||
issue_number: string | number;
|
||||
cover_date: string;
|
||||
image: {
|
||||
thumb_url: string;
|
||||
};
|
||||
volume: {
|
||||
name: string;
|
||||
};
|
||||
volumeInformation: {
|
||||
results: {
|
||||
image: {
|
||||
icon_url: string;
|
||||
};
|
||||
count_of_issues: number;
|
||||
publisher: {
|
||||
name: string;
|
||||
};
|
||||
};
|
||||
};
|
||||
interface MatchResultProps {
|
||||
matchData: any;
|
||||
comicObjectId: string;
|
||||
}
|
||||
|
||||
const handleBrokenImage = (e) => {
|
||||
e.target.src = "http://localhost:3050/dist/img/noimage.svg";
|
||||
};
|
||||
|
||||
export const MatchResult = (props: MatchResultProps) => {
|
||||
const applyCVMatch = async (match: ComicVineMatch, comicObjectId: string) => {
|
||||
try {
|
||||
const response = await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/applyComicVineMetadata`,
|
||||
method: "POST",
|
||||
data: {
|
||||
match,
|
||||
comicObjectId,
|
||||
},
|
||||
});
|
||||
|
||||
// Invalidate and refetch the comic book metadata
|
||||
if (props.queryClient) {
|
||||
await props.queryClient.invalidateQueries({
|
||||
queryKey: useGetComicByIdQuery.getKey({ id: comicObjectId }),
|
||||
});
|
||||
}
|
||||
|
||||
// Call the callback to close panel and switch tabs
|
||||
if (props.onMatchApplied) {
|
||||
props.onMatchApplied();
|
||||
}
|
||||
|
||||
return response;
|
||||
} catch (error) {
|
||||
console.error("Error applying ComicVine match:", error);
|
||||
throw error;
|
||||
}
|
||||
const applyCVMatch = async (match, comicObjectId) => {
|
||||
return await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/applyComicVineMetadata`,
|
||||
method: "POST",
|
||||
data: {
|
||||
match,
|
||||
comicObjectId,
|
||||
},
|
||||
});
|
||||
};
|
||||
return (
|
||||
<>
|
||||
|
||||
@@ -1,30 +1,18 @@
|
||||
import React, { ReactElement, ReactNode } from "react";
|
||||
import React, { ReactElement } from "react";
|
||||
import PropTypes from "prop-types";
|
||||
import prettyBytes from "pretty-bytes";
|
||||
import { isEmpty } from "lodash";
|
||||
import { format, parseISO, isValid } from "date-fns";
|
||||
import {
|
||||
RawFileDetails as RawFileDetailsType,
|
||||
InferredMetadata,
|
||||
} from "../../graphql/generated";
|
||||
import { format, parseISO } from "date-fns";
|
||||
|
||||
type RawFileDetailsProps = {
|
||||
data?: {
|
||||
rawFileDetails?: RawFileDetailsType;
|
||||
inferredMetadata?: InferredMetadata;
|
||||
createdAt?: string;
|
||||
};
|
||||
children?: ReactNode;
|
||||
};
|
||||
|
||||
/** Renders raw file info, inferred metadata, and import timestamp for a comic. */
|
||||
export const RawFileDetails = (props: RawFileDetailsProps): ReactElement => {
|
||||
const { rawFileDetails, inferredMetadata, createdAt } = props.data || {};
|
||||
export const RawFileDetails = (props): ReactElement => {
|
||||
const { rawFileDetails, inferredMetadata, created_at, updated_at } =
|
||||
props.data;
|
||||
return (
|
||||
<>
|
||||
<div className="max-w-2xl ml-5">
|
||||
<div className="px-4 sm:px-6">
|
||||
<p className="text-gray-500 dark:text-gray-400">
|
||||
<span className="text-xl">{rawFileDetails?.name}</span>
|
||||
<span className="text-xl">{rawFileDetails.name}</span>
|
||||
</p>
|
||||
</div>
|
||||
<div className="px-4 py-5 sm:px-6">
|
||||
@@ -34,10 +22,10 @@ export const RawFileDetails = (props: RawFileDetailsProps): ReactElement => {
|
||||
Raw File Details
|
||||
</dt>
|
||||
<dd className="mt-1 text-sm text-gray-900 dark:text-gray-400">
|
||||
{rawFileDetails?.containedIn}
|
||||
{"/"}
|
||||
{rawFileDetails?.name}
|
||||
{rawFileDetails?.extension}
|
||||
{rawFileDetails.containedIn +
|
||||
"/" +
|
||||
rawFileDetails.name +
|
||||
rawFileDetails.extension}
|
||||
</dd>
|
||||
</div>
|
||||
<div className="sm:col-span-1">
|
||||
@@ -45,10 +33,10 @@ export const RawFileDetails = (props: RawFileDetailsProps): ReactElement => {
|
||||
Inferred Issue Metadata
|
||||
</dt>
|
||||
<dd className="mt-1 text-sm text-gray-900 dark:text-gray-400">
|
||||
Series Name: {inferredMetadata?.issue?.name}
|
||||
{!isEmpty(inferredMetadata?.issue?.number) ? (
|
||||
Series Name: {inferredMetadata.issue.name}
|
||||
{!isEmpty(inferredMetadata.issue.number) ? (
|
||||
<span className="tag is-primary is-light">
|
||||
{inferredMetadata?.issue?.number}
|
||||
{inferredMetadata.issue.number}
|
||||
</span>
|
||||
) : null}
|
||||
</dd>
|
||||
@@ -65,7 +53,7 @@ export const RawFileDetails = (props: RawFileDetailsProps): ReactElement => {
|
||||
</span>
|
||||
|
||||
<span className="text-md text-slate-500 dark:text-slate-900">
|
||||
{rawFileDetails?.mimeType}
|
||||
{rawFileDetails.mimeType}
|
||||
</span>
|
||||
</span>
|
||||
</dd>
|
||||
@@ -82,7 +70,7 @@ export const RawFileDetails = (props: RawFileDetailsProps): ReactElement => {
|
||||
</span>
|
||||
|
||||
<span className="text-md text-slate-500 dark:text-slate-900">
|
||||
{rawFileDetails?.fileSize ? prettyBytes(rawFileDetails.fileSize) : "N/A"}
|
||||
{prettyBytes(rawFileDetails.fileSize)}
|
||||
</span>
|
||||
</span>
|
||||
</dd>
|
||||
@@ -92,12 +80,8 @@ export const RawFileDetails = (props: RawFileDetailsProps): ReactElement => {
|
||||
Import Details
|
||||
</dt>
|
||||
<dd className="mt-1 text-sm text-gray-900 dark:text-gray-400">
|
||||
{createdAt && isValid(parseISO(createdAt)) ? (
|
||||
<>
|
||||
{format(parseISO(createdAt), "dd MMMM, yyyy")},{" "}
|
||||
{format(parseISO(createdAt), "h aaaa")}
|
||||
</>
|
||||
) : "N/A"}
|
||||
{format(parseISO(created_at), "dd MMMM, yyyy")},{" "}
|
||||
{format(parseISO(created_at), "h aaaa")}
|
||||
</dd>
|
||||
</div>
|
||||
<div className="sm:col-span-2">
|
||||
@@ -114,3 +98,30 @@ export const RawFileDetails = (props: RawFileDetailsProps): ReactElement => {
|
||||
};
|
||||
|
||||
export default RawFileDetails;
|
||||
|
||||
RawFileDetails.propTypes = {
|
||||
data: PropTypes.shape({
|
||||
rawFileDetails: PropTypes.shape({
|
||||
containedIn: PropTypes.string,
|
||||
name: PropTypes.string,
|
||||
fileSize: PropTypes.number,
|
||||
path: PropTypes.string,
|
||||
extension: PropTypes.string,
|
||||
mimeType: PropTypes.string,
|
||||
cover: PropTypes.shape({
|
||||
filePath: PropTypes.string,
|
||||
}),
|
||||
}),
|
||||
inferredMetadata: PropTypes.shape({
|
||||
issue: PropTypes.shape({
|
||||
year: PropTypes.string,
|
||||
name: PropTypes.string,
|
||||
number: PropTypes.number,
|
||||
subtitle: PropTypes.string,
|
||||
}),
|
||||
}),
|
||||
created_at: PropTypes.string,
|
||||
updated_at: PropTypes.string,
|
||||
}),
|
||||
children: PropTypes.any,
|
||||
};
|
||||
|
||||
@@ -1,105 +0,0 @@
|
||||
import React, { useState } from "react";
|
||||
import { ComicVineSearchForm } from "./ComicVineSearchForm";
|
||||
import { ComicVineMatchPanel } from "./ComicVineMatchPanel";
|
||||
import { EditMetadataPanel } from "./EditMetadataPanel";
|
||||
import type { RawFileDetails, InferredMetadata } from "../../graphql/generated";
|
||||
|
||||
interface CVMatchesPanelProps {
|
||||
rawFileDetails?: RawFileDetails;
|
||||
inferredMetadata: InferredMetadata;
|
||||
comicVineMatches: any[];
|
||||
comicObjectId: string;
|
||||
queryClient: any;
|
||||
onMatchApplied: () => void;
|
||||
};
|
||||
|
||||
/**
|
||||
* Collapsible container for manual ComicVine search form.
|
||||
* Allows users to manually search when auto-match doesn't yield results.
|
||||
*/
|
||||
const CollapsibleSearchForm: React.FC<{ rawFileDetails?: RawFileDetails }> = ({
|
||||
rawFileDetails,
|
||||
}) => {
|
||||
const [isExpanded, setIsExpanded] = useState(false);
|
||||
|
||||
return (
|
||||
<div className="border border-slate-300 dark:border-slate-600 rounded-lg overflow-hidden">
|
||||
<button
|
||||
type="button"
|
||||
onClick={() => setIsExpanded(!isExpanded)}
|
||||
className="w-full flex items-center justify-between px-4 py-3 bg-slate-100 dark:bg-slate-700 hover:bg-slate-200 dark:hover:bg-slate-600 transition-colors text-left"
|
||||
aria-expanded={isExpanded}
|
||||
>
|
||||
<span className="flex items-center gap-2 text-slate-700 dark:text-slate-200 font-medium">
|
||||
<svg
|
||||
className={`w-4 h-4 transition-transform ${isExpanded ? "rotate-90" : ""}`}
|
||||
fill="none"
|
||||
stroke="currentColor"
|
||||
viewBox="0 0 24 24"
|
||||
>
|
||||
<path strokeLinecap="round" strokeLinejoin="round" strokeWidth={2} d="M9 5l7 7-7 7" />
|
||||
</svg>
|
||||
Manual Search
|
||||
</span>
|
||||
<span className="text-sm text-slate-500 dark:text-slate-400">
|
||||
{isExpanded ? "Click to collapse" : "No results? Search manually"}
|
||||
</span>
|
||||
</button>
|
||||
{isExpanded && (
|
||||
<div className="p-4 bg-white dark:bg-slate-800">
|
||||
<ComicVineSearchForm rawFileDetails={rawFileDetails} />
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* Sliding panel content for ComicVine match search.
|
||||
*
|
||||
* Renders a search form pre-populated from `rawFileDetails`, a preview of the
|
||||
* inferred issue being searched for, and a list of ComicVine match candidates
|
||||
* the user can apply to the comic.
|
||||
*
|
||||
* @param props.onMatchApplied - Called after the user selects and applies a match,
|
||||
* allowing the parent to close the panel and refresh state.
|
||||
*/
|
||||
export const CVMatchesPanel: React.FC<CVMatchesPanelProps> = ({
|
||||
rawFileDetails,
|
||||
inferredMetadata,
|
||||
comicVineMatches,
|
||||
comicObjectId,
|
||||
queryClient,
|
||||
onMatchApplied,
|
||||
}) => (
|
||||
<>
|
||||
<div className="border-slate-500 border rounded-lg p-2 mb-3">
|
||||
<p className="text-slate-600 dark:text-slate-300">Searching for:</p>
|
||||
{inferredMetadata.issue ? (
|
||||
<>
|
||||
<span className="text-slate-800 dark:text-slate-100 font-medium">{inferredMetadata.issue?.name} </span>
|
||||
<span className="text-slate-600 dark:text-slate-300"> # {inferredMetadata.issue?.number} </span>
|
||||
</>
|
||||
) : null}
|
||||
</div>
|
||||
|
||||
<CollapsibleSearchForm rawFileDetails={rawFileDetails} />
|
||||
|
||||
<ComicVineMatchPanel
|
||||
props={{
|
||||
comicVineMatches,
|
||||
comicObjectId,
|
||||
queryClient,
|
||||
onMatchApplied,
|
||||
}}
|
||||
/>
|
||||
</>
|
||||
);
|
||||
|
||||
type EditMetadataPanelWrapperProps = {
|
||||
rawFileDetails?: RawFileDetails;
|
||||
};
|
||||
|
||||
export const EditMetadataPanelWrapper: React.FC<EditMetadataPanelWrapperProps> = ({
|
||||
rawFileDetails,
|
||||
}) => <EditMetadataPanel data={rawFileDetails ?? {}} />;
|
||||
@@ -1,50 +1,25 @@
|
||||
import React, { ReactElement, Suspense, useState } from "react";
|
||||
import React, { ReactElement, useState } from "react";
|
||||
import { isNil } from "lodash";
|
||||
|
||||
interface TabItem {
|
||||
id: number;
|
||||
name: string;
|
||||
icon: React.ReactNode;
|
||||
content: React.ReactNode;
|
||||
shouldShow?: boolean;
|
||||
}
|
||||
|
||||
interface TabControlsProps {
|
||||
filteredTabs: TabItem[];
|
||||
downloadCount: number;
|
||||
activeTab?: number;
|
||||
setActiveTab?: (id: number) => void;
|
||||
}
|
||||
|
||||
export const TabControls = (props: TabControlsProps): ReactElement => {
|
||||
const { filteredTabs, downloadCount, activeTab, setActiveTab } = props;
|
||||
export const TabControls = (props): ReactElement => {
|
||||
const { filteredTabs, downloadCount } = props;
|
||||
const [active, setActive] = useState(filteredTabs[0].id);
|
||||
|
||||
// Use controlled state if provided, otherwise use internal state
|
||||
const currentActive = activeTab !== undefined ? activeTab : active;
|
||||
const handleSetActive = (id: number) => {
|
||||
if (setActiveTab) {
|
||||
setActiveTab(id);
|
||||
} else {
|
||||
setActive(id);
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className="hidden sm:block mt-7 mb-3 w-fit">
|
||||
<div className="border-b border-gray-200">
|
||||
<nav className="flex gap-6" aria-label="Tabs">
|
||||
{filteredTabs.map(({ id, name, icon }: TabItem) => (
|
||||
{filteredTabs.map(({ id, name, icon }) => (
|
||||
<a
|
||||
key={id}
|
||||
className={`inline-flex shrink-0 items-center gap-2 px-1 py-1 text-md font-medium text-gray-500 dark:text-gray-400 hover:border-gray-300 hover:border-b hover:dark:text-slate-200 ${
|
||||
currentActive === id
|
||||
active === id
|
||||
? "border-b border-cyan-50 dark:text-slate-200"
|
||||
: "border-b border-transparent"
|
||||
}`}
|
||||
aria-current="page"
|
||||
onClick={() => handleSetActive(id)}
|
||||
onClick={() => setActive(id)}
|
||||
>
|
||||
{/* Downloads tab and count badge */}
|
||||
<>
|
||||
@@ -68,13 +43,9 @@ export const TabControls = (props: TabControlsProps): ReactElement => {
|
||||
</nav>
|
||||
</div>
|
||||
</div>
|
||||
<Suspense fallback={null}>
|
||||
{filteredTabs.map(({ id, content }: TabItem) => (
|
||||
<React.Fragment key={id}>
|
||||
{currentActive === id ? content : null}
|
||||
</React.Fragment>
|
||||
))}
|
||||
</Suspense>
|
||||
{filteredTabs.map(({ id, content }) => {
|
||||
return active === id ? content : null;
|
||||
})}
|
||||
</>
|
||||
);
|
||||
};
|
||||
|
||||
@@ -14,41 +14,34 @@ import { useStore } from "../../../store";
|
||||
import { useShallow } from "zustand/react/shallow";
|
||||
import { escapePoundSymbol } from "../../../shared/utils/formatting.utils";
|
||||
|
||||
export const ArchiveOperations = (props: { data: any }): ReactElement => {
|
||||
export const ArchiveOperations = (props): ReactElement => {
|
||||
const { data } = props;
|
||||
|
||||
const getSocket = useStore((state) => state.getSocket);
|
||||
const { socketIOInstance } = useStore(
|
||||
useShallow((state) => ({
|
||||
socketIOInstance: state.socketIOInstance,
|
||||
})),
|
||||
);
|
||||
const queryClient = useQueryClient();
|
||||
// sliding panel config
|
||||
const [visible, setVisible] = useState(false);
|
||||
const [slidingPanelContentId, setSlidingPanelContentId] = useState("");
|
||||
// current image
|
||||
const [currentImage, setCurrentImage] = useState<string>("");
|
||||
const [uncompressedArchive, setUncompressedArchive] = useState<string[]>([]);
|
||||
const [imageAnalysisResult, setImageAnalysisResult] = useState<any>({});
|
||||
const [currentImage, setCurrentImage] = useState([]);
|
||||
const [uncompressedArchive, setUncompressedArchive] = useState([]);
|
||||
const [imageAnalysisResult, setImageAnalysisResult] = useState({});
|
||||
const [shouldRefetchComicBookData, setShouldRefetchComicBookData] =
|
||||
useState(false);
|
||||
const constructImagePaths = (data: string[]): Array<string> => {
|
||||
const constructImagePaths = (data): Array<string> => {
|
||||
return data?.map((path: string) =>
|
||||
escapePoundSymbol(encodeURI(`${LIBRARY_SERVICE_HOST}/${path}`)),
|
||||
);
|
||||
};
|
||||
|
||||
// Listen to the uncompression complete event and orchestrate the final payload
|
||||
useEffect(() => {
|
||||
const socket = getSocket("/");
|
||||
if (!socket) return;
|
||||
|
||||
const handleUncompressionComplete = (data: any) => {
|
||||
setUncompressedArchive(constructImagePaths(data?.uncompressedArchive));
|
||||
};
|
||||
|
||||
socket.on("LS_UNCOMPRESSION_JOB_COMPLETE", handleUncompressionComplete);
|
||||
|
||||
return () => {
|
||||
socket.off("LS_UNCOMPRESSION_JOB_COMPLETE", handleUncompressionComplete);
|
||||
};
|
||||
}, [getSocket]);
|
||||
socketIOInstance.on("LS_UNCOMPRESSION_JOB_COMPLETE", (data) => {
|
||||
setUncompressedArchive(constructImagePaths(data?.uncompressedArchive));
|
||||
});
|
||||
|
||||
useEffect(() => {
|
||||
let isMounted = true;
|
||||
@@ -65,7 +58,7 @@ export const ArchiveOperations = (props: { data: any }): ReactElement => {
|
||||
},
|
||||
transformResponse: async (responseData) => {
|
||||
const parsedData = JSON.parse(responseData);
|
||||
const paths = parsedData.map((pathObject: any) => {
|
||||
const paths = parsedData.map((pathObject) => {
|
||||
return `${pathObject.containedIn}/${pathObject.name}${pathObject.extension}`;
|
||||
});
|
||||
const uncompressedArchive = constructImagePaths(paths);
|
||||
@@ -77,7 +70,8 @@ export const ArchiveOperations = (props: { data: any }): ReactElement => {
|
||||
},
|
||||
});
|
||||
} catch (error) {
|
||||
// Error handling could be added here if needed
|
||||
console.error("Error fetching uncompressed archive:", error);
|
||||
// Handle error if necessary
|
||||
}
|
||||
};
|
||||
fetchUncompressedArchive();
|
||||
@@ -131,15 +125,13 @@ export const ArchiveOperations = (props: { data: any }): ReactElement => {
|
||||
enabled: false,
|
||||
});
|
||||
|
||||
useEffect(() => {
|
||||
if (isSuccess && shouldRefetchComicBookData) {
|
||||
queryClient.invalidateQueries({ queryKey: ["comicBookMetadata"] });
|
||||
setShouldRefetchComicBookData(false);
|
||||
}
|
||||
}, [isSuccess, shouldRefetchComicBookData, queryClient]);
|
||||
if (isSuccess && shouldRefetchComicBookData) {
|
||||
queryClient.invalidateQueries({ queryKey: ["comicBookMetadata"] });
|
||||
setShouldRefetchComicBookData(false);
|
||||
}
|
||||
|
||||
// sliding panel init
|
||||
const contentForSlidingPanel: Record<string, { content: () => React.ReactElement }> = {
|
||||
const contentForSlidingPanel = {
|
||||
imageAnalysis: {
|
||||
content: () => {
|
||||
return (
|
||||
@@ -151,7 +143,7 @@ export const ArchiveOperations = (props: { data: any }): ReactElement => {
|
||||
</pre>
|
||||
) : null}
|
||||
<pre className="font-hasklig mt-3 text-sm">
|
||||
{JSON.stringify(imageAnalysisResult?.analyzedData, null, 2)}
|
||||
{JSON.stringify(imageAnalysisResult.analyzedData, null, 2)}
|
||||
</pre>
|
||||
</div>
|
||||
);
|
||||
@@ -160,7 +152,7 @@ export const ArchiveOperations = (props: { data: any }): ReactElement => {
|
||||
};
|
||||
|
||||
// sliding panel handlers
|
||||
const openImageAnalysisPanel = useCallback((imageFilePath: string) => {
|
||||
const openImageAnalysisPanel = useCallback((imageFilePath) => {
|
||||
setSlidingPanelContentId("imageAnalysis");
|
||||
analyzeImage(imageFilePath);
|
||||
setCurrentImage(imageFilePath);
|
||||
|
||||
@@ -1,18 +1,18 @@
|
||||
import { isUndefined } from "lodash";
|
||||
import React, { ReactElement } from "react";
|
||||
|
||||
export const ComicInfoXML = (data: { json: any }): ReactElement => {
|
||||
export const ComicInfoXML = (data): ReactElement => {
|
||||
const { json } = data;
|
||||
return (
|
||||
<div className="flex w-3/4">
|
||||
<dl className="dark:bg-yellow-600 bg-yellow-200 p-3 rounded-lg w-full">
|
||||
<div className="flex md:w-4/5 lg:w-78">
|
||||
<dl className="dark:bg-yellow-600 bg-yellow-200 p-3 rounded-lg">
|
||||
<dt>
|
||||
<p className="text-lg">{json.series?.[0]}</p>
|
||||
<p className="text-lg">{json.series[0]}</p>
|
||||
</dt>
|
||||
<dd className="text-sm">
|
||||
published by{" "}
|
||||
<span className="underline">
|
||||
{json.publisher?.[0]}
|
||||
{json.publisher[0]}
|
||||
<i className="icon-[solar--arrow-right-up-outline] w-4 h-4" />
|
||||
</span>
|
||||
</dd>
|
||||
@@ -30,20 +30,18 @@ export const ComicInfoXML = (data: { json: any }): ReactElement => {
|
||||
</span>
|
||||
</dd>
|
||||
)}
|
||||
{/* Genre */}
|
||||
{!isUndefined(json.genre) && (
|
||||
<dd className="my-2">
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-sm font-medium px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--sticker-smile-circle-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
|
||||
<span className="text-slate-500 dark:text-slate-900">
|
||||
{json.genre[0]}
|
||||
</span>
|
||||
<dd className="my-2">
|
||||
{/* Genre */}
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-sm font-medium px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--sticker-smile-circle-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
</dd>
|
||||
)}
|
||||
|
||||
<span className="text-slate-500 dark:text-slate-900">
|
||||
{json.genre[0]}
|
||||
</span>
|
||||
</span>
|
||||
</dd>
|
||||
</span>
|
||||
|
||||
<dd className="my-1">
|
||||
@@ -54,14 +52,12 @@ export const ComicInfoXML = (data: { json: any }): ReactElement => {
|
||||
</span>
|
||||
)}
|
||||
</dd>
|
||||
{!isUndefined(json.notes) && (
|
||||
<dd>
|
||||
{/* Notes */}
|
||||
<span className="text-sm text-slate-500 dark:text-slate-900">
|
||||
{json.notes[0]}
|
||||
</span>
|
||||
</dd>
|
||||
)}
|
||||
<dd>
|
||||
{/* Notes */}
|
||||
<span className="text-sm text-slate-500 dark:text-slate-900">
|
||||
{json.notes[0]}
|
||||
</span>
|
||||
</dd>
|
||||
</dl>
|
||||
</div>
|
||||
);
|
||||
|
||||
@@ -1,522 +0,0 @@
|
||||
import React, { ReactElement, useMemo, useState } from "react"
|
||||
import { Drawer } from "vaul"
|
||||
import { FIELD_CONFIG, FIELD_GROUPS } from "./reconciler.fieldConfig"
|
||||
import {
|
||||
useReconciler,
|
||||
SourceKey,
|
||||
SOURCE_LABELS,
|
||||
RawSourcedMetadata,
|
||||
RawInferredMetadata,
|
||||
CanonicalRecord,
|
||||
} from "./useReconciler"
|
||||
|
||||
// ── Source styling ─────────────────────────────────────────────────────────────
|
||||
|
||||
const SOURCE_BADGE: Record<SourceKey, string> = {
|
||||
comicvine: "bg-blue-100 text-blue-800 dark:bg-blue-900/40 dark:text-blue-300",
|
||||
metron: "bg-purple-100 text-purple-800 dark:bg-purple-900/40 dark:text-purple-300",
|
||||
gcd: "bg-orange-100 text-orange-800 dark:bg-orange-900/40 dark:text-orange-300",
|
||||
locg: "bg-teal-100 text-teal-800 dark:bg-teal-900/40 dark:text-teal-300",
|
||||
comicInfo: "bg-slate-100 text-slate-700 dark:bg-slate-700/60 dark:text-slate-300",
|
||||
inferredMetadata: "bg-gray-100 text-gray-700 dark:bg-gray-700/60 dark:text-gray-300",
|
||||
}
|
||||
|
||||
const SOURCE_SELECTED: Record<SourceKey, string> = {
|
||||
comicvine: "ring-2 ring-blue-400 bg-blue-50 dark:bg-blue-900/20",
|
||||
metron: "ring-2 ring-purple-400 bg-purple-50 dark:bg-purple-900/20",
|
||||
gcd: "ring-2 ring-orange-400 bg-orange-50 dark:bg-orange-900/20",
|
||||
locg: "ring-2 ring-teal-400 bg-teal-50 dark:bg-teal-900/20",
|
||||
comicInfo: "ring-2 ring-slate-400 bg-slate-50 dark:bg-slate-700/40",
|
||||
inferredMetadata: "ring-2 ring-gray-400 bg-gray-50 dark:bg-gray-700/40",
|
||||
}
|
||||
|
||||
/** Abbreviated source names for compact badge display. */
|
||||
const SOURCE_SHORT: Record<SourceKey, string> = {
|
||||
comicvine: "CV",
|
||||
metron: "Metron",
|
||||
gcd: "GCD",
|
||||
locg: "LoCG",
|
||||
comicInfo: "XML",
|
||||
inferredMetadata: "Local",
|
||||
}
|
||||
|
||||
const SOURCE_ORDER: SourceKey[] = [
|
||||
"comicvine", "metron", "gcd", "locg", "comicInfo", "inferredMetadata",
|
||||
]
|
||||
|
||||
type FilterMode = "all" | "conflicts" | "unresolved"
|
||||
|
||||
// ── Props ──────────────────────────────────────────────────────────────────────
|
||||
|
||||
export interface ReconcilerDrawerProps {
|
||||
open: boolean
|
||||
onOpenChange: (open: boolean) => void
|
||||
sourcedMetadata: RawSourcedMetadata
|
||||
inferredMetadata?: RawInferredMetadata
|
||||
onSave: (record: CanonicalRecord) => void
|
||||
}
|
||||
|
||||
// ── Scalar cell ────────────────────────────────────────────────────────────────
|
||||
|
||||
interface ScalarCellProps {
|
||||
value: string | null
|
||||
isSelected: boolean
|
||||
isImage: boolean
|
||||
isLongtext: boolean
|
||||
onClick: () => void
|
||||
}
|
||||
|
||||
function ScalarCell({ value, isSelected, isImage, isLongtext, onClick }: ScalarCellProps): ReactElement {
|
||||
if (!value) {
|
||||
return <span className="text-slate-300 dark:text-slate-600 text-sm px-2 pt-1.5 block">—</span>
|
||||
}
|
||||
|
||||
return (
|
||||
<button
|
||||
onClick={onClick}
|
||||
className={`w-full text-left text-sm px-2 py-1.5 rounded-md border transition-all ${
|
||||
isSelected
|
||||
? `border-transparent ${SOURCE_SELECTED[/* filled by parent */ "comicvine"]}`
|
||||
: "border-slate-200 dark:border-slate-700 hover:border-slate-300 dark:hover:border-slate-600 bg-white dark:bg-slate-800 hover:bg-slate-50 dark:hover:bg-slate-750"
|
||||
}`}
|
||||
>
|
||||
{isImage ? (
|
||||
<img
|
||||
src={value}
|
||||
alt="cover"
|
||||
className="w-full h-24 object-cover rounded"
|
||||
onError={(e) => { (e.target as HTMLImageElement).style.display = "none" }}
|
||||
/>
|
||||
) : (
|
||||
<span className={`block text-slate-700 dark:text-slate-300 ${isLongtext ? "line-clamp-3 whitespace-normal" : "truncate"}`}>
|
||||
{value}
|
||||
</span>
|
||||
)}
|
||||
{isSelected && (
|
||||
<i className="icon-[solar--check-circle-bold] w-3.5 h-3.5 text-green-500 mt-0.5 block" />
|
||||
)}
|
||||
</button>
|
||||
)
|
||||
}
|
||||
|
||||
// ── Main component ─────────────────────────────────────────────────────────────
|
||||
|
||||
export function ReconcilerDrawer({
|
||||
open,
|
||||
onOpenChange,
|
||||
sourcedMetadata,
|
||||
inferredMetadata,
|
||||
onSave,
|
||||
}: ReconcilerDrawerProps): ReactElement {
|
||||
const [filter, setFilter] = useState<FilterMode>("all")
|
||||
|
||||
const {
|
||||
state,
|
||||
unresolvedCount,
|
||||
canonicalRecord,
|
||||
selectScalar,
|
||||
toggleItem,
|
||||
setBaseSource,
|
||||
reset,
|
||||
} = useReconciler(sourcedMetadata, inferredMetadata)
|
||||
|
||||
// Derive which sources actually contributed data
|
||||
const activeSources = useMemo<SourceKey[]>(() => {
|
||||
const seen = new Set<SourceKey>()
|
||||
for (const fieldState of Object.values(state)) {
|
||||
if (fieldState.kind === "scalar") {
|
||||
for (const c of fieldState.candidates) seen.add(c.source)
|
||||
} else if (fieldState.kind === "array" || fieldState.kind === "credits") {
|
||||
for (const item of fieldState.items) seen.add((item as { source: SourceKey }).source)
|
||||
}
|
||||
}
|
||||
return SOURCE_ORDER.filter((s) => seen.has(s))
|
||||
}, [state])
|
||||
|
||||
// Grid: 180px label + one equal column per active source
|
||||
const gridCols = `180px repeat(${Math.max(activeSources.length, 1)}, minmax(0, 1fr))`
|
||||
|
||||
function shouldShow(fieldKey: string): boolean {
|
||||
const fs = state[fieldKey]
|
||||
if (!fs) return false
|
||||
if (filter === "all") return true
|
||||
if (filter === "conflicts") {
|
||||
if (fs.kind === "scalar") return fs.candidates.length > 1
|
||||
if (fs.kind === "array" || fs.kind === "credits") {
|
||||
const srcs = new Set((fs.items as Array<{ source: SourceKey }>).map((i) => i.source))
|
||||
return srcs.size > 1
|
||||
}
|
||||
return false
|
||||
}
|
||||
// unresolved
|
||||
return (
|
||||
fs.kind === "scalar" &&
|
||||
fs.candidates.length > 1 &&
|
||||
fs.selectedSource === null &&
|
||||
fs.userValue === undefined
|
||||
)
|
||||
}
|
||||
|
||||
const allResolved = unresolvedCount === 0
|
||||
|
||||
return (
|
||||
<Drawer.Root open={open} onOpenChange={onOpenChange}>
|
||||
<Drawer.Portal>
|
||||
<Drawer.Overlay className="fixed inset-0 bg-black/50 z-40" />
|
||||
<Drawer.Content
|
||||
aria-describedby={undefined}
|
||||
className="fixed inset-0 z-50 flex flex-col bg-white dark:bg-slate-900 outline-none"
|
||||
>
|
||||
<Drawer.Title className="sr-only">Reconcile metadata sources</Drawer.Title>
|
||||
|
||||
{/* ── Header ── */}
|
||||
<div className="flex-none border-b border-slate-200 dark:border-slate-700 shadow-sm">
|
||||
{/* Title + controls */}
|
||||
<div className="flex items-center justify-between px-4 py-3">
|
||||
<div className="flex items-center gap-3">
|
||||
<i className="icon-[solar--refresh-circle-outline] w-5 h-5 text-slate-500 dark:text-slate-400" />
|
||||
<span className="font-semibold text-slate-800 dark:text-slate-100 text-base">
|
||||
Reconcile Metadata
|
||||
</span>
|
||||
{unresolvedCount > 0 && (
|
||||
<span className="inline-flex items-center px-2 py-0.5 rounded-full text-xs font-medium bg-amber-100 text-amber-700 dark:bg-amber-900/40 dark:text-amber-300">
|
||||
{unresolvedCount} unresolved
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-2">
|
||||
{/* Filter pill */}
|
||||
<div className="flex items-center bg-slate-100 dark:bg-slate-800 rounded-lg p-0.5 gap-0.5">
|
||||
{(["all", "conflicts", "unresolved"] as FilterMode[]).map((mode) => (
|
||||
<button
|
||||
key={mode}
|
||||
onClick={() => setFilter(mode)}
|
||||
className={`px-3 py-1 rounded-md text-xs font-medium transition-colors capitalize ${
|
||||
filter === mode
|
||||
? "bg-white dark:bg-slate-700 text-slate-800 dark:text-slate-100 shadow-sm"
|
||||
: "text-slate-500 hover:text-slate-700 dark:hover:text-slate-300"
|
||||
}`}
|
||||
>
|
||||
{mode}
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
|
||||
<button
|
||||
onClick={reset}
|
||||
title="Reset all selections"
|
||||
className="px-3 py-1.5 text-xs rounded-md border border-slate-200 dark:border-slate-600 text-slate-600 dark:text-slate-400 hover:bg-slate-50 dark:hover:bg-slate-800 transition-colors"
|
||||
>
|
||||
Reset
|
||||
</button>
|
||||
|
||||
<button
|
||||
onClick={() => onOpenChange(false)}
|
||||
title="Close"
|
||||
className="p-1.5 rounded-md text-slate-400 hover:text-slate-600 dark:hover:text-slate-300 hover:bg-slate-100 dark:hover:bg-slate-800 transition-colors"
|
||||
>
|
||||
<i className="icon-[solar--close-square-outline] w-5 h-5 block" />
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* Source column headers */}
|
||||
<div
|
||||
className="px-4 pb-3"
|
||||
style={{ display: "grid", gridTemplateColumns: gridCols, gap: "8px" }}
|
||||
>
|
||||
<div className="text-xs font-medium text-slate-400 dark:text-slate-500 uppercase tracking-wider flex items-end pb-0.5">
|
||||
Field
|
||||
</div>
|
||||
{activeSources.map((src) => (
|
||||
<div key={src} className="flex flex-col gap-1.5">
|
||||
<span className={`text-xs font-semibold px-2 py-0.5 rounded w-fit ${SOURCE_BADGE[src]}`}>
|
||||
{SOURCE_LABELS[src]}
|
||||
</span>
|
||||
<button
|
||||
onClick={() => setBaseSource(src)}
|
||||
className="text-xs text-slate-400 hover:text-slate-600 dark:hover:text-slate-300 text-left transition-colors"
|
||||
>
|
||||
Use all ↓
|
||||
</button>
|
||||
</div>
|
||||
))}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* ── Scrollable body ── */}
|
||||
<div className="flex-1 overflow-y-auto">
|
||||
{FIELD_GROUPS.map((group) => {
|
||||
const fieldsInGroup = Object.entries(FIELD_CONFIG)
|
||||
.filter(([, cfg]) => cfg.group === group)
|
||||
.filter(([key]) => shouldShow(key))
|
||||
|
||||
if (fieldsInGroup.length === 0) return null
|
||||
|
||||
return (
|
||||
<div key={group}>
|
||||
{/* Group sticky header */}
|
||||
<div className="sticky top-0 z-10 px-4 py-2 bg-slate-50 dark:bg-slate-800/90 backdrop-blur-sm border-b border-slate-200 dark:border-slate-700">
|
||||
<span className="text-xs font-bold text-slate-400 dark:text-slate-500 uppercase tracking-widest">
|
||||
{group}
|
||||
</span>
|
||||
</div>
|
||||
|
||||
{/* Field rows */}
|
||||
{fieldsInGroup.map(([fieldKey, fieldCfg]) => {
|
||||
const fs = state[fieldKey]
|
||||
if (!fs) return null
|
||||
|
||||
const isUnresolved =
|
||||
fs.kind === "scalar" &&
|
||||
fs.candidates.length > 1 &&
|
||||
fs.selectedSource === null &&
|
||||
fs.userValue === undefined
|
||||
|
||||
return (
|
||||
<div
|
||||
key={fieldKey}
|
||||
className={`border-b border-slate-100 dark:border-slate-800/60 transition-colors ${
|
||||
isUnresolved ? "bg-amber-50/50 dark:bg-amber-950/20" : ""
|
||||
}`}
|
||||
style={{
|
||||
display: "grid",
|
||||
gridTemplateColumns: gridCols,
|
||||
gap: "8px",
|
||||
padding: "10px 16px",
|
||||
alignItems: "start",
|
||||
}}
|
||||
>
|
||||
{/* Label column */}
|
||||
<div className="flex flex-col gap-0.5 pt-1.5 pr-2">
|
||||
<span className="text-sm font-medium text-slate-700 dark:text-slate-300 leading-tight">
|
||||
{fieldCfg.label}
|
||||
</span>
|
||||
{fieldCfg.comicInfoKey && (
|
||||
<span className="text-xs text-slate-400 font-mono leading-none">
|
||||
{fieldCfg.comicInfoKey}
|
||||
</span>
|
||||
)}
|
||||
{isUnresolved && (
|
||||
<span className="inline-flex items-center gap-0.5 text-xs text-amber-600 dark:text-amber-400 mt-0.5">
|
||||
<i className="icon-[solar--danger-triangle-outline] w-3 h-3" />
|
||||
conflict
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Content — varies by kind */}
|
||||
{fs.kind === "scalar" ? (
|
||||
// One cell per active source
|
||||
activeSources.map((src) => {
|
||||
const candidate = fs.candidates.find((c) => c.source === src)
|
||||
const isSelected = fs.selectedSource === src
|
||||
|
||||
// For selected state we need the source-specific color
|
||||
const selectedClass = isSelected ? SOURCE_SELECTED[src] : ""
|
||||
|
||||
if (!candidate) {
|
||||
return (
|
||||
<span
|
||||
key={src}
|
||||
className="text-slate-300 dark:text-slate-600 text-sm px-2 pt-1.5 block"
|
||||
>
|
||||
—
|
||||
</span>
|
||||
)
|
||||
}
|
||||
|
||||
return (
|
||||
<button
|
||||
key={src}
|
||||
onClick={() => selectScalar(fieldKey, src)}
|
||||
className={`w-full text-left text-sm px-2 py-1.5 rounded-md border transition-all ${
|
||||
isSelected
|
||||
? `border-transparent ${selectedClass}`
|
||||
: "border-slate-200 dark:border-slate-700 hover:border-slate-300 dark:hover:border-slate-600 bg-white dark:bg-slate-800 hover:bg-slate-50 dark:hover:bg-slate-750"
|
||||
}`}
|
||||
>
|
||||
{fieldCfg.renderAs === "image" ? (
|
||||
<img
|
||||
src={candidate.value}
|
||||
alt="cover"
|
||||
className="w-full h-24 object-cover rounded"
|
||||
onError={(e) => {
|
||||
;(e.target as HTMLImageElement).style.display = "none"
|
||||
}}
|
||||
/>
|
||||
) : (
|
||||
<span
|
||||
className={`block text-slate-700 dark:text-slate-300 ${
|
||||
fieldCfg.renderAs === "longtext"
|
||||
? "line-clamp-3 whitespace-normal text-xs leading-relaxed"
|
||||
: "truncate"
|
||||
}`}
|
||||
>
|
||||
{candidate.value}
|
||||
</span>
|
||||
)}
|
||||
{isSelected && (
|
||||
<i className="icon-[solar--check-circle-bold] w-3.5 h-3.5 text-green-500 mt-0.5 block" />
|
||||
)}
|
||||
</button>
|
||||
)
|
||||
})
|
||||
) : fs.kind === "array" ? (
|
||||
// Merged list spanning all source columns
|
||||
<div
|
||||
className="flex flex-wrap gap-1.5"
|
||||
style={{ gridColumn: "2 / -1" }}
|
||||
>
|
||||
{fs.items.length === 0 ? (
|
||||
<span className="text-slate-400 dark:text-slate-500 text-sm">No data</span>
|
||||
) : (
|
||||
fs.items.map((item) => (
|
||||
<label
|
||||
key={item.itemKey}
|
||||
className={`inline-flex items-center gap-1.5 px-2 py-1 rounded-md border cursor-pointer transition-all text-sm select-none ${
|
||||
item.selected
|
||||
? "border-slate-200 dark:border-slate-600 bg-white dark:bg-slate-800"
|
||||
: "border-dashed border-slate-200 dark:border-slate-700 opacity-40"
|
||||
}`}
|
||||
>
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={item.selected}
|
||||
onChange={(e) =>
|
||||
toggleItem(fieldKey, item.itemKey, e.target.checked)
|
||||
}
|
||||
className="w-3 h-3 rounded accent-slate-600 flex-none"
|
||||
/>
|
||||
<span className="text-slate-700 dark:text-slate-300">
|
||||
{item.displayValue}
|
||||
</span>
|
||||
<span
|
||||
className={`text-xs px-1.5 py-0.5 rounded font-medium ${SOURCE_BADGE[item.source]}`}
|
||||
>
|
||||
{SOURCE_SHORT[item.source]}
|
||||
</span>
|
||||
</label>
|
||||
))
|
||||
)}
|
||||
</div>
|
||||
) : fs.kind === "credits" ? (
|
||||
// Credits spanning all source columns
|
||||
<div
|
||||
className="flex flex-col gap-1"
|
||||
style={{ gridColumn: "2 / -1" }}
|
||||
>
|
||||
{fs.items.length === 0 ? (
|
||||
<span className="text-slate-400 dark:text-slate-500 text-sm">No data</span>
|
||||
) : (
|
||||
fs.items.map((item) => (
|
||||
<label
|
||||
key={item.itemKey}
|
||||
className={`inline-flex items-center gap-2 px-2 py-1.5 rounded-md border cursor-pointer transition-all text-sm select-none ${
|
||||
item.selected
|
||||
? "border-slate-200 dark:border-slate-600 bg-white dark:bg-slate-800"
|
||||
: "border-dashed border-slate-200 dark:border-slate-700 opacity-40"
|
||||
}`}
|
||||
>
|
||||
<input
|
||||
type="checkbox"
|
||||
checked={item.selected}
|
||||
onChange={(e) =>
|
||||
toggleItem(fieldKey, item.itemKey, e.target.checked)
|
||||
}
|
||||
className="w-3 h-3 rounded accent-slate-600 flex-none"
|
||||
/>
|
||||
<span className="font-medium text-slate-700 dark:text-slate-300">
|
||||
{item.name}
|
||||
</span>
|
||||
<span className="text-slate-400 dark:text-slate-500">·</span>
|
||||
<span className="text-slate-500 dark:text-slate-400 text-xs">
|
||||
{item.role}
|
||||
</span>
|
||||
<span
|
||||
className={`ml-auto text-xs px-1.5 py-0.5 rounded font-medium flex-none ${SOURCE_BADGE[item.source]}`}
|
||||
>
|
||||
{SOURCE_SHORT[item.source]}
|
||||
</span>
|
||||
</label>
|
||||
))
|
||||
)}
|
||||
</div>
|
||||
) : (
|
||||
// GTIN and other complex types
|
||||
<div
|
||||
className="pt-1.5"
|
||||
style={{ gridColumn: "2 / -1" }}
|
||||
>
|
||||
<span className="text-slate-400 dark:text-slate-500 text-sm italic">
|
||||
Structured field — editor coming soon
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
)
|
||||
})}
|
||||
</div>
|
||||
)
|
||||
})}
|
||||
|
||||
{/* Empty state when filter hides everything */}
|
||||
{FIELD_GROUPS.every((group) =>
|
||||
Object.entries(FIELD_CONFIG)
|
||||
.filter(([, cfg]) => cfg.group === group)
|
||||
.every(([key]) => !shouldShow(key)),
|
||||
) && (
|
||||
<div className="flex flex-col items-center justify-center py-24 gap-3 text-slate-400 dark:text-slate-500">
|
||||
<i className="icon-[solar--check-circle-bold] w-10 h-10 text-green-400" />
|
||||
<span className="text-sm">
|
||||
{filter === "unresolved" ? "No unresolved conflicts" : "No fields match the current filter"}
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* ── Footer ── */}
|
||||
<div className="flex-none border-t border-slate-200 dark:border-slate-700 px-4 py-3 flex items-center justify-between bg-white dark:bg-slate-900">
|
||||
<div className="text-sm">
|
||||
{allResolved ? (
|
||||
<span className="flex items-center gap-1.5 text-green-600 dark:text-green-400">
|
||||
<i className="icon-[solar--check-circle-bold] w-4 h-4" />
|
||||
All conflicts resolved
|
||||
</span>
|
||||
) : (
|
||||
<span className="flex items-center gap-1.5 text-amber-600 dark:text-amber-400">
|
||||
<i className="icon-[solar--danger-triangle-outline] w-4 h-4" />
|
||||
{unresolvedCount} field{unresolvedCount !== 1 ? "s" : ""} still need a value
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<div className="flex items-center gap-2">
|
||||
<button
|
||||
onClick={() => onOpenChange(false)}
|
||||
className="px-4 py-2 text-sm text-slate-600 dark:text-slate-400 hover:bg-slate-100 dark:hover:bg-slate-800 rounded-lg transition-colors"
|
||||
>
|
||||
Cancel
|
||||
</button>
|
||||
<button
|
||||
onClick={() => {
|
||||
onSave(canonicalRecord)
|
||||
onOpenChange(false)
|
||||
}}
|
||||
disabled={!allResolved}
|
||||
className={`px-4 py-2 text-sm rounded-lg font-medium transition-colors ${
|
||||
allResolved
|
||||
? "bg-green-600 text-white hover:bg-green-700 dark:bg-green-700 dark:hover:bg-green-600"
|
||||
: "bg-slate-100 text-slate-400 dark:bg-slate-800 dark:text-slate-600 cursor-not-allowed"
|
||||
}`}
|
||||
>
|
||||
Save Canonical Record
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</Drawer.Content>
|
||||
</Drawer.Portal>
|
||||
</Drawer.Root>
|
||||
)
|
||||
}
|
||||
@@ -1,201 +1,14 @@
|
||||
import React, { ReactElement, useMemo, useState } from "react";
|
||||
import { isEmpty, isNil } from "lodash";
|
||||
import { useMutation, useQueryClient } from "@tanstack/react-query";
|
||||
import React, { ReactElement } from "react";
|
||||
import ComicVineDetails from "../ComicVineDetails";
|
||||
import { ReconcilerDrawer } from "./ReconcilerDrawer";
|
||||
import { fetcher } from "../../../graphql/fetcher";
|
||||
import { useGetComicByIdQuery } from "../../../graphql/generated";
|
||||
import type { CanonicalRecord } from "./useReconciler";
|
||||
|
||||
interface ComicVineMetadata {
|
||||
volumeInformation?: Record<string, unknown>;
|
||||
name?: string;
|
||||
number?: string;
|
||||
resource_type?: string;
|
||||
id?: number;
|
||||
}
|
||||
|
||||
interface SourcedMetadata {
|
||||
comicvine?: ComicVineMetadata;
|
||||
locg?: Record<string, unknown>;
|
||||
comicInfo?: unknown;
|
||||
metron?: unknown;
|
||||
gcd?: unknown;
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
interface VolumeInformationData {
|
||||
id?: string;
|
||||
sourcedMetadata?: SourcedMetadata;
|
||||
inferredMetadata?: { issue?: unknown };
|
||||
updatedAt?: string;
|
||||
}
|
||||
|
||||
interface VolumeInformationProps {
|
||||
data: VolumeInformationData;
|
||||
onReconcile?: () => void;
|
||||
}
|
||||
|
||||
const SET_METADATA_FIELD = `
|
||||
mutation SetMetadataField($comicId: ID!, $field: String!, $value: String!) {
|
||||
setMetadataField(comicId: $comicId, field: $field, value: $value) {
|
||||
id
|
||||
}
|
||||
}
|
||||
`;
|
||||
|
||||
/** Sources stored under `sourcedMetadata` — excludes `inferredMetadata`, which is checked separately. */
|
||||
const SOURCED_METADATA_KEYS = [
|
||||
"comicvine",
|
||||
"locg",
|
||||
"comicInfo",
|
||||
"metron",
|
||||
"gcd",
|
||||
];
|
||||
|
||||
const SOURCE_LABELS: Record<string, string> = {
|
||||
comicvine: "ComicVine",
|
||||
locg: "League of Comic Geeks",
|
||||
comicInfo: "ComicInfo.xml",
|
||||
metron: "Metron",
|
||||
gcd: "Grand Comics Database",
|
||||
inferredMetadata: "Local File",
|
||||
};
|
||||
|
||||
const SOURCE_ICONS: Record<string, string> = {
|
||||
comicvine: "icon-[solar--database-bold]",
|
||||
locg: "icon-[solar--users-group-rounded-outline]",
|
||||
comicInfo: "icon-[solar--file-text-outline]",
|
||||
metron: "icon-[solar--planet-outline]",
|
||||
gcd: "icon-[solar--book-outline]",
|
||||
inferredMetadata: "icon-[solar--folder-outline]",
|
||||
};
|
||||
|
||||
const MetadataSourceChips = ({
|
||||
sources,
|
||||
onOpenReconciler,
|
||||
}: {
|
||||
sources: string[];
|
||||
onOpenReconciler: () => void;
|
||||
}): ReactElement => {
|
||||
return (
|
||||
<div className="flex flex-col gap-2 mb-5 p-3 w-fit">
|
||||
<div className="flex flex-row items-center justify-between">
|
||||
<span className="text-md text-slate-500 dark:text-slate-400">
|
||||
<i className="icon-[solar--database-outline] w-4 h-4 inline-block align-middle mr-1" />
|
||||
{sources.length} metadata sources detected
|
||||
</span>
|
||||
</div>
|
||||
<div className="flex flex-row flex-wrap gap-2">
|
||||
{sources.map((source) => (
|
||||
<span
|
||||
key={source}
|
||||
className="inline-flex items-center gap-1 bg-white dark:bg-slate-700 text-slate-700 dark:text-slate-300 text-xs font-medium px-2 py-1 rounded-md border border-slate-200 dark:border-slate-600"
|
||||
>
|
||||
<i
|
||||
className={`${SOURCE_ICONS[source] ?? "icon-[solar--check-circle-outline]"} w-3 h-3`}
|
||||
/>
|
||||
{SOURCE_LABELS[source] ?? source}
|
||||
</span>
|
||||
))}
|
||||
</div>
|
||||
<button
|
||||
className="flex space-x-1 mb-2 sm:mt-0 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-2 py-1 text-gray-500 hover:bg-transparent hover:text-green-600 focus:outline-none focus:ring active:text-indigo-500"
|
||||
onClick={onOpenReconciler}
|
||||
>
|
||||
<i className="icon-[solar--refresh-outline] w-4 h-4 px-3" />
|
||||
Reconcile sources
|
||||
</button>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* Displays volume metadata for a comic.
|
||||
*
|
||||
* - When multiple sources are present, renders a chip bar listing each source
|
||||
* with a "Reconcile sources" action to merge them.
|
||||
* - When exactly one source is present and it is ComicVine, renders the full
|
||||
* ComicVine detail panel directly.
|
||||
*
|
||||
* @param props.data - Comic data containing sourced and inferred metadata.
|
||||
* @param props.onReconcile - Called when the user triggers source reconciliation.
|
||||
*/
|
||||
export const VolumeInformation = (
|
||||
props: VolumeInformationProps,
|
||||
): ReactElement => {
|
||||
export const VolumeInformation = (props): ReactElement => {
|
||||
const { data } = props;
|
||||
const [isReconcilerOpen, setReconcilerOpen] = useState(false);
|
||||
const queryClient = useQueryClient();
|
||||
|
||||
const { mutate: saveCanonical } = useMutation({
|
||||
mutationFn: async (record: CanonicalRecord) => {
|
||||
const saves = Object.entries(record)
|
||||
.filter(([, fv]) => fv != null)
|
||||
.map(([field, fv]) => ({
|
||||
field,
|
||||
value:
|
||||
typeof fv!.value === "string"
|
||||
? fv!.value
|
||||
: JSON.stringify(fv!.value),
|
||||
}));
|
||||
await Promise.all(
|
||||
saves.map(({ field, value }) =>
|
||||
fetcher<unknown, { comicId: string; field: string; value: string }>(
|
||||
SET_METADATA_FIELD,
|
||||
{ comicId: data.id ?? "", field, value },
|
||||
)(),
|
||||
),
|
||||
);
|
||||
},
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({
|
||||
queryKey: useGetComicByIdQuery.getKey({ id: data.id ?? "" }),
|
||||
});
|
||||
},
|
||||
});
|
||||
|
||||
const presentSources = useMemo(() => {
|
||||
const sources = SOURCED_METADATA_KEYS.filter((key) => {
|
||||
const val = (data?.sourcedMetadata ?? {})[key];
|
||||
if (isNil(val) || isEmpty(val)) return false;
|
||||
// locg returns an object even when empty; require at least one non-null value
|
||||
if (key === "locg")
|
||||
return Object.values(val as Record<string, unknown>).some(
|
||||
(v) => !isNil(v) && v !== "",
|
||||
);
|
||||
return true;
|
||||
});
|
||||
if (
|
||||
!isNil(data?.inferredMetadata?.issue) &&
|
||||
!isEmpty(data?.inferredMetadata?.issue)
|
||||
) {
|
||||
sources.push("inferredMetadata");
|
||||
}
|
||||
return sources;
|
||||
}, [data?.sourcedMetadata, data?.inferredMetadata]);
|
||||
|
||||
return (
|
||||
<div key={1}>
|
||||
{presentSources.length > 1 && (
|
||||
<MetadataSourceChips
|
||||
sources={presentSources}
|
||||
onOpenReconciler={() => setReconcilerOpen(true)}
|
||||
/>
|
||||
)}
|
||||
{presentSources.length === 1 &&
|
||||
data.sourcedMetadata?.comicvine?.volumeInformation && (
|
||||
<ComicVineDetails
|
||||
data={data.sourcedMetadata.comicvine}
|
||||
updatedAt={data.updatedAt}
|
||||
/>
|
||||
)}
|
||||
<ReconcilerDrawer
|
||||
open={isReconcilerOpen}
|
||||
onOpenChange={setReconcilerOpen}
|
||||
sourcedMetadata={(data.sourcedMetadata ?? {}) as import("./useReconciler").RawSourcedMetadata}
|
||||
inferredMetadata={data.inferredMetadata as import("./useReconciler").RawInferredMetadata | undefined}
|
||||
onSave={saveCanonical}
|
||||
<ComicVineDetails
|
||||
data={data.sourcedMetadata.comicvine}
|
||||
updatedAt={data.updatedAt}
|
||||
/>
|
||||
</div>
|
||||
);
|
||||
|
||||
@@ -1,285 +0,0 @@
|
||||
/**
|
||||
* UI field configuration for the metadata reconciler.
|
||||
*
|
||||
* Each entry maps a CanonicalMetadata field key to:
|
||||
* - label Display name shown in the reconciler table
|
||||
* - group Which section the field belongs to
|
||||
* - renderAs How the field's cell is rendered (drives component selection)
|
||||
* - comicInfoKey The ComicInfo.xml v1 key this field exports to, or null if
|
||||
* the field has no v1 equivalent (shown with a badge in the UI)
|
||||
*
|
||||
* The order of entries within each group controls row order in the table.
|
||||
*/
|
||||
|
||||
export type RenderType =
|
||||
| "scalar" // Single string/number — click to select
|
||||
| "date" // ISO date string — click to select
|
||||
| "longtext" // Multi-line text — click to select, expandable preview
|
||||
| "image" // Cover image — thumbnail grid picker
|
||||
| "array" // Flat list of strings with source badges
|
||||
| "arcs" // [{name, number}] — arc name + position number
|
||||
| "universes" // [{name, designation}] — universe name + designation
|
||||
| "credits" // [{name, role}] — role-grouped, toggleable list
|
||||
| "seriesInfo" // Structured series object — rendered as sub-fields
|
||||
| "prices" // [{country, amount, currency}]
|
||||
| "gtin" // {isbn, upc}
|
||||
| "reprints" // [{description}]
|
||||
| "urls" // [{url, primary}]
|
||||
| "externalIDs" // [{source, externalId, primary}]
|
||||
|
||||
export type FieldGroup =
|
||||
| "Identity"
|
||||
| "Series"
|
||||
| "Publication"
|
||||
| "Content"
|
||||
| "Credits"
|
||||
| "Classification"
|
||||
| "Physical"
|
||||
| "Commercial"
|
||||
| "External"
|
||||
|
||||
/** Ordered list of groups — controls section order in the reconciler table. */
|
||||
export const FIELD_GROUPS: FieldGroup[] = [
|
||||
"Identity",
|
||||
"Series",
|
||||
"Publication",
|
||||
"Content",
|
||||
"Credits",
|
||||
"Classification",
|
||||
"Physical",
|
||||
"Commercial",
|
||||
"External",
|
||||
]
|
||||
|
||||
export interface FieldConfig {
|
||||
label: string
|
||||
group: FieldGroup
|
||||
renderAs: RenderType
|
||||
/**
|
||||
* ComicInfo.xml v1 key this field maps to on export.
|
||||
* null means the field is not exported to ComicInfo v1.
|
||||
*/
|
||||
comicInfoKey: string | null
|
||||
}
|
||||
|
||||
/**
|
||||
* Master field registry for the reconciler.
|
||||
* Keys match CanonicalMetadata field names from the core-service GraphQL schema.
|
||||
*/
|
||||
export const FIELD_CONFIG: Record<string, FieldConfig> = {
|
||||
// ── Identity ──────────────────────────────────────────────────────────────
|
||||
title: {
|
||||
label: "Title",
|
||||
group: "Identity",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
series: {
|
||||
label: "Series",
|
||||
group: "Identity",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: "series",
|
||||
},
|
||||
issueNumber: {
|
||||
label: "Issue Number",
|
||||
group: "Identity",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: "number",
|
||||
},
|
||||
volume: {
|
||||
label: "Volume",
|
||||
group: "Identity",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
collectionTitle: {
|
||||
label: "Collection Title",
|
||||
group: "Identity",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
|
||||
// ── Series ────────────────────────────────────────────────────────────────
|
||||
seriesInfo: {
|
||||
label: "Series Info",
|
||||
group: "Series",
|
||||
renderAs: "seriesInfo",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
|
||||
// ── Publication ───────────────────────────────────────────────────────────
|
||||
publisher: {
|
||||
label: "Publisher",
|
||||
group: "Publication",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: "publisher",
|
||||
},
|
||||
imprint: {
|
||||
label: "Imprint",
|
||||
group: "Publication",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
coverDate: {
|
||||
label: "Cover Date",
|
||||
group: "Publication",
|
||||
renderAs: "date",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
storeDate: {
|
||||
label: "Store Date",
|
||||
group: "Publication",
|
||||
renderAs: "date",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
publicationDate: {
|
||||
label: "Publication Date",
|
||||
group: "Publication",
|
||||
renderAs: "date",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
language: {
|
||||
label: "Language",
|
||||
group: "Publication",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: "languageiso",
|
||||
},
|
||||
|
||||
// ── Content ───────────────────────────────────────────────────────────────
|
||||
description: {
|
||||
label: "Description",
|
||||
group: "Content",
|
||||
renderAs: "longtext",
|
||||
comicInfoKey: "summary",
|
||||
},
|
||||
notes: {
|
||||
label: "Notes",
|
||||
group: "Content",
|
||||
renderAs: "longtext",
|
||||
comicInfoKey: "notes",
|
||||
},
|
||||
stories: {
|
||||
label: "Stories",
|
||||
group: "Content",
|
||||
renderAs: "array",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
storyArcs: {
|
||||
label: "Story Arcs",
|
||||
group: "Content",
|
||||
renderAs: "arcs",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
characters: {
|
||||
label: "Characters",
|
||||
group: "Content",
|
||||
renderAs: "array",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
teams: {
|
||||
label: "Teams",
|
||||
group: "Content",
|
||||
renderAs: "array",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
locations: {
|
||||
label: "Locations",
|
||||
group: "Content",
|
||||
renderAs: "array",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
universes: {
|
||||
label: "Universes",
|
||||
group: "Content",
|
||||
renderAs: "universes",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
coverImage: {
|
||||
label: "Cover Image",
|
||||
group: "Content",
|
||||
renderAs: "image",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
|
||||
// ── Credits ───────────────────────────────────────────────────────────────
|
||||
creators: {
|
||||
label: "Credits",
|
||||
group: "Credits",
|
||||
renderAs: "credits",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
|
||||
// ── Classification ────────────────────────────────────────────────────────
|
||||
genres: {
|
||||
label: "Genres",
|
||||
group: "Classification",
|
||||
renderAs: "array",
|
||||
comicInfoKey: "genre",
|
||||
},
|
||||
tags: {
|
||||
label: "Tags",
|
||||
group: "Classification",
|
||||
renderAs: "array",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
ageRating: {
|
||||
label: "Age Rating",
|
||||
group: "Classification",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
|
||||
// ── Physical ──────────────────────────────────────────────────────────────
|
||||
pageCount: {
|
||||
label: "Page Count",
|
||||
group: "Physical",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: "pagecount",
|
||||
},
|
||||
format: {
|
||||
label: "Format",
|
||||
group: "Physical",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
|
||||
// ── Commercial ────────────────────────────────────────────────────────────
|
||||
prices: {
|
||||
label: "Prices",
|
||||
group: "Commercial",
|
||||
renderAs: "prices",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
gtin: {
|
||||
label: "ISBN / UPC",
|
||||
group: "Commercial",
|
||||
renderAs: "gtin",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
reprints: {
|
||||
label: "Reprints",
|
||||
group: "Commercial",
|
||||
renderAs: "reprints",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
communityRating: {
|
||||
label: "Community Rating",
|
||||
group: "Commercial",
|
||||
renderAs: "scalar",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
|
||||
// ── External ──────────────────────────────────────────────────────────────
|
||||
externalIDs: {
|
||||
label: "Source IDs",
|
||||
group: "External",
|
||||
renderAs: "externalIDs",
|
||||
comicInfoKey: null,
|
||||
},
|
||||
urls: {
|
||||
label: "URLs",
|
||||
group: "External",
|
||||
renderAs: "urls",
|
||||
comicInfoKey: "web",
|
||||
},
|
||||
} as const
|
||||
@@ -1,745 +0,0 @@
|
||||
import { useReducer, useMemo } from "react";
|
||||
import { isNil, isEmpty } from "lodash";
|
||||
|
||||
// ── Source keys ────────────────────────────────────────────────────────────────
|
||||
|
||||
export type SourceKey =
|
||||
| "comicvine"
|
||||
| "metron"
|
||||
| "gcd"
|
||||
| "locg"
|
||||
| "comicInfo"
|
||||
| "inferredMetadata";
|
||||
|
||||
export const SOURCE_LABELS: Record<SourceKey, string> = {
|
||||
comicvine: "ComicVine",
|
||||
metron: "Metron",
|
||||
gcd: "Grand Comics Database",
|
||||
locg: "League of Comic Geeks",
|
||||
comicInfo: "ComicInfo.xml",
|
||||
inferredMetadata: "Local File",
|
||||
};
|
||||
|
||||
// ── Candidate types ────────────────────────────────────────────────────────────
|
||||
|
||||
/** One source's value for a scalar field. Multiple candidates for the same field = conflict. */
|
||||
export interface ScalarCandidate {
|
||||
source: SourceKey;
|
||||
value: string;
|
||||
}
|
||||
|
||||
/** One item in an array field (characters, genres, arcs…). Pre-selected; user may deselect. */
|
||||
export interface ArrayItem {
|
||||
/** Lowercase dedup key. */
|
||||
itemKey: string;
|
||||
displayValue: string;
|
||||
/** Raw value passed through to the canonical record. */
|
||||
rawValue: unknown;
|
||||
source: SourceKey;
|
||||
selected: boolean;
|
||||
}
|
||||
|
||||
/** One person credit. Dedup key is `"${name}:${role}"` (lowercased). */
|
||||
export interface CreditItem {
|
||||
itemKey: string;
|
||||
id?: string;
|
||||
name: string;
|
||||
role: string;
|
||||
source: SourceKey;
|
||||
selected: boolean;
|
||||
}
|
||||
|
||||
// ── Per-field state ────────────────────────────────────────────────────────────
|
||||
|
||||
/** Unresolved when `selectedSource === null` and `userValue` is absent. */
|
||||
interface ScalarFieldState {
|
||||
kind: "scalar";
|
||||
candidates: ScalarCandidate[];
|
||||
selectedSource: SourceKey | null;
|
||||
/** User-typed override; takes precedence over any source value. */
|
||||
userValue?: string;
|
||||
}
|
||||
|
||||
interface ArrayFieldState {
|
||||
kind: "array";
|
||||
items: ArrayItem[];
|
||||
}
|
||||
|
||||
interface CreditsFieldState {
|
||||
kind: "credits";
|
||||
items: CreditItem[];
|
||||
}
|
||||
|
||||
interface GTINFieldState {
|
||||
kind: "gtin";
|
||||
candidates: Array<{ source: SourceKey; isbn?: string; upc?: string }>;
|
||||
selectedIsbnSource: SourceKey | null;
|
||||
selectedUpcSource: SourceKey | null;
|
||||
}
|
||||
|
||||
type FieldState = ScalarFieldState | ArrayFieldState | CreditsFieldState | GTINFieldState;
|
||||
|
||||
/** Full reconciler state — one entry per field that has data from at least one source. */
|
||||
export type ReconcilerState = Record<string, FieldState>;
|
||||
|
||||
// ── Raw source data ────────────────────────────────────────────────────────────
|
||||
|
||||
/** Raw metadata payloads keyed by source, as stored on the comic document. */
|
||||
export interface RawSourcedMetadata {
|
||||
comicvine?: Record<string, unknown>;
|
||||
/** May arrive as a JSON string; normalised by `ensureParsed`. */
|
||||
metron?: unknown;
|
||||
/** May arrive as a JSON string; normalised by `ensureParsed`. */
|
||||
gcd?: unknown;
|
||||
locg?: Record<string, unknown>;
|
||||
/** May arrive as a JSON string; normalised by `ensureParsed`. */
|
||||
comicInfo?: Record<string, unknown>;
|
||||
}
|
||||
|
||||
/** Metadata inferred from the local file name / path. */
|
||||
export interface RawInferredMetadata {
|
||||
issue?: {
|
||||
name?: string;
|
||||
number?: number;
|
||||
year?: string;
|
||||
subtitle?: string;
|
||||
};
|
||||
}
|
||||
|
||||
// ── Helpers ────────────────────────────────────────────────────────────────────
|
||||
|
||||
function safeString(v: unknown): string | null {
|
||||
if (isNil(v) || v === "") return null;
|
||||
return String(v);
|
||||
}
|
||||
|
||||
/** xml2js with `normalizeTags` wraps every value in a single-element array. */
|
||||
function xmlVal(obj: Record<string, unknown>, key: string): string | null {
|
||||
const arr = obj[key];
|
||||
if (!Array.isArray(arr) || arr.length === 0) return null;
|
||||
return safeString(arr[0]);
|
||||
}
|
||||
|
||||
/** Parse a JSON string if it hasn't been parsed yet. */
|
||||
function ensureParsed(v: unknown): Record<string, unknown> | null {
|
||||
if (isNil(v)) return null;
|
||||
if (typeof v === "string") {
|
||||
try {
|
||||
return JSON.parse(v);
|
||||
} catch {
|
||||
return null;
|
||||
}
|
||||
}
|
||||
if (typeof v === "object") return v as Record<string, unknown>;
|
||||
return null;
|
||||
}
|
||||
|
||||
function makeScalarCandidate(
|
||||
source: SourceKey,
|
||||
value: unknown,
|
||||
): ScalarCandidate | undefined {
|
||||
const val = safeString(value);
|
||||
return val ? { source, value: val } : undefined;
|
||||
}
|
||||
|
||||
function makeArrayItem(
|
||||
source: SourceKey,
|
||||
rawValue: unknown,
|
||||
displayValue: string,
|
||||
): ArrayItem {
|
||||
return {
|
||||
itemKey: displayValue.toLowerCase().trim(),
|
||||
displayValue,
|
||||
rawValue,
|
||||
source,
|
||||
selected: true,
|
||||
};
|
||||
}
|
||||
|
||||
function makeCreditItem(
|
||||
source: SourceKey,
|
||||
name: string,
|
||||
role: string,
|
||||
id?: string,
|
||||
): CreditItem {
|
||||
return {
|
||||
itemKey: `${name.toLowerCase().trim()}:${role.toLowerCase().trim()}`,
|
||||
id,
|
||||
name,
|
||||
role,
|
||||
source,
|
||||
selected: true,
|
||||
};
|
||||
}
|
||||
|
||||
// ── Source adapters ────────────────────────────────────────────────────────────
|
||||
|
||||
type AdapterResult = Partial<Record<string, ScalarCandidate | ArrayItem[] | CreditItem[]>>;
|
||||
|
||||
/**
|
||||
* Extract canonical fields from a ComicVine issue payload.
|
||||
* Volume info lives under `volumeInformation`; credits under `person_credits` etc.
|
||||
*/
|
||||
function fromComicVine(cv: Record<string, unknown>): AdapterResult {
|
||||
const s: SourceKey = "comicvine";
|
||||
const vi = cv.volumeInformation as Record<string, unknown> | undefined;
|
||||
const img = cv.image as Record<string, unknown> | undefined;
|
||||
const publisher = vi?.publisher as Record<string, unknown> | undefined;
|
||||
|
||||
return {
|
||||
title: makeScalarCandidate(s, cv.name),
|
||||
series: makeScalarCandidate(s, vi?.name),
|
||||
issueNumber: makeScalarCandidate(s, cv.issue_number),
|
||||
volume: makeScalarCandidate(s, vi?.id),
|
||||
description: makeScalarCandidate(s, cv.description),
|
||||
publisher: makeScalarCandidate(s, publisher?.name),
|
||||
coverDate: makeScalarCandidate(s, cv.cover_date),
|
||||
storeDate: makeScalarCandidate(s, cv.store_date),
|
||||
coverImage: makeScalarCandidate(s, img?.super_url ?? img?.small_url),
|
||||
characters: ((cv.character_credits as unknown[]) ?? [])
|
||||
.filter((c): c is Record<string, unknown> => !isNil(c))
|
||||
.map((c) => makeArrayItem(s, c, safeString(c.name) ?? "")),
|
||||
teams: ((cv.team_credits as unknown[]) ?? [])
|
||||
.filter((t): t is Record<string, unknown> => !isNil(t))
|
||||
.map((t) => makeArrayItem(s, t, safeString(t.name) ?? "")),
|
||||
locations: ((cv.location_credits as unknown[]) ?? [])
|
||||
.filter((l): l is Record<string, unknown> => !isNil(l))
|
||||
.map((l) => makeArrayItem(s, l, safeString(l.name) ?? "")),
|
||||
storyArcs: ((cv.story_arc_credits as unknown[]) ?? [])
|
||||
.filter((a): a is Record<string, unknown> => !isNil(a))
|
||||
.map((a) => makeArrayItem(s, a, safeString(a.name) ?? "")),
|
||||
creators: ((cv.person_credits as unknown[]) ?? [])
|
||||
.filter((p): p is Record<string, unknown> => !isNil(p))
|
||||
.map((p) =>
|
||||
makeCreditItem(s, safeString(p.name) ?? "", safeString(p.role) ?? ""),
|
||||
),
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract canonical fields from a Metron / MetronInfo payload.
|
||||
* Keys are PascalCase mirroring the MetronInfo XSD schema.
|
||||
*/
|
||||
function fromMetron(raw: Record<string, unknown>): AdapterResult {
|
||||
const s: SourceKey = "metron";
|
||||
const series = raw.Series as Record<string, unknown> | undefined;
|
||||
const pub = raw.Publisher as Record<string, unknown> | undefined;
|
||||
|
||||
const nameList = (arr: unknown[]): ArrayItem[] =>
|
||||
arr
|
||||
.filter((x): x is Record<string, unknown> => !isNil(x))
|
||||
.map((x) => makeArrayItem(s, x, safeString(x.name) ?? ""));
|
||||
|
||||
return {
|
||||
title: makeScalarCandidate(s, (raw.Stories as unknown[])?.[0]),
|
||||
series: makeScalarCandidate(s, series?.Name),
|
||||
issueNumber: makeScalarCandidate(s, raw.Number),
|
||||
collectionTitle: makeScalarCandidate(s, raw.CollectionTitle),
|
||||
publisher: makeScalarCandidate(s, pub?.Name),
|
||||
imprint: makeScalarCandidate(s, pub?.Imprint),
|
||||
coverDate: makeScalarCandidate(s, raw.CoverDate),
|
||||
storeDate: makeScalarCandidate(s, raw.StoreDate),
|
||||
description: makeScalarCandidate(s, raw.Summary),
|
||||
notes: makeScalarCandidate(s, raw.Notes),
|
||||
ageRating: makeScalarCandidate(s, raw.AgeRating),
|
||||
pageCount: makeScalarCandidate(s, raw.PageCount),
|
||||
format: makeScalarCandidate(s, series?.Format),
|
||||
language: makeScalarCandidate(s, series?.lang),
|
||||
genres: nameList((raw.Genres as unknown[]) ?? []),
|
||||
tags: ((raw.Tags as unknown[]) ?? [])
|
||||
.filter((t) => !isNil(t))
|
||||
.map((t) => makeArrayItem(s, t, safeString(t) ?? "")),
|
||||
characters: nameList((raw.Characters as unknown[]) ?? []),
|
||||
teams: nameList((raw.Teams as unknown[]) ?? []),
|
||||
locations: nameList((raw.Locations as unknown[]) ?? []),
|
||||
universes: ((raw.Universes as unknown[]) ?? [])
|
||||
.filter((u): u is Record<string, unknown> => !isNil(u))
|
||||
.map((u) =>
|
||||
makeArrayItem(
|
||||
s,
|
||||
u,
|
||||
[u.Name, u.Designation].filter(Boolean).join(" — "),
|
||||
),
|
||||
),
|
||||
storyArcs: ((raw.Arcs as unknown[]) ?? [])
|
||||
.filter((a): a is Record<string, unknown> => !isNil(a))
|
||||
.map((a) =>
|
||||
makeArrayItem(
|
||||
s,
|
||||
a,
|
||||
[a.Name, a.Number ? `#${a.Number}` : null].filter(Boolean).join(" "),
|
||||
),
|
||||
),
|
||||
stories: ((raw.Stories as unknown[]) ?? [])
|
||||
.filter((t) => !isNil(t))
|
||||
.map((t) => makeArrayItem(s, t, safeString(t) ?? "")),
|
||||
creators: ((raw.Credits as unknown[]) ?? [])
|
||||
.filter((c): c is Record<string, unknown> => !isNil(c))
|
||||
.flatMap((c) => {
|
||||
const creator = c.Creator as Record<string, unknown> | undefined;
|
||||
const roles = (c.Roles as unknown[]) ?? [];
|
||||
return roles
|
||||
.filter((r): r is Record<string, unknown> => !isNil(r))
|
||||
.map((r) =>
|
||||
makeCreditItem(
|
||||
s,
|
||||
safeString(creator?.name) ?? "",
|
||||
safeString(r.name ?? r) ?? "",
|
||||
safeString(creator?.id) ?? undefined,
|
||||
),
|
||||
);
|
||||
}),
|
||||
reprints: ((raw.Reprints as unknown[]) ?? [])
|
||||
.filter((r) => !isNil(r))
|
||||
.map((r) => makeArrayItem(s, r, safeString(r) ?? "")),
|
||||
urls: ((raw.URLs as unknown[]) ?? [])
|
||||
.filter((u) => !isNil(u))
|
||||
.map((u) => makeArrayItem(s, u, safeString(u) ?? "")),
|
||||
};
|
||||
}
|
||||
|
||||
/**
|
||||
* Extract canonical fields from a ComicInfo.xml payload.
|
||||
* Values are xml2js-parsed with `normalizeTags` (each key wraps its value in a single-element array).
|
||||
* Genre is a comma-separated string; the web URL maps to `urls`.
|
||||
*/
|
||||
function fromComicInfo(ci: Record<string, unknown>): AdapterResult {
|
||||
const s: SourceKey = "comicInfo";
|
||||
const webUrl = xmlVal(ci, "web");
|
||||
const genreItems: ArrayItem[] = (xmlVal(ci, "genre") ?? "")
|
||||
.split(",")
|
||||
.map((g) => g.trim())
|
||||
.filter(Boolean)
|
||||
.map((g) => makeArrayItem(s, g, g));
|
||||
|
||||
return {
|
||||
series: makeScalarCandidate(s, xmlVal(ci, "series")),
|
||||
issueNumber: makeScalarCandidate(s, xmlVal(ci, "number")),
|
||||
publisher: makeScalarCandidate(s, xmlVal(ci, "publisher")),
|
||||
description: makeScalarCandidate(s, xmlVal(ci, "summary")),
|
||||
notes: makeScalarCandidate(s, xmlVal(ci, "notes")),
|
||||
pageCount: makeScalarCandidate(s, xmlVal(ci, "pagecount")),
|
||||
language: makeScalarCandidate(s, xmlVal(ci, "languageiso")),
|
||||
urls: webUrl ? [makeArrayItem(s, webUrl, webUrl)] : [],
|
||||
genres: genreItems,
|
||||
};
|
||||
}
|
||||
|
||||
/** GCD free-text credit fields: field key → role name. */
|
||||
const GCD_CREDIT_FIELDS: Array<{ key: string; role: string }> = [
|
||||
{ key: "script", role: "Writer" },
|
||||
{ key: "pencils", role: "Penciller" },
|
||||
{ key: "inks", role: "Inker" },
|
||||
{ key: "colors", role: "Colorist" },
|
||||
{ key: "letters", role: "Letterer" },
|
||||
{ key: "editing", role: "Editor" },
|
||||
];
|
||||
|
||||
/** Split a GCD free-text credit string (semicolon-separated; strips bracketed annotations). */
|
||||
function splitGCDCreditString(raw: string): string[] {
|
||||
return raw
|
||||
.split(/;/)
|
||||
.map((name) => name.replace(/\[.*?\]/g, "").trim())
|
||||
.filter(Boolean);
|
||||
}
|
||||
|
||||
/** Parse a GCD price string like "0.10 USD" or "10p". Returns null on failure. */
|
||||
function parseGCDPrice(
|
||||
raw: string,
|
||||
): { amount: number; currency: string } | null {
|
||||
const match = raw.trim().match(/^([\d.,]+)\s*([A-Z]{2,3}|p|¢|€|£|\$)?/);
|
||||
if (!match) return null;
|
||||
const amount = parseFloat(match[1].replace(",", "."));
|
||||
const currency = match[2] ?? "USD";
|
||||
if (isNaN(amount)) return null;
|
||||
return { amount, currency };
|
||||
}
|
||||
|
||||
function fromGCD(raw: Record<string, unknown>): AdapterResult {
|
||||
const s: SourceKey = "gcd";
|
||||
const series = raw.series as Record<string, unknown> | undefined;
|
||||
const language = series?.language as Record<string, unknown> | undefined;
|
||||
const publisher = series?.publisher as Record<string, unknown> | undefined;
|
||||
const indiciaPublisher = raw.indicia_publisher as
|
||||
| Record<string, unknown>
|
||||
| undefined;
|
||||
const stories = (raw.stories as Record<string, unknown>[]) ?? [];
|
||||
const primaryStory = stories[0] ?? {};
|
||||
|
||||
const creditItems: CreditItem[] = [];
|
||||
if (raw.editing) {
|
||||
splitGCDCreditString(String(raw.editing)).forEach((name) =>
|
||||
creditItems.push(makeCreditItem(s, name, "Editor")),
|
||||
);
|
||||
}
|
||||
GCD_CREDIT_FIELDS.forEach(({ key, role }) => {
|
||||
const val = safeString(primaryStory[key]);
|
||||
if (!val) return;
|
||||
splitGCDCreditString(val).forEach((name) =>
|
||||
creditItems.push(makeCreditItem(s, name, role)),
|
||||
);
|
||||
});
|
||||
|
||||
const genreItems: ArrayItem[] = (safeString(primaryStory.genre) ?? "")
|
||||
.split(",")
|
||||
.map((g) => g.trim())
|
||||
.filter(Boolean)
|
||||
.map((g) => makeArrayItem(s, g, g));
|
||||
|
||||
const characterItems: ArrayItem[] = (
|
||||
safeString(primaryStory.characters) ?? ""
|
||||
)
|
||||
.split(/[;,]/)
|
||||
.map((c) => c.trim())
|
||||
.filter(Boolean)
|
||||
.map((c) => makeArrayItem(s, c, c));
|
||||
|
||||
const storyTitles: ArrayItem[] = stories
|
||||
.map((st) => safeString(st.title))
|
||||
.filter((t): t is string => Boolean(t))
|
||||
.map((t) => makeArrayItem(s, t, t));
|
||||
|
||||
const priceItems: ArrayItem[] = [];
|
||||
const priceStr = safeString(raw.price);
|
||||
if (priceStr) {
|
||||
const parsed = parseGCDPrice(priceStr);
|
||||
if (parsed) {
|
||||
priceItems.push(makeArrayItem(s, { ...parsed, country: "US" }, priceStr));
|
||||
}
|
||||
}
|
||||
|
||||
return {
|
||||
series: makeScalarCandidate(s, series?.name),
|
||||
issueNumber: makeScalarCandidate(s, raw.number),
|
||||
title: makeScalarCandidate(s, raw.title ?? primaryStory.title),
|
||||
volume: makeScalarCandidate(s, raw.volume),
|
||||
// Prefer indicia publisher (as-printed) over series publisher
|
||||
publisher: makeScalarCandidate(s, indiciaPublisher?.name ?? publisher?.name),
|
||||
coverDate: makeScalarCandidate(s, raw.publication_date),
|
||||
storeDate: makeScalarCandidate(s, raw.on_sale_date ?? raw.key_date),
|
||||
pageCount: makeScalarCandidate(s, raw.page_count),
|
||||
notes: makeScalarCandidate(s, raw.notes),
|
||||
language: makeScalarCandidate(s, language?.code),
|
||||
ageRating: makeScalarCandidate(s, raw.rating),
|
||||
genres: genreItems,
|
||||
characters: characterItems,
|
||||
stories: storyTitles,
|
||||
creators: creditItems,
|
||||
prices: priceItems,
|
||||
};
|
||||
}
|
||||
|
||||
function fromLocg(locg: Record<string, unknown>): AdapterResult {
|
||||
const s: SourceKey = "locg";
|
||||
return {
|
||||
title: makeScalarCandidate(s, locg.name),
|
||||
publisher: makeScalarCandidate(s, locg.publisher),
|
||||
description: makeScalarCandidate(s, locg.description),
|
||||
coverImage: makeScalarCandidate(s, locg.cover),
|
||||
communityRating: makeScalarCandidate(s, locg.rating),
|
||||
publicationDate: makeScalarCandidate(s, locg.publicationDate),
|
||||
};
|
||||
}
|
||||
|
||||
function fromInferred(inf: RawInferredMetadata["issue"]): AdapterResult {
|
||||
if (!inf) return {};
|
||||
const s: SourceKey = "inferredMetadata";
|
||||
return {
|
||||
title: makeScalarCandidate(s, inf.name),
|
||||
issueNumber: makeScalarCandidate(s, inf.number),
|
||||
volume: makeScalarCandidate(s, inf.year),
|
||||
};
|
||||
}
|
||||
|
||||
// ── State building ─────────────────────────────────────────────────────────────
|
||||
|
||||
/**
|
||||
* Merge all adapter results directly into a `ReconcilerState`.
|
||||
* Array and credit items are deduplicated by `itemKey` using a Set (O(n)).
|
||||
* Scalar conflicts are auto-resolved when all sources agree on the same value.
|
||||
*/
|
||||
function buildState(
|
||||
sources: Partial<Record<SourceKey, AdapterResult>>,
|
||||
): ReconcilerState {
|
||||
const state: ReconcilerState = {};
|
||||
const scalarMap: Record<string, ScalarCandidate[]> = {};
|
||||
|
||||
for (const adapterResult of Object.values(sources)) {
|
||||
if (!adapterResult) continue;
|
||||
for (const [field, value] of Object.entries(adapterResult)) {
|
||||
if (!value) continue;
|
||||
|
||||
if (Array.isArray(value)) {
|
||||
// Presence of `role` distinguishes CreditItem[] from ArrayItem[].
|
||||
const isCredits = value.length > 0 && "role" in value[0];
|
||||
if (isCredits) {
|
||||
const prev = state[field];
|
||||
const existing: CreditItem[] =
|
||||
prev?.kind === "credits" ? prev.items : [];
|
||||
const seen = new Set(existing.map((i) => i.itemKey));
|
||||
const merged = [...existing];
|
||||
for (const item of value as CreditItem[]) {
|
||||
if (!seen.has(item.itemKey)) {
|
||||
seen.add(item.itemKey);
|
||||
merged.push(item);
|
||||
}
|
||||
}
|
||||
state[field] = { kind: "credits", items: merged };
|
||||
} else {
|
||||
const prev = state[field];
|
||||
const existing: ArrayItem[] =
|
||||
prev?.kind === "array" ? prev.items : [];
|
||||
const seen = new Set(existing.map((i) => i.itemKey));
|
||||
const merged = [...existing];
|
||||
for (const item of value as ArrayItem[]) {
|
||||
if (!seen.has(item.itemKey)) {
|
||||
seen.add(item.itemKey);
|
||||
merged.push(item);
|
||||
}
|
||||
}
|
||||
state[field] = { kind: "array", items: merged };
|
||||
}
|
||||
} else {
|
||||
(scalarMap[field] ??= []).push(value as ScalarCandidate);
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
for (const [field, candidates] of Object.entries(scalarMap)) {
|
||||
const allAgree =
|
||||
candidates.length === 1 ||
|
||||
candidates.every((c) => c.value === candidates[0].value);
|
||||
state[field] = {
|
||||
kind: "scalar",
|
||||
candidates,
|
||||
selectedSource: allAgree ? candidates[0].source : null,
|
||||
};
|
||||
}
|
||||
|
||||
return state;
|
||||
}
|
||||
|
||||
// ── Reducer ────────────────────────────────────────────────────────────────────
|
||||
|
||||
type Action =
|
||||
| { type: "SELECT_SCALAR"; field: string; source: SourceKey }
|
||||
| { type: "SET_USER_VALUE"; field: string; value: string }
|
||||
| { type: "TOGGLE_ITEM"; field: string; itemKey: string; selected: boolean }
|
||||
| { type: "SET_BASE_SOURCE"; source: SourceKey }
|
||||
| { type: "RESET"; initial: ReconcilerState };
|
||||
|
||||
function reducer(state: ReconcilerState, action: Action): ReconcilerState {
|
||||
switch (action.type) {
|
||||
case "SELECT_SCALAR": {
|
||||
const field = state[action.field];
|
||||
if (field?.kind !== "scalar") return state;
|
||||
return {
|
||||
...state,
|
||||
[action.field]: {
|
||||
...field,
|
||||
selectedSource: action.source,
|
||||
userValue: undefined,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
case "SET_USER_VALUE": {
|
||||
const field = state[action.field];
|
||||
if (field?.kind !== "scalar") return state;
|
||||
return {
|
||||
...state,
|
||||
[action.field]: {
|
||||
...field,
|
||||
selectedSource: null,
|
||||
userValue: action.value,
|
||||
},
|
||||
};
|
||||
}
|
||||
|
||||
case "TOGGLE_ITEM": {
|
||||
const field = state[action.field];
|
||||
if (field?.kind === "array" || field?.kind === "credits") {
|
||||
return {
|
||||
...state,
|
||||
[action.field]: {
|
||||
...field,
|
||||
items: field.items.map((item) =>
|
||||
item.itemKey === action.itemKey
|
||||
? { ...item, selected: action.selected }
|
||||
: item,
|
||||
),
|
||||
} as FieldState,
|
||||
};
|
||||
}
|
||||
return state;
|
||||
}
|
||||
|
||||
case "SET_BASE_SOURCE": {
|
||||
const next = { ...state };
|
||||
for (const [field, fieldState] of Object.entries(next)) {
|
||||
if (fieldState.kind !== "scalar") continue;
|
||||
if (fieldState.candidates.some((c) => c.source === action.source)) {
|
||||
next[field] = {
|
||||
...fieldState,
|
||||
selectedSource: action.source,
|
||||
userValue: undefined,
|
||||
};
|
||||
}
|
||||
}
|
||||
return next;
|
||||
}
|
||||
|
||||
case "RESET":
|
||||
return action.initial;
|
||||
|
||||
default:
|
||||
return state;
|
||||
}
|
||||
}
|
||||
|
||||
// ── Canonical record ───────────────────────────────────────────────────────────
|
||||
|
||||
export interface CanonicalFieldValue {
|
||||
value: unknown;
|
||||
source: SourceKey | "user";
|
||||
}
|
||||
|
||||
export type CanonicalRecord = Partial<Record<string, CanonicalFieldValue>>;
|
||||
|
||||
function deriveCanonicalRecord(state: ReconcilerState): CanonicalRecord {
|
||||
const record: CanonicalRecord = {};
|
||||
|
||||
for (const [field, fieldState] of Object.entries(state)) {
|
||||
if (fieldState.kind === "scalar") {
|
||||
if (fieldState.userValue !== undefined) {
|
||||
record[field] = { value: fieldState.userValue, source: "user" };
|
||||
} else if (fieldState.selectedSource !== null) {
|
||||
const candidate = fieldState.candidates.find(
|
||||
(c) => c.source === fieldState.selectedSource,
|
||||
);
|
||||
if (candidate) {
|
||||
record[field] = { value: candidate.value, source: candidate.source };
|
||||
}
|
||||
}
|
||||
} else if (fieldState.kind === "array") {
|
||||
const selected = fieldState.items.filter((i) => i.selected);
|
||||
if (selected.length > 0) {
|
||||
const counts = selected.reduce<Record<string, number>>((acc, i) => {
|
||||
acc[i.source] = (acc[i.source] ?? 0) + 1;
|
||||
return acc;
|
||||
}, {});
|
||||
const dominant = Object.entries(counts).sort(
|
||||
([, a], [, b]) => b - a,
|
||||
)[0][0] as SourceKey;
|
||||
record[field] = {
|
||||
value: selected.map((i) => i.rawValue),
|
||||
source: dominant,
|
||||
};
|
||||
}
|
||||
} else if (fieldState.kind === "credits") {
|
||||
const selected = fieldState.items.filter((i) => i.selected);
|
||||
if (selected.length > 0) {
|
||||
record[field] = { value: selected, source: selected[0].source };
|
||||
}
|
||||
}
|
||||
}
|
||||
|
||||
return record;
|
||||
}
|
||||
|
||||
// ── Hook ───────────────────────────────────────────────────────────────────────
|
||||
|
||||
export interface UseReconcilerResult {
|
||||
state: ReconcilerState;
|
||||
/** Number of scalar fields with a conflict that has no selection yet. */
|
||||
unresolvedCount: number;
|
||||
/** True if any field has candidates from more than one source. */
|
||||
hasConflicts: boolean;
|
||||
canonicalRecord: CanonicalRecord;
|
||||
selectScalar: (field: string, source: SourceKey) => void;
|
||||
/** Override a scalar field with a user-typed value. */
|
||||
setUserValue: (field: string, value: string) => void;
|
||||
toggleItem: (field: string, itemKey: string, selected: boolean) => void;
|
||||
/** Adopt all available fields from a single source. */
|
||||
setBaseSource: (source: SourceKey) => void;
|
||||
reset: () => void;
|
||||
}
|
||||
|
||||
export function useReconciler(
|
||||
sourcedMetadata: RawSourcedMetadata,
|
||||
inferredMetadata?: RawInferredMetadata,
|
||||
): UseReconcilerResult {
|
||||
const initial = useMemo(() => {
|
||||
const adapters: Partial<Record<SourceKey, AdapterResult>> = {};
|
||||
|
||||
if (!isEmpty(sourcedMetadata.comicvine)) {
|
||||
adapters.comicvine = fromComicVine(
|
||||
sourcedMetadata.comicvine as Record<string, unknown>,
|
||||
);
|
||||
}
|
||||
const metron = ensureParsed(sourcedMetadata.metron);
|
||||
if (metron) adapters.metron = fromMetron(metron);
|
||||
|
||||
const gcd = ensureParsed(sourcedMetadata.gcd);
|
||||
if (gcd) adapters.gcd = fromGCD(gcd);
|
||||
|
||||
if (!isEmpty(sourcedMetadata.locg)) {
|
||||
adapters.locg = fromLocg(
|
||||
sourcedMetadata.locg as Record<string, unknown>,
|
||||
);
|
||||
}
|
||||
const ci = ensureParsed(sourcedMetadata.comicInfo);
|
||||
if (ci) adapters.comicInfo = fromComicInfo(ci);
|
||||
|
||||
if (inferredMetadata?.issue) {
|
||||
adapters.inferredMetadata = fromInferred(inferredMetadata.issue);
|
||||
}
|
||||
|
||||
return buildState(adapters);
|
||||
}, [sourcedMetadata, inferredMetadata]);
|
||||
|
||||
const [state, dispatch] = useReducer(reducer, initial);
|
||||
|
||||
const unresolvedCount = useMemo(
|
||||
() =>
|
||||
Object.values(state).filter(
|
||||
(f) =>
|
||||
f.kind === "scalar" &&
|
||||
f.selectedSource === null &&
|
||||
f.userValue === undefined &&
|
||||
f.candidates.length > 1,
|
||||
).length,
|
||||
[state],
|
||||
);
|
||||
|
||||
const hasConflicts = useMemo(
|
||||
() =>
|
||||
Object.values(state).some(
|
||||
(f) =>
|
||||
(f.kind === "scalar" && f.candidates.length > 1) ||
|
||||
((f.kind === "array" || f.kind === "credits") &&
|
||||
new Set(
|
||||
(f.items as Array<ArrayItem | CreditItem>).map((i) => i.source),
|
||||
).size > 1),
|
||||
),
|
||||
[state],
|
||||
);
|
||||
|
||||
const canonicalRecord = useMemo(() => deriveCanonicalRecord(state), [state]);
|
||||
|
||||
return {
|
||||
state,
|
||||
unresolvedCount,
|
||||
hasConflicts,
|
||||
canonicalRecord,
|
||||
selectScalar: (field, source) =>
|
||||
dispatch({ type: "SELECT_SCALAR", field, source }),
|
||||
setUserValue: (field, value) =>
|
||||
dispatch({ type: "SET_USER_VALUE", field, value }),
|
||||
toggleItem: (field, itemKey, selected) =>
|
||||
dispatch({ type: "TOGGLE_ITEM", field, itemKey, selected }),
|
||||
setBaseSource: (source) =>
|
||||
dispatch({ type: "SET_BASE_SOURCE", source }),
|
||||
reset: () => dispatch({ type: "RESET", initial }),
|
||||
};
|
||||
}
|
||||
@@ -2,48 +2,12 @@ import React from "react";
|
||||
import dayjs from "dayjs";
|
||||
import prettyBytes from "pretty-bytes";
|
||||
|
||||
interface TorrentInfo {
|
||||
name: string;
|
||||
hash: string;
|
||||
added_on: number;
|
||||
progress: number;
|
||||
downloaded: number;
|
||||
uploaded: number;
|
||||
trackers_count: number;
|
||||
total_size: number;
|
||||
}
|
||||
|
||||
interface TorrentData {
|
||||
torrent?: TorrentInfo;
|
||||
// Support direct TorrentDetails format from socket events
|
||||
infoHash?: string;
|
||||
downloadSpeed?: number;
|
||||
uploadSpeed?: number;
|
||||
name?: string;
|
||||
}
|
||||
|
||||
export interface TorrentDownloadsProps {
|
||||
data: TorrentData[];
|
||||
}
|
||||
|
||||
export type { TorrentData };
|
||||
|
||||
export const TorrentDownloads = (props: TorrentDownloadsProps) => {
|
||||
export const TorrentDownloads = (props) => {
|
||||
const { data } = props;
|
||||
console.log(Object.values(data));
|
||||
return (
|
||||
<>
|
||||
{data.map((item: TorrentData, index: number) => {
|
||||
// Support both wrapped format (item.torrent) and direct format
|
||||
const torrent: TorrentInfo = item.torrent || {
|
||||
name: item.name || 'Unknown',
|
||||
hash: item.infoHash || '',
|
||||
added_on: 0,
|
||||
progress: (item as any).progress || 0,
|
||||
downloaded: 0,
|
||||
uploaded: 0,
|
||||
trackers_count: 0,
|
||||
total_size: 0,
|
||||
};
|
||||
{data.map(({ torrent }) => {
|
||||
return (
|
||||
<dl className="mt-5 dark:text-slate-200 text-slate-600">
|
||||
<dt className="text-lg">{torrent.name}</dt>
|
||||
|
||||
@@ -10,31 +10,7 @@ import { isEmpty, isNil } from "lodash";
|
||||
import ellipsize from "ellipsize";
|
||||
import prettyBytes from "pretty-bytes";
|
||||
|
||||
interface TorrentSearchPanelProps {
|
||||
issueName: string;
|
||||
comicObjectId: string;
|
||||
}
|
||||
|
||||
interface SearchFormValues {
|
||||
issueName: string;
|
||||
}
|
||||
|
||||
interface TorrentResult {
|
||||
fileName: string;
|
||||
seeders: number;
|
||||
leechers: number;
|
||||
size: number;
|
||||
files: number;
|
||||
indexer: string;
|
||||
downloadUrl: string;
|
||||
}
|
||||
|
||||
interface TorrentDownloadPayload {
|
||||
comicObjectId: string;
|
||||
torrentToDownload: string;
|
||||
}
|
||||
|
||||
export const TorrentSearchPanel = (props: TorrentSearchPanelProps) => {
|
||||
export const TorrentSearchPanel = (props) => {
|
||||
const { issueName, comicObjectId } = props;
|
||||
// Initialize searchTerm with issueName from props
|
||||
const [searchTerm, setSearchTerm] = useState({ issueName });
|
||||
@@ -47,36 +23,34 @@ export const TorrentSearchPanel = (props: TorrentSearchPanelProps) => {
|
||||
url: `${PROWLARR_SERVICE_BASE_URI}/search`,
|
||||
method: "POST",
|
||||
data: {
|
||||
prowlarrQuery: {
|
||||
port: "9696",
|
||||
apiKey: "38c2656e8f5d4790962037b8c4416a8f",
|
||||
offset: 0,
|
||||
categories: [7030],
|
||||
query: searchTerm.issueName,
|
||||
host: "localhost",
|
||||
limit: 100,
|
||||
type: "search",
|
||||
indexerIds: [2],
|
||||
},
|
||||
port: "9696",
|
||||
apiKey: "c4f42e265fb044dc81f7e88bd41c3367",
|
||||
offset: 0,
|
||||
categories: [7030],
|
||||
query: searchTerm.issueName,
|
||||
host: "localhost",
|
||||
limit: 100,
|
||||
type: "search",
|
||||
indexerIds: [2],
|
||||
},
|
||||
});
|
||||
},
|
||||
enabled: !isNil(searchTerm.issueName) && searchTerm.issueName.trim() !== "", // Make sure searchTerm is not empty
|
||||
});
|
||||
const mutation = useMutation({
|
||||
mutationFn: async (newTorrent: TorrentDownloadPayload) =>
|
||||
mutationFn: async (newTorrent) =>
|
||||
axios.post(`${QBITTORRENT_SERVICE_BASE_URI}/addTorrent`, newTorrent),
|
||||
onSuccess: async () => {
|
||||
// Torrent added successfully
|
||||
onSuccess: async (data) => {
|
||||
console.log(data);
|
||||
},
|
||||
});
|
||||
const searchIndexer = (values: SearchFormValues) => {
|
||||
const searchIndexer = (values) => {
|
||||
setSearchTerm({ issueName: values.issueName }); // Update searchTerm based on the form submission
|
||||
};
|
||||
const downloadTorrent = (downloadUrl: string) => {
|
||||
const newTorrent: TorrentDownloadPayload = {
|
||||
const downloadTorrent = (evt) => {
|
||||
const newTorrent = {
|
||||
comicObjectId,
|
||||
torrentToDownload: downloadUrl,
|
||||
torrentToDownload: evt,
|
||||
};
|
||||
mutation.mutate(newTorrent);
|
||||
};
|
||||
@@ -149,7 +123,7 @@ export const TorrentSearchPanel = (props: TorrentSearchPanelProps) => {
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody className="divide-y divide-slate-100 dark:divide-gray-500">
|
||||
{data?.data.map((result: TorrentResult, idx: number) => (
|
||||
{data?.data.map((result, idx) => (
|
||||
<tr key={idx}>
|
||||
<td className="px-3 py-3 text-gray-700 dark:text-slate-300 text-md">
|
||||
<p>{ellipsize(result.fileName, 90)}</p>
|
||||
|
||||
@@ -1,65 +0,0 @@
|
||||
import React from "react";
|
||||
import { StylesConfig } from "react-select";
|
||||
|
||||
export interface ActionOption {
|
||||
value: string;
|
||||
label: React.ReactElement;
|
||||
}
|
||||
|
||||
export const CVMatchLabel = (
|
||||
<span className="inline-flex flex-row items-center gap-2">
|
||||
<div className="w-6 h-6">
|
||||
<i className="icon-[solar--magic-stick-3-bold-duotone] w-6 h-6"></i>
|
||||
</div>
|
||||
<div>Match on ComicVine</div>
|
||||
</span>
|
||||
);
|
||||
|
||||
export const editLabel = (
|
||||
<span className="inline-flex flex-row items-center gap-2">
|
||||
<div className="w-6 h-6">
|
||||
<i className="icon-[solar--pen-2-bold-duotone] w-6 h-6"></i>
|
||||
</div>
|
||||
<div>Edit Metadata</div>
|
||||
</span>
|
||||
);
|
||||
|
||||
export const deleteLabel = (
|
||||
<span className="inline-flex flex-row items-center gap-2">
|
||||
<div className="w-6 h-6">
|
||||
<i className="icon-[solar--trash-bin-trash-bold-duotone] w-6 h-6"></i>
|
||||
</div>
|
||||
<div>Delete Comic</div>
|
||||
</span>
|
||||
);
|
||||
|
||||
export const actionOptions: ActionOption[] = [
|
||||
{ value: "match-on-comic-vine", label: CVMatchLabel },
|
||||
{ value: "edit-metdata", label: editLabel },
|
||||
{ value: "delete-comic", label: deleteLabel },
|
||||
];
|
||||
|
||||
export const customStyles: StylesConfig<ActionOption, false> = {
|
||||
menu: (base: any) => ({
|
||||
...base,
|
||||
backgroundColor: "rgb(156, 163, 175)",
|
||||
}),
|
||||
placeholder: (base: any) => ({
|
||||
...base,
|
||||
color: "black",
|
||||
}),
|
||||
option: (base: any, { isFocused }: any) => ({
|
||||
...base,
|
||||
backgroundColor: isFocused ? "gray" : "rgb(156, 163, 175)",
|
||||
}),
|
||||
singleValue: (base: any) => ({
|
||||
...base,
|
||||
paddingTop: "0.4rem",
|
||||
}),
|
||||
control: (base: any) => ({
|
||||
...base,
|
||||
backgroundColor: "rgb(156, 163, 175)",
|
||||
color: "black",
|
||||
border: "1px solid rgb(156, 163, 175)",
|
||||
}),
|
||||
};
|
||||
@@ -1,95 +0,0 @@
|
||||
import React, { lazy } from "react";
|
||||
import { isNil, isEmpty } from "lodash";
|
||||
import type { TabConfig, TabConfigParams } from "../../types";
|
||||
|
||||
const VolumeInformation = lazy(() => import("./Tabs/VolumeInformation").then(m => ({ default: m.VolumeInformation })));
|
||||
const ArchiveOperations = lazy(() => import("./Tabs/ArchiveOperations").then(m => ({ default: m.ArchiveOperations })));
|
||||
const AcquisitionPanel = lazy(() => import("./AcquisitionPanel"));
|
||||
const TorrentSearchPanel = lazy(() => import("./TorrentSearchPanel"));
|
||||
const DownloadsPanel = lazy(() => import("./DownloadsPanel"));
|
||||
|
||||
export const createTabConfig = ({
|
||||
data,
|
||||
hasAnyMetadata,
|
||||
areRawFileDetailsAvailable,
|
||||
airDCPPQuery,
|
||||
comicObjectId,
|
||||
userSettings,
|
||||
issueName,
|
||||
acquisition,
|
||||
onReconcileMetadata,
|
||||
}: TabConfigParams): TabConfig[] => {
|
||||
return [
|
||||
{
|
||||
id: 1,
|
||||
name: "Volume Information",
|
||||
icon: (
|
||||
<i className="h-5 w-5 icon-[solar--book-2-bold] text-slate-500 dark:text-slate-300"></i>
|
||||
),
|
||||
content: hasAnyMetadata ? (
|
||||
<VolumeInformation data={data} onReconcile={onReconcileMetadata} />
|
||||
) : null,
|
||||
shouldShow: hasAnyMetadata,
|
||||
},
|
||||
{
|
||||
id: 3,
|
||||
icon: (
|
||||
<i className="h-5 w-5 icon-[solar--winrar-bold-duotone] text-slate-500 dark:text-slate-300" />
|
||||
),
|
||||
name: "Archive Operations",
|
||||
content: <ArchiveOperations data={data} />,
|
||||
shouldShow: areRawFileDetailsAvailable,
|
||||
},
|
||||
{
|
||||
id: 4,
|
||||
icon: (
|
||||
<i className="h-5 w-5 icon-[solar--folder-path-connect-bold-duotone] text-slate-500 dark:text-slate-300" />
|
||||
),
|
||||
name: "DC++ Search",
|
||||
content: (
|
||||
<AcquisitionPanel
|
||||
query={airDCPPQuery}
|
||||
comicObjectId={comicObjectId}
|
||||
comicObject={data}
|
||||
settings={userSettings}
|
||||
/>
|
||||
),
|
||||
shouldShow: true,
|
||||
},
|
||||
{
|
||||
id: 5,
|
||||
icon: (
|
||||
<span className="inline-flex flex-row">
|
||||
<i className="h-5 w-5 icon-[solar--magnet-bold-duotone] text-slate-500 dark:text-slate-300" />
|
||||
</span>
|
||||
),
|
||||
name: "Torrent Search",
|
||||
content: <TorrentSearchPanel comicObjectId={comicObjectId} issueName={issueName} />,
|
||||
shouldShow: true,
|
||||
},
|
||||
{
|
||||
id: 6,
|
||||
name: "Downloads",
|
||||
icon: (
|
||||
<>
|
||||
{(acquisition?.directconnect?.downloads?.length || 0) +
|
||||
(acquisition?.torrent?.length || 0)}
|
||||
</>
|
||||
),
|
||||
content:
|
||||
!isNil(data) && !isEmpty(data) ? (
|
||||
<DownloadsPanel />
|
||||
) : (
|
||||
<div className="column is-three-fifths">
|
||||
<article className="message is-info">
|
||||
<div className="message-body is-size-6 is-family-secondary">
|
||||
AirDC++ is not configured. Please configure it in{" "}
|
||||
<code>Settings</code>.
|
||||
</div>
|
||||
</article>
|
||||
</div>
|
||||
),
|
||||
shouldShow: true,
|
||||
},
|
||||
];
|
||||
};
|
||||
@@ -1,89 +0,0 @@
|
||||
import { useState } from "react";
|
||||
import axios from "axios";
|
||||
import { isNil, isUndefined, isEmpty } from "lodash";
|
||||
import { refineQuery } from "filename-parser";
|
||||
import { COMICVINE_SERVICE_URI } from "../../constants/endpoints";
|
||||
import { RawFileDetails as RawFileDetailsType } from "../../graphql/generated";
|
||||
|
||||
type ComicVineMatch = {
|
||||
score: number;
|
||||
[key: string]: any;
|
||||
};
|
||||
|
||||
type ComicVineSearchQuery = {
|
||||
inferredIssueDetails: {
|
||||
name: string;
|
||||
[key: string]: any;
|
||||
};
|
||||
[key: string]: any;
|
||||
};
|
||||
|
||||
type ComicVineMetadata = {
|
||||
name?: string;
|
||||
[key: string]: any;
|
||||
};
|
||||
|
||||
export const useComicVineMatching = () => {
|
||||
const [comicVineMatches, setComicVineMatches] = useState<ComicVineMatch[]>([]);
|
||||
|
||||
const fetchComicVineMatches = async (
|
||||
searchPayload: any,
|
||||
issueSearchQuery: ComicVineSearchQuery,
|
||||
seriesSearchQuery: ComicVineSearchQuery,
|
||||
) => {
|
||||
try {
|
||||
const response = await axios({
|
||||
url: `${COMICVINE_SERVICE_URI}/volumeBasedSearch`,
|
||||
method: "POST",
|
||||
data: {
|
||||
format: "json",
|
||||
// hack
|
||||
query: issueSearchQuery.inferredIssueDetails.name
|
||||
.replace(/[^a-zA-Z0-9 ]/g, "")
|
||||
.trim(),
|
||||
limit: "100",
|
||||
page: 1,
|
||||
resources: "volume",
|
||||
scorerConfiguration: {
|
||||
searchParams: issueSearchQuery.inferredIssueDetails,
|
||||
},
|
||||
rawFileDetails: searchPayload,
|
||||
},
|
||||
transformResponse: (r) => {
|
||||
const matches = JSON.parse(r);
|
||||
return matches;
|
||||
},
|
||||
});
|
||||
let matches: ComicVineMatch[] = [];
|
||||
if (!isNil(response.data.results) && response.data.results.length === 1) {
|
||||
matches = response.data.results;
|
||||
} else {
|
||||
matches = response.data.map((match: ComicVineMatch) => match);
|
||||
}
|
||||
const scoredMatches = matches.sort((a: ComicVineMatch, b: ComicVineMatch) => b.score - a.score);
|
||||
setComicVineMatches(scoredMatches);
|
||||
} catch (err) {
|
||||
// Error handling could be added here if needed
|
||||
}
|
||||
};
|
||||
|
||||
const prepareAndFetchMatches = (
|
||||
rawFileDetails: RawFileDetailsType | undefined,
|
||||
comicvine: ComicVineMetadata | undefined,
|
||||
) => {
|
||||
let seriesSearchQuery: ComicVineSearchQuery = {} as ComicVineSearchQuery;
|
||||
let issueSearchQuery: ComicVineSearchQuery = {} as ComicVineSearchQuery;
|
||||
|
||||
if (!isUndefined(rawFileDetails) && rawFileDetails.name) {
|
||||
issueSearchQuery = refineQuery(rawFileDetails.name) as ComicVineSearchQuery;
|
||||
} else if (!isEmpty(comicvine) && comicvine?.name) {
|
||||
issueSearchQuery = refineQuery(comicvine.name) as ComicVineSearchQuery;
|
||||
}
|
||||
fetchComicVineMatches(rawFileDetails, issueSearchQuery, seriesSearchQuery);
|
||||
};
|
||||
|
||||
return {
|
||||
comicVineMatches,
|
||||
prepareAndFetchMatches,
|
||||
};
|
||||
};
|
||||
@@ -1,72 +1,78 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import ZeroState from "./ZeroState";
|
||||
import { RecentlyImported } from "./RecentlyImported";
|
||||
import { WantedComicsList } from "./WantedComicsList";
|
||||
import { VolumeGroups } from "./VolumeGroups";
|
||||
import { LibraryStatistics } from "./LibraryStatistics";
|
||||
import { PullList } from "./PullList";
|
||||
import {
|
||||
useGetRecentComicsQuery,
|
||||
useGetWantedComicsQuery,
|
||||
useGetVolumeGroupsQuery,
|
||||
useGetLibraryStatisticsQuery
|
||||
} from "../../graphql/generated";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { LIBRARY_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
|
||||
export const Dashboard = (): ReactElement => {
|
||||
// Use GraphQL for recent comics
|
||||
const { data: recentComicsData, error: recentComicsError } = useGetRecentComicsQuery(
|
||||
{ limit: 5 },
|
||||
{ refetchOnWindowFocus: false }
|
||||
);
|
||||
const { data: recentComics } = useQuery({
|
||||
queryFn: async () =>
|
||||
await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBooks`,
|
||||
method: "POST",
|
||||
data: {
|
||||
paginationOptions: {
|
||||
page: 0,
|
||||
limit: 5,
|
||||
sort: { updatedAt: "-1" },
|
||||
},
|
||||
predicate: { "acquisition.source.wanted": false },
|
||||
comicStatus: "recent",
|
||||
},
|
||||
}),
|
||||
queryKey: ["recentComics"],
|
||||
});
|
||||
|
||||
// Wanted Comics - using GraphQL
|
||||
const { data: wantedComicsData, error: wantedComicsError } = useGetWantedComicsQuery(
|
||||
{
|
||||
paginationOptions: {
|
||||
page: 1,
|
||||
limit: 5,
|
||||
sort: '{"updatedAt": -1}'
|
||||
},
|
||||
predicate: '{"acquisition.source.wanted": true}'
|
||||
},
|
||||
{
|
||||
refetchOnWindowFocus: false,
|
||||
retry: false
|
||||
}
|
||||
);
|
||||
const { data: wantedComics } = useQuery({
|
||||
queryFn: async () =>
|
||||
await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBooks`,
|
||||
method: "POST",
|
||||
data: {
|
||||
paginationOptions: {
|
||||
page: 0,
|
||||
limit: 5,
|
||||
sort: { updatedAt: "-1" },
|
||||
},
|
||||
predicate: { "acquisition.source.wanted": true },
|
||||
},
|
||||
}),
|
||||
queryKey: ["wantedComics"],
|
||||
});
|
||||
const { data: volumeGroups } = useQuery({
|
||||
queryFn: async () =>
|
||||
await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBookGroups`,
|
||||
method: "GET",
|
||||
}),
|
||||
queryKey: ["volumeGroups"],
|
||||
});
|
||||
|
||||
// Volume Groups - using GraphQL
|
||||
const { data: volumeGroupsData, error: volumeGroupsError } = useGetVolumeGroupsQuery(
|
||||
undefined,
|
||||
{ refetchOnWindowFocus: false }
|
||||
);
|
||||
|
||||
// Library Statistics - using GraphQL
|
||||
const { data: statisticsData, error: statisticsError } = useGetLibraryStatisticsQuery(
|
||||
undefined,
|
||||
{
|
||||
refetchOnWindowFocus: false,
|
||||
retry: false
|
||||
}
|
||||
);
|
||||
|
||||
const recentComics = recentComicsData?.comics?.comics || [];
|
||||
const wantedComics = !wantedComicsError ? (wantedComicsData?.getComicBooks?.docs || []) : [];
|
||||
const volumeGroups = volumeGroupsData?.getComicBookGroups || [];
|
||||
const statistics = !statisticsError ? statisticsData?.getLibraryStatistics : undefined;
|
||||
const { data: statistics } = useQuery({
|
||||
queryFn: async () =>
|
||||
await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/libraryStatistics`,
|
||||
method: "GET",
|
||||
}),
|
||||
queryKey: ["libraryStatistics"],
|
||||
});
|
||||
|
||||
return (
|
||||
<>
|
||||
<div className="mx-auto max-w-7xl px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
<PullList />
|
||||
{recentComics.length > 0 && <RecentlyImported comics={recentComics} />}
|
||||
{/* Wanted comics */}
|
||||
<WantedComicsList comics={wantedComics} />
|
||||
{/* Library Statistics */}
|
||||
{statistics && <LibraryStatistics stats={statistics} />}
|
||||
{/* Volume groups */}
|
||||
<VolumeGroups volumeGroups={volumeGroups} />
|
||||
</div>
|
||||
</>
|
||||
<div className="container mx-auto max-w-full">
|
||||
<PullList />
|
||||
{recentComics && <RecentlyImported comics={recentComics?.data.docs} />}
|
||||
{/* Wanted comics */}
|
||||
<WantedComicsList comics={wantedComics?.data?.docs} />
|
||||
{/* Library Statistics */}
|
||||
{statistics && <LibraryStatistics stats={statistics?.data} />}
|
||||
{/* Volume groups */}
|
||||
<VolumeGroups volumeGroups={volumeGroups?.data} />
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
|
||||
@@ -1,97 +1,100 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import React, { ReactElement, useEffect } from "react";
|
||||
import prettyBytes from "pretty-bytes";
|
||||
import { isEmpty, isUndefined, map } from "lodash";
|
||||
import Header from "../shared/Header";
|
||||
import { GetLibraryStatisticsQuery, DirectorySize } from "../../graphql/generated";
|
||||
import type { LibraryStatisticsProps } from "../../types";
|
||||
|
||||
/**
|
||||
* Displays a snapshot of library metrics: total comic files, tagging coverage,
|
||||
* file-type breakdown, and the publisher with the most issues.
|
||||
*
|
||||
* Returns `null` when `stats` is absent or the statistics array is empty.
|
||||
*/
|
||||
export const LibraryStatistics = ({ stats }: LibraryStatisticsProps): ReactElement | null => {
|
||||
if (!stats || !stats.totalDocuments) return null;
|
||||
|
||||
const facet = stats.statistics?.[0];
|
||||
if (!facet) return null;
|
||||
|
||||
const { issues, issuesWithComicInfoXML, fileTypes, publisherWithMostComicsInLibrary } = facet;
|
||||
const topPublisher = publisherWithMostComicsInLibrary?.[0];
|
||||
|
||||
export const LibraryStatistics = (
|
||||
props: ILibraryStatisticsProps,
|
||||
): ReactElement => {
|
||||
const { stats } = props;
|
||||
return (
|
||||
<div className="mt-5">
|
||||
{/* TODO: Switch iconClassNames to Solar icon */}
|
||||
<Header
|
||||
headerContent="Your Library In Numbers"
|
||||
subHeaderContent={<span className="text-md">A brief snapshot of your library.</span>}
|
||||
subHeaderContent={
|
||||
<span className="text-md">A brief snapshot of your library.</span>
|
||||
}
|
||||
iconClassNames="fa-solid fa-binoculars mr-2"
|
||||
/>
|
||||
|
||||
<div className="mt-3 flex flex-row gap-5">
|
||||
{/* Total records in database */}
|
||||
<div className="flex flex-col rounded-lg bg-card-info px-4 py-6 text-center">
|
||||
<dt className="text-lg font-medium text-gray-500">In database</dt>
|
||||
<dd className="text-3xl text-gray-700 md:text-5xl">
|
||||
{stats.totalDocuments} comics
|
||||
</dd>
|
||||
</div>
|
||||
|
||||
{/* Missing files */}
|
||||
<div className="flex flex-col rounded-lg bg-card-missing px-4 py-6 text-center">
|
||||
<dt className="text-lg font-medium text-gray-500">Missing files</dt>
|
||||
<dd className="text-3xl text-red-600 md:text-5xl">
|
||||
{stats.comicsMissingFiles}
|
||||
</dd>
|
||||
</div>
|
||||
|
||||
{/* Disk space consumed */}
|
||||
{stats.comicDirectorySize.totalSizeInGB != null && (
|
||||
<div className="flex flex-col rounded-lg bg-card-info px-4 py-6 text-center">
|
||||
<dt className="text-lg font-medium text-gray-500">Size on disk</dt>
|
||||
<dd className="text-3xl text-gray-700 md:text-5xl">
|
||||
{stats.comicDirectorySize.totalSizeInGB.toFixed(2)} GB
|
||||
<div className="mt-3">
|
||||
<div className="flex flex-row gap-5">
|
||||
<div className="flex flex-col rounded-lg bg-green-100 dark:bg-green-200 px-4 py-6 text-center">
|
||||
<dt className="text-lg font-medium text-gray-500">Library size</dt>
|
||||
<dd className="text-3xl text-green-600 md:text-5xl">
|
||||
{props.stats.totalDocuments} files
|
||||
</dd>
|
||||
<dd>
|
||||
<span className="text-2xl text-green-600">
|
||||
{props.stats.comicDirectorySize &&
|
||||
prettyBytes(props.stats.comicDirectorySize)}
|
||||
</span>
|
||||
</dd>
|
||||
</div>
|
||||
)}
|
||||
{/* comicinfo and comicvine tagged issues */}
|
||||
<div className="flex flex-col gap-4">
|
||||
{!isUndefined(props.stats.statistics) &&
|
||||
!isEmpty(props.stats.statistics[0].issues) && (
|
||||
<div className="flex flex-col h-fit rounded-lg bg-green-100 dark:bg-green-200 px-4 py-3 text-center">
|
||||
<span className="text-xl">
|
||||
{props.stats.statistics[0].issues.length}
|
||||
</span>{" "}
|
||||
tagged with ComicVine
|
||||
</div>
|
||||
)}
|
||||
{!isUndefined(props.stats.statistics) &&
|
||||
!isEmpty(props.stats.statistics[0].issuesWithComicInfoXML) && (
|
||||
<div className="flex flex-col h-fit rounded-lg bg-green-100 dark:bg-green-200 px-4 py-3 text-center">
|
||||
<span className="text-xl">
|
||||
{props.stats.statistics[0].issuesWithComicInfoXML.length}
|
||||
</span>{" "}
|
||||
<span className="tag is-warning has-text-weight-bold mr-2 ml-1">
|
||||
with ComicInfo.xml
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{/* Tagging coverage */}
|
||||
<div className="flex flex-col gap-4">
|
||||
{issues && issues.length > 0 && (
|
||||
<div className="flex flex-col h-fit rounded-lg bg-card-info px-4 py-3 text-center">
|
||||
<span className="text-xl text-gray-700">{issues.length}</span>
|
||||
tagged with ComicVine
|
||||
</div>
|
||||
)}
|
||||
{issuesWithComicInfoXML && issuesWithComicInfoXML.length > 0 && (
|
||||
<div className="flex flex-col h-fit rounded-lg bg-card-info px-4 py-3 text-center">
|
||||
<span className="text-xl text-gray-700">{issuesWithComicInfoXML.length}</span>
|
||||
with ComicInfo.xml
|
||||
</div>
|
||||
)}
|
||||
<div className="">
|
||||
{!isUndefined(props.stats.statistics) &&
|
||||
!isEmpty(props.stats.statistics[0].fileTypes) &&
|
||||
map(props.stats.statistics[0].fileTypes, (fileType, idx) => {
|
||||
return (
|
||||
<span
|
||||
key={idx}
|
||||
className="flex flex-col mb-4 h-fit text-xl rounded-lg bg-green-100 dark:bg-green-200 px-4 py-3 text-center"
|
||||
>
|
||||
{fileType.data.length} {fileType._id}
|
||||
</span>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
|
||||
{/* file types */}
|
||||
<div className="flex flex-col h-fit text-lg rounded-lg bg-green-100 dark:bg-green-200 px-4 py-3">
|
||||
{/* publisher with most issues */}
|
||||
{!isUndefined(props.stats.statistics) &&
|
||||
!isEmpty(
|
||||
props.stats.statistics[0].publisherWithMostComicsInLibrary[0],
|
||||
) && (
|
||||
<>
|
||||
<span className="">
|
||||
{
|
||||
props.stats.statistics[0]
|
||||
.publisherWithMostComicsInLibrary[0]._id
|
||||
}
|
||||
</span>
|
||||
{" has the most issues "}
|
||||
<span className="">
|
||||
{
|
||||
props.stats.statistics[0]
|
||||
.publisherWithMostComicsInLibrary[0].count
|
||||
}
|
||||
</span>
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
|
||||
{/* File-type breakdown */}
|
||||
{fileTypes && fileTypes.length > 0 && (
|
||||
<div>
|
||||
{fileTypes.map((ft) => (
|
||||
<span
|
||||
key={ft.id}
|
||||
className="flex flex-col mb-4 h-fit text-xl rounded-lg bg-card-info px-4 py-3 text-center text-gray-700"
|
||||
>
|
||||
{ft.data.length} {ft.id}
|
||||
</span>
|
||||
))}
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Publisher with most issues */}
|
||||
{topPublisher && (
|
||||
<div className="flex flex-col h-fit text-lg rounded-lg bg-card-info px-4 py-3 text-gray-700">
|
||||
<span>{topPublisher.id}</span>
|
||||
{" has the most issues "}
|
||||
<span>{topPublisher.count}</span>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
|
||||
@@ -2,83 +2,73 @@ import React, { ReactElement, useState } from "react";
|
||||
import { map } from "lodash";
|
||||
import Card from "../shared/Carda";
|
||||
import Header from "../shared/Header";
|
||||
import { importToDB } from "../../actions/fileops.actions";
|
||||
import ellipsize from "ellipsize";
|
||||
import { Link } from "react-router-dom";
|
||||
|
||||
import axios from "axios";
|
||||
import { useMutation, useQueryClient } from "@tanstack/react-query";
|
||||
import useEmblaCarousel from "embla-carousel-react";
|
||||
import { LIBRARY_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import { Form } from "react-final-form";
|
||||
import rateLimiter from "axios-rate-limit";
|
||||
import { setupCache } from "axios-cache-interceptor";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import "keen-slider/keen-slider.min.css";
|
||||
import { useKeenSlider } from "keen-slider/react";
|
||||
import { COMICVINE_SERVICE_URI } from "../../constants/endpoints";
|
||||
import { Field, Form } from "react-final-form";
|
||||
import DatePickerDialog from "../shared/DatePicker";
|
||||
import { format } from "date-fns";
|
||||
import { LocgMetadata, useGetWeeklyPullListQuery } from "../../graphql/generated";
|
||||
import type { PullListProps } from "../../types";
|
||||
|
||||
type PullListProps = {
|
||||
issues: any;
|
||||
};
|
||||
|
||||
const http = rateLimiter(axios.create(), {
|
||||
maxRequests: 1,
|
||||
perMilliseconds: 1000,
|
||||
maxRPS: 1,
|
||||
});
|
||||
const cachedAxios = setupCache(axios);
|
||||
export const PullList = (): ReactElement => {
|
||||
const queryClient = useQueryClient();
|
||||
|
||||
// datepicker
|
||||
const date = new Date();
|
||||
const [inputValue, setInputValue] = useState<string>(
|
||||
format(date, "yyyy/M/dd"),
|
||||
format(date, "M-dd-yyyy"),
|
||||
);
|
||||
|
||||
// embla carousel
|
||||
const [emblaRef, emblaApi] = useEmblaCarousel({
|
||||
loop: false,
|
||||
align: "start",
|
||||
containScroll: "trimSnaps",
|
||||
slidesToScroll: 1,
|
||||
});
|
||||
// keen slider
|
||||
const [sliderRef, instanceRef] = useKeenSlider(
|
||||
{
|
||||
loop: true,
|
||||
slides: {
|
||||
origin: "auto",
|
||||
number: 15,
|
||||
perView: 5,
|
||||
spacing: 15,
|
||||
},
|
||||
slideChanged() {
|
||||
console.log("slide changed");
|
||||
},
|
||||
},
|
||||
[
|
||||
// add plugins here
|
||||
],
|
||||
);
|
||||
|
||||
const {
|
||||
data: pullListData,
|
||||
data: pullList,
|
||||
refetch,
|
||||
isSuccess,
|
||||
isLoading,
|
||||
isError,
|
||||
} = useGetWeeklyPullListQuery({
|
||||
input: {
|
||||
startDate: inputValue,
|
||||
pageSize: 15,
|
||||
currentPage: 1,
|
||||
},
|
||||
});
|
||||
|
||||
// Transform the data to match the old structure
|
||||
const pullList = pullListData ? { data: pullListData.getWeeklyPullList } : undefined;
|
||||
|
||||
const { mutate: addToLibrary } = useMutation({
|
||||
mutationFn: async ({ sourceName, metadata }: { sourceName: string; metadata: any }) => {
|
||||
const comicBookMetadata = {
|
||||
importType: "new",
|
||||
payload: {
|
||||
rawFileDetails: {
|
||||
name: "",
|
||||
},
|
||||
importStatus: {
|
||||
isImported: true,
|
||||
tagged: false,
|
||||
matchedResult: {
|
||||
score: "0",
|
||||
},
|
||||
},
|
||||
sourcedMetadata: metadata || null,
|
||||
acquisition: { source: { wanted: true, name: sourceName } },
|
||||
},
|
||||
};
|
||||
|
||||
return await axios.request({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/rawImportToDb`,
|
||||
method: "POST",
|
||||
data: comicBookMetadata,
|
||||
});
|
||||
},
|
||||
onSuccess: () => {
|
||||
// Invalidate and refetch wanted comics queries
|
||||
queryClient.invalidateQueries({ queryKey: ["wantedComics"] });
|
||||
},
|
||||
} = useQuery({
|
||||
queryFn: async (): any =>
|
||||
await cachedAxios(`${COMICVINE_SERVICE_URI}/getWeeklyPullList`, {
|
||||
method: "get",
|
||||
params: { startDate: inputValue, pageSize: "15", currentPage: "1" },
|
||||
}),
|
||||
queryKey: ["pullList", inputValue],
|
||||
});
|
||||
const addToLibrary = (sourceName: string, locgMetadata) =>
|
||||
importToDB(sourceName, { locg: locgMetadata });
|
||||
|
||||
const next = () => {
|
||||
// sliderRef.slickNext();
|
||||
@@ -89,14 +79,14 @@ export const PullList = (): ReactElement => {
|
||||
|
||||
return (
|
||||
<>
|
||||
{/* TODO: Switch iconClassNames to Solar icon */}
|
||||
<Header
|
||||
<div className="content">
|
||||
<Header
|
||||
headerContent="Discover"
|
||||
subHeaderContent={
|
||||
<span className="text-md">
|
||||
Pull List aggregated for the week from{" "}
|
||||
<span className="underline">
|
||||
<a href="https://leagueofcomicgeeks.com">
|
||||
<a href="https://leagueofcomicgeeks.com/comics/new-comics">
|
||||
League Of Comic Geeks
|
||||
</a>
|
||||
<i className="icon-[solar--arrow-right-up-outline] w-4 h-4" />
|
||||
@@ -133,46 +123,43 @@ export const PullList = (): ReactElement => {
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
<div className="mr-[calc(-1*(1rem+max(0px,(100vw-80rem)/2)))] sm:mr-[calc(-1*(1.5rem+max(0px,(100vw-80rem)/2)))] lg:mr-[calc(-1*(2rem+max(0px,(100vw-80rem)/2)))]">
|
||||
{isSuccess && !isLoading && (
|
||||
<div className="overflow-hidden" ref={emblaRef}>
|
||||
<div className="flex">
|
||||
{map(pullList?.data.result, (issue: LocgMetadata, idx: number) => {
|
||||
return (
|
||||
<div
|
||||
key={idx}
|
||||
className="flex-[0_0_200px] min-w-0 sm:flex-[0_0_220px] md:flex-[0_0_240px] lg:flex-[0_0_260px] xl:flex-[0_0_280px] pr-[15px]"
|
||||
>
|
||||
<Card
|
||||
orientation={"vertical-2"}
|
||||
imageUrl={issue.cover || undefined}
|
||||
hasDetails
|
||||
title={ellipsize(issue.name || 'Unknown', 25)}
|
||||
>
|
||||
<div className="px-1">
|
||||
<span className="inline-flex mb-2 items-center bg-slate-50 text-slate-800 text-xs font-medium px-2.5 py-1 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
{issue.publisher || 'Unknown Publisher'}
|
||||
</span>
|
||||
<div className="flex flex-row justify-end">
|
||||
<button
|
||||
className="flex space-x-1 mb-2 sm:mt-0 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-2 py-1 text-gray-500 hover:bg-transparent hover:text-green-600 focus:outline-none focus:ring active:text-indigo-500"
|
||||
onClick={() => addToLibrary({ sourceName: "locg", metadata: { locg: issue } })}
|
||||
>
|
||||
<i className="icon-[solar--add-square-bold-duotone] w-5 h-5 mr-2"></i>{" "}
|
||||
Want
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</Card>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
{isLoading && <div>Loading...</div>}
|
||||
{isError && <div>An error occurred while retrieving the pull list.</div>}
|
||||
</div>
|
||||
|
||||
{isSuccess && !isLoading && (
|
||||
<div ref={sliderRef} className="keen-slider flex flex-row">
|
||||
{map(pullList?.data.result, (issue, idx) => {
|
||||
return (
|
||||
<div key={idx} className="keen-slider__slide">
|
||||
<Card
|
||||
orientation={"vertical-2"}
|
||||
imageUrl={issue.cover}
|
||||
hasDetails
|
||||
title={ellipsize(issue.name, 25)}
|
||||
>
|
||||
<div className="px-1">
|
||||
<span className="inline-flex mb-2 items-center bg-slate-50 text-slate-800 text-xs font-medium px-2.5 py-1 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
{issue.publisher}
|
||||
</span>
|
||||
<div className="flex flex-row justify-end">
|
||||
<button
|
||||
className="flex space-x-1 mb-2 sm:mt-0 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-2 py-1 text-gray-500 hover:bg-transparent hover:text-green-600 focus:outline-none focus:ring active:text-indigo-500"
|
||||
onClick={() => addToLibrary("locg", issue)}
|
||||
>
|
||||
<i className="icon-[solar--add-square-bold-duotone] w-5 h-5 mr-2"></i>{" "}
|
||||
Want
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</Card>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
)}
|
||||
{isLoading ? <div>Loading...</div> : null}
|
||||
{isError ? (
|
||||
<div>An error occurred while retrieving the pull list.</div>
|
||||
) : null}
|
||||
</>
|
||||
);
|
||||
};
|
||||
|
||||
@@ -4,83 +4,64 @@ import { Link } from "react-router-dom";
|
||||
import ellipsize from "ellipsize";
|
||||
import { isEmpty, isNil, isUndefined, map } from "lodash";
|
||||
import { detectIssueTypes } from "../../shared/utils/tradepaperback.utils";
|
||||
import { determineCoverFile } from "../../shared/utils/metadata.utils";
|
||||
import {
|
||||
determineCoverFile,
|
||||
determineExternalMetadata,
|
||||
} from "../../shared/utils/metadata.utils";
|
||||
import { LIBRARY_SERVICE_HOST } from "../../constants/endpoints";
|
||||
import Header from "../shared/Header";
|
||||
import useEmblaCarousel from "embla-carousel-react";
|
||||
import { GetRecentComicsQuery } from "../../graphql/generated";
|
||||
|
||||
type RecentlyImportedProps = {
|
||||
comics: GetRecentComicsQuery['comics']['comics'];
|
||||
comics: any;
|
||||
};
|
||||
|
||||
export const RecentlyImported = (
|
||||
{ comics }: RecentlyImportedProps,
|
||||
comics: RecentlyImportedProps,
|
||||
): ReactElement => {
|
||||
// embla carousel
|
||||
const [emblaRef, emblaApi] = useEmblaCarousel({
|
||||
loop: false,
|
||||
align: "start",
|
||||
containScroll: "trimSnaps",
|
||||
slidesToScroll: 1,
|
||||
});
|
||||
|
||||
return (
|
||||
<div>
|
||||
{/* TODO: Switch iconClassNames to Solar icon */}
|
||||
<Header
|
||||
headerContent="Recently Imported"
|
||||
subHeaderContent="Recent Library activity such as imports, tagging, etc."
|
||||
iconClassNames="fa-solid fa-binoculars mr-2"
|
||||
/>
|
||||
<div className="-mr-10 sm:-mr-17 lg:-mr-29 xl:-mr-36 2xl:-mr-42 mt-3">
|
||||
<div className="overflow-hidden" ref={emblaRef}>
|
||||
<div className="flex">
|
||||
{comics?.map((comic, idx) => {
|
||||
const {
|
||||
id,
|
||||
<div className="grid grid-cols-5 gap-6 mt-3">
|
||||
{comics?.comics.map(
|
||||
(
|
||||
{
|
||||
_id,
|
||||
rawFileDetails,
|
||||
sourcedMetadata,
|
||||
canonicalMetadata,
|
||||
sourcedMetadata: { comicvine, comicInfo, locg },
|
||||
inferredMetadata,
|
||||
importStatus,
|
||||
} = comic;
|
||||
|
||||
// Parse sourced metadata (GraphQL returns as strings)
|
||||
const comicvine = typeof sourcedMetadata?.comicvine === 'string'
|
||||
? JSON.parse(sourcedMetadata.comicvine)
|
||||
: sourcedMetadata?.comicvine;
|
||||
const comicInfo = typeof sourcedMetadata?.comicInfo === 'string'
|
||||
? JSON.parse(sourcedMetadata.comicInfo)
|
||||
: sourcedMetadata?.comicInfo;
|
||||
const locg = sourcedMetadata?.locg;
|
||||
|
||||
acquisition: {
|
||||
source: { name },
|
||||
},
|
||||
},
|
||||
idx,
|
||||
) => {
|
||||
const { issueName, url } = determineCoverFile({
|
||||
rawFileDetails,
|
||||
comicvine,
|
||||
comicInfo,
|
||||
locg,
|
||||
});
|
||||
const { issue, coverURL, icon } = determineExternalMetadata(name, {
|
||||
comicvine,
|
||||
comicInfo,
|
||||
locg,
|
||||
});
|
||||
const isComicVineMetadataAvailable =
|
||||
!isUndefined(comicvine) &&
|
||||
!isUndefined(comicvine.volumeInformation);
|
||||
const hasComicInfo = !isNil(comicInfo) && !isEmpty(comicInfo);
|
||||
const isMissingFile = importStatus?.isRawFileMissing === true;
|
||||
const cardState = isMissingFile
|
||||
? "missing"
|
||||
: (hasComicInfo || isComicVineMetadataAvailable) ? "scraped" : "imported";
|
||||
|
||||
return (
|
||||
<div
|
||||
<Card
|
||||
orientation="vertical-2"
|
||||
key={idx}
|
||||
className="flex-[0_0_200px] min-w-0 sm:flex-[0_0_220px] md:flex-[0_0_240px] lg:flex-[0_0_260px] xl:flex-[0_0_280px] pr-[15px]"
|
||||
imageUrl={`${LIBRARY_SERVICE_HOST}/${rawFileDetails.cover.filePath}`}
|
||||
title={inferredMetadata.issue.name}
|
||||
hasDetails
|
||||
>
|
||||
<Card
|
||||
orientation="vertical-2"
|
||||
imageUrl={url}
|
||||
title={inferredMetadata?.issue?.name}
|
||||
hasDetails
|
||||
cardState={cardState}
|
||||
>
|
||||
<div>
|
||||
<dd className="text-sm my-1 flex flex-row gap-1">
|
||||
{/* Issue number */}
|
||||
@@ -89,7 +70,7 @@ export const RecentlyImported = (
|
||||
<i className="icon-[solar--hashtag-outline]"></i>
|
||||
</span>
|
||||
<span className="text-md text-slate-900">
|
||||
{inferredMetadata?.issue?.number}
|
||||
{inferredMetadata.issue.number}
|
||||
</span>
|
||||
</span>
|
||||
{/* File extension */}
|
||||
@@ -99,7 +80,7 @@ export const RecentlyImported = (
|
||||
</span>
|
||||
|
||||
<span className="text-md text-slate-500 dark:text-slate-900">
|
||||
{rawFileDetails?.extension}
|
||||
{rawFileDetails.extension}
|
||||
</span>
|
||||
</span>
|
||||
{/* Uncompressed status */}
|
||||
@@ -117,28 +98,31 @@ export const RecentlyImported = (
|
||||
<div className="sm:inline-flex sm:shrink-0 sm:items-center sm:gap-2">
|
||||
{/* ComicInfo.xml presence */}
|
||||
{!isNil(comicInfo) && !isEmpty(comicInfo) && (
|
||||
<div className="mt-1">
|
||||
<i className="h-7 w-7 icon-[solar--code-file-bold-duotone] text-gray-500 dark:text-white-300"></i>
|
||||
<div mt-1>
|
||||
<i className="h-7 w-7 icon-[solar--code-file-bold-duotone] text-yellow-500 dark:text-yellow-300"></i>
|
||||
</div>
|
||||
)}
|
||||
{/* ComicVine metadata presence */}
|
||||
{isComicVineMetadataAvailable && (
|
||||
<span className="inline-block w-6 h-6 md:w-7 md:h-7 flex-shrink-0">
|
||||
<span className="w-7 h-7">
|
||||
<img
|
||||
src="/src/client/assets/img/cvlogo.svg"
|
||||
alt={"ComicVine metadata detected."}
|
||||
className="w-full h-full object-contain"
|
||||
/>
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
{/* Raw file presence */}
|
||||
{isNil(rawFileDetails) && (
|
||||
<span className="h-6 w-5 sm:shrink-0 sm:items-center sm:gap-2">
|
||||
<i className="icon-[solar--file-corrupted-outline] h-5 w-5" />
|
||||
</span>
|
||||
)}
|
||||
</div>
|
||||
</Card>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
},
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
|
||||
@@ -4,82 +4,59 @@ import ellipsize from "ellipsize";
|
||||
import { Link, useNavigate } from "react-router-dom";
|
||||
import Card from "../shared/Carda";
|
||||
import Header from "../shared/Header";
|
||||
import useEmblaCarousel from "embla-carousel-react";
|
||||
import { GetVolumeGroupsQuery } from "../../graphql/generated";
|
||||
|
||||
type VolumeGroupsProps = {
|
||||
volumeGroups?: GetVolumeGroupsQuery['getComicBookGroups'];
|
||||
};
|
||||
|
||||
export const VolumeGroups = (props: VolumeGroupsProps): ReactElement | null => {
|
||||
export const VolumeGroups = (props): ReactElement => {
|
||||
// Till mongo gives us back the deduplicated results with the ObjectId
|
||||
const deduplicatedGroups = unionBy(props.volumeGroups, "volumes.id");
|
||||
if (!deduplicatedGroups || deduplicatedGroups.length === 0) return null;
|
||||
|
||||
const navigate = useNavigate();
|
||||
const navigateToVolumes = (row: any) => {
|
||||
const navigateToVolumes = (row) => {
|
||||
navigate(`/volumes/all`);
|
||||
};
|
||||
|
||||
// embla carousel
|
||||
const [emblaRef, emblaApi] = useEmblaCarousel({
|
||||
loop: false,
|
||||
align: "start",
|
||||
containScroll: "trimSnaps",
|
||||
slidesToScroll: 1,
|
||||
});
|
||||
|
||||
return (
|
||||
<div>
|
||||
{/* TODO: Switch iconClassNames to Solar icon */}
|
||||
<section>
|
||||
<Header
|
||||
headerContent="Volumes"
|
||||
subHeaderContent={<>Based on ComicVine Volume information</>}
|
||||
subHeaderContent="Based on ComicVine Volume information"
|
||||
iconClassNames="fa-solid fa-binoculars mr-2"
|
||||
link={"/volumes"}
|
||||
/>
|
||||
<div className="-mr-10 sm:-mr-17 lg:-mr-29 xl:-mr-36 2xl:-mr-42 mt-3">
|
||||
<div className="overflow-hidden" ref={emblaRef}>
|
||||
<div className="flex">
|
||||
{map(deduplicatedGroups, (data) => {
|
||||
return (
|
||||
<div
|
||||
key={data.id}
|
||||
className="flex-[0_0_200px] min-w-0 sm:flex-[0_0_220px] md:flex-[0_0_240px] lg:flex-[0_0_260px] xl:flex-[0_0_280px] pr-[15px]"
|
||||
>
|
||||
<Card
|
||||
orientation="vertical-2"
|
||||
imageUrl={data.volumes?.image?.small_url || undefined}
|
||||
hasDetails
|
||||
>
|
||||
<div className="py-3">
|
||||
<div className="text-sm">
|
||||
<Link to={`/volume/details/${data.id}`}>
|
||||
{ellipsize(data.volumes?.name || 'Unknown', 48)}
|
||||
</Link>
|
||||
</div>
|
||||
{/* issue count */}
|
||||
<span className="inline-flex mt-1 items-center bg-slate-50 text-slate-800 text-xs font-medium px-2.5 py-0.5 rounded-md dark:text-slate-600 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--documents-minimalistic-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
<div className="grid grid-cols-5 gap-6 mt-3">
|
||||
{map(deduplicatedGroups, (data) => {
|
||||
return (
|
||||
<div className="max-w-sm mx-auto" key={data._id}>
|
||||
<Card
|
||||
orientation="vertical-2"
|
||||
key={data._id}
|
||||
imageUrl={data.volumes.image.small_url}
|
||||
hasDetails
|
||||
>
|
||||
<div className="py-3">
|
||||
<div className="text-sm">
|
||||
<Link to={`/volume/details/${data._id}`}>
|
||||
{ellipsize(data.volumes.name, 48)}
|
||||
</Link>
|
||||
</div>
|
||||
{/* issue count */}
|
||||
<span className="inline-flex mt-1 items-center bg-slate-50 text-slate-800 text-xs font-medium px-2.5 py-0.5 rounded-md dark:text-slate-600 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--documents-minimalistic-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
|
||||
<span className="text-md text-slate-500 dark:text-slate-900">
|
||||
{data.volumes?.count_of_issues || 0} issues
|
||||
</span>
|
||||
</span>
|
||||
</div>
|
||||
</Card>
|
||||
<div className="w-11/12 h-2 mx-auto bg-slate-900 rounded-b opacity-75"></div>
|
||||
<div className="w-10/12 h-2 mx-auto bg-slate-900 rounded-b opacity-50"></div>
|
||||
<div className="w-9/12 h-2 mx-auto bg-slate-900 rounded-b opacity-25"></div>
|
||||
<span className="text-md text-slate-500 dark:text-slate-900">
|
||||
{data.volumes.count_of_issues} issues
|
||||
</span>
|
||||
</span>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
</Card>
|
||||
<div className="w-11/12 h-2 mx-auto bg-slate-900 rounded-b opacity-75"></div>
|
||||
<div className="w-10/12 h-2 mx-auto bg-slate-900 rounded-b opacity-50"></div>
|
||||
<div className="w-9/12 h-2 mx-auto bg-slate-900 rounded-b opacity-25"></div>
|
||||
</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
</div>
|
||||
</section>
|
||||
);
|
||||
};
|
||||
|
||||
|
||||
@@ -6,134 +6,101 @@ import { isEmpty, isNil, isUndefined, map } from "lodash";
|
||||
import { detectIssueTypes } from "../../shared/utils/tradepaperback.utils";
|
||||
import { determineCoverFile } from "../../shared/utils/metadata.utils";
|
||||
import Header from "../shared/Header";
|
||||
import useEmblaCarousel from "embla-carousel-react";
|
||||
import { GetWantedComicsQuery } from "../../graphql/generated";
|
||||
|
||||
type WantedComicsListProps = {
|
||||
comics?: GetWantedComicsQuery['getComicBooks']['docs'];
|
||||
comics: any;
|
||||
};
|
||||
|
||||
export const WantedComicsList = ({
|
||||
comics,
|
||||
}: WantedComicsListProps): ReactElement | null => {
|
||||
if (!comics || comics.length === 0) return null;
|
||||
|
||||
}: WantedComicsListProps): ReactElement => {
|
||||
const navigate = useNavigate();
|
||||
|
||||
// embla carousel
|
||||
const [emblaRef, emblaApi] = useEmblaCarousel({
|
||||
loop: false,
|
||||
align: "start",
|
||||
containScroll: "trimSnaps",
|
||||
slidesToScroll: 1,
|
||||
});
|
||||
|
||||
return (
|
||||
<div>
|
||||
{/* TODO: Switch iconClassNames to Solar icon */}
|
||||
<>
|
||||
<Header
|
||||
headerContent="Wanted Comics"
|
||||
subHeaderContent={<>Comics marked as wanted from various sources</>}
|
||||
subHeaderContent="Comics marked as wanted from various sources"
|
||||
iconClassNames="fa-solid fa-binoculars mr-2"
|
||||
link={"/wanted"}
|
||||
/>
|
||||
<div className="-mr-10 sm:-mr-17 lg:-mr-29 xl:-mr-36 2xl:-mr-42 mt-3">
|
||||
<div className="overflow-hidden" ref={emblaRef}>
|
||||
<div className="flex">
|
||||
{map(
|
||||
comics,
|
||||
(comic) => {
|
||||
const {
|
||||
id,
|
||||
rawFileDetails,
|
||||
sourcedMetadata,
|
||||
} = comic;
|
||||
<div className="grid grid-cols-5 gap-6 mt-3">
|
||||
{map(
|
||||
comics,
|
||||
({
|
||||
_id,
|
||||
rawFileDetails,
|
||||
sourcedMetadata: { comicvine, comicInfo, locg },
|
||||
}) => {
|
||||
const isComicBookMetadataAvailable =
|
||||
!isUndefined(comicvine) &&
|
||||
!isUndefined(comicvine.volumeInformation);
|
||||
const consolidatedComicMetadata = {
|
||||
rawFileDetails,
|
||||
comicvine,
|
||||
comicInfo,
|
||||
locg,
|
||||
};
|
||||
|
||||
// Parse sourced metadata (GraphQL returns as strings)
|
||||
const comicvine = typeof sourcedMetadata?.comicvine === 'string'
|
||||
? JSON.parse(sourcedMetadata.comicvine)
|
||||
: sourcedMetadata?.comicvine;
|
||||
const comicInfo = typeof sourcedMetadata?.comicInfo === 'string'
|
||||
? JSON.parse(sourcedMetadata.comicInfo)
|
||||
: sourcedMetadata?.comicInfo;
|
||||
const locg = sourcedMetadata?.locg;
|
||||
const { issueName, url } = determineCoverFile(
|
||||
consolidatedComicMetadata,
|
||||
);
|
||||
const titleElement = (
|
||||
<Link to={"/comic/details/" + _id}>
|
||||
{ellipsize(issueName, 20)}
|
||||
</Link>
|
||||
);
|
||||
return (
|
||||
<Card
|
||||
key={_id}
|
||||
orientation={"vertical-2"}
|
||||
imageUrl={url}
|
||||
hasDetails
|
||||
title={issueName ? titleElement : <span>No Name</span>}
|
||||
>
|
||||
<div className="pb-1">
|
||||
{/* Issue type */}
|
||||
{isComicBookMetadataAvailable &&
|
||||
!isNil(
|
||||
detectIssueTypes(comicvine.volumeInformation.description),
|
||||
) ? (
|
||||
<div className="my-2">
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs font-medium px-2.5 py-0.5 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--book-2-line-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
|
||||
const isComicBookMetadataAvailable = !isUndefined(comicvine);
|
||||
const consolidatedComicMetadata = {
|
||||
rawFileDetails,
|
||||
comicvine,
|
||||
comicInfo,
|
||||
locg,
|
||||
};
|
||||
|
||||
const {
|
||||
issueName,
|
||||
url,
|
||||
publisher = null,
|
||||
} = determineCoverFile(consolidatedComicMetadata);
|
||||
const titleElement = (
|
||||
<Link to={"/comic/details/" + id}>
|
||||
{ellipsize(issueName, 20)}
|
||||
<p>{publisher}</p>
|
||||
</Link>
|
||||
);
|
||||
return (
|
||||
<div
|
||||
key={id}
|
||||
className="flex-[0_0_200px] min-w-0 sm:flex-[0_0_220px] md:flex-[0_0_240px] lg:flex-[0_0_260px] xl:flex-[0_0_280px] pr-[15px]"
|
||||
>
|
||||
<Card
|
||||
orientation={"vertical-2"}
|
||||
imageUrl={url}
|
||||
hasDetails
|
||||
title={issueName ? titleElement : <span>No Name</span>}
|
||||
cardState="wanted"
|
||||
>
|
||||
<div className="pb-1">
|
||||
<div className="flex flex-row gap-2">
|
||||
{/* Issue type */}
|
||||
{isComicBookMetadataAvailable &&
|
||||
!isNil(detectIssueTypes(comicvine?.description)) ? (
|
||||
<div className="my-2">
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs font-medium px-2.5 py-0.5 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--book-2-line-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
|
||||
<span className="text-md text-slate-500 dark:text-slate-900">
|
||||
{
|
||||
detectIssueTypes(comicvine?.description)
|
||||
?.displayName
|
||||
}
|
||||
</span>
|
||||
</span>
|
||||
</div>
|
||||
) : null}
|
||||
{/* Wanted comics - info not available in current GraphQL query */}
|
||||
</div>
|
||||
{/* comicVine metadata presence */}
|
||||
{isComicBookMetadataAvailable && (
|
||||
<img
|
||||
src="/src/client/assets/img/cvlogo.svg"
|
||||
alt={"ComicVine metadata detected."}
|
||||
className="inline-block w-6 h-6 md:w-7 md:h-7 flex-shrink-0 object-contain"
|
||||
/>
|
||||
)}
|
||||
{!isEmpty(locg) && (
|
||||
<img
|
||||
src="/src/client/assets/img/locglogo.svg"
|
||||
className="w-7 h-7"
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
</Card>
|
||||
</div>
|
||||
);
|
||||
},
|
||||
)}
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
<span className="text-md text-slate-500 dark:text-slate-900">
|
||||
{
|
||||
detectIssueTypes(
|
||||
comicvine.volumeInformation.description,
|
||||
).displayName
|
||||
}
|
||||
</span>
|
||||
</span>
|
||||
</div>
|
||||
) : null}
|
||||
|
||||
{/* comicVine metadata presence */}
|
||||
{isComicBookMetadataAvailable && (
|
||||
<img
|
||||
src="/src/client/assets/img/cvlogo.svg"
|
||||
alt={"ComicVine metadata detected."}
|
||||
className="w-7 h-7"
|
||||
/>
|
||||
)}
|
||||
{!isEmpty(locg) && (
|
||||
<img
|
||||
src="/src/client/assets/img/locglogo.svg"
|
||||
className="w-7 h-7"
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
</Card>
|
||||
);
|
||||
},
|
||||
)}
|
||||
</div>
|
||||
</>
|
||||
);
|
||||
};
|
||||
|
||||
@@ -1,6 +1,9 @@
|
||||
import * as React from "react";
|
||||
import type { ZeroStateProps } from "../../types";
|
||||
|
||||
interface ZeroStateProps {
|
||||
header: string;
|
||||
message: string;
|
||||
}
|
||||
const ZeroState: React.FunctionComponent<ZeroStateProps> = (props) => {
|
||||
return (
|
||||
<article className="">
|
||||
|
||||
@@ -1,55 +1,69 @@
|
||||
import React, { ReactElement, useEffect, useState } from "react";
|
||||
import { isEmpty, isNil } from "lodash";
|
||||
import { getTransfers } from "../../actions/airdcpp.actions";
|
||||
import { isEmpty, isNil, isUndefined } from "lodash";
|
||||
import { determineCoverFile } from "../../shared/utils/metadata.utils";
|
||||
import MetadataPanel from "../shared/MetadataPanel";
|
||||
import type { DownloadsProps } from "../../types";
|
||||
import { useStore } from "../../store";
|
||||
|
||||
interface BundleData {
|
||||
rawFileDetails?: Record<string, unknown>;
|
||||
inferredMetadata?: Record<string, unknown>;
|
||||
acquisition?: {
|
||||
directconnect?: {
|
||||
downloads?: Array<{
|
||||
name: string;
|
||||
size: number;
|
||||
type: { str: string };
|
||||
bundleId: string;
|
||||
}>;
|
||||
};
|
||||
};
|
||||
sourcedMetadata?: {
|
||||
locg?: unknown;
|
||||
comicvine?: unknown;
|
||||
};
|
||||
issueName?: string;
|
||||
url?: string;
|
||||
interface IDownloadsProps {
|
||||
data: any;
|
||||
}
|
||||
|
||||
export const Downloads = (_props: DownloadsProps): ReactElement => {
|
||||
// Using Zustand store for socket management
|
||||
const getSocket = useStore((state) => state.getSocket);
|
||||
|
||||
const [bundles, setBundles] = useState<BundleData[]>([]);
|
||||
const [isLoading, setIsLoading] = useState(true);
|
||||
|
||||
// Initialize socket connection and load data
|
||||
useEffect(() => {
|
||||
const socket = getSocket();
|
||||
if (socket) {
|
||||
// Socket is connected, we could fetch transfers here
|
||||
// For now, just set loading to false since we don't have direct access to Redux state
|
||||
setIsLoading(false);
|
||||
}
|
||||
}, [getSocket]);
|
||||
export const Downloads = (props: IDownloadsProps): ReactElement => {
|
||||
// const airDCPPConfiguration = useContext(AirDCPPSocketContext);
|
||||
const {
|
||||
airDCPPState: { settings, socket },
|
||||
} = airDCPPConfiguration;
|
||||
// const dispatch = useDispatch();
|
||||
|
||||
return !isNil(bundles) && bundles.length > 0 ? (
|
||||
<div className="container mx-auto px-4 sm:px-6 lg:px-8">
|
||||
// const airDCPPTransfers = useSelector(
|
||||
// (state: RootState) => state.airdcpp.transfers,
|
||||
// );
|
||||
// const issueBundles = useSelector(
|
||||
// (state: RootState) => state.airdcpp.issue_bundles,
|
||||
// );
|
||||
const [bundles, setBundles] = useState([]);
|
||||
// Make the call to get all transfers from AirDC++
|
||||
useEffect(() => {
|
||||
if (!isUndefined(socket) && !isEmpty(settings)) {
|
||||
dispatch(
|
||||
getTransfers(socket, {
|
||||
username: `${settings.directConnect.client.host.username}`,
|
||||
password: `${settings.directConnect.client.host.password}`,
|
||||
}),
|
||||
);
|
||||
}
|
||||
}, [socket]);
|
||||
|
||||
useEffect(() => {
|
||||
if (!isUndefined(issueBundles)) {
|
||||
const foo = issueBundles.data.map((bundle) => {
|
||||
const {
|
||||
rawFileDetails,
|
||||
inferredMetadata,
|
||||
acquisition: {
|
||||
directconnect: { downloads },
|
||||
},
|
||||
sourcedMetadata: { locg, comicvine },
|
||||
} = bundle;
|
||||
const { issueName, url } = determineCoverFile({
|
||||
rawFileDetails,
|
||||
comicvine,
|
||||
locg,
|
||||
});
|
||||
return { ...bundle, issueName, url };
|
||||
});
|
||||
setBundles(foo);
|
||||
}
|
||||
}, [issueBundles]);
|
||||
|
||||
return !isNil(bundles) ? (
|
||||
<div className="container">
|
||||
<section className="section">
|
||||
<h1 className="title">Downloads</h1>
|
||||
<div className="columns">
|
||||
<div className="column is-half">
|
||||
{bundles.map((bundle, idx) => {
|
||||
console.log(bundle);
|
||||
return (
|
||||
<div key={idx}>
|
||||
<MetadataPanel
|
||||
@@ -74,16 +88,16 @@ export const Downloads = (_props: DownloadsProps): ReactElement => {
|
||||
</tr>
|
||||
</thead>
|
||||
<tbody>
|
||||
{bundle.acquisition?.directconnect?.downloads?.map(
|
||||
(download, idx: number) => {
|
||||
{bundle.acquisition.directconnect.downloads.map(
|
||||
(bundle, idx) => {
|
||||
return (
|
||||
<tr key={idx}>
|
||||
<td>{download.name}</td>
|
||||
<td>{download.size}</td>
|
||||
<td>{download.type.str}</td>
|
||||
<td>{bundle.name}</td>
|
||||
<td>{bundle.size}</td>
|
||||
<td>{bundle.type.str}</td>
|
||||
<td>
|
||||
<span className="tag is-warning">
|
||||
{download.bundleId}
|
||||
{bundle.bundleId}
|
||||
</span>
|
||||
</td>
|
||||
</tr>
|
||||
|
||||
@@ -1,28 +1,40 @@
|
||||
import { debounce, isEmpty, map } from "lodash";
|
||||
import React, { ReactElement, useCallback, useState } from "react";
|
||||
import axios from "axios";
|
||||
import { useDispatch, useSelector } from "react-redux";
|
||||
import Card from "../shared/Carda";
|
||||
|
||||
import { searchIssue } from "../../actions/fileops.actions";
|
||||
import MetadataPanel from "../shared/MetadataPanel";
|
||||
import { SEARCH_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import type { GlobalSearchBarProps } from "../../types";
|
||||
|
||||
export const SearchBar = (data: GlobalSearchBarProps): ReactElement => {
|
||||
const [searchResults, setSearchResults] = useState<Record<string, unknown>[]>([]);
|
||||
interface ISearchBarProps {
|
||||
data: any;
|
||||
}
|
||||
|
||||
export const SearchBar = (data: ISearchBarProps): ReactElement => {
|
||||
const dispatch = useDispatch();
|
||||
const searchResults = useSelector(
|
||||
(state: RootState) => state.fileOps.librarySearchResultsFormatted,
|
||||
);
|
||||
|
||||
const performSearch = useCallback(
|
||||
debounce(async (e) => {
|
||||
const response = await axios({
|
||||
url: `${SEARCH_SERVICE_BASE_URI}/searchIssue`,
|
||||
method: "POST",
|
||||
data: {
|
||||
query: { volumeName: e.target.value },
|
||||
pagination: { size: 25, from: 0 },
|
||||
type: "volumeName",
|
||||
trigger: "globalSearchBar",
|
||||
},
|
||||
});
|
||||
setSearchResults(response.data?.hits ?? []);
|
||||
debounce((e) => {
|
||||
dispatch(
|
||||
searchIssue(
|
||||
{
|
||||
query: {
|
||||
volumeName: e.target.value,
|
||||
},
|
||||
},
|
||||
{
|
||||
pagination: {
|
||||
size: 25,
|
||||
from: 0,
|
||||
},
|
||||
type: "volumeName",
|
||||
trigger: "globalSearchBar",
|
||||
},
|
||||
),
|
||||
);
|
||||
}, 500),
|
||||
[data],
|
||||
);
|
||||
@@ -35,7 +47,6 @@ export const SearchBar = (data: GlobalSearchBarProps): ReactElement => {
|
||||
onChange={(e) => performSearch(e)}
|
||||
/>
|
||||
|
||||
{/* TODO: Switch to Solar icon */}
|
||||
<span className="icon is-right mt-2">
|
||||
<i className="fa-solid fa-magnifying-glass"></i>
|
||||
</span>
|
||||
|
||||
@@ -1,677 +0,0 @@
|
||||
import React from 'react';
|
||||
import { render, screen, waitFor, fireEvent, act } from '@testing-library/react';
|
||||
import '@testing-library/jest-dom';
|
||||
import { QueryClient, QueryClientProvider } from '@tanstack/react-query';
|
||||
import axios from 'axios';
|
||||
import { Import } from './Import';
|
||||
|
||||
// Mock axios
|
||||
jest.mock('axios');
|
||||
const mockedAxios = axios as jest.MockedFunction<any>;
|
||||
|
||||
// Mock zustand store
|
||||
const mockGetSocket = jest.fn();
|
||||
const mockDisconnectSocket = jest.fn();
|
||||
const mockSetStatus = jest.fn();
|
||||
|
||||
jest.mock('../../store', () => ({
|
||||
useStore: jest.fn((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
),
|
||||
}));
|
||||
|
||||
// Mock socket.io-client
|
||||
const mockSocket = {
|
||||
on: jest.fn(),
|
||||
off: jest.fn(),
|
||||
emit: jest.fn(),
|
||||
};
|
||||
|
||||
mockGetSocket.mockReturnValue(mockSocket);
|
||||
|
||||
// Helper function to create a wrapper with QueryClient
|
||||
const createWrapper = () => {
|
||||
const queryClient = new QueryClient({
|
||||
defaultOptions: {
|
||||
queries: {
|
||||
retry: false,
|
||||
},
|
||||
},
|
||||
});
|
||||
return ({ children }: { children: React.ReactNode }) => (
|
||||
<QueryClientProvider client={queryClient}>{children}</QueryClientProvider>
|
||||
);
|
||||
};
|
||||
|
||||
describe('Import Component - Numerical Indices', () => {
|
||||
beforeEach(() => {
|
||||
jest.clearAllMocks();
|
||||
});
|
||||
|
||||
test('should display numerical indices in the Past Imports table', async () => {
|
||||
// Mock API response with 3 import sessions
|
||||
const mockData = [
|
||||
{
|
||||
sessionId: 'session-1',
|
||||
earliestTimestamp: '2024-01-01T10:00:00Z',
|
||||
completedJobs: 5,
|
||||
failedJobs: 0
|
||||
},
|
||||
{
|
||||
sessionId: 'session-2',
|
||||
earliestTimestamp: '2024-01-02T10:00:00Z',
|
||||
completedJobs: 3,
|
||||
failedJobs: 1
|
||||
},
|
||||
{
|
||||
sessionId: 'session-3',
|
||||
earliestTimestamp: '2024-01-03T10:00:00Z',
|
||||
completedJobs: 8,
|
||||
failedJobs: 2
|
||||
},
|
||||
];
|
||||
|
||||
(axios as any).mockResolvedValue({ data: mockData });
|
||||
(axios.request as jest.Mock) = jest.fn().mockResolvedValue({ data: {} });
|
||||
|
||||
render(<Import path="/test" />, { wrapper: createWrapper() });
|
||||
|
||||
// Wait for the "Past Imports" heading to appear
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Past Imports')).toBeInTheDocument();
|
||||
});
|
||||
|
||||
// Verify that the "#" column header exists
|
||||
expect(screen.getByText('#')).toBeInTheDocument();
|
||||
|
||||
// Verify that numerical indices (1, 2, 3) are displayed in the first column of each row
|
||||
const rows = screen.getAllByRole('row');
|
||||
// Skip header row (index 0), check data rows
|
||||
expect(rows[1].querySelectorAll('td')[0]).toHaveTextContent('1');
|
||||
expect(rows[2].querySelectorAll('td')[0]).toHaveTextContent('2');
|
||||
expect(rows[3].querySelectorAll('td')[0]).toHaveTextContent('3');
|
||||
});
|
||||
|
||||
test('should display correct indices for larger datasets', async () => {
|
||||
// Mock API response with 10 import sessions
|
||||
const mockData = Array.from({ length: 10 }, (_, i) => ({
|
||||
sessionId: `session-${i + 1}`,
|
||||
earliestTimestamp: `2024-01-${String(i + 1).padStart(2, '0')}T10:00:00Z`,
|
||||
completedJobs: i + 1,
|
||||
failedJobs: 0,
|
||||
}));
|
||||
|
||||
(axios as any).mockResolvedValue({ data: mockData });
|
||||
(axios.request as jest.Mock) = jest.fn().mockResolvedValue({ data: {} });
|
||||
|
||||
render(<Import path="/test" />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Past Imports')).toBeInTheDocument();
|
||||
});
|
||||
|
||||
// Verify indices 1 through 10 are present in the first column
|
||||
const rows = screen.getAllByRole('row');
|
||||
// Skip header row (index 0)
|
||||
for (let i = 1; i <= 10; i++) {
|
||||
const row = rows[i];
|
||||
const cells = row.querySelectorAll('td');
|
||||
expect(cells[0]).toHaveTextContent(i.toString());
|
||||
}
|
||||
});
|
||||
});
|
||||
|
||||
describe('Import Component - Button Visibility', () => {
|
||||
beforeEach(() => {
|
||||
jest.clearAllMocks();
|
||||
(axios as any).mockResolvedValue({ data: [] });
|
||||
(axios.request as jest.Mock) = jest.fn().mockResolvedValue({ data: {} });
|
||||
});
|
||||
|
||||
test('should show Start Import button when queue status is drained', async () => {
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import path="/test" />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Start Import')).toBeInTheDocument();
|
||||
});
|
||||
|
||||
// Verify Pause and Resume buttons are NOT visible
|
||||
expect(screen.queryByText('Pause')).not.toBeInTheDocument();
|
||||
expect(screen.queryByText('Resume')).not.toBeInTheDocument();
|
||||
});
|
||||
|
||||
test('should show Start Import button when queue status is undefined', async () => {
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: undefined,
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import path="/test" />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Start Import')).toBeInTheDocument();
|
||||
});
|
||||
});
|
||||
|
||||
test('should hide Start Import button and show Pause button when queue is running', async () => {
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'running',
|
||||
successfulJobCount: 5,
|
||||
failedJobCount: 1,
|
||||
mostRecentImport: 'Comic #123',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import path="/test" />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.queryByText('Start Import')).not.toBeInTheDocument();
|
||||
expect(screen.getByText('Pause')).toBeInTheDocument();
|
||||
});
|
||||
|
||||
// Verify Import Activity section is visible
|
||||
expect(screen.getByText('Import Activity')).toBeInTheDocument();
|
||||
expect(screen.getByText('5')).toBeInTheDocument(); // successful count
|
||||
expect(screen.getByText('1')).toBeInTheDocument(); // failed count
|
||||
});
|
||||
|
||||
test('should hide Start Import button and show Resume button when queue is paused', async () => {
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'paused',
|
||||
successfulJobCount: 3,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: 'Comic #456',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import path="/test" />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.queryByText('Start Import')).not.toBeInTheDocument();
|
||||
expect(screen.getByText('Resume')).toBeInTheDocument();
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Import Component - SessionId and Socket Reconnection', () => {
|
||||
beforeEach(() => {
|
||||
jest.clearAllMocks();
|
||||
jest.useFakeTimers();
|
||||
localStorage.clear();
|
||||
(axios as any).mockResolvedValue({ data: [] });
|
||||
(axios.request as jest.Mock) = jest.fn().mockResolvedValue({ data: {} });
|
||||
});
|
||||
|
||||
afterEach(() => {
|
||||
jest.useRealTimers();
|
||||
});
|
||||
|
||||
test('should clear sessionId and reconnect socket when starting import after queue is drained', async () => {
|
||||
// Setup: Set old sessionId in localStorage
|
||||
localStorage.setItem('sessionId', 'old-session-id');
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import path="/test" />, { wrapper: createWrapper() });
|
||||
|
||||
// Click the "Start Import" button
|
||||
const startButton = await screen.findByText('Start Import');
|
||||
fireEvent.click(startButton);
|
||||
|
||||
// Verify sessionId is cleared immediately
|
||||
expect(localStorage.getItem('sessionId')).toBeNull();
|
||||
|
||||
// Verify disconnectSocket is called
|
||||
expect(mockDisconnectSocket).toHaveBeenCalledWith('/');
|
||||
|
||||
// Fast-forward 100ms
|
||||
await act(async () => {
|
||||
jest.advanceTimersByTime(100);
|
||||
});
|
||||
|
||||
// Verify getSocket is called after 100ms
|
||||
await waitFor(() => {
|
||||
expect(mockGetSocket).toHaveBeenCalledWith('/');
|
||||
});
|
||||
|
||||
// Fast-forward another 500ms
|
||||
await act(async () => {
|
||||
jest.advanceTimersByTime(500);
|
||||
});
|
||||
|
||||
// Verify initiateImport is called and status is set to running
|
||||
await waitFor(() => {
|
||||
expect(axios.request).toHaveBeenCalledWith({
|
||||
url: 'http://localhost:3000/api/library/newImport',
|
||||
method: 'POST',
|
||||
data: { sessionId: null },
|
||||
});
|
||||
expect(mockSetStatus).toHaveBeenCalledWith('running');
|
||||
});
|
||||
});
|
||||
|
||||
test('should NOT clear sessionId when starting import with undefined status', async () => {
|
||||
// Setup: Set existing sessionId in localStorage
|
||||
localStorage.setItem('sessionId', 'existing-session-id');
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: undefined,
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import path="/test" />, { wrapper: createWrapper() });
|
||||
|
||||
// Click the "Start Import" button
|
||||
const startButton = await screen.findByText('Start Import');
|
||||
fireEvent.click(startButton);
|
||||
|
||||
// Verify sessionId is NOT cleared
|
||||
expect(localStorage.getItem('sessionId')).toBe('existing-session-id');
|
||||
|
||||
// Verify disconnectSocket is NOT called
|
||||
expect(mockDisconnectSocket).not.toHaveBeenCalled();
|
||||
|
||||
// Verify status is set to running immediately
|
||||
expect(mockSetStatus).toHaveBeenCalledWith('running');
|
||||
|
||||
// Verify initiateImport is called immediately (no delay)
|
||||
await waitFor(() => {
|
||||
expect(axios.request).toHaveBeenCalledWith({
|
||||
url: 'http://localhost:3000/api/library/newImport',
|
||||
method: 'POST',
|
||||
data: { sessionId: 'existing-session-id' },
|
||||
});
|
||||
});
|
||||
});
|
||||
});
|
||||
|
||||
describe('Import Component - Real-time Updates', () => {
|
||||
beforeEach(() => {
|
||||
jest.clearAllMocks();
|
||||
(axios as any).mockResolvedValue({ data: [] });
|
||||
(axios.request as jest.Mock) = jest.fn().mockResolvedValue({ data: {} });
|
||||
});
|
||||
|
||||
test('should refetch table data when LS_COVER_EXTRACTED event is received', async () => {
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'running',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import path="/test" />, { wrapper: createWrapper() });
|
||||
|
||||
// Wait for component to mount and socket listeners to be attached
|
||||
await waitFor(() => {
|
||||
expect(mockSocket.on).toHaveBeenCalledWith('LS_COVER_EXTRACTED', expect.any(Function));
|
||||
});
|
||||
|
||||
// Get the event handler that was registered
|
||||
const coverExtractedHandler = mockSocket.on.mock.calls.find(
|
||||
(call) => call[0] === 'LS_COVER_EXTRACTED'
|
||||
)?.[1];
|
||||
|
||||
// Clear previous axios calls
|
||||
(axios as any).mockClear();
|
||||
|
||||
// Simulate the socket event
|
||||
if (coverExtractedHandler) {
|
||||
coverExtractedHandler();
|
||||
}
|
||||
|
||||
// Verify that the API is called again (refetch)
|
||||
await waitFor(() => {
|
||||
expect(axios).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
method: 'GET',
|
||||
url: 'http://localhost:3000/api/jobqueue/getJobResultStatistics',
|
||||
})
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
test('should refetch table data when LS_IMPORT_QUEUE_DRAINED event is received', async () => {
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'running',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import path="/test" />, { wrapper: createWrapper() });
|
||||
|
||||
// Wait for component to mount and socket listeners to be attached
|
||||
await waitFor(() => {
|
||||
expect(mockSocket.on).toHaveBeenCalledWith('LS_IMPORT_QUEUE_DRAINED', expect.any(Function));
|
||||
});
|
||||
|
||||
// Get the event handler that was registered
|
||||
const queueDrainedHandler = mockSocket.on.mock.calls.find(
|
||||
(call) => call[0] === 'LS_IMPORT_QUEUE_DRAINED'
|
||||
)?.[1];
|
||||
|
||||
// Clear previous axios calls
|
||||
(axios as any).mockClear();
|
||||
|
||||
// Simulate the socket event
|
||||
if (queueDrainedHandler) {
|
||||
queueDrainedHandler();
|
||||
}
|
||||
|
||||
// Verify that the API is called again (refetch)
|
||||
await waitFor(() => {
|
||||
expect(axios).toHaveBeenCalledWith(
|
||||
expect.objectContaining({
|
||||
method: 'GET',
|
||||
url: 'http://localhost:3000/api/jobqueue/getJobResultStatistics',
|
||||
})
|
||||
);
|
||||
});
|
||||
});
|
||||
|
||||
test('should cleanup socket listeners on unmount', async () => {
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
const { unmount } = render(<Import path="/test" />, { wrapper: createWrapper() });
|
||||
|
||||
// Wait for socket listeners to be attached
|
||||
await waitFor(() => {
|
||||
expect(mockSocket.on).toHaveBeenCalled();
|
||||
});
|
||||
|
||||
// Unmount the component
|
||||
unmount();
|
||||
|
||||
// Verify that socket listeners are removed
|
||||
expect(mockSocket.off).toHaveBeenCalledWith('LS_COVER_EXTRACTED', expect.any(Function));
|
||||
expect(mockSocket.off).toHaveBeenCalledWith('LS_IMPORT_QUEUE_DRAINED', expect.any(Function));
|
||||
});
|
||||
});
|
||||
|
||||
describe('Import Component - Directory Status', () => {
|
||||
beforeEach(() => {
|
||||
jest.clearAllMocks();
|
||||
(axios as any).mockResolvedValue({ data: [] });
|
||||
(axios.request as jest.Mock) = jest.fn().mockResolvedValue({ data: {} });
|
||||
// Mock successful directory status by default
|
||||
(axios.get as jest.Mock) = jest.fn().mockResolvedValue({
|
||||
data: { comics: { exists: true }, userdata: { exists: true } }
|
||||
});
|
||||
});
|
||||
|
||||
test('should show warning banner when comics directory is missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: false }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Required Directories Missing')).toBeInTheDocument();
|
||||
});
|
||||
expect(screen.getByText('comics')).toBeInTheDocument();
|
||||
});
|
||||
|
||||
test('should show warning banner when userdata directory is missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: true }, userdata: { exists: false } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Required Directories Missing')).toBeInTheDocument();
|
||||
});
|
||||
expect(screen.getByText('userdata')).toBeInTheDocument();
|
||||
});
|
||||
|
||||
test('should show warning banner when both directories are missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: false }, userdata: { exists: false } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
expect(screen.getByText('Required Directories Missing')).toBeInTheDocument();
|
||||
});
|
||||
expect(screen.getByText('comics')).toBeInTheDocument();
|
||||
expect(screen.getByText('userdata')).toBeInTheDocument();
|
||||
});
|
||||
|
||||
test('should disable import button when directories are missing', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: false }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
const button = screen.getByRole('button', { name: /Force Re-Import/i });
|
||||
expect(button).toBeDisabled();
|
||||
});
|
||||
});
|
||||
|
||||
test('should enable import button when all directories exist', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: true }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
await waitFor(() => {
|
||||
const button = screen.getByRole('button', { name: /Force Re-Import/i });
|
||||
expect(button).not.toBeDisabled();
|
||||
});
|
||||
});
|
||||
|
||||
test('should not show warning banner when all directories exist', async () => {
|
||||
(axios.get as jest.Mock).mockResolvedValue({
|
||||
data: { comics: { exists: true }, userdata: { exists: true } }
|
||||
});
|
||||
|
||||
const { useStore } = require('../../store');
|
||||
useStore.mockImplementation((selector: any) =>
|
||||
selector({
|
||||
importJobQueue: {
|
||||
status: 'drained',
|
||||
successfulJobCount: 0,
|
||||
failedJobCount: 0,
|
||||
mostRecentImport: '',
|
||||
setStatus: mockSetStatus,
|
||||
},
|
||||
getSocket: mockGetSocket,
|
||||
disconnectSocket: mockDisconnectSocket,
|
||||
})
|
||||
);
|
||||
|
||||
render(<Import />, { wrapper: createWrapper() });
|
||||
|
||||
// Wait for the component to finish loading
|
||||
await waitFor(() => {
|
||||
expect(screen.getByRole('button', { name: /Force Re-Import/i })).toBeInTheDocument();
|
||||
});
|
||||
|
||||
// The warning banner should not be present
|
||||
expect(screen.queryByText('Required Directories Missing')).not.toBeInTheDocument();
|
||||
});
|
||||
});
|
||||
|
||||
export {};
|
||||
@@ -1,213 +1,131 @@
|
||||
/**
|
||||
* @fileoverview Import page component for managing comic library imports.
|
||||
* Provides UI for starting imports, monitoring progress, viewing history,
|
||||
* and handling directory configuration issues.
|
||||
* @module components/Import/Import
|
||||
*/
|
||||
|
||||
import { ReactElement, useEffect, useRef, useState } from "react";
|
||||
import { isEmpty } from "lodash";
|
||||
import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
|
||||
import React, { ReactElement, useCallback, useEffect } from "react";
|
||||
import "react-loader-spinner/dist/loader/css/react-spinner-loader.css";
|
||||
import { format } from "date-fns";
|
||||
import Loader from "react-loader-spinner";
|
||||
import { isEmpty, isNil, isUndefined } from "lodash";
|
||||
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
|
||||
import { useStore } from "../../store";
|
||||
import { useShallow } from "zustand/react/shallow";
|
||||
import axios from "axios";
|
||||
import { useGetJobResultStatisticsQuery } from "../../graphql/generated";
|
||||
import { RealTimeImportStats } from "./RealTimeImportStats";
|
||||
import { PastImportsTable } from "./PastImportsTable";
|
||||
import { AlertBanner } from "../shared/AlertBanner";
|
||||
import { useImportSessionStatus } from "../../hooks/useImportSessionStatus";
|
||||
import { SETTINGS_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import type { DirectoryStatus, DirectoryIssue } from "./import.types";
|
||||
|
||||
interface IProps {
|
||||
matches?: unknown;
|
||||
fetchComicMetadata?: any;
|
||||
path: string;
|
||||
covers?: any;
|
||||
}
|
||||
|
||||
/**
|
||||
* Import page component for managing comic library imports.
|
||||
* Component to facilitate the import of comics to the ThreeTwo library
|
||||
*
|
||||
* Features:
|
||||
* - Real-time import progress tracking via WebSocket
|
||||
* - Directory status validation before import
|
||||
* - Force re-import functionality for fixing indexing issues
|
||||
* - Past import history table
|
||||
* - Session management for import tracking
|
||||
* @param x - The first input number
|
||||
* @param y - The second input number
|
||||
* @returns The arithmetic mean of `x` and `y`
|
||||
*
|
||||
* @returns {ReactElement} The import page UI
|
||||
* @beta
|
||||
*/
|
||||
export const Import = (): ReactElement => {
|
||||
const [importError, setImportError] = useState<string | null>(null);
|
||||
|
||||
export const Import = (props: IProps): ReactElement => {
|
||||
const queryClient = useQueryClient();
|
||||
const { importJobQueue, getSocket, disconnectSocket } = useStore(
|
||||
const { importJobQueue, socketIOInstance } = useStore(
|
||||
useShallow((state) => ({
|
||||
importJobQueue: state.importJobQueue,
|
||||
getSocket: state.getSocket,
|
||||
disconnectSocket: state.disconnectSocket,
|
||||
}))
|
||||
socketIOInstance: state.socketIOInstance,
|
||||
})),
|
||||
);
|
||||
|
||||
// Check if required directories exist
|
||||
const {
|
||||
data: directoryStatus,
|
||||
isLoading: isCheckingDirectories,
|
||||
isError: isDirectoryCheckError,
|
||||
error: directoryError,
|
||||
} = useQuery({
|
||||
queryKey: ["directoryStatus"],
|
||||
queryFn: async (): Promise<DirectoryStatus> => {
|
||||
const response = await axios.get(
|
||||
`${SETTINGS_SERVICE_BASE_URI}/getDirectoryStatus`
|
||||
);
|
||||
return response.data;
|
||||
},
|
||||
refetchOnWindowFocus: false,
|
||||
staleTime: 30000,
|
||||
retry: false,
|
||||
});
|
||||
|
||||
// Use isValid for quick check, issues array for detailed display
|
||||
const directoryCheckFailed = isDirectoryCheckError;
|
||||
const hasAllDirectories = directoryCheckFailed
|
||||
? false
|
||||
: (directoryStatus?.isValid ?? true);
|
||||
const directoryIssues = directoryStatus?.issues ?? [];
|
||||
|
||||
// Force re-import mutation
|
||||
const { mutate: forceReImport, isPending: isForceReImporting } = useMutation({
|
||||
mutationFn: async () => {
|
||||
const sessionId = localStorage.getItem("sessionId") || "";
|
||||
return await axios.request({
|
||||
url: `http://localhost:3000/api/library/forceReImport`,
|
||||
const sessionId = localStorage.getItem("sessionId");
|
||||
const { mutate: initiateImport } = useMutation({
|
||||
mutationFn: async () =>
|
||||
await axios.request({
|
||||
url: `http://localhost:3000/api/library/newImport`,
|
||||
method: "POST",
|
||||
data: { sessionId },
|
||||
});
|
||||
},
|
||||
onSuccess: (response) => {
|
||||
console.log("Force re-import initiated:", response.data);
|
||||
importJobQueue.setStatus("running");
|
||||
setImportError(null);
|
||||
},
|
||||
onError: (error: any) => {
|
||||
console.error("Failed to start force re-import:", error);
|
||||
setImportError(
|
||||
error?.response?.data?.message ||
|
||||
error?.message ||
|
||||
"Failed to start force re-import. Please try again."
|
||||
);
|
||||
},
|
||||
}),
|
||||
});
|
||||
|
||||
const { data, isLoading, refetch } = useGetJobResultStatisticsQuery();
|
||||
|
||||
const importSession = useImportSessionStatus();
|
||||
const hasActiveSession = importSession.isActive;
|
||||
const wasComplete = useRef(false);
|
||||
|
||||
// React to importSession.isComplete for state updates
|
||||
useEffect(() => {
|
||||
if (importSession.isComplete && !wasComplete.current) {
|
||||
wasComplete.current = true;
|
||||
setTimeout(() => {
|
||||
queryClient.invalidateQueries({ queryKey: ["GetJobResultStatistics"] });
|
||||
refetch();
|
||||
}, 1500);
|
||||
importJobQueue.setStatus("drained");
|
||||
} else if (!importSession.isComplete) {
|
||||
wasComplete.current = false;
|
||||
}
|
||||
}, [importSession.isComplete, refetch, importJobQueue, queryClient]);
|
||||
|
||||
// Listen to socket events to update Past Imports table
|
||||
useEffect(() => {
|
||||
const socket = getSocket("/");
|
||||
|
||||
const handleImportCompleted = () => {
|
||||
console.log(
|
||||
"[Import] IMPORT_SESSION_COMPLETED event - refreshing Past Imports"
|
||||
);
|
||||
setTimeout(() => {
|
||||
queryClient.invalidateQueries({ queryKey: ["GetJobResultStatistics"] });
|
||||
}, 1500);
|
||||
};
|
||||
|
||||
const handleQueueDrained = () => {
|
||||
console.log(
|
||||
"[Import] LS_IMPORT_QUEUE_DRAINED event - refreshing Past Imports"
|
||||
);
|
||||
setTimeout(() => {
|
||||
queryClient.invalidateQueries({ queryKey: ["GetJobResultStatistics"] });
|
||||
}, 1500);
|
||||
};
|
||||
|
||||
socket.on("IMPORT_SESSION_COMPLETED", handleImportCompleted);
|
||||
socket.on("LS_IMPORT_QUEUE_DRAINED", handleQueueDrained);
|
||||
|
||||
return () => {
|
||||
socket.off("IMPORT_SESSION_COMPLETED", handleImportCompleted);
|
||||
socket.off("LS_IMPORT_QUEUE_DRAINED", handleQueueDrained);
|
||||
};
|
||||
}, [getSocket, queryClient]);
|
||||
const { data, isError, isLoading } = useQuery({
|
||||
queryKey: ["allImportJobResults"],
|
||||
queryFn: async () =>
|
||||
await axios({
|
||||
method: "GET",
|
||||
url: "http://localhost:3000/api/jobqueue/getJobResultStatistics",
|
||||
}),
|
||||
});
|
||||
|
||||
const toggleQueue = (queueAction: string, queueStatus: string) => {
|
||||
socketIOInstance.emit(
|
||||
"call",
|
||||
"socket.setQueueStatus",
|
||||
{
|
||||
queueAction,
|
||||
queueStatus,
|
||||
},
|
||||
(data) => console.log(data),
|
||||
);
|
||||
};
|
||||
/**
|
||||
* Handles force re-import - re-imports all files to fix indexing issues
|
||||
* Method to render import job queue pause/resume controls on the UI
|
||||
*
|
||||
* @param status The `string` status (either `"pause"` or `"resume"`)
|
||||
* @returns ReactElement A `<button/>` that toggles queue status
|
||||
* @remarks Sets the global `importJobQueue.status` state upon toggling
|
||||
*/
|
||||
const handleForceReImport = async () => {
|
||||
setImportError(null);
|
||||
|
||||
if (!hasAllDirectories) {
|
||||
if (directoryCheckFailed) {
|
||||
setImportError(
|
||||
"Cannot start import: Failed to verify directory status. Please check that the backend service is running."
|
||||
const renderQueueControls = (status: string): ReactElement | null => {
|
||||
switch (status) {
|
||||
case "running":
|
||||
return (
|
||||
<div>
|
||||
<button
|
||||
className="flex space-x-1 sm:mt-0 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-3 py-1 text-gray-500 hover:bg-transparent hover:text-green-600 focus:outline-none focus:ring active:text-indigo-500"
|
||||
onClick={() => {
|
||||
toggleQueue("pause", "paused");
|
||||
importJobQueue.setStatus("paused");
|
||||
}}
|
||||
>
|
||||
<span className="text-md">Pause</span>
|
||||
<span className="w-5 h-5">
|
||||
<i className="h-5 w-5 icon-[solar--pause-bold]"></i>
|
||||
</span>
|
||||
</button>
|
||||
</div>
|
||||
);
|
||||
} else {
|
||||
const issueDetails = directoryIssues
|
||||
.map((i) => `${i.directory}: ${i.issue}`)
|
||||
.join(", ");
|
||||
setImportError(
|
||||
`Cannot start import: ${issueDetails || "Required directories are missing"}. Please check your Docker volume configuration.`
|
||||
case "paused":
|
||||
return (
|
||||
<div>
|
||||
<button
|
||||
className="flex space-x-1 sm:mt-0 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-3 py-1 text-gray-500 hover:bg-transparent hover:text-green-600 focus:outline-none focus:ring active:text-indigo-500"
|
||||
onClick={() => {
|
||||
toggleQueue("resume", "running");
|
||||
importJobQueue.setStatus("running");
|
||||
}}
|
||||
>
|
||||
<span className="text-md">Resume</span>
|
||||
<span className="w-5 h-5">
|
||||
<i className="h-5 w-5 icon-[solar--play-bold]"></i>
|
||||
</span>
|
||||
</button>
|
||||
</div>
|
||||
);
|
||||
}
|
||||
return;
|
||||
}
|
||||
|
||||
if (hasActiveSession) {
|
||||
setImportError(
|
||||
`Cannot start import: An import session "${importSession.sessionId}" is already active. Please wait for it to complete.`
|
||||
);
|
||||
return;
|
||||
}
|
||||
case "drained":
|
||||
return null;
|
||||
|
||||
if (
|
||||
window.confirm(
|
||||
"This will re-import ALL files in your library folder, even those already imported. " +
|
||||
"This can help fix Elasticsearch indexing issues. Continue?"
|
||||
)
|
||||
) {
|
||||
if (importJobQueue.status === "drained") {
|
||||
localStorage.removeItem("sessionId");
|
||||
disconnectSocket("/");
|
||||
setTimeout(() => {
|
||||
getSocket("/");
|
||||
setTimeout(() => {
|
||||
forceReImport();
|
||||
}, 500);
|
||||
}, 100);
|
||||
} else {
|
||||
forceReImport();
|
||||
}
|
||||
default:
|
||||
return null;
|
||||
}
|
||||
};
|
||||
|
||||
const canStartImport =
|
||||
!hasActiveSession &&
|
||||
(importJobQueue.status === "drained" || importJobQueue.status === undefined);
|
||||
|
||||
return (
|
||||
<div>
|
||||
<section>
|
||||
<header className="bg-slate-200 dark:bg-slate-500">
|
||||
<div className="mx-auto max-w-screen-xl px-4 py-2 sm:px-6 sm:py-8 lg:px-8 lg:py-4">
|
||||
<div className="mx-auto max-w-screen-xl px-2 py-2 sm:px-6 sm:py-8 lg:px-8 lg:py-4">
|
||||
<div className="sm:flex sm:items-center sm:justify-between">
|
||||
<div className="text-center sm:text-left">
|
||||
<h1 className="text-2xl font-bold text-gray-900 dark:text-white sm:text-3xl">
|
||||
Import
|
||||
</h1>
|
||||
|
||||
<p className="mt-1.5 text-sm text-gray-500 dark:text-white">
|
||||
Import comics into the ThreeTwo library.
|
||||
</p>
|
||||
@@ -236,90 +154,140 @@ export const Import = (): ReactElement => {
|
||||
</div>
|
||||
</article>
|
||||
|
||||
{/* Import Statistics */}
|
||||
<div className="my-6 max-w-screen-lg">
|
||||
<RealTimeImportStats />
|
||||
<div className="my-4">
|
||||
{importJobQueue.status === "drained" ||
|
||||
(importJobQueue.status === undefined && (
|
||||
<button
|
||||
className="flex space-x-1 sm:mt-0 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-5 py-3 text-gray-500 hover:bg-transparent hover:text-green-600 focus:outline-none focus:ring active:text-indigo-500"
|
||||
onClick={() => {
|
||||
initiateImport();
|
||||
importJobQueue.setStatus("running");
|
||||
}}
|
||||
>
|
||||
<span className="text-md">Start Import</span>
|
||||
<span className="w-6 h-6">
|
||||
<i className="h-6 w-6 icon-[solar--file-left-bold-duotone]"></i>
|
||||
</span>
|
||||
</button>
|
||||
))}
|
||||
</div>
|
||||
|
||||
{/* Error Message */}
|
||||
{importError && (
|
||||
<div className="my-6 max-w-screen-lg">
|
||||
<AlertBanner
|
||||
severity="error"
|
||||
title="Import Error"
|
||||
onClose={() => setImportError(null)}
|
||||
>
|
||||
{importError}
|
||||
</AlertBanner>
|
||||
</div>
|
||||
)}
|
||||
{/* Activity */}
|
||||
{(importJobQueue.status === "running" ||
|
||||
importJobQueue.status === "paused") && (
|
||||
<>
|
||||
<span className="flex items-center my-5 max-w-screen-lg">
|
||||
<span className="text-xl text-slate-500 dark:text-slate-200 pr-5">
|
||||
Import Activity
|
||||
</span>
|
||||
<span className="h-px flex-1 bg-slate-200 dark:bg-slate-400"></span>
|
||||
</span>
|
||||
<div className="mt-5 flex flex-col gap-4 sm:mt-0 sm:flex-row sm:items-center">
|
||||
<dl className="grid grid-cols-2 gap-4 sm:grid-cols-2">
|
||||
{/* Successful import counts */}
|
||||
<div className="flex flex-col rounded-lg bg-green-100 dark:bg-green-200 px-4 py-6 text-center">
|
||||
<dd className="text-3xl text-green-600 md:text-5xl">
|
||||
{importJobQueue.successfulJobCount}
|
||||
</dd>
|
||||
<dt className="text-lg font-medium text-gray-500">
|
||||
imported
|
||||
</dt>
|
||||
</div>
|
||||
{/* Failed job counts */}
|
||||
<div className="flex flex-col rounded-lg bg-red-100 dark:bg-red-200 px-4 py-6 text-center">
|
||||
<dd className="text-3xl text-red-600 md:text-5xl">
|
||||
{importJobQueue.failedJobCount}
|
||||
</dd>
|
||||
<dt className="text-lg font-medium text-gray-500">
|
||||
failed
|
||||
</dt>
|
||||
</div>
|
||||
|
||||
{/* Directory Check Error */}
|
||||
{!isCheckingDirectories && directoryCheckFailed && (
|
||||
<div className="my-6 max-w-screen-lg">
|
||||
<AlertBanner severity="error" title="Failed to Check Directory Status">
|
||||
<p>
|
||||
Unable to verify if required directories exist. Import
|
||||
functionality has been disabled.
|
||||
</p>
|
||||
<p className="mt-2">
|
||||
Error: {(directoryError as Error)?.message || "Unknown error"}
|
||||
</p>
|
||||
</AlertBanner>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Directory Status Warning */}
|
||||
{!isCheckingDirectories &&
|
||||
!directoryCheckFailed &&
|
||||
directoryIssues.length > 0 && (
|
||||
<div className="my-6 max-w-screen-lg">
|
||||
<AlertBanner
|
||||
severity="warning"
|
||||
title="Directory Configuration Issues"
|
||||
iconClass="icon-[solar--folder-error-bold]"
|
||||
>
|
||||
<p>
|
||||
The following issues were detected with your directory
|
||||
configuration:
|
||||
</p>
|
||||
<DirectoryIssuesList issues={directoryIssues} />
|
||||
<p className="mt-2">
|
||||
Please ensure these directories are mounted correctly in your
|
||||
Docker configuration.
|
||||
</p>
|
||||
</AlertBanner>
|
||||
<div className="flex flex-col dark:text-slate-200 text-slate-400">
|
||||
<dd>{renderQueueControls(importJobQueue.status)}</dd>
|
||||
</div>
|
||||
</dl>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{/* Force Re-Import Button */}
|
||||
{canStartImport && (
|
||||
<div className="my-6 max-w-screen-lg">
|
||||
<button
|
||||
className="flex space-x-1 sm:mt-0 sm:flex-row sm:items-center rounded-lg border border-orange-400 dark:border-orange-200 bg-orange-200 px-5 py-3 text-gray-700 hover:bg-transparent hover:text-orange-600 focus:outline-none focus:ring active:text-orange-500 disabled:opacity-50 disabled:cursor-not-allowed"
|
||||
onClick={handleForceReImport}
|
||||
disabled={isForceReImporting || hasActiveSession || !hasAllDirectories}
|
||||
title={
|
||||
!hasAllDirectories
|
||||
? "Cannot import: Required directories are missing"
|
||||
: "Re-import all files to fix Elasticsearch indexing issues"
|
||||
}
|
||||
>
|
||||
<span className="text-md font-medium">
|
||||
{isForceReImporting
|
||||
? "Starting Re-Import..."
|
||||
: "Force Re-Import All Files"}
|
||||
<div className="flex">
|
||||
<span className="mt-2 dark:text-slate-200 text-slate-400">
|
||||
Imported: <span>{importJobQueue.mostRecentImport}</span>
|
||||
</span>
|
||||
<span className="w-6 h-6">
|
||||
<i className="h-6 w-6 icon-[solar--refresh-bold-duotone]"></i>
|
||||
</span>
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</>
|
||||
)}
|
||||
|
||||
{/* Past Imports Table */}
|
||||
{!isLoading && !isEmpty(data?.getJobResultStatistics) && (
|
||||
<PastImportsTable data={data!.getJobResultStatistics as any} />
|
||||
{/* Past imports */}
|
||||
{!isLoading && !isEmpty(data?.data) && (
|
||||
<div className="max-w-screen-lg">
|
||||
<span className="flex items-center mt-6">
|
||||
<span className="text-xl text-slate-500 dark:text-slate-200 pr-5">
|
||||
Past Imports
|
||||
</span>
|
||||
<span className="h-px flex-1 bg-slate-200 dark:bg-slate-400"></span>
|
||||
</span>
|
||||
|
||||
<div className="overflow-x-auto w-fit mt-4 rounded-lg border border-gray-200">
|
||||
<table className="min-w-full divide-y-2 divide-gray-200 dark:divide-gray-200 text-md">
|
||||
<thead className="ltr:text-left rtl:text-right">
|
||||
<tr>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Time Started
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Session Id
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Imported
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Failed
|
||||
</th>
|
||||
</tr>
|
||||
</thead>
|
||||
|
||||
<tbody className="divide-y divide-gray-200">
|
||||
{data?.data.map((jobResult, id) => {
|
||||
return (
|
||||
<tr key={id}>
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
{format(
|
||||
new Date(jobResult.earliestTimestamp),
|
||||
"EEEE, hh:mma, do LLLL Y",
|
||||
)}
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="tag is-warning">
|
||||
{jobResult.sessionId}
|
||||
</span>
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="inline-flex items-center justify-center rounded-full bg-emerald-100 px-2 py-0.5 text-emerald-700">
|
||||
<span className="h-5 w-6">
|
||||
<i className="icon-[solar--check-circle-line-duotone] h-5 w-5"></i>
|
||||
</span>
|
||||
<p className="whitespace-nowrap text-sm">
|
||||
{jobResult.completedJobs}
|
||||
</p>
|
||||
</span>
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="inline-flex items-center justify-center rounded-full bg-red-100 px-2 py-0.5 text-red-700">
|
||||
<span className="h-5 w-6">
|
||||
<i className="icon-[solar--close-circle-line-duotone] h-5 w-5"></i>
|
||||
</span>
|
||||
|
||||
<p className="whitespace-nowrap text-sm">
|
||||
{jobResult.failedJobs}
|
||||
</p>
|
||||
</span>
|
||||
</td>
|
||||
</tr>
|
||||
);
|
||||
})}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</div>
|
||||
</section>
|
||||
@@ -327,20 +295,4 @@ export const Import = (): ReactElement => {
|
||||
);
|
||||
};
|
||||
|
||||
/**
|
||||
* Helper component to render directory issues list.
|
||||
*/
|
||||
const DirectoryIssuesList = ({ issues }: { issues: DirectoryIssue[] }): ReactElement => (
|
||||
<ul className="list-disc list-inside mt-2">
|
||||
{issues.map((item) => (
|
||||
<li key={item.directory}>
|
||||
<code className="bg-amber-100 dark:bg-amber-900/50 px-1 rounded">
|
||||
{item.directory}
|
||||
</code>
|
||||
<span className="ml-1">— {item.issue}</span>
|
||||
</li>
|
||||
))}
|
||||
</ul>
|
||||
);
|
||||
|
||||
export default Import;
|
||||
|
||||
@@ -1,103 +0,0 @@
|
||||
/**
|
||||
* @fileoverview Table component displaying historical import sessions.
|
||||
* @module components/Import/PastImportsTable
|
||||
*/
|
||||
|
||||
import { ReactElement } from "react";
|
||||
import { format } from "date-fns";
|
||||
import type { JobResultStatistics } from "./import.types";
|
||||
|
||||
/**
|
||||
* Props for the PastImportsTable component.
|
||||
*/
|
||||
export type PastImportsTableProps = {
|
||||
/** Array of job result statistics from past imports */
|
||||
data: JobResultStatistics[];
|
||||
};
|
||||
|
||||
/**
|
||||
* Displays a table of past import sessions with their statistics.
|
||||
*
|
||||
* @param props - Component props
|
||||
* @returns Table element showing import history
|
||||
*/
|
||||
export const PastImportsTable = ({ data }: PastImportsTableProps): ReactElement => {
|
||||
return (
|
||||
<div className="max-w-screen-lg">
|
||||
<span className="flex items-center mt-6">
|
||||
<span className="text-xl text-slate-500 dark:text-slate-200 pr-5">
|
||||
Past Imports
|
||||
</span>
|
||||
<span className="h-px flex-1 bg-slate-200 dark:bg-slate-400"></span>
|
||||
</span>
|
||||
|
||||
<div className="overflow-x-auto w-fit mt-4 rounded-lg border border-gray-200">
|
||||
<table className="min-w-full divide-y-2 divide-gray-200 dark:divide-gray-200 text-md">
|
||||
<thead className="ltr:text-left rtl:text-right">
|
||||
<tr>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
#
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Time Started
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Session Id
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Imported
|
||||
</th>
|
||||
<th className="whitespace-nowrap px-4 py-2 font-medium text-gray-900 dark:text-slate-200">
|
||||
Failed
|
||||
</th>
|
||||
</tr>
|
||||
</thead>
|
||||
|
||||
<tbody className="divide-y divide-gray-200">
|
||||
{data.map((jobResult, index) => (
|
||||
<tr key={jobResult.sessionId || index}>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300 font-medium">
|
||||
{index + 1}
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
{jobResult.earliestTimestamp &&
|
||||
!isNaN(parseInt(jobResult.earliestTimestamp))
|
||||
? format(
|
||||
new Date(parseInt(jobResult.earliestTimestamp)),
|
||||
"EEEE, hh:mma, do LLLL y"
|
||||
)
|
||||
: "N/A"}
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-2 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="tag is-warning">{jobResult.sessionId}</span>
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="inline-flex items-center justify-center rounded-full bg-emerald-100 px-2 py-0.5 text-emerald-700">
|
||||
<span className="h-5 w-6">
|
||||
<i className="icon-[solar--check-circle-line-duotone] h-5 w-5"></i>
|
||||
</span>
|
||||
<p className="whitespace-nowrap text-sm">
|
||||
{jobResult.completedJobs}
|
||||
</p>
|
||||
</span>
|
||||
</td>
|
||||
<td className="whitespace-nowrap px-4 py-2 text-gray-700 dark:text-slate-300">
|
||||
<span className="inline-flex items-center justify-center rounded-full bg-red-100 px-2 py-0.5 text-red-700">
|
||||
<span className="h-5 w-6">
|
||||
<i className="icon-[solar--close-circle-line-duotone] h-5 w-5"></i>
|
||||
</span>
|
||||
<p className="whitespace-nowrap text-sm">
|
||||
{jobResult.failedJobs}
|
||||
</p>
|
||||
</span>
|
||||
</td>
|
||||
</tr>
|
||||
))}
|
||||
</tbody>
|
||||
</table>
|
||||
</div>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default PastImportsTable;
|
||||
@@ -1,242 +0,0 @@
|
||||
/**
|
||||
* @fileoverview Real-time import statistics component with live progress tracking.
|
||||
* Displays import statistics, progress bars, and file detection notifications
|
||||
* using WebSocket events for real-time updates.
|
||||
* @module components/Import/RealTimeImportStats
|
||||
*/
|
||||
|
||||
import { ReactElement, useState } from "react";
|
||||
import { Link } from "react-router-dom";
|
||||
import {
|
||||
useGetImportStatisticsQuery,
|
||||
useGetWantedComicsQuery,
|
||||
useStartIncrementalImportMutation,
|
||||
} from "../../graphql/generated";
|
||||
import { useStore } from "../../store";
|
||||
import { useShallow } from "zustand/react/shallow";
|
||||
import { useImportSessionStatus } from "../../hooks/useImportSessionStatus";
|
||||
import { useImportSocketEvents } from "../../hooks/useImportSocketEvents";
|
||||
import { getComicDisplayLabel } from "../../shared/utils/formatting.utils";
|
||||
import { AlertCard } from "../shared/AlertCard";
|
||||
import { StatsCard } from "../shared/StatsCard";
|
||||
import { ProgressBar } from "../shared/ProgressBar";
|
||||
|
||||
/**
|
||||
* Real-time import statistics component with card-based layout and progress tracking.
|
||||
*
|
||||
* This component manages three distinct states:
|
||||
* - **Pre-import (idle)**: Shows current file counts and "Start Import" button when new files exist
|
||||
* - **Importing (active)**: Displays real-time progress bar with completed/total counts
|
||||
* - **Post-import (complete)**: Shows final statistics including failed imports
|
||||
*
|
||||
* Additionally, it surfaces missing files detected by the file watcher, allowing users
|
||||
* to see which previously-imported files are no longer found on disk.
|
||||
*
|
||||
* @returns {ReactElement} The rendered import statistics component
|
||||
*/
|
||||
export const RealTimeImportStats = (): ReactElement => {
|
||||
const [importError, setImportError] = useState<string | null>(null);
|
||||
|
||||
const { socketImport, detectedFile } = useImportSocketEvents();
|
||||
const importSession = useImportSessionStatus();
|
||||
|
||||
const { getSocket, disconnectSocket, importJobQueue } = useStore(
|
||||
useShallow((state) => ({
|
||||
getSocket: state.getSocket,
|
||||
disconnectSocket: state.disconnectSocket,
|
||||
importJobQueue: state.importJobQueue,
|
||||
})),
|
||||
);
|
||||
|
||||
const { data: importStats, isLoading, isError: isStatsError, error: statsError } = useGetImportStatisticsQuery(
|
||||
{},
|
||||
{ refetchOnWindowFocus: false, refetchInterval: false },
|
||||
);
|
||||
|
||||
const stats = importStats?.getImportStatistics?.stats;
|
||||
const missingCount = stats?.missingFiles ?? 0;
|
||||
|
||||
const { data: missingComicsData } = useGetWantedComicsQuery(
|
||||
{
|
||||
paginationOptions: { limit: 3, page: 1 },
|
||||
predicate: { "importStatus.isRawFileMissing": true },
|
||||
},
|
||||
{
|
||||
refetchOnWindowFocus: false,
|
||||
refetchInterval: false,
|
||||
enabled: missingCount > 0,
|
||||
},
|
||||
);
|
||||
|
||||
const missingDocs = missingComicsData?.getComicBooks?.docs ?? [];
|
||||
|
||||
const { mutate: startIncrementalImport, isPending: isStartingImport } =
|
||||
useStartIncrementalImportMutation({
|
||||
onSuccess: (data) => {
|
||||
if (data.startIncrementalImport.success) {
|
||||
importJobQueue.setStatus("running");
|
||||
setImportError(null);
|
||||
}
|
||||
},
|
||||
onError: (error: any) => {
|
||||
setImportError(error?.message || "Failed to start import. Please try again.");
|
||||
},
|
||||
});
|
||||
|
||||
const handleStartImport = async () => {
|
||||
setImportError(null);
|
||||
|
||||
if (importSession.isActive) {
|
||||
setImportError(
|
||||
`Cannot start import: An import session "${importSession.sessionId}" is already active. Please wait for it to complete.`,
|
||||
);
|
||||
return;
|
||||
}
|
||||
|
||||
if (importJobQueue.status === "drained") {
|
||||
localStorage.removeItem("sessionId");
|
||||
disconnectSocket("/");
|
||||
setTimeout(() => {
|
||||
getSocket("/");
|
||||
setTimeout(() => {
|
||||
const sessionId = localStorage.getItem("sessionId") || "";
|
||||
startIncrementalImport({ sessionId });
|
||||
}, 500);
|
||||
}, 100);
|
||||
} else {
|
||||
const sessionId = localStorage.getItem("sessionId") || "";
|
||||
startIncrementalImport({ sessionId });
|
||||
}
|
||||
};
|
||||
|
||||
if (isLoading) {
|
||||
return <div className="text-gray-500 dark:text-gray-400">Loading...</div>;
|
||||
}
|
||||
|
||||
if (isStatsError || !stats) {
|
||||
return (
|
||||
<AlertCard variant="error" title="Failed to Load Import Statistics">
|
||||
<p>Unable to retrieve import statistics from the server. Please check that the backend service is running.</p>
|
||||
{isStatsError && (
|
||||
<p className="mt-2">Error: {statsError instanceof Error ? statsError.message : "Unknown error"}</p>
|
||||
)}
|
||||
</AlertCard>
|
||||
);
|
||||
}
|
||||
|
||||
const hasNewFiles = stats.newFiles > 0;
|
||||
const isFirstImport = stats.alreadyImported === 0;
|
||||
const buttonText = isFirstImport
|
||||
? `Start Import (${stats.newFiles} files)`
|
||||
: `Start Incremental Import (${stats.newFiles} new files)`;
|
||||
|
||||
const sessionStats = importSession.stats;
|
||||
const hasSessionStats = importSession.isActive && sessionStats !== null;
|
||||
const failedCount = hasSessionStats ? sessionStats!.filesFailed : 0;
|
||||
|
||||
const showProgressBar = socketImport !== null;
|
||||
const showFailedCard = hasSessionStats && failedCount > 0;
|
||||
const showMissingCard = missingCount > 0;
|
||||
|
||||
return (
|
||||
<div className="space-y-6">
|
||||
{importError && (
|
||||
<AlertCard variant="error" title="Import Error" onDismiss={() => setImportError(null)}>
|
||||
{importError}
|
||||
</AlertCard>
|
||||
)}
|
||||
|
||||
{detectedFile && (
|
||||
<div className="rounded-lg border-l-4 border-blue-500 bg-blue-50 dark:bg-blue-900/20 p-3 flex items-center gap-3">
|
||||
<i className="h-5 w-5 text-blue-600 dark:text-blue-400 icon-[solar--document-add-bold-duotone] shrink-0"></i>
|
||||
<p className="text-sm text-blue-800 dark:text-blue-300 font-mono truncate">
|
||||
New file detected: {detectedFile}
|
||||
</p>
|
||||
</div>
|
||||
)}
|
||||
|
||||
{hasNewFiles && !importSession.isActive && (
|
||||
<button
|
||||
onClick={handleStartImport}
|
||||
disabled={isStartingImport}
|
||||
className="flex items-center gap-2 rounded-lg bg-green-500 hover:bg-green-600 disabled:bg-gray-400 px-6 py-3 text-white font-medium transition-colors disabled:cursor-not-allowed"
|
||||
>
|
||||
<i className="h-6 w-6 icon-[solar--file-left-bold-duotone]"></i>
|
||||
<span>{isStartingImport ? "Starting Import..." : buttonText}</span>
|
||||
</button>
|
||||
)}
|
||||
|
||||
{showProgressBar && (
|
||||
<ProgressBar
|
||||
current={socketImport!.completed}
|
||||
total={socketImport!.total}
|
||||
isActive={socketImport!.active}
|
||||
activeLabel={`Importing ${socketImport!.completed} / ${socketImport!.total}`}
|
||||
completeLabel={`${socketImport!.completed} / ${socketImport!.total} imported`}
|
||||
/>
|
||||
)}
|
||||
|
||||
<div className="grid grid-cols-2 sm:grid-cols-4 gap-4">
|
||||
<StatsCard
|
||||
value={stats.totalLocalFiles}
|
||||
label="in import folder"
|
||||
backgroundColor="#6b7280"
|
||||
/>
|
||||
<StatsCard
|
||||
value={stats.alreadyImported}
|
||||
label={importSession.isActive ? "imported so far" : "imported in database"}
|
||||
backgroundColor="#d8dab2"
|
||||
valueColor="text-gray-800"
|
||||
labelColor="text-gray-700"
|
||||
/>
|
||||
{showFailedCard && (
|
||||
<StatsCard
|
||||
value={failedCount}
|
||||
label="failed"
|
||||
backgroundColor="bg-red-500"
|
||||
labelColor="text-red-100"
|
||||
/>
|
||||
)}
|
||||
{showMissingCard && (
|
||||
<StatsCard
|
||||
value={missingCount}
|
||||
label="missing"
|
||||
backgroundColor="bg-card-missing"
|
||||
valueColor="text-slate-700"
|
||||
labelColor="text-slate-800"
|
||||
/>
|
||||
)}
|
||||
</div>
|
||||
|
||||
{showMissingCard && (
|
||||
<AlertCard variant="warning" title={`${missingCount} ${missingCount === 1 ? "file" : "files"} missing`}>
|
||||
<p>These files were previously imported but can no longer be found on disk. Move them back to restore access.</p>
|
||||
{missingDocs.length > 0 && (
|
||||
<ul className="mt-2 space-y-1">
|
||||
{missingDocs.map((comic, i) => (
|
||||
<li key={i} className="text-xs truncate">
|
||||
{getComicDisplayLabel(comic)} is missing
|
||||
</li>
|
||||
))}
|
||||
{missingCount > 3 && (
|
||||
<li className="text-xs text-amber-600 dark:text-amber-500">
|
||||
and {missingCount - 3} more.
|
||||
</li>
|
||||
)}
|
||||
</ul>
|
||||
)}
|
||||
<Link
|
||||
to="/library?filter=missingFiles"
|
||||
className="inline-flex items-center gap-1.5 mt-3 text-xs font-medium underline underline-offset-2 hover:opacity-70"
|
||||
>
|
||||
<i className="icon-[solar--file-corrupted-outline] w-4 h-4" />
|
||||
View Missing Files In Library
|
||||
<i className="icon-[solar--arrow-right-up-outline] w-3 h-3" />
|
||||
</Link>
|
||||
</AlertCard>
|
||||
)}
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
export default RealTimeImportStats;
|
||||
@@ -1,43 +0,0 @@
|
||||
/**
|
||||
* @fileoverview Type definitions for the Import module.
|
||||
* @module components/Import/import.types
|
||||
*/
|
||||
|
||||
/**
|
||||
* Represents an issue with a configured directory.
|
||||
*/
|
||||
export type DirectoryIssue = {
|
||||
/** Path to the directory with issues */
|
||||
directory: string;
|
||||
/** Description of the issue */
|
||||
issue: string;
|
||||
};
|
||||
|
||||
/**
|
||||
* Result of directory status check from the backend.
|
||||
*/
|
||||
export type DirectoryStatus = {
|
||||
/** Whether all required directories are accessible */
|
||||
isValid: boolean;
|
||||
/** List of specific issues found */
|
||||
issues: DirectoryIssue[];
|
||||
};
|
||||
|
||||
/**
|
||||
* Statistics for a completed import job session.
|
||||
*/
|
||||
export type JobResultStatistics = {
|
||||
/** Unique session identifier */
|
||||
sessionId: string;
|
||||
/** Timestamp of the earliest job in the session (as string for GraphQL compatibility) */
|
||||
earliestTimestamp: string;
|
||||
/** Number of successfully completed jobs */
|
||||
completedJobs: number;
|
||||
/** Number of failed jobs */
|
||||
failedJobs: number;
|
||||
};
|
||||
|
||||
/**
|
||||
* Status of the import job queue.
|
||||
*/
|
||||
export type ImportQueueStatus = "running" | "drained" | undefined;
|
||||
@@ -1,5 +1,6 @@
|
||||
import React, { useMemo, ReactElement, useState } from "react";
|
||||
import { useNavigate, useSearchParams } from "react-router-dom";
|
||||
import React, { useMemo, ReactElement, useState, useEffect } from "react";
|
||||
import PropTypes from "prop-types";
|
||||
import { useNavigate } from "react-router-dom";
|
||||
import { isEmpty, isNil, isUndefined } from "lodash";
|
||||
import MetadataPanel from "../shared/MetadataPanel";
|
||||
import T2Table from "../shared/T2Table";
|
||||
@@ -11,143 +12,101 @@ import {
|
||||
useQueryClient,
|
||||
} from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { format, parseISO } from "date-fns";
|
||||
import { useGetWantedComicsQuery } from "../../graphql/generated";
|
||||
|
||||
import type { LibrarySearchQuery, FilterOption } from "../../types";
|
||||
|
||||
const FILTER_OPTIONS: { value: FilterOption; label: string }[] = [
|
||||
{ value: "all", label: "All Comics" },
|
||||
{ value: "missingFiles", label: "Missing Files" },
|
||||
];
|
||||
import { format, fromUnixTime, parseISO } from "date-fns";
|
||||
|
||||
/**
|
||||
* Library page component. Displays a paginated, searchable table of all comics
|
||||
* in the collection, with an optional filter for comics with missing raw files.
|
||||
* Component that tabulates the contents of the user's ThreeTwo Library.
|
||||
*
|
||||
* @component
|
||||
* @example
|
||||
* <Library />
|
||||
*/
|
||||
export const Library = (): ReactElement => {
|
||||
const [searchParams] = useSearchParams();
|
||||
const initialFilter = (searchParams.get("filter") as FilterOption) ?? "all";
|
||||
|
||||
const [activeFilter, setActiveFilter] = useState<FilterOption>(initialFilter);
|
||||
const [searchQuery, setSearchQuery] = useState<LibrarySearchQuery>({
|
||||
// Default page state
|
||||
// offset: 0
|
||||
const [offset, setOffset] = useState(0);
|
||||
const [searchQuery, setSearchQuery] = useState({
|
||||
query: {},
|
||||
pagination: { size: 25, from: 0 },
|
||||
pagination: {
|
||||
size: 25,
|
||||
from: offset,
|
||||
},
|
||||
type: "all",
|
||||
trigger: "libraryPage",
|
||||
});
|
||||
|
||||
const queryClient = useQueryClient();
|
||||
|
||||
/** Fetches a page of issues from the search API. */
|
||||
const fetchIssues = async (q: LibrarySearchQuery) => {
|
||||
const { pagination, query, type } = q;
|
||||
/**
|
||||
* Method that queries the Elasticsearch index "comics" for issues specified by the query
|
||||
* @param searchQuery - A searchQuery object that contains the search term, type, and pagination params.
|
||||
*/
|
||||
const fetchIssues = async (searchQuery) => {
|
||||
const { pagination, query, type } = searchQuery;
|
||||
return await axios({
|
||||
method: "POST",
|
||||
url: "http://localhost:3000/api/search/searchIssue",
|
||||
data: { query, pagination, type },
|
||||
data: {
|
||||
query,
|
||||
pagination,
|
||||
type,
|
||||
},
|
||||
});
|
||||
};
|
||||
|
||||
const { data, isPlaceholderData } = useQuery({
|
||||
queryKey: ["comics", searchQuery],
|
||||
queryFn: () => fetchIssues(searchQuery),
|
||||
placeholderData: keepPreviousData,
|
||||
enabled: activeFilter === "all",
|
||||
});
|
||||
|
||||
const { data: missingFilesData, isLoading: isMissingLoading } = useGetWantedComicsQuery(
|
||||
{
|
||||
paginationOptions: { limit: 25, page: 1 },
|
||||
predicate: { "importStatus.isRawFileMissing": true },
|
||||
},
|
||||
{ enabled: activeFilter === "missingFiles" },
|
||||
);
|
||||
|
||||
const { data: missingIdsData } = useGetWantedComicsQuery(
|
||||
{
|
||||
paginationOptions: { limit: 1000, page: 1 },
|
||||
predicate: { "importStatus.isRawFileMissing": true },
|
||||
},
|
||||
{ enabled: activeFilter === "all" },
|
||||
);
|
||||
|
||||
/** Set of comic IDs whose raw files are missing, used to highlight rows in the main table. */
|
||||
const missingIdSet = useMemo(
|
||||
() => new Set((missingIdsData?.getComicBooks?.docs ?? []).map((doc: any) => doc.id)),
|
||||
[missingIdsData],
|
||||
);
|
||||
|
||||
const searchResults = data?.data;
|
||||
const navigate = useNavigate();
|
||||
|
||||
const navigateToComicDetail = (row: any) => navigate(`/comic/details/${row.original._id}`);
|
||||
const navigateToMissingComicDetail = (row: any) => navigate(`/comic/details/${row.original.id}`);
|
||||
|
||||
/** Triggers a search by volume name and resets pagination. */
|
||||
const searchIssues = (e: any) => {
|
||||
const searchIssues = (e) => {
|
||||
queryClient.invalidateQueries({ queryKey: ["comics"] });
|
||||
setSearchQuery({
|
||||
query: { volumeName: e.search },
|
||||
pagination: { size: 15, from: 0 },
|
||||
query: {
|
||||
volumeName: e.search,
|
||||
},
|
||||
pagination: {
|
||||
size: 15,
|
||||
from: 0,
|
||||
},
|
||||
type: "volumeName",
|
||||
trigger: "libraryPage",
|
||||
});
|
||||
};
|
||||
|
||||
/** Advances to the next page of results. */
|
||||
const nextPage = (pageIndex: number, pageSize: number) => {
|
||||
if (!isPlaceholderData) {
|
||||
queryClient.invalidateQueries({ queryKey: ["comics"] });
|
||||
setSearchQuery({
|
||||
query: {},
|
||||
pagination: { size: 15, from: pageSize * pageIndex + 1 },
|
||||
type: "all",
|
||||
trigger: "libraryPage",
|
||||
});
|
||||
}
|
||||
const { data, isLoading, isError, isPlaceholderData } = useQuery({
|
||||
queryKey: ["comics", offset, searchQuery],
|
||||
queryFn: () => fetchIssues(searchQuery),
|
||||
placeholderData: keepPreviousData,
|
||||
});
|
||||
|
||||
const searchResults = data?.data;
|
||||
// Programmatically navigate to comic detail
|
||||
const navigate = useNavigate();
|
||||
const navigateToComicDetail = (row) => {
|
||||
navigate(`/comic/details/${row.original._id}`);
|
||||
};
|
||||
|
||||
/** Goes back to the previous page of results. */
|
||||
const previousPage = (pageIndex: number, pageSize: number) => {
|
||||
let from = 0;
|
||||
if (pageIndex === 2) {
|
||||
from = (pageIndex - 1) * pageSize + 2 - (pageSize + 2);
|
||||
} else {
|
||||
from = (pageIndex - 1) * pageSize + 2 - (pageSize + 1);
|
||||
}
|
||||
queryClient.invalidateQueries({ queryKey: ["comics"] });
|
||||
setSearchQuery({
|
||||
query: {},
|
||||
pagination: { size: 15, from },
|
||||
type: "all",
|
||||
trigger: "libraryPage",
|
||||
});
|
||||
};
|
||||
|
||||
const ComicInfoXML = (value: any) =>
|
||||
value.data ? (
|
||||
<dl className="flex flex-col text-xs sm:text-md p-2 sm:p-3 ml-0 sm:ml-4 my-3 rounded-lg dark:bg-yellow-500 bg-yellow-300 w-full sm:w-max max-w-full">
|
||||
<span className="inline-flex items-center w-fit bg-slate-50 text-slate-800 text-xs font-medium px-1.5 sm:px-2 rounded-md dark:text-slate-900 dark:bg-slate-400 max-w-full overflow-hidden">
|
||||
<span className="pr-0.5 sm:pr-1 pt-1">
|
||||
<i className="icon-[solar--bookmark-square-minimalistic-bold-duotone] w-4 h-4 sm:w-5 sm:h-5"></i>
|
||||
const ComicInfoXML = (value) => {
|
||||
return value.data ? (
|
||||
<dl className="flex flex-col text-md p-3 ml-4 my-3 rounded-lg dark:bg-yellow-500 bg-yellow-300 w-max">
|
||||
{/* Series Name */}
|
||||
<span className="inline-flex items-center w-fit bg-slate-50 text-slate-800 text-xs font-medium px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--bookmark-square-minimalistic-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
<span className="text-xs sm:text-md text-slate-900 dark:text-slate-900 truncate">
|
||||
{ellipsize(value.data.series[0], 25)}
|
||||
<span className="text-md text-slate-900 dark:text-slate-900">
|
||||
{ellipsize(value.data.series[0], 45)}
|
||||
</span>
|
||||
</span>
|
||||
<div className="flex flex-row flex-wrap mt-1 sm:mt-2 gap-1 sm:gap-2">
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs px-1 sm:px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-0.5 sm:pr-1 pt-1">
|
||||
<i className="icon-[solar--notebook-minimalistic-bold-duotone] w-3.5 h-3.5 sm:w-5 sm:h-5"></i>
|
||||
<div className="flex flex-row mt-2 gap-2">
|
||||
{/* Pages */}
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--notebook-minimalistic-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
<span className="text-xs sm:text-md text-slate-900 dark:text-slate-900">
|
||||
<span className="text-md text-slate-900 dark:text-slate-900">
|
||||
Pages: {value.data.pagecount[0]}
|
||||
</span>
|
||||
</span>
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs px-1 sm:px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-0.5 sm:pr-1 pt-1">
|
||||
<i className="icon-[solar--hashtag-outline] w-3 h-3 sm:w-3.5 sm:h-3.5"></i>
|
||||
{/* Issue number */}
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--hashtag-outline] w-3.5 h-3.5"></i>
|
||||
</span>
|
||||
<span className="text-slate-900 dark:text-slate-900">
|
||||
{!isNil(value.data.number) && (
|
||||
@@ -158,62 +117,30 @@ export const Library = (): ReactElement => {
|
||||
</div>
|
||||
</dl>
|
||||
) : null;
|
||||
|
||||
const missingFilesColumns = useMemo(
|
||||
() => [
|
||||
{
|
||||
header: "Missing Files",
|
||||
columns: [
|
||||
{
|
||||
header: "Status",
|
||||
id: "missingStatus",
|
||||
cell: () => (
|
||||
<div className="flex flex-col items-center gap-1.5 px-2 py-3 min-w-[80px]">
|
||||
<i className="icon-[solar--file-corrupted-outline] w-8 h-8 text-red-500"></i>
|
||||
<span className="inline-flex items-center rounded-md bg-red-100 px-2 py-1 text-xs font-semibold text-red-700 ring-1 ring-inset ring-red-600/20">
|
||||
MISSING
|
||||
</span>
|
||||
</div>
|
||||
),
|
||||
},
|
||||
{
|
||||
header: "Comic",
|
||||
id: "missingComic",
|
||||
minWidth: 250,
|
||||
accessorFn: (row: any) => row,
|
||||
cell: (info: any) => <MetadataPanel data={info.getValue()} />,
|
||||
},
|
||||
],
|
||||
},
|
||||
],
|
||||
[],
|
||||
);
|
||||
};
|
||||
|
||||
const columns = useMemo(
|
||||
() => [
|
||||
{
|
||||
header: "Comic Metadata",
|
||||
footer: 1,
|
||||
columns: [
|
||||
{
|
||||
header: "File Details",
|
||||
id: "fileDetails",
|
||||
minWidth: 250,
|
||||
minWidth: 400,
|
||||
accessorKey: "_source",
|
||||
cell: (info: any) => {
|
||||
const source = info.getValue();
|
||||
return (
|
||||
<MetadataPanel
|
||||
data={source}
|
||||
isMissing={missingIdSet.has(info.row.original._id)}
|
||||
/>
|
||||
);
|
||||
cell: (info) => {
|
||||
return <MetadataPanel data={info.getValue()} />;
|
||||
},
|
||||
},
|
||||
{
|
||||
header: "ComicInfo.xml",
|
||||
accessorKey: "_source.sourcedMetadata.comicInfo",
|
||||
cell: (info: any) =>
|
||||
!isEmpty(info.getValue()) ? <ComicInfoXML data={info.getValue()} /> : null,
|
||||
cell: (info) =>
|
||||
!isEmpty(info.getValue()) ? (
|
||||
<ComicInfoXML data={info.getValue()} />
|
||||
) : null,
|
||||
},
|
||||
],
|
||||
},
|
||||
@@ -223,30 +150,36 @@ export const Library = (): ReactElement => {
|
||||
{
|
||||
header: "Date of Import",
|
||||
accessorKey: "_source.createdAt",
|
||||
cell: (info: any) =>
|
||||
!isNil(info.getValue()) ? (
|
||||
cell: (info) => {
|
||||
return !isNil(info.getValue()) ? (
|
||||
<div className="text-sm w-max ml-3 my-3 text-slate-600 dark:text-slate-900">
|
||||
<p>{format(parseISO(info.getValue()), "dd MMMM, yyyy")}</p>
|
||||
<p>{format(parseISO(info.getValue()), "dd MMMM, yyyy")} </p>
|
||||
{format(parseISO(info.getValue()), "h aaaa")}
|
||||
</div>
|
||||
) : null,
|
||||
) : null;
|
||||
},
|
||||
},
|
||||
{
|
||||
header: "Downloads",
|
||||
accessorKey: "_source.acquisition",
|
||||
cell: (info: any) => (
|
||||
cell: (info) => (
|
||||
<div className="flex flex-col gap-2 ml-3 my-3">
|
||||
<span className="inline-flex items-center w-fit bg-slate-50 text-slate-800 text-xs px-2 rounded-md dark:text-slate-900 dark:bg-slate-400 whitespace-nowrap">
|
||||
<span className="inline-flex items-center w-fit bg-slate-50 text-slate-800 text-xs px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--folder-path-connect-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
DC++: {info.getValue().directconnect.downloads.length}
|
||||
<span className="text-md text-slate-900 dark:text-slate-900">
|
||||
DC++: {info.getValue().directconnect.downloads.length}
|
||||
</span>
|
||||
</span>
|
||||
<span className="inline-flex items-center w-fit bg-slate-50 text-slate-800 text-xs px-2 rounded-md dark:text-slate-900 dark:bg-slate-400 whitespace-nowrap">
|
||||
|
||||
<span className="inline-flex w-fit items-center bg-slate-50 text-slate-800 text-xs px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--magnet-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
Torrent: {info.getValue().torrent.length}
|
||||
<span className="text-md text-slate-900 dark:text-slate-900">
|
||||
Torrent: {info.getValue().torrent.length}
|
||||
</span>
|
||||
</span>
|
||||
</div>
|
||||
),
|
||||
@@ -254,99 +187,129 @@ export const Library = (): ReactElement => {
|
||||
],
|
||||
},
|
||||
],
|
||||
[missingIdSet],
|
||||
[],
|
||||
);
|
||||
|
||||
const FilterDropdown = () => (
|
||||
<div className="relative">
|
||||
<select
|
||||
value={activeFilter}
|
||||
onChange={(e: React.ChangeEvent<HTMLSelectElement>) => setActiveFilter(e.target.value as FilterOption)}
|
||||
className="appearance-none h-full rounded-lg border border-gray-300 dark:border-slate-600 bg-white dark:bg-slate-700 pl-3 pr-8 py-1.5 text-sm text-gray-700 dark:text-slate-200 cursor-pointer focus:outline-none focus:ring-2 focus:ring-blue-500"
|
||||
>
|
||||
{FILTER_OPTIONS.map((opt) => (
|
||||
<option key={opt.value} value={opt.value}>
|
||||
{opt.label}
|
||||
</option>
|
||||
))}
|
||||
</select>
|
||||
<i className="icon-[solar--alt-arrow-down-bold] absolute right-2 top-1/2 -translate-y-1/2 w-4 h-4 text-gray-500 dark:text-slate-400 pointer-events-none"></i>
|
||||
</div>
|
||||
);
|
||||
/**
|
||||
* Pagination control that fetches the next x (pageSize) items
|
||||
* based on the y (pageIndex) offset from the ThreeTwo Elasticsearch index
|
||||
* @param {number} pageIndex
|
||||
* @param {number} pageSize
|
||||
* @returns void
|
||||
*
|
||||
**/
|
||||
const nextPage = (pageIndex: number, pageSize: number) => {
|
||||
if (!isPlaceholderData) {
|
||||
queryClient.invalidateQueries({ queryKey: ["comics"] });
|
||||
setSearchQuery({
|
||||
query: {},
|
||||
pagination: {
|
||||
size: 15,
|
||||
from: pageSize * pageIndex + 1,
|
||||
},
|
||||
type: "all",
|
||||
trigger: "libraryPage",
|
||||
});
|
||||
// setOffset(pageSize * pageIndex + 1);
|
||||
}
|
||||
};
|
||||
|
||||
const isMissingFilter = activeFilter === "missingFiles";
|
||||
/**
|
||||
* Pagination control that fetches the previous x (pageSize) items
|
||||
* based on the y (pageIndex) offset from the ThreeTwo Elasticsearch index
|
||||
* @param {number} pageIndex
|
||||
* @param {number} pageSize
|
||||
* @returns void
|
||||
**/
|
||||
const previousPage = (pageIndex: number, pageSize: number) => {
|
||||
let from = 0;
|
||||
if (pageIndex === 2) {
|
||||
from = (pageIndex - 1) * pageSize + 2 - (pageSize + 2);
|
||||
} else {
|
||||
from = (pageIndex - 1) * pageSize + 2 - (pageSize + 1);
|
||||
}
|
||||
queryClient.invalidateQueries({ queryKey: ["comics"] });
|
||||
setSearchQuery({
|
||||
query: {},
|
||||
pagination: {
|
||||
size: 15,
|
||||
from,
|
||||
},
|
||||
type: "all",
|
||||
trigger: "libraryPage",
|
||||
});
|
||||
// setOffset(from);
|
||||
};
|
||||
|
||||
// ImportStatus.propTypes = {
|
||||
// value: PropTypes.bool.isRequired,
|
||||
// };
|
||||
return (
|
||||
<section>
|
||||
<header className="bg-slate-200 dark:bg-slate-500">
|
||||
<div className="mx-auto max-w-screen-xl px-4 py-2 sm:px-6 sm:py-8 lg:px-8 lg:py-4">
|
||||
<div className="sm:flex sm:items-center sm:justify-between">
|
||||
<div className="text-center sm:text-left">
|
||||
<h1 className="text-2xl font-bold text-gray-900 dark:text-white sm:text-3xl">
|
||||
Library
|
||||
</h1>
|
||||
<p className="mt-1.5 text-sm text-gray-500 dark:text-white">
|
||||
Browse your comic book collection.
|
||||
</p>
|
||||
<div>
|
||||
<section>
|
||||
<header className="bg-slate-200 dark:bg-slate-500">
|
||||
<div className="mx-auto max-w-screen-xl px-2 py-2 sm:px-6 sm:py-8 lg:px-8 lg:py-4">
|
||||
<div className="sm:flex sm:items-center sm:justify-between">
|
||||
<div className="text-center sm:text-left">
|
||||
<h1 className="text-2xl font-bold text-gray-900 dark:text-white sm:text-3xl">
|
||||
Library
|
||||
</h1>
|
||||
|
||||
<p className="mt-1.5 text-sm text-gray-500 dark:text-white">
|
||||
Browse your comic book collection.
|
||||
</p>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</header>
|
||||
|
||||
{isMissingFilter ? (
|
||||
<div className="mx-auto max-w-screen-xl px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
{isMissingLoading ? (
|
||||
<div className="text-gray-500 dark:text-gray-400">Loading...</div>
|
||||
) : (
|
||||
<T2Table
|
||||
totalPages={missingFilesData?.getComicBooks?.totalDocs ?? 0}
|
||||
columns={missingFilesColumns}
|
||||
sourceData={missingFilesData?.getComicBooks?.docs ?? []}
|
||||
rowClickHandler={navigateToMissingComicDetail}
|
||||
getRowClassName={() => "bg-card-missing/40 hover:bg-card-missing/20"}
|
||||
paginationHandlers={{ nextPage: () => {}, previousPage: () => {} }}
|
||||
>
|
||||
<FilterDropdown />
|
||||
</T2Table>
|
||||
)}
|
||||
</div>
|
||||
) : !isUndefined(searchResults?.hits) ? (
|
||||
<div className="mx-auto max-w-screen-xl px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
<T2Table
|
||||
totalPages={searchResults.hits.total.value}
|
||||
columns={columns}
|
||||
sourceData={searchResults?.hits.hits}
|
||||
rowClickHandler={navigateToComicDetail}
|
||||
getRowClassName={(row) =>
|
||||
missingIdSet.has(row.original._id)
|
||||
? "bg-card-missing/40 hover:bg-card-missing/20"
|
||||
: "hover:bg-slate-100/30 dark:hover:bg-slate-700/20"
|
||||
}
|
||||
paginationHandlers={{ nextPage, previousPage }}
|
||||
>
|
||||
<div className="flex items-center gap-2">
|
||||
<FilterDropdown />
|
||||
<SearchBar searchHandler={(e: any) => searchIssues(e)} />
|
||||
</div>
|
||||
</T2Table>
|
||||
</div>
|
||||
) : (
|
||||
<div className="mx-auto max-w-screen-xl mt-5">
|
||||
<article
|
||||
role="alert"
|
||||
className="rounded-lg max-w-screen-md border-s-4 border-yellow-500 bg-yellow-50 p-4 dark:border-s-4 dark:border-yellow-600 dark:bg-yellow-300 dark:text-slate-600"
|
||||
>
|
||||
</header>
|
||||
{!isUndefined(searchResults?.hits) ? (
|
||||
<div>
|
||||
<div>
|
||||
<p>
|
||||
No comics were found in the library, Elasticsearch reports no indices. Try
|
||||
importing a few comics into the library and come back.
|
||||
</p>
|
||||
<T2Table
|
||||
totalPages={searchResults.hits.total.value}
|
||||
columns={columns}
|
||||
sourceData={searchResults?.hits.hits}
|
||||
rowClickHandler={navigateToComicDetail}
|
||||
paginationHandlers={{
|
||||
nextPage,
|
||||
previousPage,
|
||||
}}
|
||||
>
|
||||
<SearchBar searchHandler={(e) => searchIssues(e)} />
|
||||
</T2Table>
|
||||
</div>
|
||||
</article>
|
||||
<FilterDropdown />
|
||||
</div>
|
||||
)}
|
||||
</section>
|
||||
</div>
|
||||
) : (
|
||||
<div className="mx-auto max-w-screen-xl mt-5">
|
||||
<article
|
||||
role="alert"
|
||||
className="rounded-lg max-w-screen-md border-s-4 border-yellow-500 bg-yellow-50 p-4 dark:border-s-4 dark:border-yellow-600 dark:bg-yellow-300 dark:text-slate-600"
|
||||
>
|
||||
<div>
|
||||
<p>
|
||||
No comics were found in the library, Elasticsearch reports no
|
||||
indices. Try importing a few comics into the library and come
|
||||
back.
|
||||
</p>
|
||||
</div>
|
||||
</article>
|
||||
<div className="block max-w-md p-6 bg-white border border-gray-200 my-3 rounded-lg shadow dark:bg-slate-400 dark:border-gray-700">
|
||||
<pre className="text-sm font-hasklig text-slate-700 dark:text-slate-700">
|
||||
{!isUndefined(searchResults?.data?.meta?.body) ? (
|
||||
<p>
|
||||
{JSON.stringify(
|
||||
searchResults?.data.meta.body.error.root_cause,
|
||||
null,
|
||||
4,
|
||||
)}
|
||||
</p>
|
||||
) : null}
|
||||
</pre>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</section>
|
||||
</div>
|
||||
);
|
||||
};
|
||||
|
||||
|
||||
@@ -8,52 +8,23 @@ import {
|
||||
import { useTable, usePagination } from "react-table";
|
||||
import prettyBytes from "pretty-bytes";
|
||||
import ellipsize from "ellipsize";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { useDispatch, useSelector } from "react-redux";
|
||||
import { getComicBooks } from "../../actions/fileops.actions";
|
||||
import { isNil, isEmpty, isUndefined } from "lodash";
|
||||
import Masonry from "react-masonry-css";
|
||||
import Card from "../shared/Carda";
|
||||
import { detectIssueTypes } from "../../shared/utils/tradepaperback.utils";
|
||||
import { Link } from "react-router-dom";
|
||||
import { LIBRARY_SERVICE_HOST, LIBRARY_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import type { LibraryGridProps } from "../../types";
|
||||
import { LIBRARY_SERVICE_HOST } from "../../constants/endpoints";
|
||||
|
||||
interface ComicDoc {
|
||||
_id: string;
|
||||
rawFileDetails?: {
|
||||
cover?: {
|
||||
filePath: string;
|
||||
};
|
||||
name?: string;
|
||||
};
|
||||
sourcedMetadata?: {
|
||||
comicvine?: {
|
||||
image?: {
|
||||
small_url?: string;
|
||||
};
|
||||
name?: string;
|
||||
volumeInformation?: {
|
||||
description?: string;
|
||||
};
|
||||
};
|
||||
};
|
||||
}
|
||||
|
||||
export const LibraryGrid = (libraryGridProps: LibraryGridProps) => {
|
||||
const { data: comicsData } = useQuery({
|
||||
queryKey: ["recentComics"],
|
||||
queryFn: async () =>
|
||||
axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/getComicBooks`,
|
||||
method: "POST",
|
||||
data: {
|
||||
paginationOptions: { size: 25, from: 0 },
|
||||
predicate: {},
|
||||
},
|
||||
}),
|
||||
});
|
||||
const data: ComicDoc[] = comicsData?.data?.docs ?? [];
|
||||
const pageTotal: number = comicsData?.data?.totalDocs ?? 0;
|
||||
interface ILibraryGridProps {}
|
||||
export const LibraryGrid = (libraryGridProps: ILibraryGridProps) => {
|
||||
const data = useSelector(
|
||||
(state: RootState) => state.fileOps.recentComics.docs,
|
||||
);
|
||||
const pageTotal = useSelector(
|
||||
(state: RootState) => state.fileOps.recentComics.totalDocs,
|
||||
);
|
||||
const breakpointColumnsObj = {
|
||||
default: 5,
|
||||
1100: 4,
|
||||
@@ -71,20 +42,20 @@ export const LibraryGrid = (libraryGridProps: LibraryGridProps) => {
|
||||
className="my-masonry-grid"
|
||||
columnClassName="my-masonry-grid_column"
|
||||
>
|
||||
{data.map(({ _id, rawFileDetails, sourcedMetadata }: ComicDoc) => {
|
||||
{data.map(({ _id, rawFileDetails, sourcedMetadata }) => {
|
||||
let imagePath = "";
|
||||
let comicName = "";
|
||||
if (rawFileDetails && !isEmpty(rawFileDetails.cover)) {
|
||||
if (!isEmpty(rawFileDetails.cover)) {
|
||||
const encodedFilePath = encodeURI(
|
||||
`${LIBRARY_SERVICE_HOST}/${removeLeadingPeriod(
|
||||
rawFileDetails.cover?.filePath || '',
|
||||
rawFileDetails.cover.filePath,
|
||||
)}`,
|
||||
);
|
||||
imagePath = escapePoundSymbol(encodedFilePath);
|
||||
comicName = rawFileDetails.name || '';
|
||||
} else if (!isNil(sourcedMetadata) && sourcedMetadata.comicvine?.image?.small_url) {
|
||||
comicName = rawFileDetails.name;
|
||||
} else if (!isNil(sourcedMetadata)) {
|
||||
imagePath = sourcedMetadata.comicvine.image.small_url;
|
||||
comicName = sourcedMetadata.comicvine?.name || '';
|
||||
comicName = sourcedMetadata.comicvine.name;
|
||||
}
|
||||
const titleElement = (
|
||||
<Link to={"/comic/details/" + _id}>
|
||||
@@ -100,21 +71,17 @@ export const LibraryGrid = (libraryGridProps: LibraryGridProps) => {
|
||||
title={comicName ? titleElement : null}
|
||||
>
|
||||
<div className="content is-flex is-flex-direction-row">
|
||||
{sourcedMetadata && !isEmpty(sourcedMetadata.comicvine) && (
|
||||
<span className="icon cv-icon is-small inline-block w-6 h-6 md:w-7 md:h-7 flex-shrink-0">
|
||||
<img
|
||||
src="/src/client/assets/img/cvlogo.svg"
|
||||
className="w-full h-full object-contain"
|
||||
/>
|
||||
{!isEmpty(sourcedMetadata.comicvine) && (
|
||||
<span className="icon cv-icon is-small">
|
||||
<img src="/src/client/assets/img/cvlogo.svg" />
|
||||
</span>
|
||||
)}
|
||||
{/* TODO: Switch to Solar icon */}
|
||||
{isNil(rawFileDetails) && (
|
||||
<span className="icon has-text-info">
|
||||
<i className="fas fa-adjust" />
|
||||
</span>
|
||||
)}
|
||||
{sourcedMetadata?.comicvine?.volumeInformation?.description &&
|
||||
{!isUndefined(sourcedMetadata.comicvine.volumeInformation) &&
|
||||
!isEmpty(
|
||||
detectIssueTypes(
|
||||
sourcedMetadata.comicvine.volumeInformation.description,
|
||||
@@ -123,7 +90,8 @@ export const LibraryGrid = (libraryGridProps: LibraryGridProps) => {
|
||||
<span className="tag is-warning ml-1">
|
||||
{
|
||||
detectIssueTypes(
|
||||
sourcedMetadata.comicvine.volumeInformation.description || '',
|
||||
sourcedMetadata.comicvine.volumeInformation
|
||||
.description,
|
||||
).displayName
|
||||
}
|
||||
</span>
|
||||
|
||||
@@ -3,11 +3,7 @@ import PropTypes from "prop-types";
|
||||
import { Form, Field } from "react-final-form";
|
||||
import { Link } from "react-router-dom";
|
||||
|
||||
interface SearchBarProps {
|
||||
searchHandler: (values: Record<string, unknown>) => void;
|
||||
}
|
||||
|
||||
export const SearchBar = (props: SearchBarProps): ReactElement => {
|
||||
export const SearchBar = (props): ReactElement => {
|
||||
const { searchHandler } = props;
|
||||
return (
|
||||
<Form
|
||||
|
||||
@@ -3,7 +3,10 @@ import PullList from "../PullList/PullList";
|
||||
import { Volumes } from "../Volumes/Volumes";
|
||||
import WantedComics from "../WantedComics/WantedComics";
|
||||
import { Library } from "./Library";
|
||||
import type { TabulatedContentContainerProps } from "../../types";
|
||||
|
||||
interface ITabulatedContentContainerProps {
|
||||
category: string;
|
||||
}
|
||||
/**
|
||||
* Component to draw the contents of a category in a table.
|
||||
*
|
||||
@@ -15,7 +18,7 @@ import type { TabulatedContentContainerProps } from "../../types";
|
||||
*/
|
||||
|
||||
const TabulatedContentContainer = (
|
||||
props: TabulatedContentContainerProps,
|
||||
props: ITabulatedContentContainerProps,
|
||||
): ReactElement => {
|
||||
const { category } = props;
|
||||
const renderTabulatedContent = () => {
|
||||
|
||||
@@ -1,27 +1,16 @@
|
||||
import React, { ReactElement, useEffect, useMemo, useState } from "react";
|
||||
import React, { ReactElement, useEffect, useMemo } from "react";
|
||||
import T2Table from "../shared/T2Table";
|
||||
import { getWeeklyPullList } from "../../actions/comicinfo.actions";
|
||||
import Card from "../shared/Carda";
|
||||
import ellipsize from "ellipsize";
|
||||
import { isNil } from "lodash";
|
||||
import type { CellContext } from "@tanstack/react-table";
|
||||
|
||||
interface PullListComic {
|
||||
issue: {
|
||||
cover: string;
|
||||
name: string;
|
||||
publisher: string;
|
||||
description: string;
|
||||
price: string;
|
||||
pulls: number;
|
||||
};
|
||||
}
|
||||
|
||||
export const PullList = (): ReactElement => {
|
||||
// Placeholder for pull list comics - would come from API/store
|
||||
const [pullListComics, setPullListComics] = useState<PullListComic[] | null>(null);
|
||||
// const pullListComics = useSelector(
|
||||
// (state: RootState) => state.comicInfo.pullList,
|
||||
// );
|
||||
|
||||
useEffect(() => {
|
||||
// TODO: Implement pull list fetching
|
||||
// dispatch(
|
||||
// getWeeklyPullList({
|
||||
// startDate: "2023-7-28",
|
||||
@@ -42,7 +31,7 @@ export const PullList = (): ReactElement => {
|
||||
id: "comicDetails",
|
||||
minWidth: 450,
|
||||
accessorKey: "issue",
|
||||
cell: (row: CellContext<PullListComic, PullListComic["issue"]>) => {
|
||||
cell: (row) => {
|
||||
const item = row.getValue();
|
||||
return (
|
||||
<div className="columns">
|
||||
@@ -110,7 +99,7 @@ export const PullList = (): ReactElement => {
|
||||
[],
|
||||
);
|
||||
return (
|
||||
<section className="container mx-auto px-4 sm:px-6 lg:px-8">
|
||||
<section className="container">
|
||||
<div className="section">
|
||||
<div className="header-area">
|
||||
<h1 className="title">Weekly Pull List</h1>
|
||||
|
||||
@@ -1,190 +1,97 @@
|
||||
import React, { ReactElement, useState } from "react";
|
||||
import { isNil, isEmpty, isUndefined } from "lodash";
|
||||
import { detectIssueTypes } from "../../shared/utils/tradepaperback.utils";
|
||||
import React, { useCallback, ReactElement, useState } from "react";
|
||||
import { isNil, isEmpty } from "lodash";
|
||||
import { IExtractedComicBookCoverFile, RootState } from "threetwo-ui-typings";
|
||||
|
||||
import { Form, Field } from "react-final-form";
|
||||
import Card from "../shared/Carda";
|
||||
import ellipsize from "ellipsize";
|
||||
import { convert } from "html-to-text";
|
||||
import { useTranslation } from "react-i18next";
|
||||
import "../../shared/utils/i18n.util";
|
||||
import PopoverButton from "../shared/PopoverButton";
|
||||
import dayjs from "dayjs";
|
||||
import { useMutation, useQueryClient } from "@tanstack/react-query";
|
||||
import { useQuery, useQueryClient } from "@tanstack/react-query";
|
||||
import {
|
||||
COMICVINE_SERVICE_URI,
|
||||
LIBRARY_SERVICE_BASE_URI,
|
||||
} from "../../constants/endpoints";
|
||||
import axios from "axios";
|
||||
import type { SearchPageProps, ComicVineSearchResult } from "../../types";
|
||||
|
||||
interface ComicData {
|
||||
id: number;
|
||||
api_detail_url: string;
|
||||
image: { small_url: string; thumb_url?: string };
|
||||
cover_date?: string;
|
||||
issue_number?: string;
|
||||
name?: string;
|
||||
description?: string;
|
||||
volume?: { name: string; api_detail_url: string };
|
||||
start_year?: string;
|
||||
count_of_issues?: number;
|
||||
publisher?: { name: string };
|
||||
resource_type?: string;
|
||||
}
|
||||
interface ISearchProps {}
|
||||
|
||||
export const Search = ({}: SearchPageProps): ReactElement => {
|
||||
const queryClient = useQueryClient();
|
||||
export const Search = ({}: ISearchProps): ReactElement => {
|
||||
const formData = {
|
||||
search: "",
|
||||
};
|
||||
const [comicVineMetadata, setComicVineMetadata] = useState<{
|
||||
sourceName?: string;
|
||||
comicData?: ComicData;
|
||||
}>({});
|
||||
const [selectedResource, setSelectedResource] = useState("volume");
|
||||
const { t } = useTranslation();
|
||||
const handleResourceChange = (value: string) => {
|
||||
setSelectedResource(value);
|
||||
const queryClient = useQueryClient();
|
||||
const [searchQuery, setSearchQuery] = useState("");
|
||||
const [comicVineMetadata, setComicVineMetadata] = useState({});
|
||||
const getCVSearchResults = (searchQuery) => {
|
||||
setSearchQuery(searchQuery.search);
|
||||
};
|
||||
|
||||
const {
|
||||
mutate,
|
||||
data: comicVineSearchResults,
|
||||
isPending,
|
||||
isLoading,
|
||||
isSuccess,
|
||||
} = useMutation({
|
||||
mutationFn: async (data: { search: string; resource: string }) => {
|
||||
const { search, resource } = data;
|
||||
return await axios({
|
||||
} = useQuery({
|
||||
queryFn: async () =>
|
||||
await axios({
|
||||
url: `${COMICVINE_SERVICE_URI}/search`,
|
||||
method: "GET",
|
||||
params: {
|
||||
api_key: "a5fa0663683df8145a85d694b5da4b87e1c92c69",
|
||||
query: search,
|
||||
query: searchQuery,
|
||||
format: "json",
|
||||
limit: "10",
|
||||
offset: "0",
|
||||
field_list:
|
||||
"id,name,deck,api_detail_url,image,description,volume,cover_date,start_year,count_of_issues,publisher,issue_number",
|
||||
resources: resource,
|
||||
"id,name,deck,api_detail_url,image,description,volume,cover_date",
|
||||
resources: "issue",
|
||||
},
|
||||
});
|
||||
},
|
||||
}),
|
||||
queryKey: ["comicvineSearchResults", searchQuery],
|
||||
enabled: !isNil(searchQuery),
|
||||
});
|
||||
|
||||
// add to library
|
||||
const { data: additionResult, mutate: addToWantedList } = useMutation({
|
||||
mutationFn: async ({
|
||||
source,
|
||||
comicObject,
|
||||
markEntireVolumeWanted,
|
||||
resourceType,
|
||||
}: {
|
||||
source: string;
|
||||
comicObject: any;
|
||||
markEntireVolumeWanted: boolean;
|
||||
resourceType: string;
|
||||
}) => {
|
||||
let volumeInformation = {};
|
||||
let issues = [];
|
||||
switch (resourceType) {
|
||||
case "issue":
|
||||
const { id, api_detail_url, image, cover_date, issue_number } =
|
||||
comicObject;
|
||||
// Add issue metadata
|
||||
issues.push({
|
||||
id,
|
||||
url: api_detail_url,
|
||||
image,
|
||||
coverDate: cover_date,
|
||||
issueNumber: issue_number,
|
||||
});
|
||||
// Get volume metadata from CV
|
||||
const response = await axios({
|
||||
url: `${COMICVINE_SERVICE_URI}/getVolumes`,
|
||||
method: "POST",
|
||||
data: {
|
||||
volumeURI: comicObject.volume.api_detail_url,
|
||||
fieldList:
|
||||
"id,name,deck,api_detail_url,image,description,start_year,year,count_of_issues,publisher,first_issue,last_issue",
|
||||
},
|
||||
});
|
||||
// set volume metadata key
|
||||
volumeInformation = response.data?.results;
|
||||
break;
|
||||
|
||||
case "volume":
|
||||
const {
|
||||
id: volumeId,
|
||||
api_detail_url: apiUrl,
|
||||
image: volumeImage,
|
||||
name,
|
||||
publisher,
|
||||
} = comicObject;
|
||||
volumeInformation = {
|
||||
id: volumeId,
|
||||
url: apiUrl,
|
||||
image: volumeImage,
|
||||
name,
|
||||
publisher,
|
||||
};
|
||||
break;
|
||||
|
||||
default:
|
||||
break;
|
||||
}
|
||||
// Add to wanted list
|
||||
return await axios({
|
||||
const { data: additionResult } = useQuery({
|
||||
queryFn: async () =>
|
||||
await axios({
|
||||
url: `${LIBRARY_SERVICE_BASE_URI}/rawImportToDb`,
|
||||
method: "POST",
|
||||
data: {
|
||||
importType: "new",
|
||||
payload: {
|
||||
rawFileDetails: {
|
||||
name: "",
|
||||
},
|
||||
importStatus: {
|
||||
isImported: false, // wanted, but not acquired yet.
|
||||
isImported: true,
|
||||
tagged: false,
|
||||
matchedResult: {
|
||||
score: "0",
|
||||
},
|
||||
},
|
||||
wanted: {
|
||||
source,
|
||||
markEntireVolumeWanted,
|
||||
issues,
|
||||
volume: volumeInformation,
|
||||
},
|
||||
sourcedMetadata: { comicvine: volumeInformation },
|
||||
sourcedMetadata:
|
||||
{ comicvine: comicVineMetadata?.comicData } || null,
|
||||
acquisition: { source: { wanted: true, name: "comicvine" } },
|
||||
},
|
||||
},
|
||||
});
|
||||
},
|
||||
onSuccess: () => {
|
||||
// Invalidate and refetch wanted comics queries
|
||||
queryClient.invalidateQueries({ queryKey: ["wantedComics"] });
|
||||
},
|
||||
}),
|
||||
queryKey: ["additionResult"],
|
||||
enabled: !isNil(comicVineMetadata.comicData),
|
||||
});
|
||||
|
||||
const addToLibrary = (sourceName: string, comicData: ComicData) =>
|
||||
const addToLibrary = (sourceName: string, comicData) =>
|
||||
setComicVineMetadata({ sourceName, comicData });
|
||||
|
||||
const createDescriptionMarkup = (html: string) => {
|
||||
const createDescriptionMarkup = (html) => {
|
||||
return { __html: html };
|
||||
};
|
||||
|
||||
const onSubmit = async (values: { search: string }) => {
|
||||
const formData = { ...values, resource: selectedResource };
|
||||
try {
|
||||
mutate(formData);
|
||||
} catch (error) {
|
||||
// Handle error
|
||||
}
|
||||
};
|
||||
|
||||
return (
|
||||
<div>
|
||||
<section>
|
||||
<header className="bg-slate-200 dark:bg-slate-500">
|
||||
<div className="mx-auto max-w-screen-xl px-4 py-2 sm:px-6 sm:py-8 lg:px-8 lg:py-4">
|
||||
<div className="px-2 py-2 sm:px-6 sm:py-8 lg:px-8 lg:py-4">
|
||||
<div className="sm:flex sm:items-center sm:justify-between">
|
||||
<div className="text-center sm:text-left">
|
||||
<h1 className="text-2xl font-bold text-gray-900 dark:text-white sm:text-3xl">
|
||||
@@ -200,7 +107,7 @@ export const Search = ({}: SearchPageProps): ReactElement => {
|
||||
</header>
|
||||
<div className="mx-auto max-w-screen-sm px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
<Form
|
||||
onSubmit={onSubmit}
|
||||
onSubmit={getCVSearchResults}
|
||||
initialValues={{
|
||||
...formData,
|
||||
}}
|
||||
@@ -232,73 +139,19 @@ export const Search = ({}: SearchPageProps): ReactElement => {
|
||||
Search
|
||||
</button>
|
||||
</div>
|
||||
{/* resource type selection: volume, issue etc. */}
|
||||
<div className="flex flex-row gap-3 mt-4">
|
||||
<Field name="resource" type="radio" value="volume">
|
||||
{({ input: volumesInput, meta }) => (
|
||||
<div className="w-fit rounded-xl">
|
||||
<div>
|
||||
<input
|
||||
{...volumesInput}
|
||||
type="radio"
|
||||
id="volume"
|
||||
checked={selectedResource === "volume"}
|
||||
onChange={() => handleResourceChange("volume")}
|
||||
className="peer hidden"
|
||||
/>
|
||||
<label
|
||||
htmlFor="volume"
|
||||
className="block cursor-pointer select-none rounded-xl p-2 text-center peer-checked:bg-blue-500 peer-checked:font-bold peer-checked:text-white"
|
||||
>
|
||||
Volumes
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</Field>
|
||||
|
||||
<Field name="resource" type="radio" value="issue">
|
||||
{({ input: issuesInput, meta }) => (
|
||||
<div className="w-fit rounded-xl">
|
||||
<div>
|
||||
<input
|
||||
{...issuesInput}
|
||||
type="radio"
|
||||
id="issue"
|
||||
checked={selectedResource === "issue"}
|
||||
onChange={() => handleResourceChange("issue")}
|
||||
className="peer hidden"
|
||||
/>
|
||||
<label
|
||||
htmlFor="issue"
|
||||
className="block cursor-pointer select-none rounded-xl p-2 text-center peer-checked:bg-blue-500 peer-checked:font-bold peer-checked:text-white"
|
||||
>
|
||||
Issues
|
||||
</label>
|
||||
</div>
|
||||
</div>
|
||||
)}
|
||||
</Field>
|
||||
</div>
|
||||
</form>
|
||||
)}
|
||||
/>
|
||||
</div>
|
||||
{isPending && (
|
||||
<div className="max-w-screen-xl px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
Loading results...
|
||||
</div>
|
||||
)}
|
||||
{!isEmpty(comicVineSearchResults?.data?.results) ? (
|
||||
{isLoading && <>Loading kaka...</>}
|
||||
{!isNil(comicVineSearchResults?.data.results) &&
|
||||
!isEmpty(comicVineSearchResults?.data.results) ? (
|
||||
<div className="mx-auto max-w-screen-xl px-4 py-4 sm:px-6 sm:py-8 lg:px-8">
|
||||
{comicVineSearchResults?.data?.results?.map((result: ComicData) => {
|
||||
return result.resource_type === "issue" ? (
|
||||
<div
|
||||
key={result.id}
|
||||
className="mb-5 dark:bg-slate-400 p-4 rounded-lg"
|
||||
>
|
||||
{comicVineSearchResults.data.results.map((result) => {
|
||||
return isSuccess ? (
|
||||
<div key={result.id} className="mb-5">
|
||||
<div className="flex flex-row">
|
||||
<div className="mr-5 min-w-[80px] max-w-[13%]">
|
||||
<div className="mr-5">
|
||||
<Card
|
||||
key={result.id}
|
||||
orientation={"cover-only"}
|
||||
@@ -306,160 +159,58 @@ export const Search = ({}: SearchPageProps): ReactElement => {
|
||||
hasDetails={false}
|
||||
/>
|
||||
</div>
|
||||
<div className="w-3/4">
|
||||
<div className="column">
|
||||
<div className="text-xl">
|
||||
{!isEmpty(result.volume?.name) ? (
|
||||
result.volume?.name
|
||||
{!isEmpty(result.volume.name) ? (
|
||||
result.volume.name
|
||||
) : (
|
||||
<span className="is-size-3">No Name</span>
|
||||
)}
|
||||
</div>
|
||||
{result.cover_date && (
|
||||
<p>
|
||||
<span className="tag is-light">Cover date</span>
|
||||
{dayjs(result.cover_date).format("MMM D, YYYY")}
|
||||
</p>
|
||||
)}
|
||||
<div className="field is-grouped mt-1">
|
||||
<div className="control">
|
||||
<div className="tags has-addons">
|
||||
<span className="tag is-light">Cover date</span>
|
||||
<span className="tag is-info is-light">
|
||||
{dayjs(result.cover_date).format("MMM D, YYYY")}
|
||||
</span>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<p className="tag is-warning">{result.id}</p>
|
||||
<div className="control">
|
||||
<div className="tags has-addons">
|
||||
<span className="tag is-warning">{result.id}</span>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<a href={result.api_detail_url}>
|
||||
{result.api_detail_url}
|
||||
</a>
|
||||
<p className="text-sm">
|
||||
{result.description ? ellipsize(
|
||||
<p>
|
||||
{ellipsize(
|
||||
convert(result.description, {
|
||||
baseElements: {
|
||||
selectors: ["p", "div"],
|
||||
},
|
||||
}),
|
||||
320,
|
||||
) : ''}
|
||||
)}
|
||||
</p>
|
||||
<div className="mt-2">
|
||||
<PopoverButton
|
||||
content={`This will add ${result.volume?.name || 'this issue'} to your wanted list.`}
|
||||
clickHandler={() =>
|
||||
addToWantedList({
|
||||
source: "comicvine",
|
||||
comicObject: result,
|
||||
markEntireVolumeWanted: false,
|
||||
resourceType: "issue",
|
||||
})
|
||||
}
|
||||
/>
|
||||
<button
|
||||
className="flex space-x-1 sm:mt-0 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-2 py-2 text-gray-500 hover:bg-transparent hover:text-green-600 focus:outline-none focus:ring active:text-indigo-500"
|
||||
onClick={() => addToLibrary("comicvine", result)}
|
||||
>
|
||||
<i className="icon-[solar--add-square-bold-duotone] w-6 h-6 mr-2"></i>{" "}
|
||||
Mark as Wanted
|
||||
</button>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
) : (
|
||||
result.resource_type === "volume" && (
|
||||
<div
|
||||
key={result.id}
|
||||
className="mb-5 dark:bg-slate-500 p-4 rounded-lg"
|
||||
>
|
||||
<div className="flex flex-row">
|
||||
<div className="mr-5 min-w-[80px] max-w-[13%]">
|
||||
<Card
|
||||
key={result.id}
|
||||
orientation={"cover-only"}
|
||||
imageUrl={result.image.small_url}
|
||||
hasDetails={false}
|
||||
/>
|
||||
</div>
|
||||
<div className="w-3/4">
|
||||
<div className="text-xl">
|
||||
{!isEmpty(result.name) ? (
|
||||
result.name
|
||||
) : (
|
||||
<span className="text-xl">No Name</span>
|
||||
)}
|
||||
{result.start_year && <> ({result.start_year})</>}
|
||||
</div>
|
||||
|
||||
<div className="flex flex-row gap-2">
|
||||
{/* issue count */}
|
||||
{result.count_of_issues && (
|
||||
<div className="my-2">
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs font-medium px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--documents-minimalistic-bold-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
|
||||
<span className="text-md text-slate-500 dark:text-slate-900">
|
||||
{t("issueWithCount", {
|
||||
count: result.count_of_issues,
|
||||
})}
|
||||
</span>
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
{/* type: TPB, one-shot, graphic novel etc. */}
|
||||
{!isNil(result.description) &&
|
||||
!isUndefined(result.description) && (
|
||||
<>
|
||||
{!isEmpty(
|
||||
detectIssueTypes(result.description),
|
||||
) && (
|
||||
<div className="my-2">
|
||||
<span className="inline-flex items-center bg-slate-50 text-slate-800 text-xs font-medium px-2 rounded-md dark:text-slate-900 dark:bg-slate-400">
|
||||
<span className="pr-1 pt-1">
|
||||
<i className="icon-[solar--book-2-line-duotone] w-5 h-5"></i>
|
||||
</span>
|
||||
|
||||
<span className="text-md text-slate-500 dark:text-slate-900">
|
||||
{
|
||||
detectIssueTypes(result.description)
|
||||
.displayName
|
||||
}
|
||||
</span>
|
||||
</span>
|
||||
</div>
|
||||
)}
|
||||
</>
|
||||
)}
|
||||
</div>
|
||||
|
||||
<span className="tag is-warning">{result.id}</span>
|
||||
<p>
|
||||
<a href={result.api_detail_url}>
|
||||
{result.api_detail_url}
|
||||
</a>
|
||||
</p>
|
||||
|
||||
{/* description */}
|
||||
<p className="text-sm">
|
||||
{result.description ? ellipsize(
|
||||
convert(result.description, {
|
||||
baseElements: {
|
||||
selectors: ["p", "div"],
|
||||
},
|
||||
}),
|
||||
320,
|
||||
) : ''}
|
||||
</p>
|
||||
<div className="mt-2">
|
||||
<PopoverButton
|
||||
content={`Adding this volume will add ${t(
|
||||
"issueWithCount",
|
||||
{
|
||||
count: result.count_of_issues,
|
||||
},
|
||||
)} to your wanted list.`}
|
||||
clickHandler={() =>
|
||||
addToWantedList({
|
||||
source: "comicvine",
|
||||
comicObject: result,
|
||||
markEntireVolumeWanted: true,
|
||||
resourceType: "volume",
|
||||
})
|
||||
}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
</div>
|
||||
)
|
||||
<div>Loading</div>
|
||||
);
|
||||
})}
|
||||
</div>
|
||||
|
||||
@@ -1,15 +1,16 @@
|
||||
import React, { ReactElement } from "react";
|
||||
import { useQuery } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import { LIBRARY_SERVICE_BASE_URI } from "../../constants/endpoints";
|
||||
import { useDispatch, useSelector } from "react-redux";
|
||||
import { useEffect } from "react";
|
||||
import { getServiceStatus } from "../../actions/fileops.actions";
|
||||
|
||||
export const ServiceStatuses = (): ReactElement => {
|
||||
const { data } = useQuery({
|
||||
queryKey: ["serviceStatus"],
|
||||
queryFn: async () =>
|
||||
axios({ url: `${LIBRARY_SERVICE_BASE_URI}/getHealthInformation`, method: "GET" }),
|
||||
});
|
||||
const serviceStatus = data?.data;
|
||||
const serviceStatus = useSelector(
|
||||
(state: RootState) => state.fileOps.libraryServiceStatus,
|
||||
);
|
||||
const dispatch = useDispatch();
|
||||
useEffect(() => {
|
||||
dispatch(getServiceStatus());
|
||||
}, []);
|
||||
return (
|
||||
<div className="is-clearfix">
|
||||
<div className="mt-4">
|
||||
|
||||
@@ -1,20 +1,27 @@
|
||||
import React, { ReactElement, useState } from "react";
|
||||
import React, { ReactElement, useEffect, useState, useContext } from "react";
|
||||
import { Form, Field } from "react-final-form";
|
||||
import { isEmpty, isNil, isUndefined } from "lodash";
|
||||
import Select from "react-select";
|
||||
import { useQuery, useMutation, useQueryClient } from "@tanstack/react-query";
|
||||
import { useStore } from "../../../store";
|
||||
import axios from "axios";
|
||||
import { produce } from "immer";
|
||||
import { AIRDCPP_SERVICE_BASE_URI } from "../../../constants/endpoints";
|
||||
|
||||
export const AirDCPPHubsForm = (): ReactElement => {
|
||||
const queryClient = useQueryClient();
|
||||
const {
|
||||
airDCPPSocketInstance,
|
||||
airDCPPClientConfiguration,
|
||||
airDCPPSessionInformation,
|
||||
} = useStore((state) => ({
|
||||
airDCPPSocketInstance: state.airDCPPSocketInstance,
|
||||
airDCPPClientConfiguration: state.airDCPPClientConfiguration,
|
||||
airDCPPSessionInformation: state.airDCPPSessionInformation,
|
||||
}));
|
||||
|
||||
const {
|
||||
data: settings,
|
||||
isLoading,
|
||||
isError,
|
||||
refetch,
|
||||
} = useQuery({
|
||||
queryKey: ["settings"],
|
||||
queryFn: async () =>
|
||||
@@ -22,37 +29,24 @@ export const AirDCPPHubsForm = (): ReactElement => {
|
||||
url: "http://localhost:3000/api/settings/getAllSettings",
|
||||
method: "GET",
|
||||
}),
|
||||
staleTime: Infinity,
|
||||
});
|
||||
|
||||
/**
|
||||
* Get the hubs list from an AirDCPP Socket
|
||||
*/
|
||||
const { data: hubs } = useQuery({
|
||||
queryKey: ["hubs"],
|
||||
queryFn: async () =>
|
||||
await axios({
|
||||
url: `${AIRDCPP_SERVICE_BASE_URI}/getHubs`,
|
||||
method: "POST",
|
||||
data: {
|
||||
host: settings?.data.directConnect?.client?.host,
|
||||
},
|
||||
}),
|
||||
enabled: !isEmpty(settings?.data.directConnect?.client?.host),
|
||||
queryFn: async () => await airDCPPSocketInstance.get(`hubs`),
|
||||
});
|
||||
|
||||
interface HubOption {
|
||||
value: string;
|
||||
label: string;
|
||||
}
|
||||
|
||||
let hubList: HubOption[] = [];
|
||||
let hubList = {};
|
||||
if (!isNil(hubs)) {
|
||||
hubList = hubs?.data.map(({ hub_url, identity }: { hub_url: string; identity: { name: string } }) => ({
|
||||
hubList = hubs.map(({ hub_url, identity }) => ({
|
||||
value: hub_url,
|
||||
label: identity.name,
|
||||
}));
|
||||
}
|
||||
|
||||
const mutation = useMutation({
|
||||
mutationFn: async (values: Record<string, unknown>) =>
|
||||
const { mutate } = useMutation({
|
||||
mutationFn: async (values) =>
|
||||
await axios({
|
||||
url: `http://localhost:3000/api/settings/saveSettings`,
|
||||
method: "POST",
|
||||
@@ -62,123 +56,79 @@ export const AirDCPPHubsForm = (): ReactElement => {
|
||||
settingsKey: "directConnect",
|
||||
},
|
||||
}),
|
||||
onSuccess: (data) => {
|
||||
queryClient.setQueryData(["settings"], (oldData: any) =>
|
||||
produce(oldData, (draft: any) => {
|
||||
draft.data.directConnect.client = {
|
||||
...draft.data.directConnect.client,
|
||||
...data.data.directConnect.client,
|
||||
};
|
||||
}),
|
||||
);
|
||||
onSuccess: () => {
|
||||
queryClient.invalidateQueries({ queryKey: ["settings"] });
|
||||
},
|
||||
});
|
||||
const validate = async () => {};
|
||||
|
||||
const validate = async (values: Record<string, unknown>) => {
|
||||
const errors: Record<string, string> = {};
|
||||
// Add any validation logic here if needed
|
||||
return errors;
|
||||
};
|
||||
|
||||
interface SelectAdapterProps {
|
||||
input: {
|
||||
value: unknown;
|
||||
onChange: (value: unknown) => void;
|
||||
onBlur: () => void;
|
||||
onFocus: () => void;
|
||||
name: string;
|
||||
};
|
||||
[key: string]: unknown;
|
||||
}
|
||||
|
||||
const SelectAdapter = ({ input, ...rest }: SelectAdapterProps) => {
|
||||
const SelectAdapter = ({ input, ...rest }) => {
|
||||
return <Select {...input} {...rest} isClearable isMulti />;
|
||||
};
|
||||
|
||||
if (isLoading) {
|
||||
return <div>Loading...</div>;
|
||||
}
|
||||
|
||||
if (isError) {
|
||||
return <div>Error loading settings.</div>;
|
||||
}
|
||||
|
||||
return (
|
||||
<>
|
||||
{!isEmpty(hubList) && !isUndefined(hubs) ? (
|
||||
<Form
|
||||
onSubmit={(values) => {
|
||||
mutation.mutate(values);
|
||||
}}
|
||||
onSubmit={mutate}
|
||||
validate={validate}
|
||||
render={({ handleSubmit }) => (
|
||||
<form onSubmit={handleSubmit} className="mt-10">
|
||||
<h2 className="text-xl">Configure DC++ Hubs</h2>
|
||||
<article
|
||||
role="alert"
|
||||
className="mt-4 rounded-lg max-w-screen-md border-s-4 border-blue-500 bg-blue-50 p-4 dark:border-s-4 dark:border-blue-600 dark:bg-blue-300 dark:text-slate-600"
|
||||
>
|
||||
<form onSubmit={handleSubmit}>
|
||||
<div>
|
||||
<h3 className="title">Hubs</h3>
|
||||
<h6 className="subtitle has-text-grey-light">
|
||||
Select the hubs you want to perform searches against. Your
|
||||
selection in the dropdown <strong>will replace</strong> the
|
||||
existing selection.
|
||||
Select the hubs you want to perform searches against.
|
||||
</h6>
|
||||
</article>
|
||||
|
||||
<div className="field">
|
||||
<label className="block py-1 mt-3">AirDC++ Host</label>
|
||||
<Field
|
||||
name="hubs"
|
||||
component={SelectAdapter}
|
||||
className="basic-multi-select"
|
||||
placeholder="Select Hubs to Search Against"
|
||||
options={hubList}
|
||||
/>
|
||||
</div>
|
||||
<button
|
||||
type="submit"
|
||||
className="flex space-x-1 sm:mt-5 sm:flex-row sm:items-center rounded-lg border border-green-400 dark:border-green-200 bg-green-200 px-4 py-2 text-gray-500 hover:bg-transparent hover:text-green-600 focus:outline-none focus:ring active:text-indigo-500"
|
||||
>
|
||||
<div className="field">
|
||||
<label className="label">AirDC++ Host</label>
|
||||
<div className="control">
|
||||
<Field
|
||||
name="hubs"
|
||||
component={SelectAdapter}
|
||||
className="basic-multi-select"
|
||||
placeholder="Select Hubs to Search Against"
|
||||
options={hubList}
|
||||
/>
|
||||
</div>
|
||||
</div>
|
||||
|
||||
<button type="submit" className="button is-primary">
|
||||
Submit
|
||||
</button>
|
||||
</form>
|
||||
)}
|
||||
/>
|
||||
) : (
|
||||
<article
|
||||
role="alert"
|
||||
className="mt-4 rounded-lg max-w-screen-md border-s-4 border-yellow-500 bg-yellow-50 p-4 dark:border-s-4 dark:border-yellow-600 dark:bg-yellow-300 dark:text-slate-600"
|
||||
>
|
||||
<div className="message-body">
|
||||
No configured hubs detected in AirDC++. <br />
|
||||
Configure to a hub in AirDC++ and then select a default hub here.
|
||||
</div>
|
||||
</article>
|
||||
<>
|
||||
<article className="message">
|
||||
<div className="message-body">
|
||||
No configured hubs detected in AirDC++. <br />
|
||||
Configure to a hub in AirDC++ and then select a default hub here.
|
||||
</div>
|
||||
</article>
|
||||
</>
|
||||
)}
|
||||
{!isEmpty(settings?.data.directConnect?.client.hubs) ? (
|
||||
<>
|
||||
<div className="mt-4">
|
||||
<article className="message is-warning">
|
||||
<div className="message-body is-size-6 is-family-secondary"></div>
|
||||
<div className="message-body is-size-6 is-family-secondary">
|
||||
Your selection in the dropdown <strong>will replace</strong> the
|
||||
existing selection.
|
||||
</div>
|
||||
</article>
|
||||
</div>
|
||||
<div>
|
||||
<span className="flex items-center mt-10 mb-4">
|
||||
<span className="text-xl text-slate-500 dark:text-slate-200 pr-5">
|
||||
Default Hub for Searches
|
||||
</span>
|
||||
<span className="h-px flex-1 bg-slate-200 dark:bg-slate-400"></span>
|
||||
</span>
|
||||
<div className="block max-w-sm p-6 bg-white border border-gray-200 rounded-lg shadow dark:bg-slate-400 dark:border-gray-700">
|
||||
{settings?.data.directConnect?.client.hubs.map(
|
||||
({ value, label }: HubOption) => (
|
||||
<div key={value}>
|
||||
<div>{label}</div>
|
||||
<span className="is-size-7">{value}</span>
|
||||
</div>
|
||||
),
|
||||
)}
|
||||
</div>
|
||||
<div className="box mt-3">
|
||||
<h6>Default Hub For Searches:</h6>
|
||||
{settings?.data.directConnect?.client.hubs.map(
|
||||
({ value, label }) => (
|
||||
<div key={value}>
|
||||
<div>{label}</div>
|
||||
<span className="is-size-7">{value}</span>
|
||||
</div>
|
||||
),
|
||||
)}
|
||||
</div>
|
||||
</>
|
||||
) : null}
|
||||
|
||||
@@ -1,24 +1,7 @@
|
||||
import React, { ReactElement } from "react";
|
||||
|
||||
interface AirDCPPSessionInfo {
|
||||
_id: string;
|
||||
system_info: {
|
||||
client_version: string;
|
||||
hostname: string;
|
||||
platform: string;
|
||||
};
|
||||
user: {
|
||||
username: string;
|
||||
active_sessions: number;
|
||||
permissions: string[];
|
||||
};
|
||||
}
|
||||
|
||||
interface AirDCPPSettingsConfirmationProps {
|
||||
settings: AirDCPPSessionInfo;
|
||||
}
|
||||
|
||||
export const AirDCPPSettingsConfirmation = ({ settings }: AirDCPPSettingsConfirmationProps): ReactElement => {
|
||||
export const AirDCPPSettingsConfirmation = (settingsObject): ReactElement => {
|
||||
const { settings } = settingsObject;
|
||||
return (
|
||||
<div>
|
||||
<span className="flex items-center mt-10 mb-4">
|
||||
|
||||
@@ -1,73 +1,76 @@
|
||||
import React, { useState, useEffect } from "react";
|
||||
import React, { ReactElement, useCallback } from "react";
|
||||
import { AirDCPPSettingsConfirmation } from "./AirDCPPSettingsConfirmation";
|
||||
import { isUndefined, isEmpty } from "lodash";
|
||||
import { ConnectionForm } from "../../shared/ConnectionForm/ConnectionForm";
|
||||
import { useMutation, useQuery, useQueryClient } from "@tanstack/react-query";
|
||||
import { initializeAirDCPPSocket, useStore } from "../../../store/index";
|
||||
import { useShallow } from "zustand/react/shallow";
|
||||
import { useMutation } from "@tanstack/react-query";
|
||||
import axios from "axios";
|
||||
import {
|
||||
AIRDCPP_SERVICE_BASE_URI,
|
||||
SETTINGS_SERVICE_BASE_URI,
|
||||
} from "../../../constants/endpoints";
|
||||
|
||||
export const AirDCPPSettingsForm = () => {
|
||||
const [airDCPPSessionInformation, setAirDCPPSessionInformation] =
|
||||
useState(null);
|
||||
// Fetching all settings
|
||||
const { data: settingsData, isSuccess: settingsSuccess } = useQuery({
|
||||
queryKey: ["airDCPPSettings"],
|
||||
queryFn: () => axios.get(`${SETTINGS_SERVICE_BASE_URI}/getAllSettings`),
|
||||
});
|
||||
export const AirDCPPSettingsForm = (): ReactElement => {
|
||||
// cherry-picking selectors for:
|
||||
// 1. initial values for the form
|
||||
// 2. If initial values are present, get the socket information to display
|
||||
const { setState } = useStore;
|
||||
const {
|
||||
airDCPPSocketConnected,
|
||||
airDCPPDisconnectionInfo,
|
||||
airDCPPSessionInformation,
|
||||
airDCPPClientConfiguration,
|
||||
airDCPPSocketInstance,
|
||||
setAirDCPPSocketInstance,
|
||||
} = useStore(
|
||||
useShallow((state) => ({
|
||||
airDCPPSocketConnected: state.airDCPPSocketConnected,
|
||||
airDCPPDisconnectionInfo: state.airDCPPDisconnectionInfo,
|
||||
airDCPPClientConfiguration: state.airDCPPClientConfiguration,
|
||||
airDCPPSessionInformation: state.airDCPPSessionInformation,
|
||||
airDCPPSocketInstance: state.airDCPPSocketInstance,
|
||||
setAirDCPPSocketInstance: state.setAirDCPPSocketInstance,
|
||||
})),
|
||||
);
|
||||
|
||||
interface HostConfig {
|
||||
hostname: string;
|
||||
port: string;
|
||||
username: string;
|
||||
password: string;
|
||||
protocol: string;
|
||||
}
|
||||
|
||||
// Fetch session information
|
||||
const fetchSessionInfo = (host: HostConfig) => {
|
||||
return axios.post(`${AIRDCPP_SERVICE_BASE_URI}/initialize`, { host });
|
||||
};
|
||||
|
||||
// Use effect to trigger side effects on settings fetch success
|
||||
useEffect(() => {
|
||||
if (settingsSuccess && settingsData?.data?.directConnect?.client?.host) {
|
||||
const host = settingsData.data.directConnect.client.host;
|
||||
fetchSessionInfo(host).then((response) => {
|
||||
setAirDCPPSessionInformation(response.data);
|
||||
});
|
||||
}
|
||||
}, [settingsSuccess, settingsData]);
|
||||
|
||||
// Handle setting update and subsequent AirDC++ initialization
|
||||
/**
|
||||
* Mutation to update settings and subsequently initialize
|
||||
* AirDC++ socket with those settings
|
||||
*/
|
||||
const { mutate } = useMutation({
|
||||
mutationFn: (values: Record<string, unknown>) => {
|
||||
return axios.post("http://localhost:3000/api/settings/saveSettings", {
|
||||
settingsPayload: values,
|
||||
settingsKey: "directConnect",
|
||||
mutationFn: async (values) =>
|
||||
await axios({
|
||||
url: `http://localhost:3000/api/settings/saveSettings`,
|
||||
method: "POST",
|
||||
data: { settingsPayload: values, settingsKey: "directConnect" },
|
||||
}),
|
||||
onSuccess: async (values) => {
|
||||
const {
|
||||
data: {
|
||||
directConnect: {
|
||||
client: { host },
|
||||
},
|
||||
},
|
||||
} = values;
|
||||
const dcppSocketInstance = await initializeAirDCPPSocket(host);
|
||||
setState({
|
||||
airDCPPClientConfiguration: host,
|
||||
airDCPPSocketInstance: dcppSocketInstance,
|
||||
});
|
||||
},
|
||||
onSuccess: async (response) => {
|
||||
const host = response?.data?.directConnect?.client?.host;
|
||||
if (host) {
|
||||
const response = await fetchSessionInfo(host);
|
||||
setAirDCPPSessionInformation(response.data);
|
||||
// setState({ airDCPPClientConfiguration: host });
|
||||
}
|
||||
},
|
||||
});
|
||||
|
||||
const deleteSettingsMutation = useMutation({
|
||||
mutationFn: () =>
|
||||
axios.post("http://localhost:3000/api/settings/saveSettings", {
|
||||
const deleteSettingsMutation = useMutation(
|
||||
async () =>
|
||||
await axios.post("http://localhost:3000/api/settings/saveSettings", {
|
||||
settingsPayload: {},
|
||||
settingsKey: "directConnect",
|
||||
}),
|
||||
});
|
||||
|
||||
const initFormData = settingsData?.data?.directConnect?.client?.host ?? {};
|
||||
);
|
||||
|
||||
// const removeSettings = useCallback(async () => {
|
||||
// // airDCPPSettings.setSettings({});
|
||||
// }, []);
|
||||
//
|
||||
const initFormData = !isUndefined(airDCPPClientConfiguration)
|
||||
? airDCPPClientConfiguration
|
||||
: {};
|
||||
return (
|
||||
<>
|
||||
<ConnectionForm
|
||||
@@ -76,12 +79,13 @@ export const AirDCPPSettingsForm = () => {
|
||||
formHeading={"Configure AirDC++"}
|
||||
/>
|
||||
|
||||
{airDCPPSessionInformation && (
|
||||
{!isEmpty(airDCPPSessionInformation) ? (
|
||||
<AirDCPPSettingsConfirmation settings={airDCPPSessionInformation} />
|
||||
)}
|
||||
) : null}
|
||||
|
||||
{settingsData?.data && (
|
||||
{!isEmpty(airDCPPClientConfiguration) ? (
|
||||
<p className="control mt-4">
|
||||
as
|
||||
<button
|
||||
className="button is-danger"
|
||||
onClick={() => deleteSettingsMutation.mutate()}
|
||||
@@ -89,7 +93,7 @@ export const AirDCPPSettingsForm = () => {
|
||||
Delete
|
||||
</button>
|
||||
</p>
|
||||
)}
|
||||
) : null}
|
||||
</>
|
||||
);
|
||||
};
|
||||
|
||||
Some files were not shown because too many files have changed in this diff Show More
Reference in New Issue
Block a user