Compare commits
6 Commits
v0.9.9
...
24685a5371
| Author | SHA1 | Date | |
|---|---|---|---|
| 24685a5371 | |||
| 49e9ee771f | |||
| 69dc7febe2 | |||
| 7b8fc1d99b | |||
| 7a7d570852 | |||
| 3eb4b3f09f |
122
README.md
122
README.md
@@ -1,8 +1,89 @@
|
|||||||
# Magent
|
# Magent
|
||||||
|
|
||||||
AI-powered request timeline for Jellyseerr + Arr stack.
|
Magent is a friendly, AI-assisted request tracker for Jellyseerr + Arr services. It shows a clear timeline of where a request is stuck, explains what is happening in plain English, and offers safe actions to help fix issues.
|
||||||
|
|
||||||
## Backend (FastAPI)
|
## How it works
|
||||||
|
|
||||||
|
1) Requests are pulled from Jellyseerr and stored locally.
|
||||||
|
2) Magent joins that request to Sonarr/Radarr, Prowlarr, qBittorrent, and Jellyfin using TMDB/TVDB IDs and download hashes.
|
||||||
|
3) A state engine normalizes noisy service statuses into a simple, user-friendly state.
|
||||||
|
4) The UI renders a timeline and a central status box for each request.
|
||||||
|
5) Optional AI triage summarizes the likely cause and safest next steps.
|
||||||
|
|
||||||
|
## Core features
|
||||||
|
|
||||||
|
- Request search by title/year or request ID.
|
||||||
|
- Recent requests list with posters and status.
|
||||||
|
- Timeline view across Jellyseerr, Arr, Prowlarr, qBittorrent, Jellyfin.
|
||||||
|
- Central status box with clear reason + next steps.
|
||||||
|
- Safe action buttons (search, resume, re-add, etc.).
|
||||||
|
- Admin settings for service URLs, API keys, profiles, and root folders.
|
||||||
|
- Health status for each service in the pipeline.
|
||||||
|
- Cache and sync controls (full sync, delta sync, scheduled syncs).
|
||||||
|
- Local database for speed and audit history.
|
||||||
|
- Users and access control (admin vs user, block access).
|
||||||
|
- Local account password changes via "My profile".
|
||||||
|
- Docker-first deployment for easy hosting.
|
||||||
|
|
||||||
|
## Quick start (Docker - primary)
|
||||||
|
|
||||||
|
Docker is the recommended way to run Magent. It includes the backend and frontend with sane defaults.
|
||||||
|
|
||||||
|
```bash
|
||||||
|
docker compose up --build
|
||||||
|
```
|
||||||
|
|
||||||
|
Then open:
|
||||||
|
|
||||||
|
- Frontend: http://localhost:3000
|
||||||
|
- Backend: http://localhost:8000
|
||||||
|
|
||||||
|
### Docker setup steps
|
||||||
|
|
||||||
|
1) Create `.env` with your service URLs and API keys.
|
||||||
|
2) Run `docker compose up --build`.
|
||||||
|
3) Log in at http://localhost:3000.
|
||||||
|
4) Visit Settings to confirm service health.
|
||||||
|
|
||||||
|
### Docker environment variables (sample)
|
||||||
|
|
||||||
|
```bash
|
||||||
|
JELLYSEERR_URL="http://localhost:5055"
|
||||||
|
JELLYSEERR_API_KEY="..."
|
||||||
|
SONARR_URL="http://localhost:8989"
|
||||||
|
SONARR_API_KEY="..."
|
||||||
|
SONARR_QUALITY_PROFILE_ID="1"
|
||||||
|
SONARR_ROOT_FOLDER="/tv"
|
||||||
|
RADARR_URL="http://localhost:7878"
|
||||||
|
RADARR_API_KEY="..."
|
||||||
|
RADARR_QUALITY_PROFILE_ID="1"
|
||||||
|
RADARR_ROOT_FOLDER="/movies"
|
||||||
|
PROWLARR_URL="http://localhost:9696"
|
||||||
|
PROWLARR_API_KEY="..."
|
||||||
|
QBIT_URL="http://localhost:8080"
|
||||||
|
QBIT_USERNAME="..."
|
||||||
|
QBIT_PASSWORD="..."
|
||||||
|
SQLITE_PATH="data/magent.db"
|
||||||
|
JWT_SECRET="change-me"
|
||||||
|
JWT_EXP_MINUTES="720"
|
||||||
|
ADMIN_USERNAME="admin"
|
||||||
|
ADMIN_PASSWORD="adminadmin"
|
||||||
|
```
|
||||||
|
|
||||||
|
## Screenshots
|
||||||
|
|
||||||
|
Add screenshots here once available:
|
||||||
|
|
||||||
|
- `docs/screenshots/home.png`
|
||||||
|
- `docs/screenshots/request-timeline.png`
|
||||||
|
- `docs/screenshots/settings.png`
|
||||||
|
- `docs/screenshots/profile.png`
|
||||||
|
|
||||||
|
## Local development (secondary)
|
||||||
|
|
||||||
|
Use this only when you need to modify code locally.
|
||||||
|
|
||||||
|
### Backend (FastAPI)
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
cd backend
|
cd backend
|
||||||
@@ -37,7 +118,7 @@ $env:ADMIN_USERNAME="admin"
|
|||||||
$env:ADMIN_PASSWORD="adminadmin"
|
$env:ADMIN_PASSWORD="adminadmin"
|
||||||
```
|
```
|
||||||
|
|
||||||
## Frontend (Next.js)
|
### Frontend (Next.js)
|
||||||
|
|
||||||
```bash
|
```bash
|
||||||
cd frontend
|
cd frontend
|
||||||
@@ -49,20 +130,11 @@ Open http://localhost:3000
|
|||||||
|
|
||||||
Admin panel: http://localhost:3000/admin
|
Admin panel: http://localhost:3000/admin
|
||||||
|
|
||||||
Login uses the admin credentials above (or any other user you create in SQLite).
|
Login uses the admin credentials above (or any other local user you create in SQLite).
|
||||||
|
|
||||||
## Docker (Testing)
|
|
||||||
|
|
||||||
```bash
|
|
||||||
docker compose up --build
|
|
||||||
```
|
|
||||||
|
|
||||||
Backend: http://localhost:8000
|
|
||||||
Frontend: http://localhost:3000
|
|
||||||
|
|
||||||
## Public Hosting Notes
|
## Public Hosting Notes
|
||||||
|
|
||||||
The frontend now proxies `/api/*` to the backend container. Set:
|
The frontend proxies `/api/*` to the backend container. Set:
|
||||||
|
|
||||||
- `NEXT_PUBLIC_API_BASE=/api` (browser uses same-origin)
|
- `NEXT_PUBLIC_API_BASE=/api` (browser uses same-origin)
|
||||||
- `BACKEND_INTERNAL_URL=http://backend:8000` (container-to-container)
|
- `BACKEND_INTERNAL_URL=http://backend:8000` (container-to-container)
|
||||||
@@ -73,3 +145,25 @@ If you prefer the browser to call the backend directly, set `NEXT_PUBLIC_API_BAS
|
|||||||
|
|
||||||
- `GET /requests/{id}/history?limit=10` recent snapshots
|
- `GET /requests/{id}/history?limit=10` recent snapshots
|
||||||
- `GET /requests/{id}/actions?limit=10` recent action logs
|
- `GET /requests/{id}/actions?limit=10` recent action logs
|
||||||
|
|
||||||
|
## Troubleshooting
|
||||||
|
|
||||||
|
### Login fails
|
||||||
|
|
||||||
|
- Make sure `ADMIN_USERNAME` and `ADMIN_PASSWORD` are set in `.env`.
|
||||||
|
- Confirm the backend is reachable: `http://localhost:8000/health` (or see container logs).
|
||||||
|
|
||||||
|
### Services show as down
|
||||||
|
|
||||||
|
- Check the URLs and API keys in Settings.
|
||||||
|
- Verify containers can reach each service (network/DNS).
|
||||||
|
|
||||||
|
### No recent requests
|
||||||
|
|
||||||
|
- Confirm Jellyseerr credentials in Settings.
|
||||||
|
- Run a full sync from Settings -> Requests.
|
||||||
|
|
||||||
|
### Docker images not updating
|
||||||
|
|
||||||
|
- Run `docker compose up --build` again.
|
||||||
|
- If needed, run `docker compose down` first, then rebuild.
|
||||||
|
|||||||
@@ -56,6 +56,9 @@ class QBittorrentClient(ApiClient):
|
|||||||
async def get_torrents_by_hashes(self, hashes: str) -> Optional[Any]:
|
async def get_torrents_by_hashes(self, hashes: str) -> Optional[Any]:
|
||||||
return await self._get("/api/v2/torrents/info", params={"hashes": hashes})
|
return await self._get("/api/v2/torrents/info", params={"hashes": hashes})
|
||||||
|
|
||||||
|
async def get_torrents_by_category(self, category: str) -> Optional[Any]:
|
||||||
|
return await self._get("/api/v2/torrents/info", params={"category": category})
|
||||||
|
|
||||||
async def get_app_version(self) -> Optional[Any]:
|
async def get_app_version(self) -> Optional[Any]:
|
||||||
return await self._get_text("/api/v2/app/version")
|
return await self._get_text("/api/v2/app/version")
|
||||||
|
|
||||||
@@ -67,3 +70,9 @@ class QBittorrentClient(ApiClient):
|
|||||||
await self._post_form("/api/v2/torrents/start", data={"hashes": hashes})
|
await self._post_form("/api/v2/torrents/start", data={"hashes": hashes})
|
||||||
return
|
return
|
||||||
raise
|
raise
|
||||||
|
|
||||||
|
async def add_torrent_url(self, url: str, category: Optional[str] = None) -> None:
|
||||||
|
data: Dict[str, Any] = {"urls": url}
|
||||||
|
if category:
|
||||||
|
data["category"] = category
|
||||||
|
await self._post_form("/api/v2/torrents/add", data=data)
|
||||||
|
|||||||
@@ -458,7 +458,7 @@ def get_request_cache_by_id(request_id: int) -> Optional[Dict[str, Any]]:
|
|||||||
with _connect() as conn:
|
with _connect() as conn:
|
||||||
row = conn.execute(
|
row = conn.execute(
|
||||||
"""
|
"""
|
||||||
SELECT request_id, updated_at
|
SELECT request_id, updated_at, title
|
||||||
FROM requests_cache
|
FROM requests_cache
|
||||||
WHERE request_id = ?
|
WHERE request_id = ?
|
||||||
""",
|
""",
|
||||||
@@ -468,7 +468,7 @@ def get_request_cache_by_id(request_id: int) -> Optional[Dict[str, Any]]:
|
|||||||
logger.debug("requests_cache miss: request_id=%s", request_id)
|
logger.debug("requests_cache miss: request_id=%s", request_id)
|
||||||
return None
|
return None
|
||||||
logger.debug("requests_cache hit: request_id=%s updated_at=%s", row[0], row[1])
|
logger.debug("requests_cache hit: request_id=%s updated_at=%s", row[0], row[1])
|
||||||
return {"request_id": row[0], "updated_at": row[1]}
|
return {"request_id": row[0], "updated_at": row[1], "title": row[2]}
|
||||||
|
|
||||||
|
|
||||||
def get_request_cache_payload(request_id: int) -> Optional[Dict[str, Any]]:
|
def get_request_cache_payload(request_id: int) -> Optional[Dict[str, Any]]:
|
||||||
@@ -545,7 +545,7 @@ def get_request_cache_overview(limit: int = 50) -> list[Dict[str, Any]]:
|
|||||||
with _connect() as conn:
|
with _connect() as conn:
|
||||||
rows = conn.execute(
|
rows = conn.execute(
|
||||||
"""
|
"""
|
||||||
SELECT request_id, media_id, media_type, status, title, year, requested_by, created_at, updated_at
|
SELECT request_id, media_id, media_type, status, title, year, requested_by, created_at, updated_at, payload_json
|
||||||
FROM requests_cache
|
FROM requests_cache
|
||||||
ORDER BY updated_at DESC, request_id DESC
|
ORDER BY updated_at DESC, request_id DESC
|
||||||
LIMIT ?
|
LIMIT ?
|
||||||
@@ -554,13 +554,27 @@ def get_request_cache_overview(limit: int = 50) -> list[Dict[str, Any]]:
|
|||||||
).fetchall()
|
).fetchall()
|
||||||
results: list[Dict[str, Any]] = []
|
results: list[Dict[str, Any]] = []
|
||||||
for row in rows:
|
for row in rows:
|
||||||
|
title = row[4]
|
||||||
|
if not title and row[9]:
|
||||||
|
try:
|
||||||
|
payload = json.loads(row[9])
|
||||||
|
if isinstance(payload, dict):
|
||||||
|
media = payload.get("media") or {}
|
||||||
|
title = (
|
||||||
|
(media.get("title") if isinstance(media, dict) else None)
|
||||||
|
or (media.get("name") if isinstance(media, dict) else None)
|
||||||
|
or payload.get("title")
|
||||||
|
or payload.get("name")
|
||||||
|
)
|
||||||
|
except json.JSONDecodeError:
|
||||||
|
title = row[4]
|
||||||
results.append(
|
results.append(
|
||||||
{
|
{
|
||||||
"request_id": row[0],
|
"request_id": row[0],
|
||||||
"media_id": row[1],
|
"media_id": row[1],
|
||||||
"media_type": row[2],
|
"media_type": row[2],
|
||||||
"status": row[3],
|
"status": row[3],
|
||||||
"title": row[4],
|
"title": title,
|
||||||
"year": row[5],
|
"year": row[5],
|
||||||
"requested_by": row[6],
|
"requested_by": row[6],
|
||||||
"created_at": row[7],
|
"created_at": row[7],
|
||||||
|
|||||||
@@ -265,6 +265,16 @@ async def _hydrate_title_from_tmdb(
|
|||||||
return None, None
|
return None, None
|
||||||
|
|
||||||
|
|
||||||
|
async def _hydrate_media_details(client: JellyseerrClient, media_id: Optional[int]) -> Optional[Dict[str, Any]]:
|
||||||
|
if not media_id:
|
||||||
|
return None
|
||||||
|
try:
|
||||||
|
details = await client.get_media(int(media_id))
|
||||||
|
except httpx.HTTPStatusError:
|
||||||
|
return None
|
||||||
|
return details if isinstance(details, dict) else None
|
||||||
|
|
||||||
|
|
||||||
async def _hydrate_artwork_from_tmdb(
|
async def _hydrate_artwork_from_tmdb(
|
||||||
client: JellyseerrClient, media_type: Optional[str], tmdb_id: Optional[int]
|
client: JellyseerrClient, media_type: Optional[str], tmdb_id: Optional[int]
|
||||||
) -> tuple[Optional[str], Optional[str]]:
|
) -> tuple[Optional[str], Optional[str]]:
|
||||||
@@ -389,6 +399,28 @@ async def _sync_all_requests(client: JellyseerrClient) -> int:
|
|||||||
if isinstance(details, dict):
|
if isinstance(details, dict):
|
||||||
payload = _parse_request_payload(details)
|
payload = _parse_request_payload(details)
|
||||||
item = details
|
item = details
|
||||||
|
if not payload.get("title") and payload.get("media_id"):
|
||||||
|
media_details = await _hydrate_media_details(client, payload.get("media_id"))
|
||||||
|
if isinstance(media_details, dict):
|
||||||
|
media_title = media_details.get("title") or media_details.get("name")
|
||||||
|
if media_title:
|
||||||
|
payload["title"] = media_title
|
||||||
|
if not payload.get("year") and media_details.get("year"):
|
||||||
|
payload["year"] = media_details.get("year")
|
||||||
|
if not payload.get("tmdb_id") and media_details.get("tmdbId"):
|
||||||
|
payload["tmdb_id"] = media_details.get("tmdbId")
|
||||||
|
if not payload.get("media_type") and media_details.get("mediaType"):
|
||||||
|
payload["media_type"] = media_details.get("mediaType")
|
||||||
|
if isinstance(item, dict):
|
||||||
|
existing_media = item.get("media")
|
||||||
|
if isinstance(existing_media, dict):
|
||||||
|
merged = dict(media_details)
|
||||||
|
for key, value in existing_media.items():
|
||||||
|
if value is not None:
|
||||||
|
merged[key] = value
|
||||||
|
item["media"] = merged
|
||||||
|
else:
|
||||||
|
item["media"] = media_details
|
||||||
poster_path, backdrop_path = _extract_artwork_paths(item)
|
poster_path, backdrop_path = _extract_artwork_paths(item)
|
||||||
if cache_mode == "cache" and not (poster_path or backdrop_path):
|
if cache_mode == "cache" and not (poster_path or backdrop_path):
|
||||||
details = await _get_request_details(client, request_id)
|
details = await _get_request_details(client, request_id)
|
||||||
@@ -483,13 +515,35 @@ async def _sync_delta_requests(client: JellyseerrClient) -> int:
|
|||||||
if isinstance(request_id, int):
|
if isinstance(request_id, int):
|
||||||
cached = get_request_cache_by_id(request_id)
|
cached = get_request_cache_by_id(request_id)
|
||||||
incoming_updated = payload.get("updated_at")
|
incoming_updated = payload.get("updated_at")
|
||||||
if cached and incoming_updated and cached.get("updated_at") == incoming_updated:
|
if cached and incoming_updated and cached.get("updated_at") == incoming_updated and cached.get("title"):
|
||||||
continue
|
continue
|
||||||
if not payload.get("title") or not payload.get("media_id"):
|
if not payload.get("title") or not payload.get("media_id"):
|
||||||
details = await _get_request_details(client, request_id)
|
details = await _get_request_details(client, request_id)
|
||||||
if isinstance(details, dict):
|
if isinstance(details, dict):
|
||||||
payload = _parse_request_payload(details)
|
payload = _parse_request_payload(details)
|
||||||
item = details
|
item = details
|
||||||
|
if not payload.get("title") and payload.get("media_id"):
|
||||||
|
media_details = await _hydrate_media_details(client, payload.get("media_id"))
|
||||||
|
if isinstance(media_details, dict):
|
||||||
|
media_title = media_details.get("title") or media_details.get("name")
|
||||||
|
if media_title:
|
||||||
|
payload["title"] = media_title
|
||||||
|
if not payload.get("year") and media_details.get("year"):
|
||||||
|
payload["year"] = media_details.get("year")
|
||||||
|
if not payload.get("tmdb_id") and media_details.get("tmdbId"):
|
||||||
|
payload["tmdb_id"] = media_details.get("tmdbId")
|
||||||
|
if not payload.get("media_type") and media_details.get("mediaType"):
|
||||||
|
payload["media_type"] = media_details.get("mediaType")
|
||||||
|
if isinstance(item, dict):
|
||||||
|
existing_media = item.get("media")
|
||||||
|
if isinstance(existing_media, dict):
|
||||||
|
merged = dict(media_details)
|
||||||
|
for key, value in existing_media.items():
|
||||||
|
if value is not None:
|
||||||
|
merged[key] = value
|
||||||
|
item["media"] = merged
|
||||||
|
else:
|
||||||
|
item["media"] = media_details
|
||||||
poster_path, backdrop_path = _extract_artwork_paths(item)
|
poster_path, backdrop_path = _extract_artwork_paths(item)
|
||||||
if cache_mode == "cache" and not (poster_path or backdrop_path):
|
if cache_mode == "cache" and not (poster_path or backdrop_path):
|
||||||
details = await _get_request_details(client, request_id)
|
details = await _get_request_details(client, request_id)
|
||||||
@@ -1243,6 +1297,37 @@ async def ai_triage(request_id: str, user: Dict[str, str] = Depends(get_current_
|
|||||||
|
|
||||||
@router.post("/{request_id}/actions/search")
|
@router.post("/{request_id}/actions/search")
|
||||||
async def action_search(request_id: str, user: Dict[str, str] = Depends(get_current_user)) -> dict:
|
async def action_search(request_id: str, user: Dict[str, str] = Depends(get_current_user)) -> dict:
|
||||||
|
runtime = get_runtime_settings()
|
||||||
|
client = JellyseerrClient(runtime.jellyseerr_base_url, runtime.jellyseerr_api_key)
|
||||||
|
if client.configured():
|
||||||
|
await _ensure_request_access(client, int(request_id), user)
|
||||||
|
snapshot = await build_snapshot(request_id)
|
||||||
|
prowlarr_results: List[Dict[str, Any]] = []
|
||||||
|
prowlarr = ProwlarrClient(runtime.prowlarr_base_url, runtime.prowlarr_api_key)
|
||||||
|
if not prowlarr.configured():
|
||||||
|
raise HTTPException(status_code=400, detail="Prowlarr not configured")
|
||||||
|
query = snapshot.title
|
||||||
|
if snapshot.year:
|
||||||
|
query = f"{query} {snapshot.year}"
|
||||||
|
try:
|
||||||
|
results = await prowlarr.search(query=query)
|
||||||
|
prowlarr_results = _filter_prowlarr_results(results, snapshot.request_type)
|
||||||
|
except httpx.HTTPStatusError:
|
||||||
|
prowlarr_results = []
|
||||||
|
|
||||||
|
await asyncio.to_thread(
|
||||||
|
save_action,
|
||||||
|
request_id,
|
||||||
|
"search_releases",
|
||||||
|
"Search and choose a download",
|
||||||
|
"ok",
|
||||||
|
f"Found {len(prowlarr_results)} releases.",
|
||||||
|
)
|
||||||
|
return {"status": "ok", "releases": prowlarr_results}
|
||||||
|
|
||||||
|
|
||||||
|
@router.post("/{request_id}/actions/search_auto")
|
||||||
|
async def action_search_auto(request_id: str, user: Dict[str, str] = Depends(get_current_user)) -> dict:
|
||||||
runtime = get_runtime_settings()
|
runtime = get_runtime_settings()
|
||||||
client = JellyseerrClient(runtime.jellyseerr_base_url, runtime.jellyseerr_api_key)
|
client = JellyseerrClient(runtime.jellyseerr_base_url, runtime.jellyseerr_api_key)
|
||||||
if client.configured():
|
if client.configured():
|
||||||
@@ -1252,18 +1337,6 @@ async def action_search(request_id: str, user: Dict[str, str] = Depends(get_curr
|
|||||||
if not isinstance(arr_item, dict):
|
if not isinstance(arr_item, dict):
|
||||||
raise HTTPException(status_code=404, detail="Item not found in Sonarr/Radarr")
|
raise HTTPException(status_code=404, detail="Item not found in Sonarr/Radarr")
|
||||||
|
|
||||||
prowlarr_results: List[Dict[str, Any]] = []
|
|
||||||
prowlarr = ProwlarrClient(runtime.prowlarr_base_url, runtime.prowlarr_api_key)
|
|
||||||
if prowlarr.configured():
|
|
||||||
query = snapshot.title
|
|
||||||
if snapshot.year:
|
|
||||||
query = f"{query} {snapshot.year}"
|
|
||||||
try:
|
|
||||||
results = await prowlarr.search(query=query)
|
|
||||||
prowlarr_results = _filter_prowlarr_results(results, snapshot.request_type)
|
|
||||||
except httpx.HTTPStatusError:
|
|
||||||
prowlarr_results = []
|
|
||||||
|
|
||||||
if snapshot.request_type.value == "tv":
|
if snapshot.request_type.value == "tv":
|
||||||
client = SonarrClient(runtime.sonarr_base_url, runtime.sonarr_api_key)
|
client = SonarrClient(runtime.sonarr_base_url, runtime.sonarr_api_key)
|
||||||
if not client.configured():
|
if not client.configured():
|
||||||
@@ -1271,12 +1344,11 @@ async def action_search(request_id: str, user: Dict[str, str] = Depends(get_curr
|
|||||||
episodes = await client.get_episodes(int(arr_item["id"]))
|
episodes = await client.get_episodes(int(arr_item["id"]))
|
||||||
missing_by_season = _missing_episode_ids_by_season(episodes)
|
missing_by_season = _missing_episode_ids_by_season(episodes)
|
||||||
if not missing_by_season:
|
if not missing_by_season:
|
||||||
return {
|
message = "No missing monitored episodes found."
|
||||||
"status": "ok",
|
await asyncio.to_thread(
|
||||||
"message": "No missing monitored episodes found",
|
save_action, request_id, "search_auto", "Search and auto-download", "ok", message
|
||||||
"searched": [],
|
)
|
||||||
"releases": prowlarr_results,
|
return {"status": "ok", "message": message, "searched": []}
|
||||||
}
|
|
||||||
responses = []
|
responses = []
|
||||||
for season_number in sorted(missing_by_season.keys()):
|
for season_number in sorted(missing_by_season.keys()):
|
||||||
episode_ids = missing_by_season[season_number]
|
episode_ids = missing_by_season[season_number]
|
||||||
@@ -1285,33 +1357,23 @@ async def action_search(request_id: str, user: Dict[str, str] = Depends(get_curr
|
|||||||
responses.append(
|
responses.append(
|
||||||
{"season": season_number, "episodeCount": len(episode_ids), "response": response}
|
{"season": season_number, "episodeCount": len(episode_ids), "response": response}
|
||||||
)
|
)
|
||||||
result = {"status": "ok", "searched": responses, "releases": prowlarr_results}
|
message = "Search sent to Sonarr."
|
||||||
await asyncio.to_thread(
|
await asyncio.to_thread(
|
||||||
save_action,
|
save_action, request_id, "search_auto", "Search and auto-download", "ok", message
|
||||||
request_id,
|
|
||||||
"search",
|
|
||||||
"Re-run search in Sonarr/Radarr",
|
|
||||||
"ok",
|
|
||||||
f"Found {len(prowlarr_results)} releases.",
|
|
||||||
)
|
)
|
||||||
return result
|
return {"status": "ok", "message": message, "searched": responses}
|
||||||
elif snapshot.request_type.value == "movie":
|
if snapshot.request_type.value == "movie":
|
||||||
client = RadarrClient(runtime.radarr_base_url, runtime.radarr_api_key)
|
client = RadarrClient(runtime.radarr_base_url, runtime.radarr_api_key)
|
||||||
if not client.configured():
|
if not client.configured():
|
||||||
raise HTTPException(status_code=400, detail="Radarr not configured")
|
raise HTTPException(status_code=400, detail="Radarr not configured")
|
||||||
response = await client.search(int(arr_item["id"]))
|
response = await client.search(int(arr_item["id"]))
|
||||||
result = {"status": "ok", "response": response, "releases": prowlarr_results}
|
message = "Search sent to Radarr."
|
||||||
await asyncio.to_thread(
|
await asyncio.to_thread(
|
||||||
save_action,
|
save_action, request_id, "search_auto", "Search and auto-download", "ok", message
|
||||||
request_id,
|
|
||||||
"search",
|
|
||||||
"Re-run search in Sonarr/Radarr",
|
|
||||||
"ok",
|
|
||||||
f"Found {len(prowlarr_results)} releases.",
|
|
||||||
)
|
)
|
||||||
return result
|
return {"status": "ok", "message": message, "response": response}
|
||||||
else:
|
|
||||||
raise HTTPException(status_code=400, detail="Unknown request type")
|
raise HTTPException(status_code=400, detail="Unknown request type")
|
||||||
|
|
||||||
|
|
||||||
@router.post("/{request_id}/actions/qbit/resume")
|
@router.post("/{request_id}/actions/qbit/resume")
|
||||||
@@ -1507,6 +1569,7 @@ async def action_grab(
|
|||||||
snapshot = await build_snapshot(request_id)
|
snapshot = await build_snapshot(request_id)
|
||||||
guid = payload.get("guid")
|
guid = payload.get("guid")
|
||||||
indexer_id = payload.get("indexerId")
|
indexer_id = payload.get("indexerId")
|
||||||
|
download_url = payload.get("downloadUrl")
|
||||||
if not guid or not indexer_id:
|
if not guid or not indexer_id:
|
||||||
raise HTTPException(status_code=400, detail="Missing guid or indexerId")
|
raise HTTPException(status_code=400, detail="Missing guid or indexerId")
|
||||||
|
|
||||||
@@ -1518,6 +1581,28 @@ async def action_grab(
|
|||||||
try:
|
try:
|
||||||
response = await client.grab_release(str(guid), int(indexer_id))
|
response = await client.grab_release(str(guid), int(indexer_id))
|
||||||
except httpx.HTTPStatusError as exc:
|
except httpx.HTTPStatusError as exc:
|
||||||
|
status_code = exc.response.status_code if exc.response is not None else 502
|
||||||
|
if status_code == 404 and download_url:
|
||||||
|
qbit = QBittorrentClient(
|
||||||
|
runtime.qbittorrent_base_url,
|
||||||
|
runtime.qbittorrent_username,
|
||||||
|
runtime.qbittorrent_password,
|
||||||
|
)
|
||||||
|
if not qbit.configured():
|
||||||
|
raise HTTPException(status_code=400, detail="qBittorrent not configured")
|
||||||
|
try:
|
||||||
|
await qbit.add_torrent_url(str(download_url), category=f"magent-{request_id}")
|
||||||
|
except httpx.HTTPStatusError as qbit_exc:
|
||||||
|
raise HTTPException(status_code=502, detail=str(qbit_exc)) from qbit_exc
|
||||||
|
await asyncio.to_thread(
|
||||||
|
save_action,
|
||||||
|
request_id,
|
||||||
|
"grab",
|
||||||
|
"Grab release",
|
||||||
|
"ok",
|
||||||
|
"Sent to qBittorrent via Prowlarr.",
|
||||||
|
)
|
||||||
|
return {"status": "ok", "message": "Sent to qBittorrent.", "via": "qbittorrent"}
|
||||||
raise HTTPException(status_code=502, detail=str(exc)) from exc
|
raise HTTPException(status_code=502, detail=str(exc)) from exc
|
||||||
await asyncio.to_thread(
|
await asyncio.to_thread(
|
||||||
save_action, request_id, "grab", "Grab release", "ok", "Grab sent to Sonarr."
|
save_action, request_id, "grab", "Grab release", "ok", "Grab sent to Sonarr."
|
||||||
@@ -1530,6 +1615,28 @@ async def action_grab(
|
|||||||
try:
|
try:
|
||||||
response = await client.grab_release(str(guid), int(indexer_id))
|
response = await client.grab_release(str(guid), int(indexer_id))
|
||||||
except httpx.HTTPStatusError as exc:
|
except httpx.HTTPStatusError as exc:
|
||||||
|
status_code = exc.response.status_code if exc.response is not None else 502
|
||||||
|
if status_code == 404 and download_url:
|
||||||
|
qbit = QBittorrentClient(
|
||||||
|
runtime.qbittorrent_base_url,
|
||||||
|
runtime.qbittorrent_username,
|
||||||
|
runtime.qbittorrent_password,
|
||||||
|
)
|
||||||
|
if not qbit.configured():
|
||||||
|
raise HTTPException(status_code=400, detail="qBittorrent not configured")
|
||||||
|
try:
|
||||||
|
await qbit.add_torrent_url(str(download_url), category=f"magent-{request_id}")
|
||||||
|
except httpx.HTTPStatusError as qbit_exc:
|
||||||
|
raise HTTPException(status_code=502, detail=str(qbit_exc)) from qbit_exc
|
||||||
|
await asyncio.to_thread(
|
||||||
|
save_action,
|
||||||
|
request_id,
|
||||||
|
"grab",
|
||||||
|
"Grab release",
|
||||||
|
"ok",
|
||||||
|
"Sent to qBittorrent via Prowlarr.",
|
||||||
|
)
|
||||||
|
return {"status": "ok", "message": "Sent to qBittorrent.", "via": "qbittorrent"}
|
||||||
raise HTTPException(status_code=502, detail=str(exc)) from exc
|
raise HTTPException(status_code=502, detail=str(exc)) from exc
|
||||||
await asyncio.to_thread(
|
await asyncio.to_thread(
|
||||||
save_action, request_id, "grab", "Grab release", "ok", "Grab sent to Radarr."
|
save_action, request_id, "grab", "Grab release", "ok", "Grab sent to Radarr."
|
||||||
|
|||||||
@@ -465,9 +465,14 @@ async def build_snapshot(request_id: str) -> Snapshot:
|
|||||||
try:
|
try:
|
||||||
download_ids = _download_ids(_queue_records(arr_queue))
|
download_ids = _download_ids(_queue_records(arr_queue))
|
||||||
torrent_list: List[Dict[str, Any]] = []
|
torrent_list: List[Dict[str, Any]] = []
|
||||||
if download_ids and qbittorrent.configured():
|
if qbittorrent.configured():
|
||||||
torrents = await qbittorrent.get_torrents_by_hashes("|".join(download_ids))
|
if download_ids:
|
||||||
torrent_list = torrents if isinstance(torrents, list) else []
|
torrents = await qbittorrent.get_torrents_by_hashes("|".join(download_ids))
|
||||||
|
torrent_list = torrents if isinstance(torrents, list) else []
|
||||||
|
else:
|
||||||
|
category = f"magent-{request_id}"
|
||||||
|
torrents = await qbittorrent.get_torrents_by_category(category)
|
||||||
|
torrent_list = torrents if isinstance(torrents, list) else []
|
||||||
summary = _summarize_qbit(torrent_list)
|
summary = _summarize_qbit(torrent_list)
|
||||||
qbit_state = summary.get("state")
|
qbit_state = summary.get("state")
|
||||||
qbit_message = summary.get("message")
|
qbit_message = summary.get("message")
|
||||||
@@ -550,8 +555,15 @@ async def build_snapshot(request_id: str) -> Snapshot:
|
|||||||
elif arr_item and arr_state != "available":
|
elif arr_item and arr_state != "available":
|
||||||
actions.append(
|
actions.append(
|
||||||
ActionOption(
|
ActionOption(
|
||||||
id="search",
|
id="search_auto",
|
||||||
label="Search again for releases",
|
label="Search and auto-download",
|
||||||
|
risk="low",
|
||||||
|
)
|
||||||
|
)
|
||||||
|
actions.append(
|
||||||
|
ActionOption(
|
||||||
|
id="search_releases",
|
||||||
|
label="Search and choose a download",
|
||||||
risk="low",
|
risk="low",
|
||||||
)
|
)
|
||||||
)
|
)
|
||||||
|
|||||||
@@ -34,6 +34,7 @@ type ReleaseOption = {
|
|||||||
leechers?: number
|
leechers?: number
|
||||||
protocol?: string
|
protocol?: string
|
||||||
infoUrl?: string
|
infoUrl?: string
|
||||||
|
downloadUrl?: string
|
||||||
}
|
}
|
||||||
|
|
||||||
type SnapshotHistory = {
|
type SnapshotHistory = {
|
||||||
@@ -484,7 +485,8 @@ export default function RequestTimelinePage({ params }: { params: { id: string }
|
|||||||
}
|
}
|
||||||
const baseUrl = getApiBase()
|
const baseUrl = getApiBase()
|
||||||
const actionMap: Record<string, string> = {
|
const actionMap: Record<string, string> = {
|
||||||
search: 'actions/search',
|
search_releases: 'actions/search',
|
||||||
|
search_auto: 'actions/search_auto',
|
||||||
resume_torrent: 'actions/qbit/resume',
|
resume_torrent: 'actions/qbit/resume',
|
||||||
readd_to_arr: 'actions/readd',
|
readd_to_arr: 'actions/readd',
|
||||||
}
|
}
|
||||||
@@ -493,7 +495,7 @@ export default function RequestTimelinePage({ params }: { params: { id: string }
|
|||||||
setActionMessage('This action is not wired yet.')
|
setActionMessage('This action is not wired yet.')
|
||||||
return
|
return
|
||||||
}
|
}
|
||||||
if (action.id === 'search') {
|
if (action.id === 'search_releases') {
|
||||||
setActionMessage(null)
|
setActionMessage(null)
|
||||||
setReleaseOptions([])
|
setReleaseOptions([])
|
||||||
setSearchRan(false)
|
setSearchRan(false)
|
||||||
@@ -513,7 +515,7 @@ export default function RequestTimelinePage({ params }: { params: { id: string }
|
|||||||
throw new Error(text || `Request failed: ${response.status}`)
|
throw new Error(text || `Request failed: ${response.status}`)
|
||||||
}
|
}
|
||||||
const data = await response.json()
|
const data = await response.json()
|
||||||
if (action.id === 'search') {
|
if (action.id === 'search_releases') {
|
||||||
if (Array.isArray(data.releases)) {
|
if (Array.isArray(data.releases)) {
|
||||||
setReleaseOptions(data.releases)
|
setReleaseOptions(data.releases)
|
||||||
}
|
}
|
||||||
@@ -526,6 +528,10 @@ export default function RequestTimelinePage({ params }: { params: { id: string }
|
|||||||
setModalMessage('Search complete. Pick an option below if you want to download.')
|
setModalMessage('Search complete. Pick an option below if you want to download.')
|
||||||
}
|
}
|
||||||
setActionMessage(`${action.label} started.`)
|
setActionMessage(`${action.label} started.`)
|
||||||
|
} else if (action.id === 'search_auto') {
|
||||||
|
const message = data?.message ?? 'Search sent to Sonarr/Radarr.'
|
||||||
|
setActionMessage(message)
|
||||||
|
setModalMessage(message)
|
||||||
} else {
|
} else {
|
||||||
const message = data?.message ?? `${action.label} started.`
|
const message = data?.message ?? `${action.label} started.`
|
||||||
setActionMessage(message)
|
setActionMessage(message)
|
||||||
@@ -565,6 +571,7 @@ export default function RequestTimelinePage({ params }: { params: { id: string }
|
|||||||
<span>{release.seeders ?? 0} seeders · {formatBytes(release.size)}</span>
|
<span>{release.seeders ?? 0} seeders · {formatBytes(release.size)}</span>
|
||||||
<button
|
<button
|
||||||
type="button"
|
type="button"
|
||||||
|
disabled={!release.guid || !release.indexerId}
|
||||||
onClick={async () => {
|
onClick={async () => {
|
||||||
if (!snapshot || !release.guid || !release.indexerId) {
|
if (!snapshot || !release.guid || !release.indexerId) {
|
||||||
setActionMessage('Missing details to start the download.')
|
setActionMessage('Missing details to start the download.')
|
||||||
@@ -583,6 +590,7 @@ export default function RequestTimelinePage({ params }: { params: { id: string }
|
|||||||
body: JSON.stringify({
|
body: JSON.stringify({
|
||||||
guid: release.guid,
|
guid: release.guid,
|
||||||
indexerId: release.indexerId,
|
indexerId: release.indexerId,
|
||||||
|
downloadUrl: release.downloadUrl,
|
||||||
}),
|
}),
|
||||||
}
|
}
|
||||||
)
|
)
|
||||||
|
|||||||
@@ -40,19 +40,19 @@ export default function HeaderActions() {
|
|||||||
}
|
}
|
||||||
}
|
}
|
||||||
|
|
||||||
|
if (!signedIn) {
|
||||||
|
return null
|
||||||
|
}
|
||||||
|
|
||||||
return (
|
return (
|
||||||
<div className="header-actions">
|
<div className="header-actions">
|
||||||
<a href="/">Requests</a>
|
<a href="/">Requests</a>
|
||||||
<a href="/how-it-works">How it works</a>
|
<a href="/how-it-works">How it works</a>
|
||||||
{signedIn && <a href="/profile">My profile</a>}
|
<a href="/profile">My profile</a>
|
||||||
{role === 'admin' && <a href="/admin">Settings</a>}
|
{role === 'admin' && <a href="/admin">Settings</a>}
|
||||||
{signedIn ? (
|
<button type="button" className="header-link" onClick={signOut}>
|
||||||
<button type="button" className="header-link" onClick={signOut}>
|
Sign out
|
||||||
Sign out
|
</button>
|
||||||
</button>
|
|
||||||
) : (
|
|
||||||
<a href="/login">Sign in</a>
|
|
||||||
)}
|
|
||||||
</div>
|
</div>
|
||||||
)
|
)
|
||||||
}
|
}
|
||||||
|
|||||||
Reference in New Issue
Block a user