Compare commits

...

33 Commits

Author SHA1 Message Date
fallenbagel
e908c11d45 docs: fix the position of workflow_dispatch in docs-deploy 2025-10-31 07:45:23 +08:00
fallenbagel
509d615a61 docs: fix typo on the docs deploy 2025-10-31 07:42:26 +08:00
fallenbagel
9ebe620d52 docs: deploy from legacy-jellyseerr branch manually 2025-10-31 07:39:35 +08:00
fallenbagel
e928611285 docs: change pnpm version for stable 2025-10-31 07:38:13 +08:00
TacoCake
2e6e9ad657 fix: include video content in the blacklisted tags processing job (#1736)
* fix: include video content in the blacklisted tags processing job

Modified the “blacklisted tags” job to include adult & video content, this correctly blacklists more
adult films that were always missed, even if they had the tag.

* refactor: remove dead code

* refactor: remove redundant explicit arguments
2025-10-28 20:29:04 -06:00
0xsysr3ll
9a92d6ac30 fix(api): respect is4k parameter for all media status changes (#1951)
Signed-off-by: 0xsysr3ll <0xsysr3ll@pm.me>
2025-10-28 17:26:28 +01:00
0xsysr3ll
7dfa30a151 fix(media): handle 4K Radarr removal for multiple instances (#2037)
This PR fixes an issue where removing 4K movies from Radarr failed when multiple Radarr instances were configured. The backend was misparsing boolean query parameters and using string slugs instead of TMDB IDs. The fix ensures that the correct 4K Radarr instance is targeted and that TMDB IDs are used for movie removal.

Signed-off-by: 0xsysr3ll <0xsysr3ll@pm.me>
2025-10-28 17:25:57 +01:00
Gauthier
efc9b00d39 ci: fix AI-generated workflow trigger (#2101) 2025-10-28 15:46:14 +00:00
Gauthier
e246215663 ci: add a new workflow to close AI-generated PRs (#2098)
* ci: add a new workflow to close AI-generated PRs

This PR adds a workflow to automatically close the PRs with too much AI-generated code.

* fix: apply review comments
2025-10-28 14:28:42 +00:00
Joe Harrison
843d05cc3f chore:update to the code of conduct link in bug report (#2091) 2025-10-27 09:57:49 +01:00
Joe Harrison
e781cd56b3 chore(bug.yml): fixed link to the code of conduct in the bug.yml in issue templates (#2090) 2025-10-27 08:31:22 +01:00
Ludovic Ortega
b34ca1543a feat: do not enforce TLD on email (#2075)
fix #1846
2025-10-20 17:24:24 +03:00
Ludovic Ortega
48a61d812b docs: migrate third-parties documentation to a dedicated folder (#2068)
* docs: migrate third party documentation to a dedidcated folders

---------

Signed-off-by: Ludovic Ortega <ludovic.ortega@adminafk.fr>
2025-10-20 10:03:21 +02:00
J. Winters-Brown
f7f00ce361 feat: migrate to validator from email-validator (#2059)
* refactor(adds package): this adds the validator package and removes email-validator from dependencys

* refactor(auth.ts and email.ts): migrates from EmailValidator to validator
2025-10-19 22:37:09 +02:00
0xsysr3ll
a7909342b4 fix(api): correct Jellyfin users endpoint documentation (#2073)
Signed-off-by: 0xsysr3ll <0xsysr3ll@pm.me>
2025-10-19 22:32:58 +02:00
Joe Harrison
082ba3d037 ci: added helm cosign verification and renovate app workflow to bump chart versions (#2064)
* ci: added helm cosign verification and renovate app workflow to bump chart versions

* docs: add helm artifacts verification

Signed-off-by: Ludovic Ortega <ludovic.ortega@adminafk.fr>

* fix: update app id

Signed-off-by: Ludovic Ortega <ludovic.ortega@adminafk.fr>

* docs: add documentation link in helm chart and seerr docs

Signed-off-by: Ludovic Ortega <ludovic.ortega@adminafk.fr>

---------

Signed-off-by: Ludovic Ortega <ludovic.ortega@adminafk.fr>
Co-authored-by: Ludovic Ortega <ludovic.ortega@adminafk.fr>
2025-10-19 04:22:28 +01:00
Brandon Cohen
a975ab25c3 fix: delete endpoint on push notification disable (#2067)
fix: add endpoint deletion on disable

fix: use definemessages util

refactor: add code comment
2025-10-19 00:03:28 +08:00
fallenbagel
0d6bfa18cc fix(download-tracker): reset both service caches when resetting downloads (#2065) 2025-10-17 21:10:02 +02:00
Ludovic Ortega
0dbbac02af docs: add documentation for dockerhub (#2063)
* docs: add documentation for dockerhub

Signed-off-by: Ludovic Ortega <ludovic.ortega@adminafk.fr>

* docs: typo fixes

---------

Signed-off-by: Ludovic Ortega <ludovic.ortega@adminafk.fr>
Co-authored-by: sudo-kraken <joe@j-harrison.co.uk>
2025-10-17 17:22:19 +02:00
Ludovic Ortega
81eab7434f ci: fix concurrency issue on support workflows (#2062) [skip ci]
Signed-off-by: Ludovic Ortega <ludovic.ortega@adminafk.fr>
2025-10-17 17:00:44 +02:00
renovate[bot]
669faccc85 ci(actions): update github/codeql-action action to v4 (#2056)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-16 21:30:50 +02:00
renovate[bot]
a0893a5831 ci(actions): update github actions (#2022)
Co-authored-by: renovate[bot] <29139614+renovate[bot]@users.noreply.github.com>
2025-10-16 21:19:33 +02:00
fallenbagel
c4236dce73 docs: HAProxy documentation warning format (#2054)
Updated warning message for HAProxy documentation. And fixed a typo
2025-10-16 15:48:13 +02:00
Terry Sposato
f3d8f0d7ab docs: add haproxy configuration example (#2048) 2025-10-16 15:15:09 +02:00
Joe Harrison
a988f8e657 fix: update github repo refs for docker hub (#2053)
* fix: update github repo refs for docker hub

* ci: updated wf to use env var for the docker hub space
2025-10-16 21:12:17 +08:00
Joe Harrison
618563c6d7 docs: added guide for image verification (#2051)
* docs: added guide for image verification

* Update verifying-signed-images.mdx

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

---------

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
2025-10-16 14:10:09 +02:00
Joe Harrison
8688645a32 ci: update to release workflow (#2047)
* ci: update to release workflow

* build: re-ran lock file update with typeorm 0.3.12

* build: resync lockfile with develop

* ci: syntax fix in cliff.toml

* Update .github/workflows/release.yml

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>

* reverting co-pilots nonsense @fallenbagel's fault

Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>

---------

Co-authored-by: Copilot <175728472+Copilot@users.noreply.github.com>
Co-authored-by: Copilot Autofix powered by AI <62310815+github-advanced-security[bot]@users.noreply.github.com>
2025-10-16 12:53:02 +01:00
Joe Harrison
de0e9b1f35 fix: path in docs and compose for postgres 18 (#2049) 2025-10-16 07:36:56 +02:00
Ludovic Ortega
d95cccac6a docs: postgres 18 documentation (#2046)
Signed-off-by: Ludovic Ortega <ludovic.ortega@adminafk.fr>
2025-10-15 00:09:43 +02:00
Ludovic Ortega
9d174baa70 chore: use ghcr.io instead of dockerhub as default (#2045)
Signed-off-by: Ludovic Ortega <ludovic.ortega@adminafk.fr>
2025-10-15 05:27:56 +08:00
fallenbagel
d5ff0c11ca fix(subscriber): prevent infinite loop when requesting existing media with scan disabled (#2043) 2025-10-14 22:56:33 +02:00
Ludovic Ortega
0354debd2b build(docker): setup rootless image (#2032) [skip ci]
* build(docker): setup rootless image

---------

Signed-off-by: Ludovic Ortega <ludovic.ortega@adminafk.fr>
2025-10-14 22:49:57 +02:00
salty
a8c7e35f56 fix(api): cleanup radarr bits in sonarr api (#2035) 2025-10-12 17:19:51 +02:00
52 changed files with 1306 additions and 540 deletions

View File

@@ -11,16 +11,19 @@
.husky
.next
.prettierignore
.vscode
charts
config/db/*
config/logs/*
config/*.json
cypress
dist
Dockerfile*
compose.yaml
gen-docs
docs
LICENSE
node_modules
public/os_logo_filled.png
public/preview.jpg
stylelint.config.js
cypress

View File

@@ -95,7 +95,7 @@ body:
id: terms
attributes:
label: Code of Conduct
description: By submitting this issue, you agree to follow our [Code of Conduct](/../../CODE_OF_CONDUCT.md)
description: By submitting this issue, you agree to follow our [Code of Conduct](https://github.com/seerr-team/seerr/blob/develop/CODE_OF_CONDUCT.md)
options:
- label: I agree to follow Seerr's Code of Conduct
required: true

94
.github/cliff.toml vendored Normal file
View File

@@ -0,0 +1,94 @@
# git-cliff ~ configuration
# https://git-cliff.org/docs/configuration
[changelog]
header = ""
body = """
{%- macro remote_url() -%}
https://github.com/{{ remote.github.owner }}/{{ remote.github.repo }}
{%- endmacro -%}
{%- set excluded_users = ["github-actions[bot]", "dependabot[bot]", "renovate[bot]"] -%}
{% macro print_commit(commit) -%}
- {% if commit.scope %}*({{ commit.scope }})* {% endif %}\
{% if commit.breaking %}[**breaking**] {% endif %}\
{{ commit.message | upper_first }} - \
([{{ commit.id | truncate(length=7, end="") }}]({{ self::remote_url() }}/commit/{{ commit.id }}))\
{% endmacro -%}
{% if version %}\
{% if previous.version %}\
## [{{ version | trim_start_matches(pat="v") }}]({{ self::remote_url() }}/compare/{{ previous.version }}..{{ version }}) - {{ timestamp | date(format="%Y-%m-%d") }}
{% else %}\
## [{{ version | trim_start_matches(pat="v") }}] - {{ timestamp | date(format="%Y-%m-%d") }}
{% endif %}\
{% else %}\
## [unreleased]
{% endif %}\
{%- for group, commits in commits | group_by(attribute="group") %}
### {{ group | striptags | trim | upper_first }}
{%- for commit in commits | filter(attribute="scope") | sort(attribute="scope") %}
{{ self::print_commit(commit=commit) }}
{%- endfor %}
{%- for commit in commits %}
{%- if not commit.scope -%}
{{ self::print_commit(commit=commit) }}
{%- endif -%}
{%- endfor -%}
{%- endfor -%}
{%- set valid_contributors = [] -%}
{%- for c in github.contributors | filter(attribute="is_first_time", value=true) %}
{%- if c.username and c.username not in excluded_users and c.username not in valid_contributors %}
{%- set_global valid_contributors = valid_contributors | concat(with=c.username) %}
{%- endif %}
{%- endfor %}
{%- if valid_contributors | length > 0 %}
## New Contributors ❤️
{%- for username in valid_contributors %}
* @{{ username }} made their first contribution
{%- endfor %}
{%- endif %}
"""
footer = """
<!-- generated by git-cliff -->
"""
trim = true
postprocessors = []
[git]
conventional_commits = true
filter_unconventional = true
split_commits = false
filter_commits = true
commit_preprocessors = [
{ pattern = '.*\[skip ci\].*', replace = "" },
{ pattern = '.*\[ci skip\].*', replace = "" },
]
commit_parsers = [
{ message = "^feat", group = "<!-- 0 -->🚀 Features" },
{ message = "^fix", group = "<!-- 1 -->🐛 Bug Fixes" },
{ message = "^doc", group = "<!-- 3 -->📖 Documentation" },
{ message = "^perf", group = "<!-- 4 -->⚡ Performance" },
{ message = "^refactor", group = "<!-- 2 -->🚜 Refactor" },
{ message = "^style", group = "<!-- 5 -->🎨 Styling" },
{ message = "^test", group = "<!-- 6 -->🧪 Testing" },
{ message = "^chore\\(release\\): prepare for", skip = true },
{ message = "^chore\\(deps.*\\)", skip = true },
{ message = "^chore\\(pr\\)", skip = true },
{ message = "^chore\\(pull\\)", skip = true },
{ message = "^chore\\(git-sync\\)", skip = true },
{ message = "^chore|^ci", group = "<!-- 7 -->⚙️ Miscellaneous Tasks" },
{ body = ".*security", group = "<!-- 8 -->🛡️ Security" },
{ message = "^revert", group = "<!-- 9 -->◀️ Revert" },
{ message = '.*\[skip ci\].*', skip = true },
{ message = '.*\[ci skip\].*', skip = true },
]
protect_breaking_commits = false
tag_pattern = "v?[0-9]+\\.[0-9]+\\.[0-9]+.*"
skip_tags = "beta|alpha|rc"
topo_order = false
sort_commits = "newest"

59
.github/workflows/ai-generated.yml vendored Normal file
View File

@@ -0,0 +1,59 @@
---
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
name: 'AI-generated pull requests'
on:
pull_request_target:
types: [labeled, unlabeled, reopened]
permissions:
pull-requests: read
jobs:
ai-generated-support:
if: github.event.label.name == 'ai-generated' || github.event.action == 'reopened'
runs-on: ubuntu-24.04
concurrency:
group: support-${{ github.event.pull_request.number }}
cancel-in-progress: true
permissions:
pull-requests: write
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
GH_REPO: ${{ github.repository }}
NUMBER: ${{ github.event.pull_request.number }}
PR_AUTHOR: ${{ github.event.pull_request.user.login }}
steps:
- name: Label added, comment and close pull request
if: github.event.action == 'labeled' && github.event.label.name == 'ai-generated'
shell: bash
env:
BODY: >
:wave: @${{ env.PR_AUTHOR }}, thank you for your contribution!
However, this pull request has been closed because it appears to contain a significant amount of AI-generated code without sufficient human review or supervision.
AI-generated code can often introduce subtle bugs, poor design patterns, or inconsistent styles that make long-term maintenance difficult and reduce overall code quality. For the sake of the project's future stability and readability, we require that all contributions meet our established coding standards and demonstrate clear developer oversight.
This pull request is also too large for effective human review. Please discuss with us on how to break down these changes into smaller, more focused PRs to ensure a thorough and efficient review process.
If you'd like to revise and resubmit your changes with careful review and cleanup, we'd be happy to take another look.
run: |
retry() { n=0; until "$@"; do n=$((n+1)); [ $n -ge 3 ] && break; echo "retry $n: $*" >&2; sleep 2; done; }
retry gh pr comment "$NUMBER" -R "$GH_REPO" -b "$BODY" || true
retry gh pr close "$NUMBER" -R "$GH_REPO" || true
gh pr lock "$NUMBER" -R "$GH_REPO" -r "spam" || true
- name: Reopened or label removed, unlock pull request
if: github.event.action == 'unlabeled' && github.event.label.name == 'ai-generated'
shell: bash
run: |
retry() { n=0; until "$@"; do n=$((n+1)); [ $n -ge 3 ] && break; echo "retry $n: $*" >&2; sleep 2; done; }
retry gh pr reopen "$NUMBER" -R "$GH_REPO" || true
gh pr unlock "$NUMBER" -R "$GH_REPO" || true
- name: Remove AI-generated label on manual reopen
if: github.event.action == 'reopened'
shell: bash
run: |
gh pr edit "$NUMBER" -R "$GH_REPO" --remove-label "ai-generated" || true
gh pr unlock "$NUMBER" -R "$GH_REPO" || true

View File

@@ -23,7 +23,7 @@ jobs:
name: Lint & Test Build
if: github.event_name == 'pull_request'
runs-on: ubuntu-24.04
container: node:22.20.0-alpine3.22@sha256:cb3143549582cc5f74f26f0992cdef4a422b22128cb517f94173a5f910fa4ee7
container: node:22.20.0-alpine3.22@sha256:dbcedd8aeab47fbc0f4dd4bffa55b7c3c729a707875968d467aaaea42d6225af
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
@@ -31,7 +31,7 @@ jobs:
persist-credentials: false
- name: Pnpm Setup
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4.2.0
- name: Get pnpm store directory
shell: sh
@@ -48,7 +48,7 @@ jobs:
- name: Install dependencies
env:
HUSKY: 0
CI: true
run: pnpm install
- name: Lint

View File

@@ -42,15 +42,15 @@ jobs:
persist-credentials: false
- name: Initialize CodeQL
uses: github/codeql-action/init@64d10c13136e1c5bce3e5fbde8d4906eeaafc885 # v3.30.6
uses: github/codeql-action/init@e296a935590eb16afc0c0108289f68c87e2a89a5 # v4.30.7
with:
languages: ${{ matrix.language }}
queries: +security-and-quality
- name: Autobuild
uses: github/codeql-action/autobuild@64d10c13136e1c5bce3e5fbde8d4906eeaafc885 # v3.30.6
uses: github/codeql-action/autobuild@e296a935590eb16afc0c0108289f68c87e2a89a5 # v4.30.7
- name: Perform CodeQL Analysis
uses: github/codeql-action/analyze@64d10c13136e1c5bce3e5fbde8d4906eeaafc885 # v3.30.6
uses: github/codeql-action/analyze@e296a935590eb16afc0c0108289f68c87e2a89a5 # v4.30.7
with:
category: '/language:${{ matrix.language }}'

View File

@@ -48,7 +48,7 @@ jobs:
package-manager-cache: false
- name: Pnpm Setup
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4.2.0
- name: Install dependencies
run: pnpm install --frozen-lockfile

View File

@@ -3,9 +3,10 @@
name: Deploy to GitHub Pages
on:
workflow_dispatch:
push:
branches:
- develop
- legacy-jellyseerr
paths:
- 'docs/**'
- 'gen-docs/**'
@@ -34,7 +35,7 @@ jobs:
package-manager-cache: false
- name: Pnpm Setup
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4.2.0
- name: Get pnpm store directory
shell: sh

View File

@@ -55,7 +55,7 @@ jobs:
# get current version
current_version=$(grep '^version:' "$chart_path/Chart.yaml" | awk '{print $2}')
# try to get current release version
if oras manifest fetch "ghcr.io/${GITHUB_REPOSITORY@L}/${chart_name}:${current_version}" >/dev/null 2>&1; then
if oras manifest fetch "ghcr.io/${{ github.repository }}/${chart_name}:${current_version}" >/dev/null 2>&1; then
echo "No version change for $chart_name. Skipping."
else
helm dependency build "$chart_path"
@@ -87,8 +87,8 @@ jobs:
name: Publish to ghcr.io
runs-on: ubuntu-24.04
permissions:
packages: write # needed for pushing to github registry
id-token: write # needed for signing the images with GitHub OIDC Token
packages: write
id-token: write
needs: [package-helm-chart]
if: needs.package-helm-chart.outputs.has_artifacts == 'true'
steps:
@@ -128,17 +128,59 @@ jobs:
# push chart to OCI
chart_release_file=$(basename "$chart_path")
chart_name=${chart_release_file%-*}
helm push ${chart_path} oci://ghcr.io/${GITHUB_REPOSITORY@L} |& tee helm-push-output.log
helm push ${chart_path} oci://ghcr.io/${{ github.repository }} |& tee helm-push-output.log
chart_digest=$(awk -F "[, ]+" '/Digest/{print $NF}' < helm-push-output.log)
# sign chart
cosign sign "ghcr.io/${GITHUB_REPOSITORY@L}/${chart_name}@${chart_digest}"
cosign sign "ghcr.io/${{ github.repository }}/${chart_name}@${chart_digest}"
# push artifacthub-repo.yml to OCI
oras push \
ghcr.io/${GITHUB_REPOSITORY@L}/${chart_name}:artifacthub.io \
ghcr.io/${{ github.repository }}/${chart_name}:artifacthub.io \
--config /dev/null:application/vnd.cncf.artifacthub.config.v1+yaml \
charts/$chart_name/artifacthub-repo.yml:application/vnd.cncf.artifacthub.repository-metadata.layer.v1.yaml \
|& tee oras-push-output.log
artifacthub_digest=$(grep "Digest:" oras-push-output.log | awk '{print $2}')
# sign artifacthub-repo.yml
cosign sign "ghcr.io/${GITHUB_REPOSITORY@L}/${chart_name}:artifacthub.io@${artifacthub_digest}"
cosign sign "ghcr.io/${{ github.repository }}/${chart_name}:artifacthub.io@${artifacthub_digest}"
done
verify:
name: Verify signatures for each chart tag
needs: [publish]
runs-on: ubuntu-24.04
permissions:
contents: read
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
- name: Install Cosign
uses: sigstore/cosign-installer@d7543c93d881b35a8faa02e8e3605f69b7a1ce62 # v3.10.0
- name: Downloads artifacts
uses: actions/download-artifact@634f93cb2916e3fdff6788551b99b062d0335ce0 # v5.0.0
with:
name: artifacts
path: .cr-release-packages/
- name: Login to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Verify signatures for each chart tag
run: |
for chart_path in $(find .cr-release-packages -name '*.tgz' -print); do
chart_release_file=$(basename "$chart_path")
chart_name=${chart_release_file%-*}
version=${chart_release_file#$chart_name-}
version=${version%.tgz}
cosign verify "ghcr.io/${{ github.repository }}/${chart_name}:${version}" \
--certificate-identity "https://github.com/${{ github.workflow_ref }}" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com"
done

View File

@@ -3,7 +3,9 @@
name: Seerr Release
on:
workflow_dispatch:
push:
tags:
- 'v*'
permissions:
contents: read
@@ -12,15 +14,17 @@ concurrency:
group: release-${{ github.ref }}
cancel-in-progress: true
env:
DOCKER_HUB: seerr/seerr
jobs:
semantic-release:
name: Tag and release latest version
runs-on: ubuntu-22.04
env:
HUSKY: 0
changelog:
name: Generate changelog
runs-on: ubuntu-24.04
permissions:
contents: read
outputs:
new_release_published: ${{ steps.release.outputs.new_release_published }}
new_release_version: ${{ steps.release.outputs.new_release_version }}
release_body: ${{ steps.git-cliff.outputs.content }}
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
@@ -28,46 +32,36 @@ jobs:
fetch-depth: 0
persist-credentials: false
- name: Set up Node.js
uses: actions/setup-node@a0853c24544627f65ddf259abe73b1d18a591444 # v5.0.0
- name: Generate changelog
id: git-cliff
uses: orhun/git-cliff-action@d77b37db2e3f7398432d34b72a12aa3e2ba87e51 # v4.6.0
with:
node-version-file: package.json
package-manager-cache: false
- name: Pnpm Setup
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
- name: Get pnpm store directory
shell: sh
run: |
echo "STORE_PATH=$(pnpm store path --silent)" >> $GITHUB_ENV
- name: Setup pnpm cache
uses: actions/cache@0057852bfaa89a56745cba8c7296529d2fc39830 # v4.3.0
with:
path: ${{ env.STORE_PATH }}
key: ${{ runner.os }}-pnpm-store-${{ hashFiles('**/pnpm-lock.yaml') }}
restore-keys: |
${{ runner.os }}-pnpm-store-
- name: Install dependencies
run: pnpm install
- name: Release
id: release
uses: cycjimmy/semantic-release-action@9cc899c47e6841430bbaedb43de1560a568dfd16 # v5.0.0
with:
extra_plugins: |
@semantic-release/git@10
@semantic-release/changelog@6
@codedependant/semantic-release-docker@5
config: .github/cliff.toml
args: -vv --current
env:
GITHUB_TOKEN: ${{ secrets.GH_TOKEN }}
OUTPUT: CHANGELOG.md
GITHUB_REPO: ${{ github.repository }}
create-draft-release:
name: Create draft release
runs-on: ubuntu-24.04
permissions:
contents: write
needs: changelog
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Draft Release
run: gh release create ${GITHUB_REF_NAME} -t "Release ${GITHUB_REF_NAME}" -n "${RELEASE_BODY}" --draft
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
RELEASE_BODY: ${{ needs.changelog.outputs.release_body }}
build:
name: Build (per-arch, native runners)
needs: semantic-release
if: needs.semantic-release.outputs.new_release_published == 'true'
name: Build (${{ matrix.arch }})
strategy:
matrix:
include:
@@ -78,6 +72,8 @@ jobs:
platform: linux/arm64
arch: arm64
runs-on: ${{ matrix.runner }}
env:
VERSION: ${{ github.ref_name }}
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
@@ -91,7 +87,7 @@ jobs:
- name: Set up Docker Buildx
uses: docker/setup-buildx-action@e468171a9de216ec08956ac3ada2f0791b6bd435 # v3.11.1
- name: Warm cache (no push) — ${{ matrix.platform }}
- name: Warm cache [${{ matrix.platform }}]
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: .
@@ -100,21 +96,23 @@ jobs:
push: false
build-args: |
COMMIT_TAG=${{ github.sha }}
BUILD_VERSION=${{ needs.semantic-release.outputs.new_release_version }}
BUILD_VERSION=${{ env.VERSION }}
SOURCE_DATE_EPOCH=${{ steps.ts.outputs.TIMESTAMP }}
cache-from: type=gha,scope=${{ matrix.platform }}
cache-to: type=gha,mode=max,scope=${{ matrix.platform }}
provenance: false
publish:
name: Publish multi-arch image
needs: [semantic-release, build]
if: needs.semantic-release.outputs.new_release_published == 'true'
name: Publish multi-arch manifests
needs: build
runs-on: ubuntu-24.04
permissions:
contents: read
id-token: write
packages: write
outputs:
image_digest: ${{ steps.digests.outputs.IMAGE_DIGEST }}
env:
VERSION: ${{ github.ref_name }}
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
@@ -149,11 +147,11 @@ jobs:
${{ github.repository }}
ghcr.io/${{ github.repository }}
tags: |
type=raw,value=${{ needs.semantic-release.outputs.new_release_version }}
type=raw,value=${{ env.VERSION }}
labels: |
org.opencontainers.image.created=${{ steps.ts.outputs.TIMESTAMP }}
- name: Build & Push (multi-arch, single tag)
- name: Build & Push (multi-arch)
uses: docker/build-push-action@263435318d21b8e681c14492fe198d362a7d2c83 # v6.18.0
with:
context: .
@@ -162,7 +160,7 @@ jobs:
push: true
build-args: |
COMMIT_TAG=${{ github.sha }}
BUILD_VERSION=${{ needs.semantic-release.outputs.new_release_version }}
BUILD_VERSION=${{ env.VERSION }}
SOURCE_DATE_EPOCH=${{ steps.ts.outputs.TIMESTAMP }}
labels: ${{ steps.meta.outputs.labels }}
tags: ${{ steps.meta.outputs.tags }}
@@ -172,37 +170,158 @@ jobs:
cache-to: type=gha,mode=max
provenance: false
- name: Resolve manifest digest
id: digests
run: |
DIGEST=$(docker buildx imagetools inspect "${{ github.repository }}:${{ env.VERSION }}" --format '{{json .Manifest.Digest}}' | tr -d '"')
echo "IMAGE_DIGEST=$DIGEST" >> $GITHUB_OUTPUT
- name: Also tag :latest (non-pre-release only)
shell: bash
if: ${{ !contains(env.VERSION, '-') }}
run: |
VER="${{ needs.semantic-release.outputs.new_release_version }}"
if [[ "$VER" != *"-"* ]]; then
docker buildx imagetools create \
-t ${{ github.repository }}:latest \
${{ github.repository }}:${VER}
docker buildx imagetools create \
-t ghcr.io/${{ github.repository }}:latest \
ghcr.io/${{ github.repository }}:${VER}
fi
docker buildx imagetools create \
-t ${{ env.DOCKER_HUB }}:latest \
${{ env.DOCKER_HUB }}:${{ env.VERSION }}
docker buildx imagetools create \
-t ghcr.io/${{ github.repository }}:latest \
ghcr.io/${{ github.repository }}:${{ env.VERSION }}
sign:
name: Sign images and create SBOM attestations
needs: publish
runs-on: ubuntu-24.04
permissions:
contents: read
id-token: write
packages: write
env:
VERSION: ${{ github.ref_name }}
COSIGN_YES: 'true'
steps:
- name: Checkout
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
persist-credentials: false
- name: Install Cosign
uses: sigstore/cosign-installer@d7543c93d881b35a8faa02e8e3605f69b7a1ce62 # v3.10.0
- name: Install Trivy
uses: aquasecurity/setup-trivy@e6c2c5e321ed9123bda567646e2f96565e34abe1 # v0.2.4
- name: Log in to Docker Hub
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
username: ${{ secrets.DOCKER_USERNAME }}
password: ${{ secrets.DOCKER_TOKEN }}
- name: Log in to GitHub Container Registry
uses: docker/login-action@5e57cd118135c172c3672efd75eb46360885c0ef # v3.6.0
with:
registry: ghcr.io
username: ${{ github.repository_owner }}
password: ${{ secrets.GITHUB_TOKEN }}
- name: Sign images
run: |
cosign sign --recursive "ghcr.io/${{ github.repository }}@${{ needs.publish.outputs.image_digest }}"
cosign sign --recursive "${{ env.DOCKER_HUB }}@${{ needs.publish.outputs.image_digest }}"
- name: Generate SBOMs
run: |
trivy image --format cyclonedx --output seerr-ghcr-image-${{ env.VERSION }}.sbom \
"ghcr.io/${{ github.repository }}@${{ needs.publish.outputs.image_digest }}"
trivy image --format cyclonedx --output seerr-dockerhub-image-${{ env.VERSION }}.sbom \
"${{ env.DOCKER_HUB }}@${{ needs.publish.outputs.image_digest }}"
- name: Attest SBOMs
run: |
cosign attest \
--type cyclonedx \
--predicate seerr-ghcr-image-${{ env.VERSION }}.sbom \
"ghcr.io/${{ github.repository }}@${{ needs.publish.outputs.image_digest }}"
cosign attest \
--type cyclonedx \
--predicate seerr-dockerhub-image-${{ env.VERSION }}.sbom \
"${{ env.DOCKER_HUB }}@${{ needs.publish.outputs.image_digest }}"
- name: Upload SBOMs
uses: actions/upload-artifact@ea165f8d65b6e75b540449e92b4886f43607fa02 # v4.6.2
with:
name: sboms-${{ env.VERSION }}
path: '*.sbom'
if-no-files-found: error
retention-days: 1
verify:
name: Verify signatures and attestations
needs: [publish, sign]
runs-on: ubuntu-24.04
permissions:
contents: read
env:
VERSION: ${{ github.ref_name }}
steps:
- name: Install Cosign
uses: sigstore/cosign-installer@d7543c93d881b35a8faa02e8e3605f69b7a1ce62 # v3.10.0
- name: Verify signatures
run: |
cosign verify "ghcr.io/${{ github.repository }}@${{ needs.publish.outputs.image_digest }}" \
--certificate-identity "https://github.com/${{ github.workflow_ref }}" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com"
cosign verify "${{ env.DOCKER_HUB }}@${{ needs.publish.outputs.image_digest }}" \
--certificate-identity "https://github.com/${{ github.workflow_ref }}" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com"
- name: Verify attestations
run: |
cosign verify-attestation "ghcr.io/${{ github.repository }}@${{ needs.publish.outputs.image_digest }}" \
--type cyclonedx \
--certificate-identity "https://github.com/${{ github.workflow_ref }}" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com" > /dev/null
cosign verify-attestation "${{ env.DOCKER_HUB }}@${{ needs.publish.outputs.image_digest }}" \
--type cyclonedx \
--certificate-identity "https://github.com/${{ github.workflow_ref }}" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com" > /dev/null
publish-release:
name: Publish release
needs: [create-draft-release, verify]
runs-on: ubuntu-24.04
permissions:
contents: write
env:
VERSION: ${{ github.ref_name }}
steps:
- name: Publish release
run: gh release edit "${{ env.VERSION }}" --draft=false --repo "${{ github.repository }}"
env:
GH_TOKEN: ${{ secrets.GITHUB_TOKEN }}
discord:
name: Send Discord Notification
needs: publish
needs: publish-release
if: always()
runs-on: ubuntu-24.04
steps:
- name: Determine Workflow Status
- name: Determine status
id: status
run: |
case "${{ needs.publish.result }}" in
case "${{ needs.publish-release.result }}" in
success) echo "status=Success" >> $GITHUB_OUTPUT; echo "colour=3066993" >> $GITHUB_OUTPUT ;;
failure) echo "status=Failure" >> $GITHUB_OUTPUT; echo "colour=15158332" >> $GITHUB_OUTPUT ;;
cancelled) echo "status=Cancelled" >> $GITHUB_OUTPUT; echo "colour=10181046" >> $GITHUB_OUTPUT ;;
*) echo "status=Skipped" >> $GITHUB_OUTPUT; echo "colour=9807270" >> $GITHUB_OUTPUT ;;
esac
- name: Send Discord notification
shell: bash
- name: Send notification
run: |
WEBHOOK="${{ secrets.DISCORD_WEBHOOK }}"
@@ -217,7 +336,7 @@ jobs:
{ "name": "Event", "value": "${{ github.event_name }}", "inline": true },
{ "name": "Triggered by", "value": "${{ github.actor }}", "inline": true },
{ "name": "Workflow", "value": "[${{ github.workflow }}](${{ github.server_url }}/${{ github.repository }}/actions/runs/${{ github.run_id }})", "inline": true }
],
]
}]
}
EOF

View File

@@ -0,0 +1,181 @@
---
# yaml-language-server: $schema=https://json.schemastore.org/github-workflow.json
name: Renovate Helm Hooks
on:
pull_request:
branches:
- develop
paths:
- 'charts/**'
permissions: {}
concurrency:
group: renovate-helm-hooks-${{ github.ref }}
cancel-in-progress: true
jobs:
renovate-post-run:
name: Renovate Bump Chart Version
runs-on: ubuntu-latest
permissions:
contents: read
pull-requests: write
if: github.actor == 'renovate[bot]'
steps:
- name: Checkout code
uses: actions/checkout@08c6903cd8c0fde910a37f88322edcfb5dd907a8 # v5.0.0
with:
fetch-depth: 0
persist-credentials: false
- uses: actions/create-github-app-token@67018539274d69449ef7c02e8e71183d1719ab42 # v2.1.4
id: app-token
with:
app-id: 2138788
private-key: ${{ secrets.APP_SEERR_HELM_PRIVATE_KEY }}
- name: Set up chart-testing
uses: helm/chart-testing-action@0d28d3144d3a25ea2cc349d6e59901c4ff469b3b # v2.7.0
- name: Run chart-testing (list-changed)
id: list-changed
run: |
changed="$(ct list-changed --target-branch ${TARGET_BRANCH})"
if [[ -n "$changed" ]]; then
echo "changed=true" >> "$GITHUB_OUTPUT"
echo "changed_list=${changed//$'\n'/ }" >> "$GITHUB_OUTPUT"
fi
env:
TARGET_BRANCH: ${{ github.event.repository.default_branch }}
- name: Bump chart version
if: steps.list-changed.outputs.changed == 'true'
env:
CHART: ${{ steps.list-changed.outputs.changed_list }}
run: |
if [[ ! -d "${CHART}" ]]; then
echo "${CHART} directory not found"
exit 0
fi
# Extract current appVersion and chart version from Chart.yaml
APP_VERSION=$(grep -e "^appVersion:" "$CHART/Chart.yaml" | cut -d ":" -f 2 | tr -d '[:space:]' | tr -d '"')
CHART_VERSION=$(grep -e "^version:" "$CHART/Chart.yaml" | cut -d ":" -f 2 | tr -d '[:space:]' | tr -d '"')
# Extract major, minor and patch versions of appVersion
APP_MAJOR_VERSION=$(printf '%s' "$APP_VERSION" | cut -d "." -f 1)
APP_MINOR_VERSION=$(printf '%s' "$APP_VERSION" | cut -d "." -f 2)
APP_PATCH_VERSION=$(printf '%s' "$APP_VERSION" | cut -d "." -f 3)
# Extract major, minor and patch versions of chart version
CHART_MAJOR_VERSION=$(printf '%s' "$CHART_VERSION" | cut -d "." -f 1)
CHART_MINOR_VERSION=$(printf '%s' "$CHART_VERSION" | cut -d "." -f 2)
CHART_PATCH_VERSION=$(printf '%s' "$CHART_VERSION" | cut -d "." -f 3)
# Get previous appVersion from the base commit of the pull request
BASE_COMMIT=$(git merge-base origin/main HEAD)
PREV_APP_VERSION=$(git show "$BASE_COMMIT":"$CHART/Chart.yaml" | grep -e "^appVersion:" | cut -d ":" -f 2 | tr -d '[:space:]' | tr -d '"')
# Extract major, minor and patch versions of previous appVersion
PREV_APP_MAJOR_VERSION=$(printf '%s' "$PREV_APP_VERSION" | cut -d "." -f 1)
PREV_APP_MINOR_VERSION=$(printf '%s' "$PREV_APP_VERSION" | cut -d "." -f 2)
PREV_APP_PATCH_VERSION=$(printf '%s' "$PREV_APP_VERSION" | cut -d "." -f 3)
# Check if the major, minor, or patch version of appVersion has changed
if [[ "$APP_MAJOR_VERSION" != "$PREV_APP_MAJOR_VERSION" ]]; then
# Bump major version of the chart and reset minor and patch versions to 0
CHART_MAJOR_VERSION=$((CHART_MAJOR_VERSION+1))
CHART_MINOR_VERSION=0
CHART_PATCH_VERSION=0
elif [[ "$APP_MINOR_VERSION" != "$PREV_APP_MINOR_VERSION" ]]; then
# Bump minor version of the chart and reset patch version to 0
CHART_MINOR_VERSION=$((CHART_MINOR_VERSION+1))
CHART_PATCH_VERSION=0
elif [[ "$APP_PATCH_VERSION" != "$PREV_APP_PATCH_VERSION" ]]; then
# Bump patch version of the chart
CHART_PATCH_VERSION=$((CHART_PATCH_VERSION+1))
fi
# Update the chart version in Chart.yaml
CHART_NEW_VERSION="${CHART_MAJOR_VERSION}.${CHART_MINOR_VERSION}.${CHART_PATCH_VERSION}"
sed -i "s/^version:.*/version: ${CHART_NEW_VERSION}/" "$CHART/Chart.yaml"
- name: Ensure documentation is updated
if: steps.list-changed.outputs.changed == 'true'
uses: docker://jnorwood/helm-docs:v1.14.2@sha256:7e562b49ab6b1dbc50c3da8f2dd6ffa8a5c6bba327b1c6335cc15ce29267979c
- name: Commit changes
if: steps.list-changed.outputs.changed == 'true'
env:
CHART: ${{ steps.list-changed.outputs.changed_list }}
GITHUB_TOKEN: ${{ steps.app-token.outputs.token }}
GITHUB_HEAD_REF: ${{ github.head_ref }}
run: |
# Define the target directory
TARGET_DIR="$CHART"
# Fetch deleted files in the target directory
DELETED_FILES=$(git diff --diff-filter=D --name-only HEAD -- "$TARGET_DIR")
# Fetch added/modified files in the target directory
MODIFIED_FILES=$(git diff --diff-filter=ACM --name-only HEAD -- "$TARGET_DIR")
# Create a temporary file for JSON output
FILE_CHANGES_JSON_FILE=$(mktemp)
# Initialize JSON structure in the file
echo '{ "deletions": [], "additions": [] }' > "$FILE_CHANGES_JSON_FILE"
# Add deletions
for file in $DELETED_FILES; do
jq --arg path "$file" '.deletions += [{"path": $path}]' "$FILE_CHANGES_JSON_FILE" > "$FILE_CHANGES_JSON_FILE.tmp"
mv "$FILE_CHANGES_JSON_FILE.tmp" "$FILE_CHANGES_JSON_FILE"
done
# Add additions (new or modified files)
for file in $MODIFIED_FILES; do
BASE64_CONTENT=$(base64 -w 0 <"$file") # Encode file content
jq --arg path "$file" --arg content "$BASE64_CONTENT" \
'.additions += [{"path": $path, "contents": $content}]' "$FILE_CHANGES_JSON_FILE" > "$FILE_CHANGES_JSON_FILE.tmp"
mv "$FILE_CHANGES_JSON_FILE.tmp" "$FILE_CHANGES_JSON_FILE"
done
# Create a temporary file for the final JSON payload
JSON_PAYLOAD_FILE=$(mktemp)
# Construct the final JSON using jq and store it in a file
jq -n --arg repo "$GITHUB_REPOSITORY" \
--arg branch "$GITHUB_HEAD_REF" \
--arg message "fix: post upgrade changes from renovate" \
--arg expectedOid "$GITHUB_SHA" \
--slurpfile fileChanges "$FILE_CHANGES_JSON_FILE" \
'{
query: "mutation ($input: CreateCommitOnBranchInput!) {
createCommitOnBranch(input: $input) {
commit {
url
}
}
}",
variables: {
input: {
branch: {
repositoryNameWithOwner: $repo,
branchName: $branch
},
message: { headline: $message },
fileChanges: $fileChanges[0],
expectedHeadOid: $expectedOid
}
}
}' > "$JSON_PAYLOAD_FILE"
# Call GitHub API
curl https://api.github.com/graphql -f \
-sSf -H "Authorization: Bearer $GITHUB_TOKEN" \
--data "@$JSON_PAYLOAD_FILE"
# Clean up temporary files
rm "$FILE_CHANGES_JSON_FILE" "$JSON_PAYLOAD_FILE"

View File

@@ -9,14 +9,13 @@ on:
permissions:
issues: read
concurrency:
group: support-${{ github.event.issue.number }}
cancel-in-progress: true
jobs:
support:
if: github.event.label.name == 'support' || github.event.action == 'reopened'
runs-on: ubuntu-24.04
concurrency:
group: support-${{ github.event.issue.number }}
cancel-in-progress: true
permissions:
issues: write
env:

View File

@@ -36,7 +36,7 @@ jobs:
package-manager-cache: false
- name: Pnpm Setup
uses: pnpm/action-setup@a7487c7e89a18df4991f7f222e4898a00d66ddda # v4.1.0
uses: pnpm/action-setup@41ff72655975bd51cab0327fa583b6e92b6d3061 # v4.2.0
- name: Get pnpm store directory
shell: sh

View File

@@ -56,6 +56,6 @@ jobs:
ignore-unfixed: true
- name: Upload SARIF to code scanning
uses: github/codeql-action/upload-sarif@64d10c13136e1c5bce3e5fbde8d4906eeaafc885 # v3.30.6
uses: github/codeql-action/upload-sarif@e296a935590eb16afc0c0108289f68c87e2a89a5 # v4.30.7
with:
sarif_file: trivy.sarif

View File

@@ -1,15 +1,20 @@
FROM node:22.20.0-alpine3.22@sha256:cb3143549582cc5f74f26f0992cdef4a422b22128cb517f94173a5f910fa4ee7 AS build_image
FROM node:22.20.0-alpine3.22@sha256:cb3143549582cc5f74f26f0992cdef4a422b22128cb517f94173a5f910fa4ee7 AS base
ARG SOURCE_DATE_EPOCH
ARG TARGETPLATFORM
ARG COMMIT_TAG
ENV TARGETPLATFORM=${TARGETPLATFORM:-linux/amd64}
ENV COMMIT_TAG=${COMMIT_TAG}
ENV PNPM_HOME="/pnpm"
ENV PATH="$PNPM_HOME:$PATH"
RUN corepack enable
COPY . ./app
WORKDIR /app
FROM base AS prod-deps
RUN --mount=type=cache,id=pnpm,target=/pnpm/store CI=true pnpm install --prod --frozen-lockfile
FROM base as build
RUN \
case "${TARGETPLATFORM}" in \
'linux/arm64' | 'linux/arm/v7') \
@@ -19,34 +24,32 @@ RUN \
;; \
esac
WORKDIR /app
RUN --mount=type=cache,id=pnpm,target=/pnpm/store CYPRESS_INSTALL_BINARY=0 pnpm install --frozen-lockfile
COPY package.json pnpm-lock.yaml postinstall-win.js ./
RUN CYPRESS_INSTALL_BINARY=0 pnpm install --frozen-lockfile
COPY . ./
RUN pnpm build
# remove development dependencies
RUN pnpm prune --prod --ignore-scripts && \
rm -rf src server .next/cache charts gen-docs docs && \
touch config/DOCKER && \
echo "{\"commitTag\": \"${COMMIT_TAG}\"}" > committag.json
RUN rm -rf .next/cache
FROM node:22.20.0-alpine3.22@sha256:cb3143549582cc5f74f26f0992cdef4a422b22128cb517f94173a5f910fa4ee7
ARG SOURCE_DATE_EPOCH
ARG COMMIT_TAG
ENV NODE_ENV=production
ENV COMMIT_TAG=${COMMIT_TAG}
ENV PNPM_HOME="/pnpm"
ENV PATH="$PNPM_HOME:$PATH"
RUN corepack enable
RUN apk add --no-cache tzdata
USER node:node
WORKDIR /app
RUN apk add --no-cache tzdata tini && rm -rf /tmp/*
COPY --chown=node:node . .
COPY --chown=node:node --from=prod-deps /app/node_modules ./node_modules
COPY --chown=node:node --from=build /app/.next ./.next
COPY --chown=node:node --from=build /app/dist ./dist
# copy from build image
COPY --from=build_image /app ./
ENTRYPOINT [ "/sbin/tini", "--" ]
CMD [ "pnpm", "start" ]
RUN touch config/DOCKER && \
echo "{\"commitTag\": \"${COMMIT_TAG}\"}" > committag.json
EXPOSE 5055
CMD [ "npm", "start" ]

9
bin/prepare.js Normal file
View File

@@ -0,0 +1,9 @@
#!/usr/bin/env node
/**
* Do not run husky in CI environments
*/
const isCi = process.env.CI !== undefined;
if (!isCi) {
require('husky').install();
}

View File

@@ -20,6 +20,10 @@ Seerr helm chart for Kubernetes
Kubernetes: `>=1.23.0-0`
## Installation
Refer to [https://docs.seerr.dev/getting-started/kubernetes](Seerr kubernetes documentation)
## Update Notes
### Updating to 3.0.0

View File

@@ -14,11 +14,15 @@
{{ template "chart.requirementsSection" . }}
## Installation
Refer to [https://docs.seerr.dev/getting-started/kubernetes](Seerr kubernetes documentation)
## Update Notes
### Updating to 3.0.0
Nothing change we just rebranded `jellyseerr` helm-chart to `seerr` :)
Nothing has changed; we just rebranded the `jellyseerr` Helm chart to `seerr` 🥳.
### Updating to 2.7.0

View File

@@ -23,7 +23,7 @@ services:
links:
- postgres
postgres:
image: postgres
image: postgres:18
environment:
POSTGRES_USER: jellyseerr
POSTGRES_PASSWORD: jellyseerr
@@ -31,6 +31,6 @@ services:
ports:
- '5432:5432'
volumes:
- postgres:/var/lib/postgresql/data
- postgres:/var/lib/postgresql
volumes:
postgres:

View File

@@ -21,6 +21,7 @@ DB_LOG_QUERIES="false" # (optional) Whether to log the DB queries for debugging.
:::caution
When migrating Postgres from version 17 to 18 in Docker, note that the data mount point has changed. Instead of using `/var/lib/postgresql/data`, the correct mount path is now `/var/lib/postgresql`.
Refer to the [PostgreSQL Docker documentation](https://hub.docker.com/_/postgres/#pgdata) to learn how to migrate or opt out of this change.
:::
### TCP Connection

View File

@@ -249,7 +249,7 @@ Add the following Location block to your existing Server configuration.
# Please Replace "FQDN" with your domain
Header edit location ^/login https://FQDN/jellyseerr/login
Header edit location ^/setup https://FQDN/jellyseerr/setup
AddOutputFilterByType INFLATE;SUBSTITUTE text/html application/javascript application/json
SubstituteMaxLineLength 2000K
# This is HTML and JS update
@@ -266,3 +266,36 @@ Add the following Location block to your existing Server configuration.
</TabItem>
</Tabs>
## HAProxy (v3)
:::warning
This is a third-party documentation maintained by the community. We can't provide support for this setup and are unable to test it.
:::
Add the following frontend and backend configurations for your seerr instance:
```haproxy
frontend seerr-frontend
bind 0.0.0.0:80
bind 0.0.0.0:443 ssl crt /etc/ssl/private/seerr.example.com.pem
mode http
log global
option httplog
option http-keep-alive
http-request set-header X-Real-IP %[src]
option forwardfor
acl seerr hdr(host) -i seerr.example.com
redirect scheme https code 301 if !{ ssl_fc }
use_backend seerr-backend if seerr
backend seerr-backend
mode http
log global
option httplog
http-response set-header Strict-Transport-Security max-age=15552000
option httpchk GET /api/v1/status
timeout connect 30000
timeout server 30000
retries 3
server seerr 127.0.0.1:5055 check inter 1000
```

View File

@@ -15,7 +15,7 @@ import TabItem from '@theme/TabItem';
### Prerequisites
- [Node.js 22.x](https://nodejs.org/en/download/)
- [Pnpm 10.x](https://pnpm.io/installation)
- [Pnpm 9.x](https://pnpm.io/installation)
- [Git](https://git-scm.com/downloads)
## Unix (Linux, macOS)

View File

@@ -11,6 +11,16 @@ Details on how to install Docker can be found on the [official Docker website](h
Refer to [Configuring Databases](/extending-jellyseerr/database-config#postgresql-options) for details on how to configure your database.
:::
:::info
An alternative Docker image is available on Docker Hub for this project. You can find it at [Docker Hub Repository Link](https://hub.docker.com/r/seerr/seerr)
:::
:::info
All official Seerr images are cryptographically signed and include a verified [Software Bill of Materials (SBOM)](https://cyclonedx.org/).
To confirm that the container image you are using is authentic and unmodified, please refer to the [Verifying Signed Artifacts](/using-jellyseerr/advanced/verifying-signed-artifacts#verifying-signed-images) guide.
:::
## Unix (Linux, macOS)
:::warning
Be sure to replace `/path/to/appdata/config` in the below examples with a valid host directory path. If this volume mount is not configured correctly, your Jellyseerr settings/data will not be persisted when the container is recreated (e.g., when updating the image or rebooting your machine).
@@ -31,13 +41,14 @@ For details on the Docker CLI, please [review the official `docker run` document
```bash
docker run -d \
--name jellyseerr \
--init \
-e LOG_LEVEL=debug \
-e TZ=Asia/Tashkent \
-e PORT=5055 \
-p 5055:5055 \
-v /path/to/appdata/config:/app/config \
--restart unless-stopped \
fallenbagel/jellyseerr
ghcr.io/fallenbagel/jellyseerr:latest
```
The argument `-e PORT=5055` is optional.
@@ -61,7 +72,7 @@ docker stop jellyseerr && docker rm jellyseerr
```
Pull the latest image:
```bash
docker pull fallenbagel/jellyseerr
docker pull ghcr.io/fallenbagel/jellyseerr:latest
```
Finally, run the container with the same parameters originally used to create the container:
```bash
@@ -84,7 +95,8 @@ Define the `jellyseerr` service in your `compose.yaml` as follows:
---
services:
jellyseerr:
image: fallenbagel/jellyseerr:latest
image: ghcr.io/fallenbagel/jellyseerr:latest
init: true
container_name: jellyseerr
environment:
- LOG_LEVEL=debug
@@ -123,15 +135,6 @@ You may alternatively use a third-party mechanism like [dockge](https://github.c
</TabItem>
</Tabs>
## Unraid
1. Ensure you have the **Community Applications** plugin installed.
2. Inside the **Community Applications** app store, search for **Jellyseerr**.
3. Click the **Install Button**.
4. On the following **Add Container** screen, make changes to the **Host Port** and **Host Path 1** \(Appdata\) as needed.
5. Click apply and access "Jellyseerr" at your `<ServerIP:HostPort>` in a web browser.
## Windows
Please refer to the [Docker Desktop for Windows user manual](https://docs.docker.com/docker-for-windows/) for details on how to install Docker on Windows. There is no need to install a Linux distro if using named volumes like in the example below.
@@ -156,13 +159,14 @@ Then, create and start the Jellyseerr container:
```bash
docker run -d \
--name jellyseerr \
--init \
-e LOG_LEVEL=debug \
-e TZ=Asia/Tashkent \
-e PORT=5055 \
-p 5055:5055 \
-v jellyseerr-data:/app/config \
--restart unless-stopped \
fallenbagel/jellyseerr
ghcr.io/fallenbagel/jellyseerr:latest
```
The argument `-e PORT=5055` is optional.
@@ -192,7 +196,8 @@ docker compose up -d
---
services:
jellyseerr:
image: fallenbagel/jellyseerr:latest
image: ghcr.io/fallenbagel/jellyseerr:latest
init: true
container_name: jellyseerr
environment:
- LOG_LEVEL=debug

View File

@@ -7,4 +7,4 @@ import DocCardList from '@theme/DocCardList';
After running Jellyseerr for the first time, configure it by visiting the web UI at `http://[address]:5055` and completing the setup steps.
:::
<DocCardList />
<DocCardList />

View File

@@ -1,13 +1,19 @@
---
title: Kubernetes (Advanced)
description: Install Jellyseerr in Kubernetes
sidebar_position: 5
sidebar_position: 3
---
# Kubernetes
:::info
:::warning
This method is not recommended for most users. It is intended for advanced users who are using Kubernetes.
:::
:::info
All official Seerr charts are cryptographically signed and include a verified [Software Bill of Materials (SBOM)](https://cyclonedx.org/).
To confirm that the chart you are using is authentic and unmodified, please refer to the [Verifying Signed Artifacts](/using-jellyseerr/advanced/verifying-signed-artifacts#verifying-signed-helm-charts) guide.
:::
## Installation
```console
helm install jellyseerr oci://ghcr.io/fallenbagel/jellyseerr/jellyseerr-chart

View File

@@ -1,271 +0,0 @@
---
title: Nix Package Manager (Advanced)
description: Install Jellyseerr using Nix
sidebar_position: 3
---
import { JellyseerrVersion, NixpkgVersion } from '@site/src/components/JellyseerrVersion';
import Admonition from '@theme/Admonition';
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
# Nix Package Manager (Advanced)
:::info
This method is not recommended for most users. It is intended for advanced users who are using Nix as their package manager.
:::
export const VersionMismatchWarning = () => {
let jellyseerrVersion = null;
let nixpkgVersions = null;
try {
jellyseerrVersion = JellyseerrVersion();
nixpkgVersions = NixpkgVersion();
} catch (err) {
return (
<Admonition type="error">
Failed to load version information. Error: {err.message || JSON.stringify(err)}
</Admonition>
);
}
if (!nixpkgVersions || nixpkgVersions.error) {
return (
<Admonition type="error">
Failed to fetch Nixpkg versions: {nixpkgVersions?.error || 'Unknown error'}
</Admonition>
);
}
const isUnstableUpToDate = jellyseerrVersion === nixpkgVersions.unstable;
const isStableUpToDate = jellyseerrVersion === nixpkgVersions.stable;
return (
<>
{!isStableUpToDate ? (
<Admonition type="warning">
The{' '}
<a href="https://github.com/NixOS/nixpkgs/blob/nixos-24.11/pkgs/servers/jellyseerr/default.nix#L14">
upstream Jellyseerr Nix Package (v{nixpkgVersions.stable})
</a>{' '}
is not <b>up-to-date</b>. If you want to use <b>Jellyseerr v{jellyseerrVersion}</b>,{' '}
{isUnstableUpToDate ? (
<>
consider using the{' '}
<a href="https://github.com/NixOS/nixpkgs/blob/nixos-unstable/pkgs/by-name/je/jellyseerr/package.nix">
unstable package
</a>{' '}
instead.
</>
) : (
<>
you will need to{' '}
<a href="#overriding-the-package-derivation">override the package derivation</a>.
</>
)}
</Admonition>
) : null}
</>
);
};
<VersionMismatchWarning />
## Installation
To get up and running with jellyseerr using Nix, you can add the following to your `configuration.nix`:
```nix
{ config, pkgs, ... }:
{
services.jellyseerr.enable = true;
}
```
If you want more advanced configuration options, you can use the following:
<Tabs groupId="nixpkg-methods" queryString>
<TabItem value="default" label="Default Configurations">
```nix
{ config, pkgs, ... }:
{
services.jellyseerr = {
enable = true;
port = 5055;
openFirewall = true;
package = pkgs.jellyseerr; # Use the unstable package if stable is not up-to-date
};
}
```
</TabItem>
<TabItem value="custom" label="Database Configurations">
In order to use postgres, you will need to add override the default module of jellyseerr with the following as the current default module is not compatible with postgres:
```nix
{
config,
pkgs,
lib,
...
}:
with lib;
let
cfg = config.services.jellyseerr;
in
{
meta.maintainers = [ maintainers.camillemndn ];
disabledModules = [ "services/misc/jellyseerr.nix" ];
options.services.jellyseerr = {
enable = mkEnableOption ''Jellyseerr, a requests manager for Jellyfin'';
openFirewall = mkOption {
type = types.bool;
default = false;
description = ''Open port in the firewall for the Jellyseerr web interface.'';
};
port = mkOption {
type = types.port;
default = 5055;
description = ''The port which the Jellyseerr web UI should listen to.'';
};
package = mkOption {
type = types.package;
default = pkgs.jellyseerr;
defaultText = literalExpression "pkgs.jellyseerr";
description = ''
Jellyseerr package to use.
'';
};
databaseConfig = mkOption {
type = types.attrsOf types.str;
default = {
type = "sqlite";
configDirectory = "config";
logQueries = "false";
};
description = ''
Database configuration. For "sqlite", only "type", "configDirectory", and "logQueries" are relevant.
For "postgres", include host, port, user, pass, name, and optionally socket.
Example:
{
type = "postgres";
socket = "/run/postgresql";
user = "jellyseerr";
name = "jellyseerr";
logQueries = "false";
}
or
{
type = "postgres";
host = "localhost";
port = "5432";
user = "dbuser";
pass = "password";
name = "jellyseerr";
logQueries = "false";
}
or
{
type = "sqlite";
configDirectory = "config";
logQueries = "false";
}
'';
};
};
config = mkIf cfg.enable {
systemd.services.jellyseerr = {
description = "Jellyseerr, a requests manager for Jellyfin";
after = [ "network.target" ];
wantedBy = [ "multi-user.target" ];
environment =
let
dbConfig = cfg.databaseConfig;
in
{
PORT = toString cfg.port;
DB_TYPE = toString dbConfig.type;
CONFIG_DIRECTORY = toString dbConfig.configDirectory or "";
DB_LOG_QUERIES = toString dbConfig.logQueries;
DB_HOST = if dbConfig.type == "postgres" && !(hasAttr "socket" dbConfig) then toString dbConfig.host or "" else "";
DB_PORT = if dbConfig.type == "postgres" && !(hasAttr "socket" dbConfig) then toString dbConfig.port or "" else "";
DB_SOCKET_PATH = if dbConfig.type == "postgres" && hasAttr "socket" dbConfig then toString dbConfig.socket or "" else "";
DB_USER = if dbConfig.type == "postgres" then toString dbConfig.user or "" else "";
DB_PASS = if dbConfig.type == "postgres" then toString dbConfig.pass or "" else "";
DB_NAME = if dbConfig.type == "postgres" then toString dbConfig.name or "" else "";
};
serviceConfig = {
Type = "exec";
StateDirectory = "jellyseerr";
WorkingDirectory = "${cfg.package}/libexec/jellyseerr";
DynamicUser = true;
ExecStart = "${cfg.package}/bin/jellyseerr";
BindPaths = [ "/var/lib/jellyseerr/:${cfg.package}/libexec/jellyseerr/config/" ];
Restart = "on-failure";
ProtectHome = true;
ProtectSystem = "strict";
PrivateTmp = true;
PrivateDevices = true;
ProtectHostname = true;
ProtectClock = true;
ProtectKernelTunables = true;
ProtectKernelModules = true;
ProtectKernelLogs = true;
ProtectControlGroups = true;
NoNewPrivileges = true;
RestrictRealtime = true;
RestrictSUIDSGID = true;
RemoveIPC = true;
PrivateMounts = true;
};
};
networking.firewall = mkIf cfg.openFirewall { allowedTCPPorts = [ cfg.port ]; };
};
}
```
Then, import the module into your `configuration.nix`:
```nix
{ config, pkgs, ... }:
{
imports = [ ./modules/jellyseerr.nix ];
services.jellyseerr = {
enable = true;
port = 5055;
openFirewall = true;
package = pkgs.unstable.jellyseerr; # use the unstable package if stable is not up-to-date
databaseConfig = {
type = "postgres";
host = "localhost"; # or socket: "/run/postgresql"
port = "5432"; # if using socket, this is not needed
user = "jellyseerr";
pass = "jellyseerr";
name = "jellyseerr";
logQueries = "false";
};
}
}
```
</TabItem>
</Tabs>
After adding the configuration to your `configuration.nix`, you can run the following command to install jellyseerr:
```bash
nixos-rebuild switch
```
After rebuild is complete jellyseerr should be running, verify that it is with the following command.
```bash
systemctl status jellyseerr
```
:::info
You can now access Jellyseerr by visiting `http://localhost:5055` in your web browser.
:::

View File

@@ -1,16 +1,15 @@
---
title: AUR (Arch User Repository)
title: AUR (Advanced)
description: Install Jellyseerr using the Arch User Repository
sidebar_position: 4
sidebar_position: 2
---
# AUR (Arch User Repository)
:::note Disclaimer
This AUR package is not maintained by us but by a third party. Please refer to the maintainer for any issues.
# AUR
:::warning
Third-party installation methods are maintained by the community. The Seerr team is not responsible for these packages.
:::
:::info
:::warning
This method is not recommended for most users. It is intended for advanced users who are using Arch Linux or an Arch-based distribution.
:::

View File

@@ -0,0 +1,11 @@
---
title: Third-party Installation Methods
---
import DocCardList from '@theme/DocCardList';
:::warning
Third-party installation methods are maintained by the community. The Seerr team is not responsible for these packages.
:::
<DocCardList />

View File

@@ -0,0 +1,17 @@
---
title: Nix Package Manager (Advanced)
description: Install Jellyseerr using Nixpkgs
sidebar_position: 1
---
import { JellyseerrVersion, NixpkgVersion } from '@site/src/components/JellyseerrVersion';
import Admonition from '@theme/Admonition';
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
# Nix Package Manager
:::warning
Third-party installation methods are maintained by the community. The Seerr team is not responsible for these packages.
:::
Refer to [NixOS documentation](https://search.nixos.org/options?channel=25.05&query=seerr)

View File

@@ -0,0 +1,20 @@
---
title: Unraid (Advanced)
description: Install Jellyseerr using Unraid
sidebar_position: 3
---
# Unraid
:::warning
Third-party installation methods are maintained by the community. The Seerr team is not responsible for these packages.
:::
:::warning
This method is not recommended for most users. It is intended for advanced users who are using Unraid.
:::
1. Ensure you have the **Community Applications** plugin installed.
2. Inside the **Community Applications** app store, search for **Jellyseerr**.
3. Click the **Install Button**.
4. On the following **Add Container** screen, make changes to the **Host Port** and **Host Path 1** \(Appdata\) as needed.
5. Click apply and access "Jellyseerr" at your `<ServerIP:HostPort>` in a web browser.

View File

@@ -103,7 +103,7 @@ If you can't change your DNS servers or force IPV4 resolution, you can use Jelly
In some places (like China), the ISP blocks not only the DNS resolution but also the connection to the TMDB API.
You can configure Jellyseerr to use a proxy with the [HTTP(S) Proxy](/using-jellyseerr/settings/general#https-proxy) setting.
You can configure Jellyseerr to use a proxy with the [HTTP(S) Proxy](/using-jellyseerr/settings/general#enable-proxy-support) setting.
### Option 3: Force IPV4 resolution first

View File

@@ -0,0 +1,15 @@
---
title: Advanced Features
description: Advanced configuration and use cases.
sidebar_position: 6
---
# Advanced Features
## Advanced Configuration and Use Cases
Seerr currently offers advanced features for power users and specific use cases:
import DocCardList from '@theme/DocCardList';
<DocCardList />

View File

@@ -0,0 +1,386 @@
---
id: verifying-signed-artifacts
title: Verifying Signed Artifacts
sidebar_label: Verify Signed Artifacts
description: Learn how to verify Seerr's signed artifacts and SBOM attestations.
---
import Tabs from '@theme/Tabs';
import TabItem from '@theme/TabItem';
# Verifying Signed Artifacts
These artifacts are cryptographically signed using [Sigstore Cosign](https://docs.sigstore.dev/quickstart/quickstart-cosign/):
- Container images
- Helm charts
This ensures that the images you pull are authentic, tamper-proof, and built by the official Seerr release pipeline.
Additionally each container image also includes a CycloneDX SBOM (Software Bill of Materials) attestation, generated with [Trivy](https://aquasecurity.github.io/trivy/), providing transparency about all dependencies included in the image.
---
## Prerequisites
You will need the following tools installed:
- [Cosign](https://docs.sigstore.dev/cosign/system_config/installation/)
To verify images:
- [Docker](https://docs.docker.com/get-docker/) **or** [Podman](https://podman.io/getting-started/installation) (including [Skopeo](https://github.com/containers/skopeo/blob/main/install.md))
---
## Verifying Signed Images
### Image Locations
Official Seerr images are available from:
- GitHub Container Registry (GHCR): `ghcr.io/seerr-team/seerr:<tag>`
- Docker Hub: `seerr/seerr:<tag>`
You can view all available tags on the [Seerr Releases page](https://github.com/seerr-team/seerr/releases).
---
### Verifying a Specific Release Tag
Each tagged release (for example `v2.7.4`) is immutable and cryptographically signed.
Verification should always be performed using the image digest (SHA256).
#### Retrieve the Image Digest
<Tabs groupId="verify-methods">
<TabItem value="docker" label="Docker">
```bash
docker buildx imagetools inspect ghcr.io/seerr-team/seerr:v2.7.4 --format '{{json .Manifest.Digest}}' | tr -d '"'
```
</TabItem>
<TabItem value="podman" label="Podman / Skopeo">
```bash
skopeo inspect docker://ghcr.io/seerr-team/seerr:v2.7.4 --format '{{.Digest}}'
```
</TabItem>
</Tabs>
Example output:
```
sha256:abcd1234...
```
---
#### Verify the Image Signature
<Tabs groupId="registry-methods">
<TabItem value="ghcr" label="GitHub Container Registry (GHCR)">
```bash
cosign verify ghcr.io/seerr-team/seerr@sha256:abcd1234... \
--certificate-identity "https://github.com/seerr-team/seerr/.github/workflows/release.yml@refs/tags/v2.7.4" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com"
```
</TabItem>
<TabItem value="dockerhub" label="Docker Hub">
```bash
cosign verify seerr/seerr@sha256:abcd1234... \
--certificate-identity "https://github.com/seerr-team/seerr/.github/workflows/release.yml@refs/tags/v2.7.4" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com"
```
</TabItem>
</Tabs>
:::info Successful Verification Example
Verification for `ghcr.io/seerr-team/seerr@sha256:abcd1234...`
The following checks were performed:
- Cosign claims validated
- Signatures verified against the transparency log
- Certificate issued by Fulcio to the expected workflow identity
:::
---
### Verifying the `latest` Tag
:::warning Latest Tag Warning
The `latest` tag is **mutable**, meaning it will change with each new release.
Always verify the digest that `latest` currently points to.
:::
#### Retrieve the Digest for `latest`
<Tabs groupId="verify-methods">
<TabItem value="docker" label="Docker">
```bash
docker buildx imagetools inspect ghcr.io/seerr-team/seerr:latest --format '{{json .Manifest.Digest}}' | tr -d '"'
```
</TabItem>
<TabItem value="podman" label="Podman / Skopeo">
```bash
skopeo inspect docker://ghcr.io/seerr-team/seerr:latest --format '{{.Digest}}'
```
</TabItem>
</Tabs>
Example output:
```
sha256:abcd1234...
```
#### Verify the Signature
<Tabs groupId="registry-methods">
<TabItem value="ghcr" label="GHCR">
```bash
cosign verify ghcr.io/seerr-team/seerr@sha256:abcd1234... \
--certificate-identity-regexp "https://github.com/seerr-team/seerr/.github/workflows/release.yml@refs/tags/v.*" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com"
```
</TabItem>
<TabItem value="dockerhub" label="Docker Hub">
```bash
cosign verify seerr/seerr@sha256:abcd1234... \
--certificate-identity-regexp "https://github.com/seerr-team/seerr/.github/workflows/release.yml@refs/tags/v.*" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com"
```
</TabItem>
</Tabs>
:::tip
The wildcard `v.*` ensures verification works for any versioned release that `latest` represents.
:::
---
### Verifying SBOM Attestations
Each image includes a CycloneDX SBOM attestation.
#### Verify the Attestation
```bash
cosign verify-attestation ghcr.io/seerr-team/seerr@sha256:abcd1234... \
--type cyclonedx \
--certificate-identity "https://github.com/seerr-team/seerr/.github/workflows/release.yml@refs/tags/v2.7.4" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com"
```
:::info Successful Verification Example
Verification for `ghcr.io/seerr-team/seerr@sha256:abcd1234...`
The following checks were performed:
- Cosign claims validated
- Signatures verified against the transparency log
- Certificate issued by Fulcio to the expected workflow identity
:::
#### Extract the SBOM for Inspection
```bash
cosign verify-attestation ghcr.io/seerr-team/seerr@sha256:abcd1234... \
--type cyclonedx \
--certificate-identity "https://github.com/seerr-team/seerr/.github/workflows/release.yml@refs/tags/v2.7.4" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com" | jq -r '.payload | @base64d' > sbom.json
```
You can open `sbom.json` in a CycloneDX viewer or analyse it with [Trivy](https://aquasecurity.github.io/trivy/) or [Dependency-Track](https://dependencytrack.org/).
---
### Expected Certificate Identity
The expected certificate identity for all signed Seerr images is:
```
https://github.com/seerr-team/seerr/.github/workflows/release.yml@refs/tags/v*
```
This confirms that the image was:
- Built by the official Seerr Release workflow
- Produced from the seerr-team/seerr repository
- Signed using GitHubs OIDC identity via Sigstore Fulcio
---
### Example: Full Verification Flow
<Tabs groupId="verify-examples">
<TabItem value="docker" label="Docker">
```bash
DIGEST=$(docker buildx imagetools inspect ghcr.io/seerr-team/seerr:latest --format '{{json .Manifest.Digest}}' | tr -d '"')
cosign verify ghcr.io/seerr-team/seerr@"$DIGEST" \
--certificate-identity-regexp "https://github.com/seerr-team/seerr/.github/workflows/release.yml@refs/tags/v.*" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com"
cosign verify-attestation ghcr.io/seerr-team/seerr@"$DIGEST" \
--type cyclonedx \
--certificate-identity-regexp "https://github.com/seerr-team/seerr/.github/workflows/release.yml@refs/tags/v.*" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com"
```
</TabItem>
<TabItem value="podman" label="Podman / Skopeo">
```bash
DIGEST=$(skopeo inspect docker://ghcr.io/seerr-team/seerr:latest --format '{{.Digest}}')
cosign verify ghcr.io/seerr-team/seerr@"$DIGEST" \
--certificate-identity-regexp "https://github.com/seerr-team/seerr/.github/workflows/release.yml@refs/tags/v.*" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com"
```
</TabItem>
</Tabs>
## Verifying Signed Helm charts
### Helm Chart Locations
Official Seerr helm charts are available from:
- GitHub Container Registry (GHCR): `ghcr.io/seerr-team/seerr/seerr-chart/seerr-chart:<tag>`
You can view all available tags on the [Seerr Releases page](https://github.com/seerr-team/seerr/pkgs/container/seerr%2Fseerr-chart).
---
### Verifying a Specific Release Tag
Each tagged release (for example `3.0.0`) is immutable and cryptographically signed.
Verification should always be performed using the image digest (SHA256).
#### Retrieve the Helm Chart Digest
<Tabs groupId="verify-methods">
<TabItem value="docker" label="Docker">
```bash
docker buildx imagetools inspect ghcr.io/seerr-team/seerr/seerr-chart:3.0.0 --format '{{json .Manifest.Digest}}' | tr -d '"'
```
</TabItem>
<TabItem value="podman" label="Podman / Skopeo">
```bash
skopeo inspect docker://ghcr.io/seerr-team/seerr/seerr-chart:3.0.0 --format '{{.Digest}}'
```
</TabItem>
</Tabs>
Example output:
```
sha256:abcd1234...
```
---
#### Verify the Helm Chart Signature
```bash
cosign verify ghcr.io/seerr-team/seerr/seerr-chart@sha256:abcd1234... \
--certificate-identity "https://github.com/seerr-team/seerr/.github/workflows/helm.yml@refs/heads/main" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com"
```
:::info Successful Verification Example
Verification for `ghcr.io/seerr-team/seerr/seerr-chart@sha256:abcd1234...`
The following checks were performed:
- Cosign claims validated
- Signatures verified against the transparency log
- Certificate issued by Fulcio to the expected workflow identity
:::
---
### Expected Certificate Identity
The expected certificate identity for all signed Seerr images is:
```
https://github.com/seerr-team/seerr/.github/workflows/helm.yml@refs/heads/main
```
This confirms that the image was:
- Built by the official Seerr Release workflow
- Produced from the seerr-team/seerr repository
- Signed using GitHubs OIDC identity via Sigstore Fulcio
---
### Example: Full Verification Flow
<Tabs groupId="verify-examples">
<TabItem value="docker" label="Docker">
```bash
DIGEST=$(docker buildx imagetools inspect ghcr.io/seerr-team/seerr/seerr-chart:3.0.0 --format '{{json .Manifest.Digest}}' | tr -d '"')
cosign verify ghcr.io/seerr-team/seerr/seerr-chart@"$DIGEST" \
--certificate-identity-regexp "https://github.com/seerr-team/seerr/.github/workflows/helm.yml@refs/heads/main" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com"
cosign verify-attestation ghcr.io/seerr-team/seerr/seerr-chart@"$DIGEST" \
--type cyclonedx \
--certificate-identity-regexp "https://github.com/seerr-team/seerr/.github/workflows/helm.yml@refs/heads/main" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com"
```
</TabItem>
<TabItem value="podman" label="Podman / Skopeo">
```bash
DIGEST=$(skopeo inspect docker://ghcr.io/seerr-team/seerr/seerr-chart:3.0.0 --format '{{.Digest}}')
cosign verify ghcr.io/seerr-team/seerr/seerr-chart@"$DIGEST" \
--certificate-identity-regexp "https://github.com/seerr-team/seerr/.github/workflows/helm.yml@refs/heads/main" \
--certificate-oidc-issuer "https://token.actions.githubusercontent.com"
```
</TabItem>
</Tabs>
---
## Troubleshooting
| Issue | Likely Cause | Suggested Fix |
|-------|---------------|----------------|
| `no matching signatures` | Incorrect digest or tag | Retrieve the digest again using Docker or Skopeo |
| `certificate identity does not match expected` | Workflow reference changed | Ensure your `--certificate-identity` matches this documentation |
| `cosign: command not found` | Cosign not installed | Install Cosign from the official release |
| `certificate expired` | Old release | Verify a newer tag or digest |
---
## Further Reading
- [Sigstore Documentation](https://docs.sigstore.dev)
- [Cosign Verification Guide](https://docs.sigstore.dev/cosign/verifying/verify/)
- [CycloneDX Specification](https://cyclonedx.org/specification/overview/)
- [Trivy Documentation](https://trivy.dev/latest/docs/)
- [Skopeo Documentation](https://github.com/containers/skopeo)
- [Podman Documentation](https://podman.io/get-started/)
- [Docker Documentation](https://docs.docker.com/)
- [Seerr GitHub Repository](https://github.com/seerr-team/seerr)

View File

@@ -22,7 +22,7 @@
"typecheck": "pnpm typecheck:server && pnpm typecheck:client",
"typecheck:server": "tsc --project server/tsconfig.json --noEmit",
"typecheck:client": "tsc --noEmit",
"prepare": "husky install",
"prepare": "node bin/prepare.js",
"cypress:open": "cypress open",
"cypress:prepare": "ts-node -r tsconfig-paths/register --files --project server/tsconfig.json server/scripts/prepareTestDb.ts",
"cypress:build": "pnpm build && pnpm cypress:prepare"
@@ -60,7 +60,6 @@
"dayjs": "1.11.7",
"dns-caching": "^0.2.7",
"email-templates": "12.0.1",
"email-validator": "2.0.4",
"express": "4.21.2",
"express-openapi-validator": "4.13.8",
"express-rate-limit": "6.7.0",
@@ -107,6 +106,7 @@
"typeorm": "0.3.12",
"ua-parser-js": "^1.0.35",
"undici": "^7.3.0",
"validator": "^13.15.15",
"web-push": "3.5.0",
"wink-jaro-distance": "^2.0.0",
"winston": "3.8.2",
@@ -140,6 +140,7 @@
"@types/secure-random-password": "0.2.1",
"@types/semver": "7.3.13",
"@types/swagger-ui-express": "4.1.3",
"@types/validator": "^13.15.3",
"@types/web-push": "3.3.2",
"@types/xml2js": "0.4.11",
"@types/yamljs": "0.2.31",
@@ -201,70 +202,6 @@
"@commitlint/config-conventional"
]
},
"release": {
"plugins": [
"@semantic-release/commit-analyzer",
"@semantic-release/release-notes-generator",
"@semantic-release/npm",
[
"@codedependant/semantic-release-docker",
{
"dockerArgs": {
"COMMIT_TAG": "${GITHUB_SHA}"
},
"dockerLogin": false,
"dockerProject": "fallenbagel",
"dockerImage": "jellyseerr",
"dockerTags": [
"latest",
"{{major}}",
"{{major}}.{{minor}}",
"{{major}}.{{minor}}.{{patch}}"
],
"dockerPlatform": [
"linux/amd64",
"linux/arm64"
]
}
],
[
"@codedependant/semantic-release-docker",
{
"dockerArgs": {
"COMMIT_TAG": "${GITHUB_SHA}"
},
"dockerLogin": false,
"dockerRegistry": "ghcr.io",
"dockerProject": "fallenbagel",
"dockerImage": "jellyseerr",
"dockerTags": [
"latest",
"{{major}}",
"{{major}}.{{minor}}",
"{{major}}.{{minor}}.{{patch}}"
],
"dockerPlatform": [
"linux/amd64",
"linux/arm64"
]
}
],
[
"@semantic-release/github",
{
"addReleases": "bottom"
}
]
],
"branches": [
"main"
],
"npmPublish": false,
"publish": [
"@codedependant/semantic-release-docker",
"@semantic-release/github"
]
},
"pnpm": {
"onlyBuiltDependencies": [
"sqlite3",

26
pnpm-lock.yaml generated
View File

@@ -89,9 +89,6 @@ importers:
email-templates:
specifier: 12.0.1
version: 12.0.1(@babel/core@7.24.7)(encoding@0.1.13)(handlebars@4.7.8)(mustache@4.2.0)(pug@3.0.3)(react-dom@18.3.1(react@18.3.1))(react@18.3.1)(underscore@1.13.7)
email-validator:
specifier: 2.0.4
version: 2.0.4
express:
specifier: 4.21.2
version: 4.21.2
@@ -230,6 +227,9 @@ importers:
undici:
specifier: ^7.3.0
version: 7.3.0
validator:
specifier: ^13.15.15
version: 13.15.15
web-push:
specifier: 3.5.0
version: 3.5.0
@@ -324,6 +324,9 @@ importers:
'@types/swagger-ui-express':
specifier: 4.1.3
version: 4.1.3
'@types/validator':
specifier: ^13.15.3
version: 13.15.3
'@types/web-push':
specifier: 3.3.2
version: 3.3.2
@@ -3340,6 +3343,9 @@ packages:
'@types/unist@2.0.10':
resolution: {integrity: sha512-IfYcSBWE3hLpBg8+X2SEa8LVkJdJEkT2Ese2aaLs3ptGdVtABxndrMaxuFlQ1qdFf9Q5rDvDpxI3WwgvKFAsQA==}
'@types/validator@13.15.3':
resolution: {integrity: sha512-7bcUmDyS6PN3EuD9SlGGOxM77F8WLVsrwkxyWxKnxzmXoequ6c7741QBrANq6htVRGOITJ7z72mTP6Z4XyuG+Q==}
'@types/web-push@3.3.2':
resolution: {integrity: sha512-JxWGVL/m7mWTIg4mRYO+A6s0jPmBkr4iJr39DqJpRJAc+jrPiEe1/asmkwerzRon8ZZDxaZJpsxpv0Z18Wo9gw==}
@@ -4735,10 +4741,6 @@ packages:
resolution: {integrity: sha512-849pjBFVUAWWTa3HqhDjxlXHaSWmxf4CZOlZ9iVkrSAbQ8YCYi+7KiKqt35L6F20WhSViWX7lmMjno6zBv2rNQ==}
engines: {node: '>=14'}
email-validator@2.0.4:
resolution: {integrity: sha512-gYCwo7kh5S3IDyZPLZf6hSS0MnZT8QmJFqYvbqlDZSbwdZlY6QZWxJ4i/6UhITOJ4XzyI647Bm2MXKCLqnJ4nQ==}
engines: {node: '>4.0'}
emoji-regex@10.3.0:
resolution: {integrity: sha512-QpLs9D9v9kArv4lfDEgg1X/gN5XLnf/A6l9cs8SPZLRZR3ZkY9+kwIQTxm+fsSej5UMYGE8fdoaZVIBlqG0XTw==}
@@ -9049,6 +9051,10 @@ packages:
validate-npm-package-license@3.0.4:
resolution: {integrity: sha512-DpKm2Ui/xN7/HQKCtpZxoRWBhZ9Z0kqtygG8XCgNQ8ZlDnxuQmWhj566j8fN4Cu3/JmbhsDo7fcAJq4s9h27Ew==}
validator@13.15.15:
resolution: {integrity: sha512-BgWVbCI72aIQy937xbawcs+hrVaN/CZ2UwutgaJ36hGqRrLNM+f5LUT/YPRbo8IV/ASeFzXszezV+y2+rq3l8A==}
engines: {node: '>= 0.10'}
vary@1.1.2:
resolution: {integrity: sha512-BNGbWLfd0eUPabhkXUVm0j8uuvREyTh5ovRa/dyow/BqAbZJyC+5fU+IzQOzmAKzYqYRAISoRhdQr3eIZ/PXqg==}
engines: {node: '>= 0.8'}
@@ -13256,6 +13262,8 @@ snapshots:
'@types/unist@2.0.10': {}
'@types/validator@13.15.3': {}
'@types/web-push@3.3.2':
dependencies:
'@types/node': 22.10.5
@@ -14937,8 +14945,6 @@ snapshots:
- walrus
- whiskers
email-validator@2.0.4: {}
emoji-regex@10.3.0: {}
emoji-regex@8.0.0: {}
@@ -20048,6 +20054,8 @@ snapshots:
spdx-correct: 3.2.0
spdx-expression-parse: 3.0.1
validator@13.15.15: {}
vary@1.1.2: {}
verror@1.10.0:

View File

@@ -2339,8 +2339,12 @@ paths:
properties:
username:
type: string
userId:
type: integer
id:
type: string
thumb:
type: string
email:
type: string
/settings/jellyfin/sync:
get:
summary: Get status of full Jellyfin library sync
@@ -6908,6 +6912,10 @@ paths:
is4k:
type: boolean
example: false
description: |
When true, updates the 4K status field (status4k).
When false or not provided, updates the regular status field (status).
This applies to all status values (available, partial, processing, pending, unknown).
responses:
'200':
description: Returned media

View File

@@ -257,7 +257,7 @@ class SonarrAPI extends ServarrBase<{
series: createdSeriesResponse.data,
});
} else {
logger.error('Failed to add movie to Sonarr', {
logger.error('Failed to add series to Sonarr', {
label: 'Sonarr',
options,
});
@@ -342,7 +342,7 @@ class SonarrAPI extends ServarrBase<{
return newSeasons;
}
public removeSerie = async (serieId: number): Promise<void> => {
public removeSeries = async (serieId: number): Promise<void> => {
try {
const { id, title } = await this.getSeriesByTvdbId(serieId);
await this.axios.delete(`/series/${id}`, {
@@ -351,9 +351,9 @@ class SonarrAPI extends ServarrBase<{
addImportExclusion: false,
},
});
logger.info(`[Radarr] Removed serie ${title}`);
logger.info(`[Sonarr] Removed series ${title}`);
} catch (e) {
throw new Error(`[Radarr] Failed to remove serie: ${e.message}`);
throw new Error(`[Sonarr] Failed to remove series: ${e.message}`);
}
};

View File

@@ -73,6 +73,7 @@ export interface TmdbCertificationResponse {
interface DiscoverMovieOptions {
page?: number;
includeAdult?: boolean;
includeVideo?: boolean;
language?: string;
primaryReleaseDateGte?: string;
primaryReleaseDateLte?: string;
@@ -490,6 +491,7 @@ class TheMovieDb extends ExternalAPI implements TvShowProvider {
sortBy = 'popularity.desc',
page = 1,
includeAdult = false,
includeVideo = true,
language = this.locale,
primaryReleaseDateGte,
primaryReleaseDateLte,
@@ -527,6 +529,7 @@ class TheMovieDb extends ExternalAPI implements TvShowProvider {
sort_by: sortBy,
page,
include_adult: includeAdult,
include_video: includeVideo,
language,
region: this.discoverRegion || '',
with_original_language:

View File

@@ -56,6 +56,7 @@ class DownloadTracker {
public async resetDownloadTracker() {
this.radarrServers = {};
this.sonarrServers = {};
}
public updateDownloads() {

View File

@@ -7,8 +7,8 @@ import type { NotificationAgentEmail } from '@server/lib/settings';
import { getSettings, NotificationAgentKey } from '@server/lib/settings';
import logger from '@server/logger';
import type { EmailOptions } from 'email-templates';
import * as EmailValidator from 'email-validator';
import path from 'path';
import validator from 'validator';
import { Notification, shouldSendAdminNotification } from '..';
import type { NotificationAgent, NotificationPayload } from './agent';
import { BaseAgent } from './agent';
@@ -221,7 +221,9 @@ class EmailAgent
this.getSettings(),
payload.notifyUser.settings?.pgpKey
);
if (EmailValidator.validate(payload.notifyUser.email)) {
if (
validator.isEmail(payload.notifyUser.email, { require_tld: false })
) {
await email.send(
this.buildMessage(
type,
@@ -283,7 +285,7 @@ class EmailAgent
this.getSettings(),
user.settings?.pgpKey
);
if (EmailValidator.validate(user.email)) {
if (validator.isEmail(user.email, { require_tld: false })) {
await email.send(
this.buildMessage(type, payload, user.email, user.displayName)
);

View File

@@ -15,9 +15,9 @@ import { ApiError } from '@server/types/error';
import { getAppVersion } from '@server/utils/appVersion';
import { getHostname } from '@server/utils/getHostname';
import axios from 'axios';
import * as EmailValidator from 'email-validator';
import { Router } from 'express';
import net from 'net';
import validator from 'validator';
const authRoutes = Router();
@@ -37,7 +37,7 @@ authRoutes.get('/me', isAuthenticated(), async (req, res) => {
const settings = await getSettings();
if (
settings.notifications.agents.email.options.userEmailRequired &&
!EmailValidator.validate(user.email)
!validator.isEmail(user.email, { require_tld: false })
) {
user.warnings.push('userEmailRequired');
logger.warn(`User ${user.username} has no valid email address`);

View File

@@ -112,7 +112,7 @@ mediaRoutes.post<
return next({ status: 404, message: 'Media does not exist.' });
}
const is4k = Boolean(req.body.is4k);
const is4k = String(req.body.is4k) === 'true';
switch (req.params.status) {
case 'available':
@@ -145,16 +145,16 @@ mediaRoutes.post<
message: 'Only series can be set to be partially available',
});
}
media.status = MediaStatus.PARTIALLY_AVAILABLE;
media[is4k ? 'status4k' : 'status'] = MediaStatus.PARTIALLY_AVAILABLE;
break;
case 'processing':
media.status = MediaStatus.PROCESSING;
media[is4k ? 'status4k' : 'status'] = MediaStatus.PROCESSING;
break;
case 'pending':
media.status = MediaStatus.PENDING;
media[is4k ? 'status4k' : 'status'] = MediaStatus.PENDING;
break;
case 'unknown':
media.status = MediaStatus.UNKNOWN;
media[is4k ? 'status4k' : 'status'] = MediaStatus.UNKNOWN;
}
await mediaRepository.save(media);
@@ -198,7 +198,7 @@ mediaRoutes.delete(
where: { id: Number(req.params.id) },
});
const is4k = req.query.is4k === 'true';
const is4k = String(req.query.is4k) === 'true';
const isMovie = media.mediaType === MediaType.MOVIE;
let serviceSettings;
@@ -212,18 +212,19 @@ mediaRoutes.delete(
);
}
const specificServiceId = is4k ? media.serviceId4k : media.serviceId;
if (
media.serviceId &&
media.serviceId >= 0 &&
serviceSettings?.id !== media.serviceId
specificServiceId &&
specificServiceId >= 0 &&
serviceSettings?.id !== specificServiceId
) {
if (isMovie) {
serviceSettings = settings.radarr.find(
(radarr) => radarr.id === media.serviceId
(radarr) => radarr.id === specificServiceId
);
} else {
serviceSettings = settings.sonarr.find(
(sonarr) => sonarr.id === media.serviceId
(sonarr) => sonarr.id === specificServiceId
);
}
}
@@ -257,13 +258,7 @@ mediaRoutes.delete(
}
if (isMovie) {
await (service as RadarrAPI).removeMovie(
parseInt(
is4k
? (media.externalServiceSlug4k as string)
: (media.externalServiceSlug as string)
)
);
await (service as RadarrAPI).removeMovie(media.tmdbId);
} else {
const tmdb = new TheMovieDb();
const series = await tmdb.getTvShow({ tvId: media.tmdbId });
@@ -271,7 +266,7 @@ mediaRoutes.delete(
if (!tvdbId) {
throw new Error('TVDB ID not found');
}
await (service as SonarrAPI).removeSerie(tvdbId);
await (service as SonarrAPI).removeSeries(tvdbId);
}
return res.status(204).send();

View File

@@ -269,16 +269,20 @@ router.delete<{ userId: number; endpoint: string }>(
try {
const userPushSubRepository = getRepository(UserPushSubscription);
const userPushSub = await userPushSubRepository.findOneOrFail({
relations: {
user: true,
},
const userPushSub = await userPushSubRepository.findOne({
relations: { user: true },
where: {
user: { id: req.params.userId },
endpoint: req.params.endpoint,
},
});
// If not found, just return 204 to prevent push disable failure
// (rare scenario where user push sub does not exist)
if (!userPushSub) {
return res.status(204).send();
}
await userPushSubRepository.remove(userPushSub);
return res.status(204).send();
} catch (e) {

View File

@@ -341,9 +341,11 @@ export class MediaRequestSubscriber
mediaId: entity.media.id,
});
const requestRepository = getRepository(MediaRequest);
entity.status = MediaRequestStatus.APPROVED;
await requestRepository.save(entity);
if (entity.status !== MediaRequestStatus.APPROVED) {
const requestRepository = getRepository(MediaRequest);
entity.status = MediaRequestStatus.APPROVED;
await requestRepository.save(entity);
}
return;
}
@@ -505,9 +507,11 @@ export class MediaRequestSubscriber
mediaId: entity.media.id,
});
const requestRepository = getRepository(MediaRequest);
entity.status = MediaRequestStatus.APPROVED;
await requestRepository.save(entity);
if (entity.status !== MediaRequestStatus.APPROVED) {
const requestRepository = getRepository(MediaRequest);
entity.status = MediaRequestStatus.APPROVED;
await requestRepository.save(entity);
}
return;
}

View File

@@ -5,6 +5,7 @@ import { Transition } from '@headlessui/react';
import axios from 'axios';
import { Field, Formik } from 'formik';
import { useIntl } from 'react-intl';
import validator from 'validator';
import * as Yup from 'yup';
const messages = defineMessages('components.Login', {
@@ -36,7 +37,11 @@ const AddEmailModal: React.FC<AddEmailModalProps> = ({
const EmailSettingsSchema = Yup.object().shape({
email: Yup.string()
.email(intl.formatMessage(messages.validationEmailFormat))
.test(
'email',
intl.formatMessage(messages.validationEmailFormat),
(value) => !value || validator.isEmail(value, { require_tld: false })
)
.required(intl.formatMessage(messages.validationEmailRequired)),
});

View File

@@ -150,6 +150,31 @@ const ManageSlideOver = ({
return false;
};
const isDefault4kService = () => {
if (data.mediaInfo) {
if (data.mediaInfo.mediaType === MediaType.MOVIE) {
return (
radarrData?.find(
(radarr) =>
radarr.isDefault &&
radarr.is4k &&
radarr.id === data.mediaInfo?.serviceId4k
) !== undefined
);
} else {
return (
sonarrData?.find(
(sonarr) =>
sonarr.isDefault &&
sonarr.is4k &&
sonarr.id === data.mediaInfo?.serviceId4k
) !== undefined
);
}
}
return false;
};
const markAvailable = async (is4k = false) => {
if (data.mediaInfo) {
await axios.post(`/api/v1/media/${data.mediaInfo?.id}/available`, {
@@ -572,7 +597,7 @@ const ManageSlideOver = ({
</span>
</Button>
</a>
{isDefaultService() && (
{isDefault4kService() && (
<div>
<ConfirmButton
onClick={() => deleteMediaFile(true)}

View File

@@ -10,6 +10,7 @@ import Image from 'next/image';
import Link from 'next/link';
import { useState } from 'react';
import { useIntl } from 'react-intl';
import validator from 'validator';
import * as Yup from 'yup';
const messages = defineMessages('components.ResetPassword', {
@@ -29,7 +30,11 @@ const ResetPassword = () => {
const ResetSchema = Yup.object().shape({
email: Yup.string()
.email(intl.formatMessage(messages.validationemailrequired))
.test(
'email',
intl.formatMessage(messages.validationemailrequired),
(value) => !value || validator.isEmail(value, { require_tld: false })
)
.required(intl.formatMessage(messages.validationemailrequired)),
});

View File

@@ -11,6 +11,7 @@ import { useState } from 'react';
import { useIntl } from 'react-intl';
import { useToasts } from 'react-toast-notifications';
import useSWR, { mutate } from 'swr';
import validator from 'validator';
import * as Yup from 'yup';
const messages = defineMessages('components.Settings.Notifications', {
@@ -77,7 +78,11 @@ const NotificationsEmail = () => {
.required(intl.formatMessage(messages.validationEmail)),
otherwise: Yup.string().nullable(),
})
.email(intl.formatMessage(messages.validationEmail)),
.test(
'email',
intl.formatMessage(messages.validationEmail),
(value) => !value || validator.isEmail(value, { require_tld: false })
),
smtpHost: Yup.string().when('enabled', {
is: true,
then: Yup.string()

View File

@@ -8,6 +8,7 @@ import axios from 'axios';
import { Field, Form, Formik } from 'formik';
import { FormattedMessage, useIntl } from 'react-intl';
import { useToasts } from 'react-toast-notifications';
import validator from 'validator';
import * as Yup from 'yup';
const messages = defineMessages('components.Login', {
@@ -90,7 +91,11 @@ function JellyfinSetup({
(value) => !value || !value.endsWith('/')
),
email: Yup.string()
.email(intl.formatMessage(messages.validationemailformat))
.test(
'email',
intl.formatMessage(messages.validationemailformat),
(value) => !value || validator.isEmail(value, { require_tld: false })
)
.required(intl.formatMessage(messages.validationemailrequired)),
username: Yup.string().required(
intl.formatMessage(messages.validationusernamerequired)

View File

@@ -36,6 +36,7 @@ import { useEffect, useState } from 'react';
import { useIntl } from 'react-intl';
import { useToasts } from 'react-toast-notifications';
import useSWR from 'swr';
import validator from 'validator';
import * as Yup from 'yup';
import JellyfinImportModal from './JellyfinImportModal';
@@ -210,7 +211,11 @@ const UserList = () => {
),
email: Yup.string()
.required()
.email(intl.formatMessage(messages.validationEmail)),
.test(
'email',
intl.formatMessage(messages.validationEmail),
(value) => !value || validator.isEmail(value, { require_tld: false })
),
password: Yup.lazy((value) =>
!value
? Yup.string()

View File

@@ -23,6 +23,7 @@ import { useEffect, useState } from 'react';
import { useIntl } from 'react-intl';
import { useToasts } from 'react-toast-notifications';
import useSWR from 'swr';
import validator from 'validator';
import * as Yup from 'yup';
const messages = defineMessages(
@@ -105,10 +106,18 @@ const UserGeneralSettings = () => {
user?.id === 1 ||
(user?.userType !== UserType.JELLYFIN && user?.userType !== UserType.EMBY)
? Yup.string()
.email(intl.formatMessage(messages.validationemailformat))
.test(
'email',
intl.formatMessage(messages.validationemailformat),
(value) =>
!value || validator.isEmail(value, { require_tld: false })
)
.required(intl.formatMessage(messages.validationemailrequired))
: Yup.string().email(
intl.formatMessage(messages.validationemailformat)
: Yup.string().test(
'email',
intl.formatMessage(messages.validationemailformat),
(value) =>
!value || validator.isEmail(value, { require_tld: false })
),
discordId: Yup.string()
.nullable()

View File

@@ -111,6 +111,11 @@ const UserWebPushSettings = () => {
try {
await unsubscribeToPushNotifications(user?.id, endpoint);
// Delete from backend if endpoint is available
if (subEndpoint) {
await deletePushSubscriptionFromBackend(subEndpoint);
}
localStorage.setItem('pushNotificationsEnabled', 'false');
setWebPushEnabled(false);
addToast(intl.formatMessage(messages.webpushhasbeendisabled), {