update
Some checks failed
Pipeline: Test, Lint, Build / Get version info (push) Has been cancelled
Pipeline: Test, Lint, Build / Lint Go code (push) Has been cancelled
Pipeline: Test, Lint, Build / Test Go code (push) Has been cancelled
Pipeline: Test, Lint, Build / Test JS code (push) Has been cancelled
Pipeline: Test, Lint, Build / Lint i18n files (push) Has been cancelled
Pipeline: Test, Lint, Build / Check Docker configuration (push) Has been cancelled
Pipeline: Test, Lint, Build / Build (darwin/amd64) (push) Has been cancelled
Pipeline: Test, Lint, Build / Build (darwin/arm64) (push) Has been cancelled
Pipeline: Test, Lint, Build / Build (linux/386) (push) Has been cancelled
Pipeline: Test, Lint, Build / Build (linux/amd64) (push) Has been cancelled
Pipeline: Test, Lint, Build / Build (linux/arm/v5) (push) Has been cancelled
Pipeline: Test, Lint, Build / Build (linux/arm/v6) (push) Has been cancelled
Pipeline: Test, Lint, Build / Build (linux/arm/v7) (push) Has been cancelled
Pipeline: Test, Lint, Build / Build (linux/arm64) (push) Has been cancelled
Pipeline: Test, Lint, Build / Build (windows/386) (push) Has been cancelled
Pipeline: Test, Lint, Build / Build (windows/amd64) (push) Has been cancelled
Pipeline: Test, Lint, Build / Push to GHCR (push) Has been cancelled
Pipeline: Test, Lint, Build / Push to Docker Hub (push) Has been cancelled
Pipeline: Test, Lint, Build / Cleanup digest artifacts (push) Has been cancelled
Pipeline: Test, Lint, Build / Build Windows installers (push) Has been cancelled
Pipeline: Test, Lint, Build / Package/Release (push) Has been cancelled
Pipeline: Test, Lint, Build / Upload Linux PKG (push) Has been cancelled
Close stale issues and PRs / stale (push) Has been cancelled
POEditor import / update-translations (push) Has been cancelled

This commit is contained in:
2025-12-08 16:16:23 +01:00
commit c251f174ed
1349 changed files with 194301 additions and 0 deletions

12
.github/FUNDING.yml vendored Normal file
View File

@@ -0,0 +1,12 @@
# These are supported funding model platforms
github: deluan
patreon: # Replace with a single Patreon username
open_collective: # Replace with a single Open Collective username
ko_fi: deluan
liberapay: deluan
tidelift: # Replace with a single Tidelift platform-name/package-name e.g., npm/babel
community_bridge: # Replace with a single Community Bridge project-name e.g., cloud-foundry
issuehunt: # Replace with a single IssueHunt username
otechie: # Replace with a single Otechie username
custom: # Replace with up to 4 custom sponsorship URLs e.g., ['link1', 'link2']

103
.github/ISSUE_TEMPLATE/bug_report.yml vendored Normal file
View File

@@ -0,0 +1,103 @@
name: Bug Report
description: Before opening a new issue, please search to see if an issue already exists for the bug you encountered.
title: "[Bug]: "
labels: ["bug", "triage"]
#assignees:
# - deluan
body:
- type: markdown
attributes:
value: |
### Thanks for taking the time to fill out this bug report!
- type: checkboxes
id: requirements
attributes:
label: "I confirm that:"
options:
- label: I have searched the existing [open AND closed issues](https://github.com/navidrome/navidrome/issues?q=is%3Aissue) to see if an issue already exists for the bug I've encountered
required: true
- label: I'm using the latest version (your issue may have been fixed already)
required: false
- type: input
id: version
attributes:
label: Version
description: What version of Navidrome are you running? (please try upgrading first, as your issue may have been fixed already).
validations:
required: true
- type: textarea
attributes:
label: Current Behavior
description: A concise description of what you're experiencing.
validations:
required: true
- type: textarea
attributes:
label: Expected Behavior
description: A concise description of what you expected to happen.
validations:
required: true
- type: textarea
attributes:
label: Steps To Reproduce
description: Steps to reproduce the behavior.
placeholder: |
1. In this scenario...
2. With this config...
3. Click (or Execute) '...'
4. See error...
validations:
required: false
- type: textarea
id: env
attributes:
label: Environment
description: |
examples:
- **OS**: Ubuntu 20.04
- **Browser**: Chrome 110.0.5481.177 on Windows 11
- **Client**: DSub 5.5.1
value: |
- OS:
- Browser:
- Client:
render: markdown
- type: dropdown
id: distribution
attributes:
label: How Navidrome is installed?
multiple: false
options:
- Docker
- Binary (from downloads page)
- Package
- Built from sources
validations:
required: true
- type: textarea
id: config
attributes:
label: Configuration
description: Please copy and paste your `navidrome.toml` (and/or `docker-compose.yml`) configuration. This will be automatically formatted into code, so no need for backticks.
render: toml
- type: textarea
id: logs
attributes:
label: Relevant log output
description: Please copy and paste any relevant log output (change your `LogLevel` (`ND_LOGLEVEL`) to debug). This will be automatically formatted into code, so no need for backticks. ([Where I can find the logs?](https://www.navidrome.org/docs/faq/#where-are-the-logs))
render: shell
- type: textarea
attributes:
label: Anything else?
description: |
Links? References? Anything that will give us more context about the issue you are encountering!
Tip: You can attach screenshots by clicking this area to highlight it and then dragging files in.
- type: checkboxes
id: terms
attributes:
label: Code of Conduct
description: By submitting this issue, you agree to follow our [Code of Conduct](https://github.com/navidrome/navidrome/blob/master/CODE_OF_CONDUCT.md).
options:
- label: I agree to follow Navidrome's Code of Conduct
required: true

8
.github/ISSUE_TEMPLATE/config.yml vendored Normal file
View File

@@ -0,0 +1,8 @@
blank_issues_enabled: false
contact_links:
- name: Ideas for new features
url: https://github.com/navidrome/navidrome/discussions/categories/ideas
about: This is the place to share and discuss new ideas and potentially new features.
- name: Support requests
url: https://github.com/navidrome/navidrome/discussions/categories/q-a
about: This is the place to ask questions.

View File

@@ -0,0 +1,23 @@
name: 'Download TagLib'
description: 'Downloads and extracts the TagLib library, adding it to PKG_CONFIG_PATH'
inputs:
version:
description: 'Version of TagLib to download'
required: true
platform:
description: 'Platform to download TagLib for'
default: 'linux-amd64'
runs:
using: 'composite'
steps:
- name: Download TagLib
shell: bash
run: |
mkdir -p /tmp/taglib
cd /tmp
FILE=taglib-${{ inputs.platform }}.tar.gz
wget https://github.com/navidrome/cross-taglib/releases/download/v${{ inputs.version }}/${FILE}
tar -xzf ${FILE} -C taglib
PKG_CONFIG_PREFIX=/tmp/taglib
echo "PKG_CONFIG_PREFIX=${PKG_CONFIG_PREFIX}" >> $GITHUB_ENV
echo "PKG_CONFIG_PATH=${PKG_CONFIG_PATH}:${PKG_CONFIG_PREFIX}/lib/pkgconfig" >> $GITHUB_ENV

View File

@@ -0,0 +1,84 @@
name: 'Prepare Docker Buildx environment'
description: 'Downloads and extracts the TagLib library, adding it to PKG_CONFIG_PATH'
inputs:
github_token:
description: 'GitHub token'
required: true
default: ''
hub_repository:
description: 'Docker Hub repository to push images to'
required: false
default: ''
hub_username:
description: 'Docker Hub username'
required: false
default: ''
hub_password:
description: 'Docker Hub password'
required: false
default: ''
outputs:
tags:
description: 'Docker image tags'
value: ${{ steps.meta.outputs.tags }}
labels:
description: 'Docker image labels'
value: ${{ steps.meta.outputs.labels }}
annotations:
description: 'Docker image annotations'
value: ${{ steps.meta.outputs.annotations }}
version:
description: 'Docker image version'
value: ${{ steps.meta.outputs.version }}
hub_repository:
description: 'Docker Hub repository'
value: ${{ env.DOCKER_HUB_REPO }}
hub_enabled:
description: 'Is Docker Hub enabled'
value: ${{ env.DOCKER_HUB_ENABLED }}
runs:
using: 'composite'
steps:
- name: Check Docker Hub configuration
shell: bash
run: |
if [ -z "${{inputs.hub_repository}}" ]; then
echo "DOCKER_HUB_REPO=none" >> $GITHUB_ENV
echo "DOCKER_HUB_ENABLED=false" >> $GITHUB_ENV
else
echo "DOCKER_HUB_REPO=${{inputs.hub_repository}}" >> $GITHUB_ENV
echo "DOCKER_HUB_ENABLED=true" >> $GITHUB_ENV
fi
- name: Login to Docker Hub
if: inputs.hub_username != '' && inputs.hub_password != ''
uses: docker/login-action@v3
with:
username: ${{ inputs.hub_username }}
password: ${{ inputs.hub_password }}
- name: Login to GitHub Container Registry
uses: docker/login-action@v3
with:
registry: ghcr.io
username: ${{ github.actor }}
password: ${{ inputs.github_token }}
- name: Set up Docker Buildx
id: buildx
uses: docker/setup-buildx-action@v3
- name: Extract metadata for Docker image
id: meta
uses: docker/metadata-action@v5
with:
labels: |
maintainer=deluan@navidrome.org
images: |
name=${{env.DOCKER_HUB_REPO}},enable=${{env.DOCKER_HUB_ENABLED}}
name=ghcr.io/${{ github.repository }}
tags: |
type=ref,event=pr
type=semver,pattern={{version}}
type=raw,value=develop,enable={{is_default_branch}}

22
.github/dependabot.yml vendored Normal file
View File

@@ -0,0 +1,22 @@
version: 2
updates:
- package-ecosystem: npm
directory: "/ui"
schedule:
interval: weekly
open-pull-requests-limit: 10
- package-ecosystem: gomod
directory: "/"
schedule:
interval: weekly
open-pull-requests-limit: 10
- package-ecosystem: docker
directory: "/"
schedule:
interval: weekly
open-pull-requests-limit: 10
- package-ecosystem: github-actions
directory: "/.github/workflows"
schedule:
interval: weekly
open-pull-requests-limit: 10

38
.github/pull_request_template.md vendored Normal file
View File

@@ -0,0 +1,38 @@
### Description
<!-- Please provide a clear and concise description of what this PR does and why it is needed. -->
### Related Issues
<!-- List any related issues, e.g., "Fixes #123" or "Related to #456". -->
### Type of Change
- [ ] Bug fix
- [ ] New feature
- [ ] Documentation update
- [ ] Refactor
- [ ] Other (please describe):
### Checklist
Please review and check all that apply:
- [ ] My code follows the projects coding style
- [ ] I have tested the changes locally
- [ ] I have added or updated documentation as needed
- [ ] I have added tests that prove my fix/feature works (or explain why not)
- [ ] All existing and new tests pass
### How to Test
<!-- Describe the steps to test your changes. Include setup, commands, and expected results. -->
### Screenshots / Demos (if applicable)
<!-- Add screenshots, GIFs, or links to demos if your change includes UI updates or visual changes. -->
### Additional Notes
<!-- Anything else the maintainer should know? Potential side effects, breaking changes, or areas of concern? -->
<!--
**Tips for Contributors:**
- Be concise but thorough.
- If your PR is large, consider breaking it into smaller PRs.
- Tag the maintainer if you need a prompt review.
- Avoid force pushing to the branch after opening the PR, as it can complicate the review process.
-->

Binary file not shown.

After

Width:  |  Height:  |  Size: 3.7 MiB

Binary file not shown.

After

Width:  |  Height:  |  Size: 223 KiB

BIN
.github/screenshots/ss-mobile-login.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 735 KiB

BIN
.github/screenshots/ss-mobile-player.png vendored Normal file

Binary file not shown.

After

Width:  |  Height:  |  Size: 885 KiB

View File

@@ -0,0 +1,54 @@
name: Add download link to PR
on:
workflow_run:
workflows: ['Pipeline: Test, Lint, Build']
types: [completed]
jobs:
pr_comment:
if: github.event.workflow_run.event == 'pull_request' && github.event.workflow_run.conclusion == 'success'
runs-on: ubuntu-latest
steps:
- uses: actions/github-script@v3
with:
# This snippet is public-domain, taken from
# https://github.com/oprypin/nightly.link/blob/master/.github/workflows/pr-comment.yml
script: |
const {owner, repo} = context.repo;
const run_id = ${{github.event.workflow_run.id}};
const pull_head_sha = '${{github.event.workflow_run.head_sha}}';
const pull_user_id = ${{github.event.sender.id}};
const issue_number = await (async () => {
const pulls = await github.pulls.list({owner, repo});
for await (const {data} of github.paginate.iterator(pulls)) {
for (const pull of data) {
if (pull.head.sha === pull_head_sha && pull.user.id === pull_user_id) {
return pull.number;
}
}
}
})();
if (issue_number) {
core.info(`Using pull request ${issue_number}`);
} else {
return core.error(`No matching pull request found`);
}
const {data: {artifacts}} = await github.actions.listWorkflowRunArtifacts({owner, repo, run_id});
if (!artifacts.length) {
return core.error(`No artifacts found`);
}
let body = `Download the artifacts for this pull request:\n`;
for (const art of artifacts) {
body += `\n* [${art.name}.zip](https://nightly.link/${owner}/${repo}/actions/artifacts/${art.id}.zip)`;
}
const {data: comments} = await github.issues.listComments({repo, owner, issue_number});
const existing_comment = comments.find((c) => c.user.login === 'github-actions[bot]');
if (existing_comment) {
core.info(`Updating comment ${existing_comment.id}`);
await github.issues.updateComment({repo, owner, comment_id: existing_comment.id, body});
} else {
core.info(`Creating a comment`);
await github.issues.createComment({repo, owner, issue_number, body});
}

467
.github/workflows/pipeline.yml vendored Normal file
View File

@@ -0,0 +1,467 @@
name: "Pipeline: Test, Lint, Build"
on:
push:
branches:
- master
tags:
- "v*"
pull_request:
branches:
- master
concurrency:
group: ${{ startsWith(github.ref, 'refs/tags/v') && 'tag' || 'branch' }}-${{ github.ref }}
cancel-in-progress: true
env:
CROSS_TAGLIB_VERSION: "2.1.1-1"
IS_RELEASE: ${{ startsWith(github.ref, 'refs/tags/') && 'true' || 'false' }}
jobs:
git-version:
name: Get version info
runs-on: ubuntu-latest
outputs:
git_tag: ${{ steps.git-version.outputs.GIT_TAG }}
git_sha: ${{ steps.git-version.outputs.GIT_SHA }}
steps:
- uses: actions/checkout@v6
with:
fetch-depth: 0
fetch-tags: true
- name: Show git version info
run: |
echo "git describe (dirty): $(git describe --dirty --always --tags)"
echo "git describe --tags: $(git describe --tags `git rev-list --tags --max-count=1`)"
echo "git tag: $(git tag --sort=-committerdate | head -n 1)"
echo "github_ref: $GITHUB_REF"
echo "github_head_sha: ${{ github.event.pull_request.head.sha }}"
git tag -l
- name: Determine git current SHA and latest tag
id: git-version
run: |
GIT_TAG=$(git tag --sort=-committerdate | head -n 1)
if [ -n "$GIT_TAG" ]; then
if [[ "$GITHUB_REF" != refs/tags/* ]]; then
GIT_TAG=${GIT_TAG}-SNAPSHOT
fi
echo "GIT_TAG=$GIT_TAG" >> $GITHUB_OUTPUT
fi
GIT_SHA=$(git rev-parse --short HEAD)
PR_NUM=$(jq --raw-output .pull_request.number "$GITHUB_EVENT_PATH")
if [[ $PR_NUM != "null" ]]; then
GIT_SHA=$(echo "${{ github.event.pull_request.head.sha }}" | cut -c1-8)
GIT_SHA="pr-${PR_NUM}/${GIT_SHA}"
fi
echo "GIT_SHA=$GIT_SHA" >> $GITHUB_OUTPUT
echo "GIT_TAG=$GIT_TAG"
echo "GIT_SHA=$GIT_SHA"
go-lint:
name: Lint Go code
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v6
- name: Download TagLib
uses: ./.github/actions/download-taglib
with:
version: ${{ env.CROSS_TAGLIB_VERSION }}
- name: golangci-lint
uses: golangci/golangci-lint-action@v9
with:
version: latest
problem-matchers: true
args: --timeout 2m
- name: Run go goimports
run: go run golang.org/x/tools/cmd/goimports@latest -w `find . -name '*.go' | grep -v '_gen.go$' | grep -v '.pb.go$'`
- run: go mod tidy
- name: Verify no changes from goimports and go mod tidy
run: |
git status --porcelain
if [ -n "$(git status --porcelain)" ]; then
echo 'To fix this check, run "make format" and commit the changes'
exit 1
fi
go:
name: Test Go code
runs-on: ubuntu-latest
steps:
- name: Check out code into the Go module directory
uses: actions/checkout@v6
- name: Download TagLib
uses: ./.github/actions/download-taglib
with:
version: ${{ env.CROSS_TAGLIB_VERSION }}
- name: Download dependencies
run: go mod download
- name: Test
run: |
pkg-config --define-prefix --cflags --libs taglib # for debugging
go test -shuffle=on -tags netgo -race ./... -v
js:
name: Test JS code
runs-on: ubuntu-latest
env:
NODE_OPTIONS: "--max_old_space_size=4096"
steps:
- uses: actions/checkout@v6
- uses: actions/setup-node@v6
with:
node-version: 24
cache: "npm"
cache-dependency-path: "**/package-lock.json"
- name: npm install dependencies
run: |
cd ui
npm ci
- name: npm lint
run: |
cd ui
npm run check-formatting && npm run lint
- name: npm test
run: |
cd ui
npm test
- name: npm build
run: |
cd ui
npm run build
i18n-lint:
name: Lint i18n files
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v6
- run: |
set -e
for file in resources/i18n/*.json; do
echo "Validating $file"
if ! jq empty "$file" 2>error.log; then
error_message=$(cat error.log)
line_number=$(echo "$error_message" | grep -oP 'line \K[0-9]+')
echo "::error file=$file,line=$line_number::$error_message"
exit 1
fi
done
- run: ./.github/workflows/validate-translations.sh -v
check-push-enabled:
name: Check Docker configuration
runs-on: ubuntu-latest
outputs:
is_enabled: ${{ steps.check.outputs.is_enabled }}
steps:
- name: Check if Docker push is configured
id: check
run: echo "is_enabled=${{ secrets.DOCKER_HUB_USERNAME != '' }}" >> $GITHUB_OUTPUT
build:
name: Build
needs: [js, go, go-lint, i18n-lint, git-version, check-push-enabled]
strategy:
matrix:
platform: [ linux/amd64, linux/arm64, linux/arm/v5, linux/arm/v6, linux/arm/v7, linux/386, darwin/amd64, darwin/arm64, windows/amd64, windows/386 ]
runs-on: ubuntu-latest
env:
IS_LINUX: ${{ startsWith(matrix.platform, 'linux/') && 'true' || 'false' }}
IS_ARMV5: ${{ matrix.platform == 'linux/arm/v5' && 'true' || 'false' }}
IS_DOCKER_PUSH_CONFIGURED: ${{ needs.check-push-enabled.outputs.is_enabled == 'true' }}
DOCKER_BUILD_SUMMARY: false
GIT_SHA: ${{ needs.git-version.outputs.git_sha }}
GIT_TAG: ${{ needs.git-version.outputs.git_tag }}
steps:
- name: Sanitize platform name
id: set-platform
run: |
PLATFORM=$(echo ${{ matrix.platform }} | tr '/' '_')
echo "PLATFORM=$PLATFORM" >> $GITHUB_ENV
- uses: actions/checkout@v6
- name: Prepare Docker Buildx
uses: ./.github/actions/prepare-docker
id: docker
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
hub_repository: ${{ vars.DOCKER_HUB_REPO }}
hub_username: ${{ secrets.DOCKER_HUB_USERNAME }}
hub_password: ${{ secrets.DOCKER_HUB_PASSWORD }}
- name: Build Binaries
uses: docker/build-push-action@v6
with:
context: .
file: Dockerfile
platforms: ${{ matrix.platform }}
outputs: |
type=local,dest=./output/${{ env.PLATFORM }}
target: binary
build-args: |
GIT_SHA=${{ env.GIT_SHA }}
GIT_TAG=${{ env.GIT_TAG }}
CROSS_TAGLIB_VERSION=${{ env.CROSS_TAGLIB_VERSION }}
- name: Upload Binaries
uses: actions/upload-artifact@v5
with:
name: navidrome-${{ env.PLATFORM }}
path: ./output
retention-days: 7
- name: Build and push image by digest
id: push-image
if: env.IS_LINUX == 'true' && env.IS_DOCKER_PUSH_CONFIGURED == 'true' && env.IS_ARMV5 == 'false'
uses: docker/build-push-action@v6
with:
context: .
file: Dockerfile
platforms: ${{ matrix.platform }}
labels: ${{ steps.docker.outputs.labels }}
build-args: |
GIT_SHA=${{ env.GIT_SHA }}
GIT_TAG=${{ env.GIT_TAG }}
CROSS_TAGLIB_VERSION=${{ env.CROSS_TAGLIB_VERSION }}
outputs: |
type=image,name=${{ steps.docker.outputs.hub_repository }},push-by-digest=true,name-canonical=true,push=${{ steps.docker.outputs.hub_enabled }}
type=image,name=ghcr.io/${{ github.repository }},push-by-digest=true,name-canonical=true,push=true
- name: Export digest
if: env.IS_LINUX == 'true' && env.IS_DOCKER_PUSH_CONFIGURED == 'true' && env.IS_ARMV5 == 'false'
run: |
mkdir -p /tmp/digests
digest="${{ steps.push-image.outputs.digest }}"
touch "/tmp/digests/${digest#sha256:}"
- name: Upload digest
uses: actions/upload-artifact@v5
if: env.IS_LINUX == 'true' && env.IS_DOCKER_PUSH_CONFIGURED == 'true' && env.IS_ARMV5 == 'false'
with:
name: digests-${{ env.PLATFORM }}
path: /tmp/digests/*
if-no-files-found: error
retention-days: 1
push-manifest-ghcr:
name: Push to GHCR
permissions:
contents: read
packages: write
runs-on: ubuntu-latest
needs: [build, check-push-enabled]
if: needs.check-push-enabled.outputs.is_enabled == 'true'
env:
REGISTRY_IMAGE: ghcr.io/${{ github.repository }}
steps:
- uses: actions/checkout@v6
- name: Download digests
uses: actions/download-artifact@v6
with:
path: /tmp/digests
pattern: digests-*
merge-multiple: true
- name: Prepare Docker Buildx
uses: ./.github/actions/prepare-docker
id: docker
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
- name: Create manifest list and push to ghcr.io
working-directory: /tmp/digests
run: |
docker buildx imagetools create $(jq -cr '.tags | map(select(startswith("ghcr.io"))) | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \
$(printf '${{ env.REGISTRY_IMAGE }}@sha256:%s ' *)
- name: Inspect image in ghcr.io
run: |
docker buildx imagetools inspect ${{ env.REGISTRY_IMAGE }}:${{ steps.docker.outputs.version }}
push-manifest-dockerhub:
name: Push to Docker Hub
runs-on: ubuntu-latest
permissions:
contents: read
needs: [build, check-push-enabled]
if: needs.check-push-enabled.outputs.is_enabled == 'true' && vars.DOCKER_HUB_REPO != ''
continue-on-error: true
steps:
- uses: actions/checkout@v6
- name: Download digests
uses: actions/download-artifact@v6
with:
path: /tmp/digests
pattern: digests-*
merge-multiple: true
- name: Prepare Docker Buildx
uses: ./.github/actions/prepare-docker
id: docker
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
hub_repository: ${{ vars.DOCKER_HUB_REPO }}
hub_username: ${{ secrets.DOCKER_HUB_USERNAME }}
hub_password: ${{ secrets.DOCKER_HUB_PASSWORD }}
- name: Create manifest list and push to Docker Hub
uses: nick-fields/retry@v3
with:
timeout_minutes: 5
max_attempts: 3
retry_wait_seconds: 30
command: |
cd /tmp/digests
docker buildx imagetools create $(jq -cr '.tags | map(select(startswith("ghcr.io") | not)) | map("-t " + .) | join(" ")' <<< "$DOCKER_METADATA_OUTPUT_JSON") \
$(printf 'ghcr.io/${{ github.repository }}@sha256:%s ' *)
- name: Inspect image in Docker Hub
run: |
docker buildx imagetools inspect ${{ vars.DOCKER_HUB_REPO }}:${{ steps.docker.outputs.version }}
cleanup-digests:
name: Cleanup digest artifacts
runs-on: ubuntu-latest
needs: [push-manifest-ghcr, push-manifest-dockerhub]
if: always() && needs.push-manifest-ghcr.result == 'success'
steps:
- name: Delete unnecessary digest artifacts
env:
GH_TOKEN: ${{ github.token }}
run: |
for artifact in $(gh api repos/${{ github.repository }}/actions/artifacts | jq -r '.artifacts[] | select(.name | startswith("digests-")) | .id'); do
gh api --method DELETE repos/${{ github.repository }}/actions/artifacts/$artifact
done
msi:
name: Build Windows installers
needs: [build, git-version]
runs-on: ubuntu-24.04
steps:
- uses: actions/checkout@v6
- uses: actions/download-artifact@v6
with:
path: ./binaries
pattern: navidrome-windows*
merge-multiple: true
- name: Install Wix
run: sudo apt-get install -y wixl jq
- name: Build MSI
env:
GIT_TAG: ${{ needs.git-version.outputs.git_tag }}
run: |
rm -rf binaries/msi
sudo GIT_TAG=$GIT_TAG release/wix/build_msi.sh ${GITHUB_WORKSPACE} 386
sudo GIT_TAG=$GIT_TAG release/wix/build_msi.sh ${GITHUB_WORKSPACE} amd64
du -h binaries/msi/*.msi
- name: Upload MSI files
uses: actions/upload-artifact@v5
with:
name: navidrome-windows-installers
path: binaries/msi/*.msi
retention-days: 7
release:
name: Package/Release
needs: [build, msi]
runs-on: ubuntu-latest
outputs:
package_list: ${{ steps.set-package-list.outputs.package_list }}
steps:
- uses: actions/checkout@v6
with:
fetch-depth: 0
fetch-tags: true
- uses: actions/download-artifact@v6
with:
path: ./binaries
pattern: navidrome-*
merge-multiple: true
- run: ls -lR ./binaries
- name: Set RELEASE_FLAGS for snapshot releases
if: env.IS_RELEASE == 'false'
run: echo 'RELEASE_FLAGS=--skip=publish --snapshot' >> $GITHUB_ENV
- name: Run GoReleaser
uses: goreleaser/goreleaser-action@v6
with:
version: '~> v2'
args: "release --clean -f release/goreleaser.yml ${{ env.RELEASE_FLAGS }}"
env:
GITHUB_TOKEN: ${{ secrets.GITHUB_TOKEN }}
- name: Remove build artifacts
run: |
ls -l ./dist
rm ./dist/*.tar.gz ./dist/*.zip
- name: Upload all-packages artifact
uses: actions/upload-artifact@v5
with:
name: packages
path: dist/navidrome_0*
- id: set-package-list
name: Export list of generated packages
run: |
cd dist
set +x
ITEMS=$(ls navidrome_0* | sed 's/^navidrome_0[^_]*_linux_//' | jq -R -s -c 'split("\n")[:-1]')
echo $ITEMS
echo "package_list=${ITEMS}" >> $GITHUB_OUTPUT
upload-packages:
name: Upload Linux PKG
runs-on: ubuntu-latest
needs: [release]
strategy:
matrix:
item: ${{ fromJson(needs.release.outputs.package_list) }}
steps:
- name: Download all-packages artifact
uses: actions/download-artifact@v6
with:
name: packages
path: ./dist
- name: Upload all-packages artifact
uses: actions/upload-artifact@v5
with:
name: navidrome_linux_${{ matrix.item }}
path: dist/navidrome_0*_linux_${{ matrix.item }}
# delete-artifacts:
# name: Delete unused artifacts
# runs-on: ubuntu-latest
# needs: [upload-packages]
# steps:
# - name: Delete all-packages artifact
# env:
# GH_TOKEN: ${{ github.token }}
# run: |
# for artifact in $(gh api repos/${{ github.repository }}/actions/artifacts | jq -r '.artifacts[] | select(.name | startswith("packages")) | .id'); do
# gh api --method DELETE repos/${{ github.repository }}/actions/artifacts/$artifact
# done

56
.github/workflows/stale.yml vendored Normal file
View File

@@ -0,0 +1,56 @@
name: 'Close stale issues and PRs'
on:
workflow_dispatch:
schedule:
- cron: '30 1 * * *'
permissions:
contents: read
jobs:
stale:
permissions:
issues: write
pull-requests: write
runs-on: ubuntu-latest
steps:
- uses: dessant/lock-threads@v5
with:
process-only: 'issues, prs'
issue-inactive-days: 120
pr-inactive-days: 120
log-output: true
add-issue-labels: 'frozen-due-to-age'
add-pr-labels: 'frozen-due-to-age'
issue-comment: >
This issue has been automatically locked since there
has not been any recent activity after it was closed.
Please open a new issue for related bugs.
pr-comment: >
This pull request has been automatically locked since there
has not been any recent activity after it was closed.
Please open a new issue for related bugs.
- uses: actions/stale@v9
with:
operations-per-run: 999
days-before-issue-stale: 180
days-before-pr-stale: 180
days-before-issue-close: 30
days-before-pr-close: 30
stale-issue-message: >
This issue has been automatically marked as stale because it has not had
recent activity. The resources of the Navidrome team are limited, and so we are asking for your help.
If this is a **bug** and you can still reproduce this error on the <code>master</code> branch, please reply with all of the information you have about it in order to keep the issue open.
If this is a **feature request**, and you feel that it is still relevant and valuable, please tell us why.
This issue will automatically be closed in the near future if no further activity occurs. Thank you for all your contributions.
stale-pr-message: This PR has been automatically marked as stale because it has not had
recent activity. The resources of the Navidrome team are limited, and so we are asking for your help.
Please check https://github.com/navidrome/navidrome/blob/master/CONTRIBUTING.md#pull-requests and verify that this code contribution fits with the description. If yes, tell it in a comment.
This PR will automatically be closed in the near future if no further activity occurs. Thank you for all your contributions.
stale-issue-label: 'stale'
exempt-issue-labels: 'keep,security'
stale-pr-label: 'stale'
exempt-pr-labels: 'keep,security'

93
.github/workflows/update-translations.sh vendored Executable file
View File

@@ -0,0 +1,93 @@
#!/bin/sh
set -e
I18N_DIR=resources/i18n
# Function to process JSON: remove empty attributes and sort
process_json() {
jq 'walk(if type == "object" then with_entries(select(.value != null and .value != "" and .value != [] and .value != {})) | to_entries | sort_by(.key) | from_entries else . end)' "$1"
}
# Function to check differences between local and remote translations
check_lang_diff() {
filename=${I18N_DIR}/"$1".json
url=$(curl -s -X POST https://poeditor.com/api/ \
-d api_token="${POEDITOR_APIKEY}" \
-d action="export" \
-d id="${POEDITOR_PROJECTID}" \
-d language="$1" \
-d type="key_value_json" | jq -r .item)
if [ -z "$url" ]; then
echo "Failed to export $1"
return 1
fi
curl -sSL "$url" > poeditor.json
process_json "$filename" > "$filename".tmp
process_json poeditor.json > poeditor.tmp
diff=$(diff -u "$filename".tmp poeditor.tmp) || true
if [ -n "$diff" ]; then
echo "$diff"
mv poeditor.json "$filename"
fi
rm -f poeditor.json poeditor.tmp "$filename".tmp
}
# Function to get the list of languages
get_language_list() {
response=$(curl -s -X POST https://api.poeditor.com/v2/languages/list \
-d api_token="${POEDITOR_APIKEY}" \
-d id="${POEDITOR_PROJECTID}")
echo $response
}
# Function to get the language name from the language code
get_language_name() {
lang_code="$1"
lang_list="$2"
lang_name=$(echo "$lang_list" | jq -r ".result.languages[] | select(.code == \"$lang_code\") | .name")
if [ -z "$lang_name" ]; then
echo "Error: Language code '$lang_code' not found" >&2
return 1
fi
echo "$lang_name"
}
# Function to get the language code from the file path
get_lang_code() {
filepath="$1"
# Extract just the filename
filename=$(basename "$filepath")
# Remove the extension
lang_code="${filename%.*}"
echo "$lang_code"
}
lang_list=$(get_language_list)
# Check differences for each language
for file in ${I18N_DIR}/*.json; do
code=$(get_lang_code "$file")
lang=$(jq -r .languageName < "$file")
lang_name=$(get_language_name "$code" "$lang_list")
echo "Downloading $lang_name - $lang ($code)"
check_lang_diff "$code"
done
# List changed languages to stderr
languages=""
for file in $(git diff --name-only --exit-code | grep json); do
lang_code=$(get_lang_code "$file")
lang_name=$(get_language_name "$lang_code" "$lang_list")
languages="${languages}$(echo "$lang_name" | tr -d '\n'), "
done
echo "${languages%??}" 1>&2

View File

@@ -0,0 +1,33 @@
name: POEditor import
on:
workflow_dispatch:
schedule:
- cron: '0 10 * * *'
jobs:
update-translations:
runs-on: ubuntu-latest
if: ${{ github.repository_owner == 'navidrome' }}
steps:
- uses: actions/checkout@v6
- name: Get updated translations
id: poeditor
env:
POEDITOR_PROJECTID: ${{ secrets.POEDITOR_PROJECTID }}
POEDITOR_APIKEY: ${{ secrets.POEDITOR_APIKEY }}
run: |
.github/workflows/update-translations.sh 2> title.tmp
title=$(cat title.tmp)
echo "::set-output name=title::$title"
rm title.tmp
- name: Show changes, if any
run: |
git status --porcelain
git diff
- name: Create Pull Request
uses: peter-evans/create-pull-request@v7
with:
token: ${{ secrets.PAT }}
author: "navidrome-bot <navidrome-bot@navidrome.org>"
commit-message: "fix(ui): update ${{ steps.poeditor.outputs.title }} translations from POEditor"
title: "fix(ui): update ${{ steps.poeditor.outputs.title }} translations from POEditor"
branch: update-translations

236
.github/workflows/validate-translations.sh vendored Executable file
View File

@@ -0,0 +1,236 @@
#!/bin/bash
# validate-translations.sh
#
# This script validates the structure of JSON translation files by comparing them
# against the reference English translation file (ui/src/i18n/en.json).
#
# The script performs the following validations:
# 1. JSON syntax validation using jq
# 2. Structural validation - ensures all keys from English file are present
# 3. Reports missing keys (translation incomplete)
# 4. Reports extra keys (keys not in English reference, possibly deprecated)
# 5. Emits GitHub Actions annotations for CI/CD integration
#
# Usage:
# ./validate-translations.sh
#
# Environment Variables:
# EN_FILE - Path to reference English file (default: ui/src/i18n/en.json)
# TRANSLATION_DIR - Directory containing translation files (default: resources/i18n)
#
# Exit codes:
# 0 - All translations are valid
# 1 - One or more translations have structural issues
#
# GitHub Actions Integration:
# The script outputs GitHub Actions annotations using ::error and ::warning
# format that will be displayed in PR checks and workflow summaries.
# Script to validate JSON translation files structure against en.json
set -e
# Path to the reference English translation file
EN_FILE="${EN_FILE:-ui/src/i18n/en.json}"
TRANSLATION_DIR="${TRANSLATION_DIR:-resources/i18n}"
VERBOSE=false
# Parse command line arguments
while [[ $# -gt 0 ]]; do
case "$1" in
-v|--verbose)
VERBOSE=true
shift
;;
-h|--help)
echo "Usage: $0 [options]"
echo ""
echo "Validates JSON translation files structure against English reference file."
echo ""
echo "Options:"
echo " -h, --help Show this help message"
echo " -v, --verbose Show detailed output (default: only show errors)"
echo ""
echo "Environment Variables:"
echo " EN_FILE Path to reference English file (default: ui/src/i18n/en.json)"
echo " TRANSLATION_DIR Directory with translation files (default: resources/i18n)"
echo ""
echo "Examples:"
echo " $0 # Validate all translation files (quiet mode)"
echo " $0 -v # Validate with detailed output"
echo " EN_FILE=custom/en.json $0 # Use custom reference file"
echo " TRANSLATION_DIR=custom/i18n $0 # Use custom translations directory"
exit 0
;;
*)
echo "Unknown option: $1" >&2
echo "Use --help for usage information" >&2
exit 1
;;
esac
done
# Color codes for output
RED='\033[0;31m'
GREEN='\033[0;32m'
YELLOW='\033[1;33m'
NC='\033[0m' # No Color
if [[ "$VERBOSE" == "true" ]]; then
echo "Validating translation files structure against ${EN_FILE}..."
fi
# Check if English reference file exists
if [[ ! -f "$EN_FILE" ]]; then
echo "::error::Reference file $EN_FILE not found"
exit 1
fi
# Function to extract all JSON keys from a file, creating a flat list of dot-separated paths
extract_keys() {
local file="$1"
jq -r 'paths(scalars) as $p | $p | join(".")' "$file" 2>/dev/null | sort
}
# Function to extract all non-empty string keys (to identify structural issues)
extract_structure_keys() {
local file="$1"
# Get only keys where values are not empty strings
jq -r 'paths(scalars) as $p | select(getpath($p) != "") | $p | join(".")' "$file" 2>/dev/null | sort
}
# Function to validate a single translation file
validate_translation() {
local translation_file="$1"
local filename=$(basename "$translation_file")
local has_errors=false
local verbose=${2:-false}
if [[ "$verbose" == "true" ]]; then
echo "Validating $filename..."
fi
# First validate JSON syntax
if ! jq empty "$translation_file" 2>/dev/null; then
echo "::error file=$translation_file::Invalid JSON syntax"
echo -e "${RED}$filename has invalid JSON syntax${NC}"
return 1
fi
# Extract all keys from both files (for statistics)
local en_keys_file=$(mktemp)
local translation_keys_file=$(mktemp)
extract_keys "$EN_FILE" > "$en_keys_file"
extract_keys "$translation_file" > "$translation_keys_file"
# Extract only non-empty structure keys (to validate structural issues)
local en_structure_file=$(mktemp)
local translation_structure_file=$(mktemp)
extract_structure_keys "$EN_FILE" > "$en_structure_file"
extract_structure_keys "$translation_file" > "$translation_structure_file"
# Find structural issues: keys in translation not in English (misplaced)
local extra_keys=$(comm -13 "$en_keys_file" "$translation_keys_file")
# Find missing keys (for statistics only)
local missing_keys=$(comm -23 "$en_keys_file" "$translation_keys_file")
# Count keys for statistics
local total_en_keys=$(wc -l < "$en_keys_file")
local total_translation_keys=$(wc -l < "$translation_keys_file")
local missing_count=0
local extra_count=0
if [[ -n "$missing_keys" ]]; then
missing_count=$(echo "$missing_keys" | grep -c '^' || echo 0)
fi
if [[ -n "$extra_keys" ]]; then
extra_count=$(echo "$extra_keys" | grep -c '^' || echo 0)
has_errors=true
fi
# Report extra/misplaced keys (these are structural issues)
if [[ -n "$extra_keys" ]]; then
if [[ "$verbose" == "true" ]]; then
echo -e "${YELLOW}Misplaced keys in $filename ($extra_count):${NC}"
fi
while IFS= read -r key; do
# Try to find the line number
line=$(grep -n "\"$(echo "$key" | sed 's/.*\.//')" "$translation_file" | head -1 | cut -d: -f1)
line=${line:-1} # Default to line 1 if not found
echo "::error file=$translation_file,line=$line::Misplaced key: $key"
if [[ "$verbose" == "true" ]]; then
echo " + $key (line ~$line)"
fi
done <<< "$extra_keys"
fi
# Clean up temp files
rm -f "$en_keys_file" "$translation_keys_file" "$en_structure_file" "$translation_structure_file"
# Print statistics
if [[ "$verbose" == "true" ]]; then
echo " Keys: $total_translation_keys/$total_en_keys (Missing: $missing_count, Extra/Misplaced: $extra_count)"
if [[ "$has_errors" == "true" ]]; then
echo -e "${RED}$filename has structural issues${NC}"
else
echo -e "${GREEN}$filename structure is valid${NC}"
fi
elif [[ "$has_errors" == "true" ]]; then
echo -e "${RED}$filename has structural issues (Extra/Misplaced: $extra_count)${NC}"
fi
return $([[ "$has_errors" == "true" ]] && echo 1 || echo 0)
}
# Main validation loop
validation_failed=false
total_files=0
failed_files=0
valid_files=0
for translation_file in "$TRANSLATION_DIR"/*.json; do
if [[ -f "$translation_file" ]]; then
total_files=$((total_files + 1))
if ! validate_translation "$translation_file" "$VERBOSE"; then
validation_failed=true
failed_files=$((failed_files + 1))
else
valid_files=$((valid_files + 1))
fi
if [[ "$VERBOSE" == "true" ]]; then
echo "" # Add spacing between files
fi
fi
done
# Summary
if [[ "$VERBOSE" == "true" ]]; then
echo "========================================="
echo "Translation Validation Summary:"
echo " Total files: $total_files"
echo " Valid files: $valid_files"
echo " Files with structural issues: $failed_files"
echo "========================================="
fi
if [[ "$validation_failed" == "true" ]]; then
if [[ "$VERBOSE" == "true" ]]; then
echo -e "${RED}Translation validation failed - $failed_files file(s) have structural issues${NC}"
else
echo -e "${RED}Translation validation failed - $failed_files/$total_files file(s) have structural issues${NC}"
fi
exit 1
elif [[ "$VERBOSE" == "true" ]]; then
echo -e "${GREEN}All translation files are structurally valid${NC}"
fi
exit 0