Java.DBMigrationTools.How do you detect and prevent duplicate changesets?

Duplicate changesets are mostly a process + CI guardrail problem. You detect them by validating metadata uniqueness and prevent them by naming conventions + automation.

What “duplicate” means

Liquibase

A changeset is uniquely identified by the triple:

  • (id, author, filePath) (filePath = changelog file it’s defined in)

So “duplicate changeset” usually means:

  • same id + author reused in the same file, or
  • the same changeset got copied into another file with same id/author and Liquibase considers filePath identical (or you used logicalFilePath and made it collide), or
  • you moved files / changed logicalFilePath incorrectly, making Liquibase think “same changeset, different place” (or vice versa).

Flyway

Duplicates are usually:

  • same version used twice (V42__...sql in two branches)
  • or same repeatable name (R__views.sql) duplicated in multiple locations

Detect duplicates (CI + local)

Liquibase

  • Run liquibase validate in CI for every PR.
  • Also run liquibase status --verbose (good for visibility), but validate is the key “fail-fast” check.

Extra hardening

  • Parse the changelog set in CI and fail if:
    • duplicate (id, author, logicalFilePath) exists
    • any file defines duplicate ids within itself
      This is easy to implement as a small script (YAML/XML/JSON).

Flyway

  • Run flyway validate in CI.
  • Run flyway info and fail if there are:
    • duplicate versions
    • pending migrations unexpectedly in release branches
  • Optionally add a custom check that migration filenames are unique and follow your convention.

Prevent duplicates (what actually works)

1) Use collision-resistant naming

Liquibase

  • id: 2026-01-21-001-add-email-verified (date + sequence + verb)
  • author: team-or-username
  • Avoid “id: 1, 2, 3” unless you namespace them.

Flyway

  • Prefer timestamp versioning:
    • V20260121_134501__add_email_verified.sql
      This essentially eliminates “two devs picked V42” conflicts.

2) “One change = one file” + master includes

  • Liquibase master file should include directories or keep a deterministic include list.
  • Devs add new files; they don’t edit old ones.
    This avoids copy/paste duplication and merge conflicts.

3) Don’t copy changesets between services/modules

If you’re in a monorepo, keep migrations owned by the service schema:

  • services/payments/db/...
  • services/customers/db/...

Cross-copying is how duplicates happen.

4) Lock down old migrations (immutability)

  • Add a CI rule: migrations under changes/<past> are read-only (no edits), only new files allowed.
  • If a change is needed, create a new migration, don’t “fix” the old one.

5) Pre-commit / PR checks

Best ROI guardrails:

  • Pre-commit: reject new Liquibase changesets if id doesn’t match regex and isn’t unique in the repo.
  • PR check: run validate + custom “duplicate scanner”.

Common pitfall: logicalFilePath

Liquibase can treat moved files as duplicates if you set logicalFilePath inconsistently.
Rule of thumb:

  • Either don’t use it, or
  • Use a stable logical path from day one and never change it.

Interview-ready answer

“For Liquibase, uniqueness is (id, author, filePath), so we enforce a naming convention and run liquibase validate in CI. We also keep migrations immutable and require ‘new file per change’ so people don’t copy changesets around. For Flyway, duplicates are version collisions, so we use timestamped versions and fail PRs on flyway validate.”

This entry was posted in Без рубрики. Bookmark the permalink.