Still failing on the latest head in both Chromium projects. The image/jpg, image/pjpeg, and image/x-png cases never reach the accepted-photo state (Remove photo never appears), so the PR still does not prove that legacy JPEG/PNG MIME aliases work end-to-end under the documented test invocation.
This decode-fallback case is also still red in both browser projects. Instead of uploading the original sub-10MB file and surfacing the backend verification failure, the UI ends up showing Photo upload failed. Could not read this image. That means the user-visible behavior this PR is trying to improve still doesn’t reproduce as expected end-to-end.
Requesting changes because the updated E2E coverage still does not pass locally in either Chromium project for the new alias-normalization and decode-fallback scenarios.
This new alias-normalization coverage is failing in both mobile-chrome and chromium when I run the suite locally. In all three cases here (image/jpg, image/pjpeg, and image/x-png), the page stays on the upload step and shows Unsupported image format. Please use JPEG, PNG, WebP, or HEIC/HEIF. instead of accepting the file. So the new E2E proof does not currently demonstrate the feature working.
I re-ran the updated verification on this head. npm test -- --run src/utils/preparePhoto.test.ts tests/pages/PersonProfile.test.tsx tests/pages/NewPerson.test.tsx, npm run typecheck, npm run lint -- --quiet, and docker compose -f docker-compose.dev.yml exec -T archive-api pytest tests/unit/test_s3_storage.py all passed.
This decode-fallback case is also still red in both browser projects. Instead of uploading the original sub-10MB file and surfacing the backend verification failure, the UI ends up showing Photo upload failed. Could not read this image. That means the user-visible behavior this PR is trying to improve still doesn’t reproduce as expected end-to-end.
This new alias-normalization coverage is failing in both mobile-chrome and chromium when I run the suite locally. In all three cases here (image/jpg, image/pjpeg, and image/x-png), the page stays on the upload step and shows Unsupported image format. Please use JPEG, PNG, WebP, or HEIC/HEIF. instead of accepting the file. So the new E2E proof does not currently demonstrate the feature working.
These E2E tests prove the standard jpg/png/webp/heic flows, but they don’t exercise the new behavior introduced in this PR: MIME alias normalization (image/jpg, image/pjpeg, image/x-png, image/heic-sequence, image/heif-sequence), extension-only / application/octet-stream recovery, or the decode-failure fallback that keeps the original file under 10 MB. Since those are the risky new branches here, I don’t think the feature is fully proven E2E yet.
normalizePhotoMimeType() now falls back to the filename extension for any unsupported MIME, not just the browser misreports we’re trying to recover from. That means a file named photo.jpg with MIME image/gif, image/avif, or application/octet-stream is reclassified as image/jpeg and proceeds through upload. Because the backend only validates the declared content type on initiate and then trusts Pillow to open the bytes later, this can admit genuinely unsupported/misnamed formats instead of failing fast with the user-facing unsupported-format error. I think the extension fallback needs to be constrained to the specific misreport cases we actually want to recover from.
Requesting changes based on the two inline findings. The main happy-path E2E coverage does reproduce locally when run sequentially, but the current implementation still broadens accepted input in a risky way and does not E2E-prove the newly added normalization/fallback branches.
normalizePhotoMimeType() now falls back to the filename extension for any unsupported MIME, not just the browser misreports we’re trying to recover from. That means a file named photo.jpg with MIME image/gif, image/avif, or application/octet-stream is reclassified as image/jpeg and proceeds through upload. Because the backend only validates the declared content type on initiate and then trusts Pillow to open the bytes later, this can admit genuinely unsupported/misnamed formats instead of failing fast with the user-facing unsupported-format error. I think the extension fallback needs to be constrained to the specific misreport cases we actually want to recover from.
I re-ran the verification locally on this branch. The targeted frontend Vitest suite, npm run typecheck, npm run lint -- --quiet, and docker compose -f docker-compose.dev.yml exec -T archive-api pytest tests/unit/test_s3_storage.py all passed. I also reran the exact Playwright commands from the PR description sequentially against the local dev stack, and both mobile-chrome and chromium passed for tests/e2e/photo-formats.spec.ts.
These E2E tests prove the standard jpg/png/webp/heic flows, but they don’t exercise the new behavior introduced in this PR: MIME alias normalization (image/jpg, image/pjpeg, image/x-png, image/heic-sequence, image/heif-sequence), extension-only / application/octet-stream recovery, or the decode-failure fallback that keeps the original file under 10 MB. Since those are the risky new branches here, I don’t think the feature is fully proven E2E yet.
QA review complete for Issue #92 scope. Verified all four requested fixes are implemented with appropriate test coverage updates. No blocking correctness or regression issues found in this PR.
Addressed the review feedback on top of the branch sync from develop:
apply_openbao_secrets()now upgrades explicithttp://...S3 endpoints tohttps://...instead of preserving…