First-attempt commit3a5c6e184only captured the .gitignore change; the pre-commit hook silently dropped the 343 staged moves/deletes during lint-staged's "no matching task" path. This commit re-applies the intended J1 content on top ofbec75f143(which was pushed in parallel). Uses --no-verify because: - J1 only touches .md/.json/.log/.png/binaries — zero code that would benefit from lint-staged, typecheck, or vitest - The hook demonstrated it corrupts pure-rename commits in this repo - Explicitly authorized by user for this one commit Changes (343 total: 169 deletions + 174 renames): Binaries purged (~167 MB): - veza-backend-api/{server,modern-server,encrypt_oauth_tokens,seed,seed-v2} Generated reports purged: - 9 apps/web/lint_report*.json (~32 MB) - 8 apps/web/tsc_*.{log,txt} + ts_*.log (TS error snapshots) - 3 apps/web/storybook_*.json (1375+ stored errors) - apps/web/{build_errors*,build_output,final_errors}.txt - 70 veza-backend-api/coverage*.out + coverage_groups/ (~4 MB) - 3 veza-backend-api/internal/handlers/*.bak Root cleanup: - 54 audit-*.png (visual regression baselines, ~11 MB) - 9 stale MVP-era scripts (Jan 27, hardcoded v0.101): start_{iteration,mvp,recovery}.sh, test_{mvp_endpoints,protected_endpoints,user_journey}.sh, validate_v0101.sh, verify_logs_setup.sh, gen_hash.py Session docs archived (not deleted — preserved under docs/archive/): - 78 apps/web/*.md → docs/archive/frontend-sessions-2026/ - 43 veza-backend-api/*.md → docs/archive/backend-sessions-2026/ - 53 docs/{RETROSPECTIVE_V,SMOKE_TEST_V,PLAN_V0_,V0_*_RELEASE_SCOPE, AUDIT_,PLAN_ACTION_AUDIT,REMEDIATION_PROGRESS}*.md → docs/archive/v0-history/ README.md and CONTRIBUTING.md preserved in apps/web/ and veza-backend-api/. Note: The .gitignore rules preventing recurrence were already pushed in3a5c6e184and remain in place — this commit does not modify .gitignore. Refs: AUDIT_REPORT.md §11
10 KiB
Plan d'implémentation v0.903 — Stabilisation v1.0 & Launch Readiness
État des lieux
| Feature | Backend | Frontend | Tests |
|---|---|---|---|
| Recherche phonétique | ❌ | ❌ | ❌ |
| Spell correction | ❌ | ❌ | ❌ |
| Saved searches | ❌ | ❌ | ❌ |
| Recommendations algo | ⚠️ basic random | ❌ | ❌ |
| Auto playlists | ❌ | ❌ | ❌ |
| Smart playlists | ❌ | ❌ | ❌ |
| Export M3U | ❌ | ❌ | ❌ |
| Merge/duplicate playlists | ❌ | ❌ | ❌ |
| Session management | ❌ | ❌ | ❌ |
| Login unusual detection | ❌ | ❌ | ❌ |
| Login history | ❌ | ❌ | ❌ |
| CAPTCHA | ❌ | ❌ | ❌ |
| Password history | ❌ | ❌ | ❌ |
| Load tests k6 | ❌ | — | ❌ |
| Production guide | ❌ | — | — |
Fichiers existants clés
- Search : existing search handlers in
internal/core/track/track_search_handler.go - Recommendations :
GET /tracks/recommendations(basic random) - Playlists :
internal/handlers/(CRUD playlists) - Auth :
internal/core/auth/(service, handler) - Player :
apps/web/src/features/player/ - Search frontend :
apps/web/src/features/search/ - Library :
apps/web/src/features/library/ - Settings Security :
apps/web/src/components/settings/
Step 1 : Recherche phonétique + spell correction (ST1-01, ST1-02)
Backend :
- Activer extensions PostgreSQL :
CREATE EXTENSION IF NOT EXISTS fuzzystrmatch; - Modifier search query pour inclure
soundex()etmetaphone()comme fallback si trigram similarity < 0.3 - Ajouter "Did you mean" dans la réponse search : si < 3 résultats, chercher la meilleure alternative via
similarity()> 0.3
type SearchResponse struct {
Results []Track `json:"results"`
Total int64 `json:"total"`
DidYouMean *string `json:"did_you_mean,omitempty"`
}
Frontend : Afficher "Did you mean: {suggestion}" cliquable dans SearchPage
Commit : feat(search): phonetic search with soundex/metaphone and spell correction
Step 2 : Saved searches (ST1-03)
Migration : migrations/132_saved_searches.sql
CREATE TABLE IF NOT EXISTS saved_searches (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
query TEXT NOT NULL,
filters JSONB DEFAULT '{}',
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE INDEX idx_saved_searches_user ON saved_searches(user_id);
Backend : POST /search/saved, GET /search/saved, DELETE /search/saved/:id (max 50 per user)
Frontend : Section "Saved Searches" dans SearchPage, bouton save, liste dans sidebar
Commit : feat(search): saved searches with CRUD and frontend display
Step 3 : Recommendation algorithm + Auto playlists (ST1-04, ST1-05)
Backend :
- Collaborative filtering :
SELECT t.* FROM tracks t JOIN track_plays tp ON t.id = tp.track_id WHERE tp.user_id IN (SELECT user_id FROM track_plays WHERE track_id IN (SELECT track_id FROM track_plays WHERE user_id = ?)) AND t.id NOT IN (SELECT track_id FROM track_plays WHERE user_id = ?) ORDER BY count(*) DESC LIMIT 30 - Fallback si pas assez de données : genre + BPM similarity
- "Discover Weekly" : cron lundi 00:00, génère playlist auto pour chaque user actif (played_last_30d)
- "Your Top Tracks" : top 20 plays du mois en cours, régénéré quotidiennement
Frontend : Section dans Library avec playlists auto-générées (badge "Auto", read-only)
Commit : feat(search): collaborative filtering recommendations and auto-generated playlists
Step 4 : Smart playlists (ST2-01)
Migration : migrations/133_smart_playlists.sql
ALTER TABLE playlists ADD COLUMN IF NOT EXISTS is_smart BOOLEAN NOT NULL DEFAULT false;
ALTER TABLE playlists ADD COLUMN IF NOT EXISTS smart_rules JSONB;
ALTER TABLE playlists ADD COLUMN IF NOT EXISTS last_updated_at TIMESTAMPTZ;
Backend :
- Smart playlist rules :
{ "rules": [{"field": "genre", "op": "eq", "value": "hip-hop"}, {"field": "bpm", "op": "gt", "value": 120}], "logic": "AND", "limit": 100 } - Cron quotidien : re-évalue les règles, met à jour les tracks de la playlist
POST /playlists/smart— créer smart playlist,PUT /playlists/:id/rules— modifier règles
Frontend : Smart playlist builder — formulaire de règles (field selector, operator, value input), preview résultat
Commit : feat(playlists): smart playlists with rule builder and auto-update
Step 5 : Export M3U + Merge + Duplicate (ST2-02 to ST2-04)
Backend :
GET /playlists/:id/export?format=m3u— génère fichier M3U8 avec URLs des tracksPOST /playlists/merge— body{ "playlist_ids": [...], "name": "Merged" }, déduplique par track_idPOST /playlists/:id/duplicate— copie playlist + tracks, nom = "Copy of {name}"
Frontend : Menu actions dans PlaylistView avec Export, Merge (multi-select), Duplicate
Commit : feat(playlists): export M3U, merge playlists, duplicate playlist
Step 6 : Session management + Login history (ST3-01 to ST3-04)
Migration : migrations/134_sessions_history.sql
CREATE TABLE IF NOT EXISTS user_sessions (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
device VARCHAR(255),
ip_address VARCHAR(45),
user_agent TEXT,
is_current BOOLEAN NOT NULL DEFAULT false,
is_suspicious BOOLEAN NOT NULL DEFAULT false,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
last_active_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE INDEX idx_user_sessions_user ON user_sessions(user_id);
CREATE TABLE IF NOT EXISTS login_history (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
ip_address VARCHAR(45),
device VARCHAR(255),
success BOOLEAN NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE INDEX idx_login_history_user ON login_history(user_id);
Backend :
- Créer session à chaque login, mettre à jour last_active_at sur chaque requête auth
GET /auth/sessions— liste des sessions actives avec device/IP/dateDELETE /auth/sessions/:id— révoquer une session (invalider le token)DELETE /auth/sessions— logout all devicesGET /auth/login-history— dernières 50 connexions- Détection login inhabituel : nouvelle IP non vue dans les 30 derniers jours → email notification
Frontend : Page Security dans Settings — tableau des sessions actives avec bouton Revoke, historique connexions
Commit : feat(auth): session management, login history, unusual login detection
Step 7 : CAPTCHA + Password history (ST3-05, ST3-06)
Backend :
- Intégration hCaptcha : middleware sur
/auth/login(après 3 échecs consécutifs),/auth/register,/auth/reset-password - Vérification server-side via hCaptcha API
- Password history : stocker hash des 5 derniers mots de passe dans
password_history JSONB, vérifier avant update
Frontend : Composant <CaptchaWidget /> intégré dans les formulaires login/register/reset quand requis
Commit : feat(auth): hCaptcha integration and password history enforcement
Step 8 : Load testing k6 (ST4-01)
Fichier : tests/load/ (nouveau répertoire)
Scripts k6 :
auth-flow.js— register + login + refresh + logout (100 VUs, 5min)search.js— search queries variées (200 VUs, 5min)marketplace.js— browse + add to cart + checkout (50 VUs, 5min)streaming.js— GET track + HLS segments (100 VUs, 5min)websocket.js— WebSocket connect + send/receive messages (50 VUs, 5min)
Target : p95 < 200ms, error rate < 1%, 0 crashes
Commit : test(load): k6 load testing scripts for auth, search, marketplace, streaming, websocket
Step 9 : Performance optimization (ST4-02 to ST4-04)
Redis : Ajouter cache-aside sur :
- Search results (TTL 30s)
- Trending hashtags (TTL 5min)
- Analytics aggregations (TTL 1min)
- User preferences (TTL 10min)
DB : EXPLAIN ANALYZE sur top 10 slow queries, ajouter indexes manquants
CDN : Cache-Control headers sur :
- Static assets : max-age=31536000, immutable
- HLS segments : max-age=3600
- Product images : max-age=86400
Commit : perf: Redis cache optimization, DB query tuning, CDN cache headers
Step 10 : Documentation v1.0 (ST5)
Fichier : docs/API_REFERENCE.md — compléter tous les endpoints manquants
Fichier : docs/PRODUCTION_GUIDE.md (nouveau) — docker-compose prod, env vars, backup, monitoring, scaling
Fichier : README.md — mise à jour v1.0 avec architecture, quick start, features
Fichier : docs/MIGRATION_GUIDE.md (nouveau) — liste migrations 1-134+, procédure upgrade
Fichier : CHANGELOG.md — historique complet v0.101 à v1.0
Commit : docs: v1.0 documentation — API Reference, Production Guide, README, Migration Guide
Step 11 : E2E validation + Security audit (REL)
- Exécuter tous les smoke tests (v0.703 à v0.903)
- Vérifier security headers sur toutes les réponses
- Vérifier rate limiting
- Vérifier GDPR (export + deletion)
- Lighthouse audit : Performance ≥ 85, Accessibility ≥ 90, PWA ≥ 90
- k6 results : p95 < 200ms
Commit : test: v1.0 E2E validation and security audit
Step 12 : Release v0.903 + v1.0
Fichier : docs/PROJECT_STATE.md — Version = v1.0, Phase = Released, features count final
Fichier : docs/FEATURE_STATUS.md — mise à jour finale
Fichier : docs/RETROSPECTIVE_V0903.md (nouveau)
Fichier : docs/RETROSPECTIVE_V1.md (nouveau) — rétrospective globale du projet
git add .
git commit -m "chore(release): finalize v0.903 and v1.0 documentation"
git tag v0.903
git tag v1.0
Dépendances entre steps
graph TD
S1[Step1 Search phonetic] --> S2[Step2 Saved searches]
S2 --> S3[Step3 Recommendations]
S4[Step4 Smart playlists] --> S5[Step5 Export/Merge/Dup]
S6[Step6 Sessions] --> S7[Step7 CAPTCHA]
S8[Step8 k6 load tests] --> S9[Step9 Perf optimization]
S3 --> S10[Step10 Docs]
S5 --> S10
S7 --> S10
S9 --> S10
S10 --> S11[Step11 E2E]
S11 --> S12[Step12 Release]
Validation finale
cd veza-backend-api && go build ./... && go vet ./... && go test ./... -v
cd apps/web && npm run build
# k6 load tests
cd tests/load && k6 run auth-flow.js && k6 run search.js
# Lighthouse
# Security headers check
git tag v0.903
git tag v1.0