veza/docs/PLAN_V0_903_IMPLEMENTATION.md
senke da20e83e09 docs: complete roadmap documentation v0.703 to v0.903 (v1.0 target)
Add Release Scope, Implementation Plan, and Smoke Test for 7 versions:
- v0.703: Go Live & Streaming Complet (Phase 7 Finale)
- v0.801: UX/UI Polish, Accessibilite & PWA (Phase 8)
- v0.802: Cloud Complet, Fichiers & Gear Avance (Phase 8)
- v0.803: Securite, Compliance & Outillage Dev (Phase 8)
- v0.901: Marketplace Complet & Analytics Avances (Phase 9)
- v0.902: Social Complet, Chat & Notifications (Phase 9)
- v0.903: Stabilisation v1.0 & Launch Readiness (Phase 9)

21 documents total (3 per version), covering all remaining features
needed to reach v1.0 from v0.702.
2026-02-24 01:32:04 +01:00

293 lines
10 KiB
Markdown

# Plan d'implémentation v0.903 — Stabilisation v1.0 & Launch Readiness
## État des lieux
| Feature | Backend | Frontend | Tests |
|---------|---------|----------|-------|
| Recherche phonétique | ❌ | ❌ | ❌ |
| Spell correction | ❌ | ❌ | ❌ |
| Saved searches | ❌ | ❌ | ❌ |
| Recommendations algo | ⚠️ basic random | ❌ | ❌ |
| Auto playlists | ❌ | ❌ | ❌ |
| Smart playlists | ❌ | ❌ | ❌ |
| Export M3U | ❌ | ❌ | ❌ |
| Merge/duplicate playlists | ❌ | ❌ | ❌ |
| Session management | ❌ | ❌ | ❌ |
| Login unusual detection | ❌ | ❌ | ❌ |
| Login history | ❌ | ❌ | ❌ |
| CAPTCHA | ❌ | ❌ | ❌ |
| Password history | ❌ | ❌ | ❌ |
| Load tests k6 | ❌ | — | ❌ |
| Production guide | ❌ | — | — |
---
## Fichiers existants clés
- Search : existing search handlers in `internal/core/track/track_search_handler.go`
- Recommendations : `GET /tracks/recommendations` (basic random)
- Playlists : `internal/handlers/` (CRUD playlists)
- Auth : `internal/core/auth/` (service, handler)
- Player : `apps/web/src/features/player/`
- Search frontend : `apps/web/src/features/search/`
- Library : `apps/web/src/features/library/`
- Settings Security : `apps/web/src/components/settings/`
---
## Step 1 : Recherche phonétique + spell correction (ST1-01, ST1-02)
**Backend** :
- Activer extensions PostgreSQL : `CREATE EXTENSION IF NOT EXISTS fuzzystrmatch;`
- Modifier search query pour inclure `soundex()` et `metaphone()` comme fallback si trigram similarity < 0.3
- Ajouter "Did you mean" dans la réponse search : si < 3 résultats, chercher la meilleure alternative via `similarity()` > 0.3
```go
type SearchResponse struct {
Results []Track `json:"results"`
Total int64 `json:"total"`
DidYouMean *string `json:"did_you_mean,omitempty"`
}
```
**Frontend** : Afficher "Did you mean: {suggestion}" cliquable dans SearchPage
**Commit** : `feat(search): phonetic search with soundex/metaphone and spell correction`
---
## Step 2 : Saved searches (ST1-03)
**Migration** : `migrations/132_saved_searches.sql`
```sql
CREATE TABLE IF NOT EXISTS saved_searches (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
query TEXT NOT NULL,
filters JSONB DEFAULT '{}',
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE INDEX idx_saved_searches_user ON saved_searches(user_id);
```
**Backend** : `POST /search/saved`, `GET /search/saved`, `DELETE /search/saved/:id` (max 50 per user)
**Frontend** : Section "Saved Searches" dans SearchPage, bouton save, liste dans sidebar
**Commit** : `feat(search): saved searches with CRUD and frontend display`
---
## Step 3 : Recommendation algorithm + Auto playlists (ST1-04, ST1-05)
**Backend** :
- Collaborative filtering : `SELECT t.* FROM tracks t JOIN track_plays tp ON t.id = tp.track_id WHERE tp.user_id IN (SELECT user_id FROM track_plays WHERE track_id IN (SELECT track_id FROM track_plays WHERE user_id = ?)) AND t.id NOT IN (SELECT track_id FROM track_plays WHERE user_id = ?) ORDER BY count(*) DESC LIMIT 30`
- Fallback si pas assez de données : genre + BPM similarity
- "Discover Weekly" : cron lundi 00:00, génère playlist auto pour chaque user actif (played_last_30d)
- "Your Top Tracks" : top 20 plays du mois en cours, régénéré quotidiennement
**Frontend** : Section dans Library avec playlists auto-générées (badge "Auto", read-only)
**Commit** : `feat(search): collaborative filtering recommendations and auto-generated playlists`
---
## Step 4 : Smart playlists (ST2-01)
**Migration** : `migrations/133_smart_playlists.sql`
```sql
ALTER TABLE playlists ADD COLUMN IF NOT EXISTS is_smart BOOLEAN NOT NULL DEFAULT false;
ALTER TABLE playlists ADD COLUMN IF NOT EXISTS smart_rules JSONB;
ALTER TABLE playlists ADD COLUMN IF NOT EXISTS last_updated_at TIMESTAMPTZ;
```
**Backend** :
- Smart playlist rules : `{ "rules": [{"field": "genre", "op": "eq", "value": "hip-hop"}, {"field": "bpm", "op": "gt", "value": 120}], "logic": "AND", "limit": 100 }`
- Cron quotidien : re-évalue les règles, met à jour les tracks de la playlist
- `POST /playlists/smart` — créer smart playlist, `PUT /playlists/:id/rules` — modifier règles
**Frontend** : Smart playlist builder — formulaire de règles (field selector, operator, value input), preview résultat
**Commit** : `feat(playlists): smart playlists with rule builder and auto-update`
---
## Step 5 : Export M3U + Merge + Duplicate (ST2-02 to ST2-04)
**Backend** :
- `GET /playlists/:id/export?format=m3u` — génère fichier M3U8 avec URLs des tracks
- `POST /playlists/merge` — body `{ "playlist_ids": [...], "name": "Merged" }`, déduplique par track_id
- `POST /playlists/:id/duplicate` — copie playlist + tracks, nom = "Copy of {name}"
**Frontend** : Menu actions dans PlaylistView avec Export, Merge (multi-select), Duplicate
**Commit** : `feat(playlists): export M3U, merge playlists, duplicate playlist`
---
## Step 6 : Session management + Login history (ST3-01 to ST3-04)
**Migration** : `migrations/134_sessions_history.sql`
```sql
CREATE TABLE IF NOT EXISTS user_sessions (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
device VARCHAR(255),
ip_address VARCHAR(45),
user_agent TEXT,
is_current BOOLEAN NOT NULL DEFAULT false,
is_suspicious BOOLEAN NOT NULL DEFAULT false,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW(),
last_active_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE INDEX idx_user_sessions_user ON user_sessions(user_id);
CREATE TABLE IF NOT EXISTS login_history (
id UUID PRIMARY KEY DEFAULT gen_random_uuid(),
user_id UUID NOT NULL REFERENCES users(id) ON DELETE CASCADE,
ip_address VARCHAR(45),
device VARCHAR(255),
success BOOLEAN NOT NULL,
created_at TIMESTAMPTZ NOT NULL DEFAULT NOW()
);
CREATE INDEX idx_login_history_user ON login_history(user_id);
```
**Backend** :
- Créer session à chaque login, mettre à jour last_active_at sur chaque requête auth
- `GET /auth/sessions` — liste des sessions actives avec device/IP/date
- `DELETE /auth/sessions/:id` — révoquer une session (invalider le token)
- `DELETE /auth/sessions` — logout all devices
- `GET /auth/login-history` — dernières 50 connexions
- Détection login inhabituel : nouvelle IP non vue dans les 30 derniers jours → email notification
**Frontend** : Page Security dans Settings — tableau des sessions actives avec bouton Revoke, historique connexions
**Commit** : `feat(auth): session management, login history, unusual login detection`
---
## Step 7 : CAPTCHA + Password history (ST3-05, ST3-06)
**Backend** :
- Intégration hCaptcha : middleware sur `/auth/login` (après 3 échecs consécutifs), `/auth/register`, `/auth/reset-password`
- Vérification server-side via hCaptcha API
- Password history : stocker hash des 5 derniers mots de passe dans `password_history JSONB`, vérifier avant update
**Frontend** : Composant `<CaptchaWidget />` intégré dans les formulaires login/register/reset quand requis
**Commit** : `feat(auth): hCaptcha integration and password history enforcement`
---
## Step 8 : Load testing k6 (ST4-01)
**Fichier** : `tests/load/` (nouveau répertoire)
Scripts k6 :
- `auth-flow.js` — register + login + refresh + logout (100 VUs, 5min)
- `search.js` — search queries variées (200 VUs, 5min)
- `marketplace.js` — browse + add to cart + checkout (50 VUs, 5min)
- `streaming.js` — GET track + HLS segments (100 VUs, 5min)
- `websocket.js` — WebSocket connect + send/receive messages (50 VUs, 5min)
**Target** : p95 < 200ms, error rate < 1%, 0 crashes
**Commit** : `test(load): k6 load testing scripts for auth, search, marketplace, streaming, websocket`
---
## Step 9 : Performance optimization (ST4-02 to ST4-04)
**Redis** : Ajouter cache-aside sur :
- Search results (TTL 30s)
- Trending hashtags (TTL 5min)
- Analytics aggregations (TTL 1min)
- User preferences (TTL 10min)
**DB** : EXPLAIN ANALYZE sur top 10 slow queries, ajouter indexes manquants
**CDN** : Cache-Control headers sur :
- Static assets : max-age=31536000, immutable
- HLS segments : max-age=3600
- Product images : max-age=86400
**Commit** : `perf: Redis cache optimization, DB query tuning, CDN cache headers`
---
## Step 10 : Documentation v1.0 (ST5)
**Fichier** : `docs/API_REFERENCE.md` compléter tous les endpoints manquants
**Fichier** : `docs/PRODUCTION_GUIDE.md` (nouveau) docker-compose prod, env vars, backup, monitoring, scaling
**Fichier** : `README.md` mise à jour v1.0 avec architecture, quick start, features
**Fichier** : `docs/MIGRATION_GUIDE.md` (nouveau) liste migrations 1-134+, procédure upgrade
**Fichier** : `CHANGELOG.md` historique complet v0.101 à v1.0
**Commit** : `docs: v1.0 documentation — API Reference, Production Guide, README, Migration Guide`
---
## Step 11 : E2E validation + Security audit (REL)
- Exécuter tous les smoke tests (v0.703 à v0.903)
- Vérifier security headers sur toutes les réponses
- Vérifier rate limiting
- Vérifier GDPR (export + deletion)
- Lighthouse audit : Performance 85, Accessibility 90, PWA 90
- k6 results : p95 < 200ms
**Commit** : `test: v1.0 E2E validation and security audit`
---
## Step 12 : Release v0.903 + v1.0
**Fichier** : `docs/PROJECT_STATE.md` Version = v1.0, Phase = Released, features count final
**Fichier** : `docs/FEATURE_STATUS.md` mise à jour finale
**Fichier** : `docs/RETROSPECTIVE_V0903.md` (nouveau)
**Fichier** : `docs/RETROSPECTIVE_V1.md` (nouveau) rétrospective globale du projet
```bash
git add .
git commit -m "chore(release): finalize v0.903 and v1.0 documentation"
git tag v0.903
git tag v1.0
```
---
## Dépendances entre steps
```mermaid
graph TD
S1[Step1 Search phonetic] --> S2[Step2 Saved searches]
S2 --> S3[Step3 Recommendations]
S4[Step4 Smart playlists] --> S5[Step5 Export/Merge/Dup]
S6[Step6 Sessions] --> S7[Step7 CAPTCHA]
S8[Step8 k6 load tests] --> S9[Step9 Perf optimization]
S3 --> S10[Step10 Docs]
S5 --> S10
S7 --> S10
S9 --> S10
S10 --> S11[Step11 E2E]
S11 --> S12[Step12 Release]
```
---
## Validation finale
```bash
cd veza-backend-api && go build ./... && go vet ./... && go test ./... -v
cd apps/web && npm run build
# k6 load tests
cd tests/load && k6 run auth-flow.js && k6 run search.js
# Lighthouse
# Security headers check
git tag v0.903
git tag v1.0
```