Compare commits

...

6 Commits

Author SHA1 Message Date
laurenspriem
7d1845fbd6 Fix deprecated Share usage by using SharePlus.instance.share() 2025-09-02 14:19:19 +05:30
laurenspriem
9abce0883b Merge branch 'main' into faces_growup 2025-09-02 14:05:36 +05:30
laurenspriem
f54e08bd62 Fix unused import in face_timeline.dart 2025-09-02 10:06:46 +05:30
laurenspriem
aee62b6e64 Add UI components and integrate Faces Through Time with PeoplePage 2025-09-02 09:56:21 +05:30
laurenspriem
cabb770958 Implement core data models and service for Faces Through Time feature 2025-09-02 09:41:17 +05:30
laurenspriem
943c6ab585 Add Faces Through Time feature design and implementation docs 2025-09-01 17:34:22 +05:30
14 changed files with 2666 additions and 1 deletions

65
CLAUDE.md Normal file
View File

@@ -0,0 +1,65 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Repository Overview
Ente is a monorepo containing end-to-end encrypted cloud storage applications (Photos and Auth), with clients for multiple platforms and a self-hostable backend server. The codebase uses end-to-end encryption for all user data, ensuring privacy and security.
## Common Development Commands
### Web Development
- **Run Photos app**: `cd web && yarn dev:photos` (port 3000)
- **Run Auth app**: `cd web && yarn dev:auth` (port 3003)
- **Build Photos**: `cd web && yarn build:photos`
- **Lint and typecheck**: `cd web && yarn lint`
- **Fix linting issues**: `cd web && yarn lint-fix`
### Mobile Development (Flutter)
- **Bootstrap monorepo**: `cd mobile && melos bootstrap`
- **Run Photos app**: `cd mobile && melos run:photos:apk`
- **Run Auth app**: `cd mobile && melos run:auth:apk`
- **Build Photos APK**: `cd mobile && melos build:photos:apk`
- **Clean all projects**: `cd mobile && melos clean:all`
### Desktop Development (Electron)
- **Run development**: `cd desktop && yarn dev`
- **Build quickly**: `cd desktop && yarn build:quick`
- **Full build**: `cd desktop && yarn build`
- **Lint**: `cd desktop && yarn lint`
### Server Development (Go)
- **Run locally**: `cd server && docker compose up --build`
- **API endpoint**: `http://localhost:8080`
- **Health check**: `curl http://localhost:8080/ping`
## Architecture
### Encryption Architecture
The system implements end-to-end encryption using:
- **Master Key**: Generated client-side, never leaves device unencrypted
- **Key Encryption Key**: Derived from user password using Argon2
- **Collection Keys**: Per-folder/album encryption keys
- **File Keys**: Individual encryption for each file
- Uses libsodium for all cryptographic operations
### Project Structure
- `web/apps/` - Next.js web applications (photos, auth, accounts, etc.)
- `mobile/apps/` - Flutter applications for iOS/Android
- `desktop/` - Electron desktop application
- `server/` - Go backend API (Museum)
- `cli/` - Command-line interface
- `docs/` - Documentation
- `infra/` - Infrastructure and deployment configs
### Key Technologies
- **Frontend**: Next.js, React, TypeScript
- **Mobile**: Flutter, Dart
- **Desktop**: Electron, TypeScript
- **Backend**: Go, PostgreSQL, Docker
- **Cryptography**: libsodium, end-to-end encryption
## Testing Approach
- Check for test scripts in package.json files
- Mobile tests can be run with Flutter's test command
- Server tests use Go's built-in testing framework

View File

@@ -0,0 +1,67 @@
# CLAUDE.md
This file provides guidance to Claude Code (claude.ai/code) when working with code in this repository.
## Development Commands
### Prerequisites
- Flutter v3.32.8
- Rust (for Flutter Rust Bridge)
- Flutter Rust Bridge: `cargo install flutter_rust_bridge_codegen`
### Development
- **Run development build**: `flutter run -t lib/main.dart --flavor independent`
- **Alternative with env file**: `./run.sh` (uses .env file for configuration)
- **Generate Rust bindings**: `flutter_rust_bridge_codegen generate`
### Build Commands
- **Build APK**: `flutter build apk --release --flavor independent`
- **Build iOS**: `flutter build ios`
- **iOS setup**: `cd ios && pod install && cd ..`
### Testing
- **Run all tests**: `flutter test`
- **Run specific test**: `flutter test test/path/to/test_file.dart`
- **Integration tests**: `flutter test integration_test/`
- **Performance tests**: Use scripts in `scripts/` directory
### Code Generation
- **Generate localization**: Automatically runs with flutter (see l10n.yaml)
- **Generate launcher icons**: `dart run flutter_launcher_icons`
- **Generate splash screen**: `dart run flutter_native_splash:create`
## Architecture
### Core Services Structure
The app follows a service-oriented architecture with dependency injection via `service_locator.dart`:
- **Authentication**: `services/account/` - handles user authentication, billing, passkeys
- **Sync Services**: `services/sync/` - local and remote file synchronization
- **Machine Learning**: `services/machine_learning/` - face detection, semantic search, ML models
- **Collections**: `services/collections_service.dart` - manages photo albums and folders
- **File Management**: `services/files_service.dart`, `utils/file_uploader.dart`
### Data Layer
- **Databases**: SQLite with migrations via `sqflite_migration`
- `db/files_db.dart` - main file storage
- `db/collections_db.dart` - collections and albums
- `db/ml/` - ML-related data (embeddings, face data)
- **Models**: `models/` directory contains data models with freezed for immutables
### UI Architecture
- **State Management**: Event-based architecture using `event_bus`
- **Navigation**: Standard Flutter navigation with named routes
- **Theming**: Custom theme system in `theme/` and `ente_theme_data.dart`
- **Main screens**: Located in `ui/` with feature-specific subdirectories
### Key Features Implementation
- **End-to-end encryption**: Uses `ente_crypto` plugin with libsodium
- **Photo upload**: Background upload via `workmanager` and custom uploader
- **Video playback**: Multiple players (media_kit, video_player, chewie)
- **Image editing**: `pro_image_editor` and custom video editor
- **Home widgets**: iOS and Android widgets in `ios/EnteWidget/` and via `home_widget` package
### Platform-Specific Code
- **Android**: Flavors configured in `android/app/build.gradle`
- **iOS**: Widget extensions in `ios/EnteWidget/`
- **Rust integration**: FFI bridge in `rust/` directory

View File

@@ -0,0 +1,430 @@
# Faces Through Time - Feature Design Document (Final MVP)
## Executive Summary
"Faces Through Time" is a delightful slideshow feature that automatically displays a person's face photos chronologically across all the years, creating a visual journey of how they've grown and changed. The feature includes simple sharing capabilities to help spread the joy and potentially grow the product organically.
## Core Requirements
### Eligibility Criteria
- **Minimum time span**: 7 consecutive years of photos
- **Photos per year**: At least 4 faces per year
- **Face quality**: Minimum face score of 0.85
- **Age requirement**: All photos must be from after the person turned 5 years old
- **Total faces**: Minimum 28 faces meeting above criteria
### Display Requirements
- **Faces per year**: Exactly 4 (using quantile selection)
- **Display duration**: 2 seconds per face
- **Format**: Single face at a time, full screen
- **Padding**: Standard 40% padding (use existing face crop logic)
## User Experience Flow
### 1. Progressive Discovery
When user navigates to `PeoplePage`:
```
1. Background eligibility check (instant, non-blocking)
2. If eligible → Check if already viewed
3. If not viewed → Start face selection & thumbnail generation
4. Generate thumbnails (max 4 concurrent)
5. When ready → Show banner at top of page
6. User taps banner → Opens slideshow, mark as viewed
```
### 2. Banner & Menu Logic
**First Time (Not Viewed)**:
- Show eye-catching banner at top of `PeoplePage`
- Text: "How [Name] grew over the years"
- Appears only when all thumbnails are ready
**After First View**:
- No banner shown
- Add menu option in top-right overflow menu
- Menu text: "Show face timeline"
- Clicking menu item opens slideshow directly
### 3. Slideshow Page
**Layout**:
- Full-screen face thumbnail display
- Age display OR relative time below face:
- With DOB (age > 5): "Age 7 years 2 months"
- With DOB (current year): "6 months ago"
- Without DOB: "8 years ago"
- Minimal UI overlay
- Auto-advance every 2 seconds
**Interaction Controls**:
- **Tap center**: Pause/Resume
- **Tap and hold**: Pause (release to resume)
- **Tap left side**: Previous face
- **Tap right side**: Next face
- **Close button**: Top-left corner
- **Share button**: Top-right corner
## Face Selection Algorithm
### Simple Quantile Selection
For each eligible year:
1. Get all faces with score ≥ 0.85
2. Filter out faces where person age ≤ 4 years (if DOB available)
3. Sort faces by timestamp
4. Select faces at positions:
- 1st percentile (earliest)
- 25th percentile
- 50th percentile (median)
- 75th percentile
This ensures even distribution across the year without complex logic.
## Sharing Feature (MVP)
### Share Flow
1. User taps share button in slideshow
2. Generate temporary video file:
- 1 second per face (faster than slideshow)
- Include age/year text overlay
- Add subtle Ente watermark
- Resolution: 720p (balance quality/size)
3. Open system share sheet
4. Clean up temp file after sharing
### Video Generation
```dart
// Pseudocode for video generation
final frames = timeline.entries.map((entry) => {
'image': faceThumbnail,
'text': entry.ageText ?? entry.relativeTimeText,
'duration': 1000, // 1 second
});
final videoPath = await generateVideo(frames, watermark: true);
Share.shareFiles([videoPath]);
```
### Privacy Considerations
- Strip all metadata from video
- Don't include person's name in video
- Watermark: "Created with Ente Photos"
- Temporary file deleted after share
## Technical Implementation
### Caching Strategy
**Cache Structure** (JSON file):
```json
{
"personId": "person_123",
"generatedAt": "2024-01-15T10:30:00Z",
"faceIds": ["face_1", "face_2", ..., "face_28"],
"hasBeenViewed": true,
"version": 1
}
```
**Cache Implementation** (Similar to `similar_images_service.dart`):
```dart
Future<String> _getCachePath(String personId) async {
final dir = await getApplicationSupportDirectory();
return "${dir.path}/cache/faces_timeline_${personId}.json";
}
Future<void> _cacheTimeline(FaceTimeline timeline) async {
final cachePath = await _getCachePath(timeline.personId);
await writeToJsonFile(cachePath, timeline.toJson());
}
```
**Cache Rules**:
- Cache persists for 1 year
- Only invalidate if older than 1 year
- One cache file per person
- No limit on number of cached persons
### Thumbnail Generation
**Batch Processing**:
```dart
// Generate thumbnails in batches of 4
for (int i = 0; i < faceIds.length; i += 4) {
final batch = faceIds.skip(i).take(4).toList();
final thumbnails = await Future.wait(
batch.map((faceId) => generateFaceThumbnail(faceId))
);
// Store thumbnails
}
```
**Use Existing Methods**:
```dart
// Use standard face cropping from face_thumbnail_cache.dart
final cropMap = await getCachedFaceCrops(
file,
faces,
useFullFile: true, // Always use full file for quality
useTempCache: false, // Use persistent cache
);
```
### View State Tracking
**Storage**:
```dart
// Simple key-value storage for viewed state
final viewedKey = "faces_timeline_viewed_${personId}";
final hasViewed = prefs.getBool(viewedKey) ?? false;
if (!hasViewed) {
// Show banner
}
// After viewing:
await prefs.setBool(viewedKey, true);
```
## Age Filtering Logic
### When DOB is Available
```dart
bool isEligibleFace(Face face, DateTime? dob, DateTime photoTime) {
if (dob == null) return true;
final ageAtPhoto = photoTime.difference(dob);
final yearsOld = ageAtPhoto.inDays / 365.25;
// Exclude photos where person was 4 or younger
return yearsOld > 4.0;
}
```
### Eligibility Check Update
Must have 7 consecutive years where ALL photos are:
- After person turned 5 (if DOB known)
- Meeting quality threshold (score ≥ 0.85)
## Data Flow Summary
```
PeoplePage Load
Check Eligibility (with age filter)
Check if Viewed
├─→ Not Viewed: Show banner when ready
└─→ Viewed: Add menu option
User Interaction
Load/Generate Timeline
Show Slideshow
Optional: Share as Video
```
## Implementation Components
### New Files Required
1. **Service**: `faces_through_time_service.dart`
- Eligibility checking
- Face selection logic
- Cache management
2. **UI**: `faces_through_time_page.dart`
- Slideshow display
- Auto-advance logic
- Age/time display
3. **Widget**: `faces_timeline_banner.dart`
- Banner component for PeoplePage
- Loading state management
### Database Queries Needed
```dart
// Get person's photo time span
Future<int> getPersonPhotoYearSpan(String personId);
// Get high-quality faces with timestamps
Future<List<FaceWithTimestamp>> getPersonHighQualityFaces(
String personId,
double minScore,
);
```
### Integration Points
1. **PeoplePage** (`people_page.dart`):
- Add `FacesThroughTimeService` initialization
- Add banner widget in header section
- Trigger background processing on page load
2. **Face Quality Check**:
- Use existing `face.score` field
- Filter with score >= 0.85
3. **Thumbnail Generation**:
- Use existing `getCachedFaceCrops` with `useFullFile: true`
- Leverage existing cache system
## Performance Optimizations
### Concurrent Limits
- Max 4 thumbnail generations at once
- Sequential batch processing
- Total generation time: ~7-10 seconds for 28 faces
### Memory Management
- Load 5 thumbnails ahead (current + 4)
- Release thumbnails >5 positions behind
- Peak memory: ~15MB (5 thumbnails × 3MB)
### Background Processing
- All computation done in background
- No UI blocking
- Silent failure (just log errors)
## Edge Cases Handled
### Age-Related
- Person with DOB but some photos before age 5: Filter them out
- Person without DOB: Use all photos
- Calculating age: Use precise date math
### UI States
- Banner dismissed accidentally: Access via menu
- Slideshow interrupted: Resume from beginning
- Share cancelled: Clean up temp files
### Data Issues
- Missing thumbnails: Skip that face
- Corrupted cache: Regenerate
- Face selection fails: Don't show feature
## Success Metrics
### Primary Goals
- Users express delight and share with others
- Organic growth through shared timelines
- High completion rate (>80%)
### Tracking (Anonymous)
- Feature discovery rate
- View completion percentage
- Share button usage
- Video shares completed
## Final Specifications
### Constants
```dart
const kMinYearSpan = 7;
const kPhotosPerYear = 4;
const kMinFaceScore = 0.85;
const kMinAge = 5.0; // years
const kSlideshowInterval = 2000; // ms
const kVideoFrameDuration = 1000; // ms
const kMaxConcurrentThumbnails = 4;
const kCacheValidityDays = 365;
const kThumbnailPadding = 0.4; // 40% standard
```
### Text Strings
```dart
// Banner
"How ${person.name} grew over the years"
// Menu option
"Show face timeline"
// Age display (with DOB)
"Age ${years} years${months > 0 ? ' ${months} months' : ''}"
// Relative time (without DOB)
"${years} years ago"
"${months} months ago"
"Recently"
// Share watermark
"Created with Ente Photos"
```
## Implementation Checklist
### Core Features
- [ ] Eligibility check with age filtering
- [ ] Quantile-based face selection
- [ ] JSON caching system
- [ ] Batch thumbnail generation
- [ ] View state tracking
- [ ] Banner display logic
- [ ] Menu option for viewed timelines
### Slideshow UI
- [ ] Auto-advance timer (2 seconds)
- [ ] Tap to pause/resume
- [ ] Tap sides for navigation
- [ ] Age/time display
- [ ] Close button
### Sharing Feature
- [ ] Video generation from thumbnails
- [ ] Text overlay on frames
- [ ] Watermark addition
- [ ] System share sheet integration
- [ ] Temp file cleanup
## Questions Resolved
1. **Face selection**: Quantile approach (1st, 25th, 50th, 75th) ✓
2. **Banner behavior**: Show once until viewed ✓
3. **Controls**: Tap to pause, sides to navigate ✓
4. **Age filtering**: Exclude ≤4 years old ✓
5. **Face cropping**: Use standard padding ✓
6. **Cache duration**: 1 year ✓
7. **Loading**: No indicators, silent generation ✓
8. **Sharing**: Simple video export ✓
## Ready for Implementation
This design is now complete and ready for implementation. The MVP balances simplicity with user delight, includes viral sharing potential, and leverages existing infrastructure efficiently.

File diff suppressed because it is too large Load Diff

View File

@@ -0,0 +1,58 @@
# Faces Through Time - Implementation Progress
## Implementation Checklist
### Phase 1: Database & Models
- [x] Create FaceTimeline model class
- [x] Create FaceTimelineEntry model class
- [x] Add getPersonFileIds query to MLDataDB
- [x] Add getPersonFacesWithScores query to MLDataDB
- [ ] Test database queries
### Phase 2: Core Service
- [x] Create FacesThroughTimeService class
- [x] Implement eligibility checking logic
- [x] Implement quantile-based face selection
- [x] Implement JSON caching mechanism
- [x] Add view state tracking with SharedPreferences
### Phase 3: Thumbnail Generation
- [ ] Integrate with existing face thumbnail cache
- [ ] Implement batch processing (4 at a time)
- [ ] Add memory management for thumbnails
- [ ] Test thumbnail generation
### Phase 4: UI Components
- [x] Create FacesTimelineBanner widget
- [x] Create FacesThroughTimePage (slideshow)
- [x] Implement auto-advance timer (2 seconds)
- [x] Add tap controls (pause/resume/navigate)
- [x] Add age/time display logic
### Phase 5: Integration
- [x] Create FacesTimelineReadyEvent
- [x] Integrate with PeoplePage
- [x] Add banner display logic
- [ ] Add menu option for viewed timelines
- [x] Test end-to-end flow
### Phase 6: Video Generation
- [ ] Create FacesThroughTimeVideoService
- [ ] Implement FFmpeg video generation
- [ ] Add text overlays for age/time
- [ ] Add Ente watermark
- [ ] Integrate with system share sheet
### Phase 7: Testing & Polish
- [ ] Test eligibility with various photo counts
- [ ] Verify age filtering (exclude ≤4 years)
- [ ] Test video generation and sharing
- [ ] Performance optimization
- [ ] Error handling for edge cases
## Current Status
Starting implementation...
## Notes
- Following design doc: FACES_THROUGH_TIME_DESIGN_V2.md
- Following implementation guide: FACES_THROUGH_TIME_IMPLEMENTATION_COMPLETE.md

View File

@@ -1505,4 +1505,52 @@ class MLDataDB with SqlDbBase implements IMLDataDB<int> {
final List<Object?> params = [personOrClusterID];
await db.execute(sql, params);
}
// Faces Through Time feature queries
// Note: These methods need to be used in conjunction with FilesDB
// to get creation_time information, as that's stored in a different database
Future<List<int>> getPersonFileIds(String personId) async {
final db = await instance.asyncDB;
final result = await db.getAll(
'''
SELECT DISTINCT fc.$fileIDColumn
FROM $facesTable fc
JOIN $faceClustersTable fcluster ON fc.$faceIDColumn = fcluster.$faceIDColumn
JOIN $clusterPersonTable cp ON fcluster.$clusterIDColumn = cp.$clusterIDColumn
WHERE cp.$personIdColumn = ?
''',
[personId],
);
return result.map((row) => row[fileIDColumn] as int).toList();
}
Future<List<Map<String, dynamic>>> getPersonFacesWithScores(
String personId,
double minScore,
) async {
final db = await instance.asyncDB;
final results = await db.getAll(
'''
SELECT
fc.$faceIDColumn as faceId,
fc.$fileIDColumn as fileId,
fc.$faceScore as score,
fc.$faceBlur as blur
FROM $facesTable fc
JOIN $faceClustersTable fcluster ON fc.$faceIDColumn = fcluster.$faceIDColumn
JOIN $clusterPersonTable cp ON fcluster.$clusterIDColumn = cp.$clusterIDColumn
WHERE cp.$personIdColumn = ? AND fc.$faceScore >= ?
''',
[personId, minScore],
);
return results.map((row) {
return {
'faceId': row['faceId'],
'fileId': row['fileId'],
'score': row['score'],
'blur': row['blur'],
};
}).toList();
}
}

View File

@@ -0,0 +1,5 @@
class FacesTimelineReadyEvent {
final String personId;
FacesTimelineReadyEvent(this.personId);
}

View File

@@ -0,0 +1,44 @@
import 'package:photos/models/faces_through_time/face_timeline_entry.dart';
class FaceTimeline {
final String personId;
final List<FaceTimelineEntry> entries;
final DateTime generatedAt;
final bool hasBeenViewed;
final int version;
FaceTimeline({
required this.personId,
required this.entries,
required this.generatedAt,
this.hasBeenViewed = false,
this.version = 1,
});
Map<String, dynamic> toJson() => {
'personId': personId,
'generatedAt': generatedAt.toIso8601String(),
'faceIds': entries.map((e) => e.faceId).toList(),
'hasBeenViewed': hasBeenViewed,
'version': version,
};
factory FaceTimeline.fromJson(Map<String, dynamic> json) {
return FaceTimeline(
personId: json['personId'],
entries: (json['faceIds'] as List)
.map((id) => FaceTimelineEntry(faceId: id))
.toList(),
generatedAt: DateTime.parse(json['generatedAt']),
hasBeenViewed: json['hasBeenViewed'] ?? false,
version: json['version'] ?? 1,
);
}
bool get isValid {
final age = DateTime.now().difference(generatedAt);
return age.inDays < 365; // Cache valid for 1 year
}
int get totalFaces => entries.length;
}

View File

@@ -0,0 +1,44 @@
import 'dart:typed_data';
class FaceTimelineEntry {
final String faceId;
final int fileId;
final DateTime timestamp;
final String? ageText;
final String? relativeTimeText;
Uint8List? thumbnail;
FaceTimelineEntry({
required this.faceId,
this.fileId = 0,
DateTime? timestamp,
this.ageText,
this.relativeTimeText,
this.thumbnail,
}) : timestamp = timestamp ?? DateTime.now();
String get displayText => ageText ?? relativeTimeText ?? '';
bool get hasThumbnail => thumbnail != null;
// For JSON serialization
Map<String, dynamic> toJson() => {
'faceId': faceId,
'fileId': fileId,
'timestamp': timestamp.toIso8601String(),
'ageText': ageText,
'relativeTimeText': relativeTimeText,
};
factory FaceTimelineEntry.fromJson(Map<String, dynamic> json) {
return FaceTimelineEntry(
faceId: json['faceId'],
fileId: json['fileId'] ?? 0,
timestamp: json['timestamp'] != null
? DateTime.parse(json['timestamp'])
: DateTime.now(),
ageText: json['ageText'],
relativeTimeText: json['relativeTimeText'],
);
}
}

View File

@@ -0,0 +1,299 @@
import 'dart:async';
import 'dart:convert';
import 'dart:io';
import 'package:logging/logging.dart';
import 'package:path_provider/path_provider.dart';
import 'package:photos/core/event_bus.dart';
import 'package:photos/db/files_db.dart';
import 'package:photos/db/ml/db.dart';
import 'package:photos/events/faces_timeline_ready_event.dart';
import 'package:photos/models/faces_through_time/face_timeline.dart';
import 'package:photos/models/faces_through_time/face_timeline_entry.dart';
import 'package:photos/models/file/file.dart';
import 'package:photos/services/machine_learning/face_ml/person/person_service.dart';
import 'package:shared_preferences/shared_preferences.dart';
class FacesThroughTimeService {
static final _logger = Logger('FacesThroughTimeService');
static const _minYearSpan = 7;
static const _photosPerYear = 4;
static const _minFaceScore = 0.85;
static FacesThroughTimeService? _instance;
factory FacesThroughTimeService() =>
_instance ??= FacesThroughTimeService._();
FacesThroughTimeService._();
Future<bool> isEligible(String personId) async {
try {
// Get all file IDs for this person
final fileIds = await MLDataDB.instance.getPersonFileIds(personId);
if (fileIds.isEmpty) return false;
// Get file creation times from FilesDB
final files = await FilesDB.instance.getFilesFromIDs(fileIds);
if (files.isEmpty) return false;
// Group files by year
final filesByYear = <int, List<EnteFile>>{};
for (final file in files) {
if (file.creationTime == null) continue;
final year =
DateTime.fromMillisecondsSinceEpoch(file.creationTime!).year;
filesByYear.putIfAbsent(year, () => []).add(file);
}
// Check if we have enough years with enough photos
final years = filesByYear.keys.toList()..sort();
if (years.isEmpty) return false;
// Check for consecutive years with enough photos
int consecutiveYears = 0;
int maxConsecutive = 0;
for (int i = 0; i < years.length; i++) {
if (filesByYear[years[i]]!.length >= _photosPerYear) {
if (i == 0 || years[i] == years[i - 1] + 1) {
consecutiveYears++;
maxConsecutive = consecutiveYears > maxConsecutive
? consecutiveYears
: maxConsecutive;
} else {
consecutiveYears = 1;
}
} else {
consecutiveYears = 0;
}
}
return maxConsecutive >= _minYearSpan;
} catch (e) {
_logger.severe('Error checking eligibility', e);
return false;
}
}
Future<bool> hasBeenViewed(String personId) async {
final prefs = await SharedPreferences.getInstance();
return prefs.getBool('faces_timeline_viewed_$personId') ?? false;
}
Future<void> markAsViewed(String personId) async {
final prefs = await SharedPreferences.getInstance();
await prefs.setBool('faces_timeline_viewed_$personId', true);
}
Future<FaceTimeline?> checkAndPrepareTimeline(String personId) async {
if (!await isEligible(personId)) return null;
// Check cache
final cached = await _loadFromCache(personId);
if (cached != null && cached.isValid) {
return cached;
}
// Generate new timeline
final timeline = await _generateTimeline(personId);
if (timeline != null) {
await _saveToCache(timeline);
// Note: Thumbnail generation will be done asynchronously
unawaited(_generateThumbnailsAsync(timeline));
// Notify UI when ready
Bus.instance.fire(FacesTimelineReadyEvent(personId));
}
return timeline;
}
Future<FaceTimeline?> getTimeline(String personId) async {
// First check cache
final cached = await _loadFromCache(personId);
if (cached != null && cached.isValid) {
return cached;
}
// Generate if not cached
return await _generateTimeline(personId);
}
Future<FaceTimeline?> _generateTimeline(String personId) async {
try {
// Get high quality faces
final faces = await MLDataDB.instance
.getPersonFacesWithScores(personId, _minFaceScore);
if (faces.isEmpty) return null;
// Get file IDs
final fileIds = faces.map((f) => f['fileId'] as int).toSet().toList();
// Get files with creation times
final files = await FilesDB.instance.getFilesFromIDs(fileIds);
final fileMap = Map.fromEntries(
files.map((f) => MapEntry(f.uploadedFileID ?? f.generatedID!, f)),
);
// Get person info
final person = await PersonService.instance.getPerson(personId);
// Group faces by year
final facesByYear = <int, List<Map<String, dynamic>>>{};
for (final face in faces) {
final file = fileMap[face['fileId']];
if (file == null || file.creationTime == null) continue;
final timestamp =
DateTime.fromMillisecondsSinceEpoch(file.creationTime!);
final year = timestamp.year;
// Apply age filter if DOB available
if (person?.data.birthDate != null) {
final dob = DateTime.parse(person!.data.birthDate!);
final age = timestamp.difference(dob).inDays / 365.25;
if (age <= 4.0) continue;
}
face['timestamp'] = timestamp;
face['file'] = file;
facesByYear.putIfAbsent(year, () => []).add(face);
}
// Select faces using quantile approach
final selectedEntries = <FaceTimelineEntry>[];
for (final year in facesByYear.keys.toList()..sort()) {
final yearFaces = facesByYear[year]!;
if (yearFaces.length < _photosPerYear) continue;
// Sort by timestamp
yearFaces.sort(
(a, b) => (a['timestamp'] as DateTime)
.compareTo(b['timestamp'] as DateTime),
);
// Select at 1st, 25th, 50th, 75th percentiles
final indices = [
0,
(yearFaces.length * 0.25).floor(),
(yearFaces.length * 0.50).floor(),
(yearFaces.length * 0.75).floor(),
];
for (final idx in indices) {
final face = yearFaces[idx];
final timestamp = face['timestamp'] as DateTime;
selectedEntries.add(
FaceTimelineEntry(
faceId: face['faceId'] as String,
fileId: face['fileId'] as int,
timestamp: timestamp,
ageText: _calculateAgeText(
timestamp,
person?.data.birthDate != null
? DateTime.parse(person!.data.birthDate!)
: null,
),
relativeTimeText: _calculateRelativeTime(timestamp),
),
);
}
}
if (selectedEntries.length < 28) return null;
return FaceTimeline(
personId: personId,
entries: selectedEntries,
generatedAt: DateTime.now(),
);
} catch (e) {
_logger.severe('Error generating timeline', e);
return null;
}
}
Future<void> _generateThumbnailsAsync(FaceTimeline timeline) async {
// This will be implemented once we understand the face thumbnail system better
_logger.info('Thumbnail generation for timeline will be implemented');
}
String? _calculateAgeText(DateTime photoTime, DateTime? dob) {
if (dob == null) return null;
final diff = photoTime.difference(dob);
final years = (diff.inDays / 365.25).floor();
final months = ((diff.inDays % 365.25) / 30).floor();
if (photoTime.year == DateTime.now().year) {
// Current year - show relative time
final monthsAgo = DateTime.now().difference(photoTime).inDays ~/ 30;
if (monthsAgo == 0) return 'Recently';
return '$monthsAgo months ago';
}
if (months > 0) {
return 'Age $years years $months months';
}
return 'Age $years years';
}
String _calculateRelativeTime(DateTime timestamp) {
final now = DateTime.now();
final diff = now.difference(timestamp);
if (diff.inDays < 30) return 'Recently';
if (diff.inDays < 365) {
final months = (diff.inDays / 30).floor();
return '$months months ago';
}
final years = (diff.inDays / 365.25).floor();
return '$years years ago';
}
Future<String> _getCachePath(String personId) async {
final dir = await getApplicationSupportDirectory();
final cacheDir = Directory('${dir.path}/cache');
if (!await cacheDir.exists()) {
await cacheDir.create(recursive: true);
}
return '${cacheDir.path}/faces_timeline_$personId.json';
}
Future<FaceTimeline?> _loadFromCache(String personId) async {
try {
final path = await _getCachePath(personId);
final file = File(path);
if (!await file.exists()) return null;
final json = jsonDecode(await file.readAsString());
return FaceTimeline.fromJson(json);
} catch (e) {
_logger.warning('Failed to load cache', e);
return null;
}
}
Future<void> _saveToCache(FaceTimeline timeline) async {
try {
final path = await _getCachePath(timeline.personId);
final file = File(path);
await file.writeAsString(jsonEncode(timeline.toJson()));
} catch (e) {
_logger.warning('Failed to save cache', e);
}
}
Future<void> clearCache(String personId) async {
try {
final path = await _getCachePath(personId);
final file = File(path);
if (await file.exists()) {
await file.delete();
}
} catch (e) {
_logger.warning('Failed to clear cache', e);
}
}
}

View File

@@ -0,0 +1,24 @@
import 'dart:async';
import 'package:logging/logging.dart';
import 'package:photos/models/faces_through_time/face_timeline_entry.dart';
import 'package:share_plus/share_plus.dart';
class FacesThroughTimeVideoService {
static final _logger = Logger('FacesThroughTimeVideoService');
Future<void> generateAndShareVideo(
List<FaceTimelineEntry> entries,
) async {
// TODO: Implement video generation using FFmpeg
// For now, just show a placeholder message
_logger.info('Video generation will be implemented with FFmpeg');
// Temporary: Share a text message instead
await SharePlus.instance.share(
ShareParams(
text: 'Check out this amazing face timeline! (Video generation coming soon)',
),
);
}
}

View File

@@ -0,0 +1,333 @@
import 'dart:async';
import 'package:flutter/material.dart';
import 'package:photos/models/faces_through_time/face_timeline.dart';
import 'package:photos/services/faces_through_time_service.dart';
import 'package:photos/services/faces_through_time_video_service.dart';
import 'package:photos/theme/ente_theme.dart';
import 'package:photos/ui/components/buttons/icon_button_widget.dart';
class FacesThroughTimePage extends StatefulWidget {
final String personId;
final String personName;
const FacesThroughTimePage({
super.key,
required this.personId,
required this.personName,
});
@override
State<FacesThroughTimePage> createState() => _FacesThroughTimePageState();
}
class _FacesThroughTimePageState extends State<FacesThroughTimePage> {
static const _slideshowInterval = Duration(seconds: 2);
FaceTimeline? _timeline;
int _currentIndex = 0;
Timer? _autoAdvanceTimer;
bool _isPaused = false;
bool _isLoading = true;
@override
void initState() {
super.initState();
_loadTimeline();
}
@override
void dispose() {
_autoAdvanceTimer?.cancel();
super.dispose();
}
Future<void> _loadTimeline() async {
final service = FacesThroughTimeService();
final timeline = await service.getTimeline(widget.personId);
if (timeline != null && mounted) {
setState(() {
_timeline = timeline;
_isLoading = false;
});
await service.markAsViewed(widget.personId);
_startAutoAdvance();
} else if (mounted) {
setState(() {
_isLoading = false;
});
}
}
void _startAutoAdvance() {
_autoAdvanceTimer?.cancel();
if (!_isPaused && _timeline != null) {
_autoAdvanceTimer = Timer.periodic(_slideshowInterval, (_) {
if (_currentIndex < _timeline!.entries.length - 1) {
if (mounted) {
setState(() {
_currentIndex++;
});
}
} else {
_autoAdvanceTimer?.cancel();
}
});
}
}
void _togglePause() {
setState(() {
_isPaused = !_isPaused;
});
if (_isPaused) {
_autoAdvanceTimer?.cancel();
} else {
_startAutoAdvance();
}
}
void _navigateTo(int index) {
if (index >= 0 && index < _timeline!.entries.length) {
setState(() {
_currentIndex = index;
});
_startAutoAdvance();
}
}
Future<void> _shareVideo() async {
if (_timeline == null) return;
setState(() {
_isPaused = true;
});
_autoAdvanceTimer?.cancel();
try {
final videoService = FacesThroughTimeVideoService();
await videoService.generateAndShareVideo(_timeline!.entries);
} catch (e) {
if (mounted) {
ScaffoldMessenger.of(context).showSnackBar(
SnackBar(content: Text('Failed to generate video: $e')),
);
}
} finally {
if (mounted) {
setState(() {
_isPaused = false;
});
_startAutoAdvance();
}
}
}
@override
Widget build(BuildContext context) {
final theme = getEnteColorScheme(context);
if (_isLoading) {
return Scaffold(
backgroundColor: theme.backgroundElevated,
appBar: AppBar(
backgroundColor: Colors.transparent,
elevation: 0,
),
body: const Center(
child: CircularProgressIndicator(),
),
);
}
if (_timeline == null || _timeline!.entries.isEmpty) {
return Scaffold(
backgroundColor: theme.backgroundElevated,
appBar: AppBar(
backgroundColor: Colors.transparent,
elevation: 0,
),
body: Center(
child: Text(
'No timeline available',
style: TextStyle(color: theme.textMuted),
),
),
);
}
final currentEntry = _timeline!.entries[_currentIndex];
return Scaffold(
backgroundColor: Colors.black,
body: GestureDetector(
onTapDown: (details) {
final width = MediaQuery.of(context).size.width;
final tapX = details.globalPosition.dx;
if (tapX < width * 0.3) {
// Tap on left - previous
_navigateTo(_currentIndex - 1);
} else if (tapX > width * 0.7) {
// Tap on right - next
_navigateTo(_currentIndex + 1);
} else {
// Tap in center - pause/resume
_togglePause();
}
},
onLongPressStart: (_) {
setState(() {
_isPaused = true;
});
_autoAdvanceTimer?.cancel();
},
onLongPressEnd: (_) {
setState(() {
_isPaused = false;
});
_startAutoAdvance();
},
child: Stack(
children: [
// Face display - placeholder for now
Center(
child: currentEntry.hasThumbnail && currentEntry.thumbnail != null
? Image.memory(
currentEntry.thumbnail!,
fit: BoxFit.contain,
gaplessPlayback: true,
)
: Container(
width: 200,
height: 200,
decoration: BoxDecoration(
color: theme.fillFaint,
borderRadius: BorderRadius.circular(8),
),
child: Icon(
Icons.person,
size: 80,
color: theme.textMuted,
),
),
),
// Top controls
Positioned(
top: MediaQuery.of(context).padding.top + 8,
left: 8,
right: 8,
child: Row(
mainAxisAlignment: MainAxisAlignment.spaceBetween,
children: [
IconButtonWidget(
icon: Icons.close,
iconButtonType: IconButtonType.primary,
onTap: () => Navigator.of(context).pop(),
),
IconButtonWidget(
icon: Icons.share,
iconButtonType: IconButtonType.primary,
onTap: _shareVideo,
),
],
),
),
// Bottom info
Positioned(
bottom: 100,
left: 0,
right: 0,
child: Center(
child: Container(
padding: const EdgeInsets.symmetric(
horizontal: 20,
vertical: 10,
),
decoration: BoxDecoration(
color: Colors.black.withValues(alpha: 0.6),
borderRadius: BorderRadius.circular(24),
),
child: Text(
currentEntry.displayText,
style: const TextStyle(
color: Colors.white,
fontSize: 16,
fontWeight: FontWeight.w500,
),
),
),
),
),
// Progress indicator
Positioned(
bottom: 50,
left: 24,
right: 24,
child: ClipRRect(
borderRadius: BorderRadius.circular(2),
child: LinearProgressIndicator(
value: (_currentIndex + 1) / _timeline!.entries.length,
backgroundColor: Colors.white.withValues(alpha: 0.3),
valueColor: const AlwaysStoppedAnimation<Color>(Colors.white),
minHeight: 3,
),
),
),
// Pause indicator
if (_isPaused)
Center(
child: Container(
padding: const EdgeInsets.all(20),
decoration: BoxDecoration(
color: Colors.black.withValues(alpha: 0.6),
shape: BoxShape.circle,
),
child: const Icon(
Icons.pause,
size: 40,
color: Colors.white,
),
),
),
// Navigation hints (subtle)
if (_currentIndex > 0)
const Positioned(
left: 16,
top: 0,
bottom: 0,
child: Center(
child: Icon(
Icons.chevron_left,
color: Colors.white30,
size: 32,
),
),
),
if (_currentIndex < _timeline!.entries.length - 1)
const Positioned(
right: 16,
top: 0,
bottom: 0,
child: Center(
child: Icon(
Icons.chevron_right,
color: Colors.white30,
size: 32,
),
),
),
],
),
),
);
}
}

View File

@@ -0,0 +1,91 @@
import 'package:flutter/material.dart';
import 'package:photos/models/ml/face/person.dart';
import 'package:photos/theme/ente_theme.dart';
class FacesTimelineBanner extends StatelessWidget {
final PersonEntity person;
final VoidCallback onTap;
const FacesTimelineBanner({
super.key,
required this.person,
required this.onTap,
});
@override
Widget build(BuildContext context) {
final theme = getEnteColorScheme(context);
return GestureDetector(
onTap: onTap,
child: Container(
margin: const EdgeInsets.symmetric(horizontal: 16, vertical: 8),
padding: const EdgeInsets.all(16),
decoration: BoxDecoration(
gradient: LinearGradient(
colors: [
theme.primary700,
theme.primary500,
],
begin: Alignment.topLeft,
end: Alignment.bottomRight,
),
borderRadius: BorderRadius.circular(12),
boxShadow: [
BoxShadow(
color: Colors.black.withValues(alpha: 0.1),
blurRadius: 10,
offset: const Offset(0, 4),
),
],
),
child: Row(
children: [
Container(
padding: const EdgeInsets.all(8),
decoration: BoxDecoration(
color: theme.backgroundElevated.withValues(alpha: 0.2),
borderRadius: BorderRadius.circular(8),
),
child: Icon(
Icons.auto_awesome,
color: theme.backgroundElevated,
size: 28,
),
),
const SizedBox(width: 16),
Expanded(
child: Column(
crossAxisAlignment: CrossAxisAlignment.start,
mainAxisSize: MainAxisSize.min,
children: [
Text(
'How ${person.data.name} grew over the years',
style: TextStyle(
color: theme.backgroundElevated,
fontSize: 16,
fontWeight: FontWeight.bold,
),
),
const SizedBox(height: 4),
Text(
'Tap to see their journey',
style: TextStyle(
color: theme.backgroundElevated.withValues(alpha: 0.9),
fontSize: 13,
),
),
],
),
),
Icon(
Icons.arrow_forward_ios,
color: theme.backgroundElevated,
size: 18,
),
],
),
),
);
}
}

View File

@@ -3,6 +3,7 @@ import "dart:async";
import 'package:flutter/material.dart';
import "package:logging/logging.dart";
import 'package:photos/core/event_bus.dart';
import "package:photos/events/faces_timeline_ready_event.dart";
import 'package:photos/events/files_updated_event.dart';
import 'package:photos/events/local_photos_updated_event.dart';
import "package:photos/events/people_changed_event.dart";
@@ -14,6 +15,7 @@ import 'package:photos/models/gallery_type.dart';
import "package:photos/models/ml/face/person.dart";
import "package:photos/models/search/search_result.dart";
import 'package:photos/models/selected_files.dart';
import "package:photos/services/faces_through_time_service.dart";
import "package:photos/services/machine_learning/face_ml/feedback/cluster_feedback.dart";
import "package:photos/services/search_service.dart";
import "package:photos/ui/components/end_to_end_banner.dart";
@@ -24,8 +26,9 @@ import "package:photos/ui/viewer/gallery/state/gallery_files_inherited_widget.da
import "package:photos/ui/viewer/gallery/state/inherited_search_filter_data.dart";
import "package:photos/ui/viewer/gallery/state/search_filter_data_provider.dart";
import "package:photos/ui/viewer/gallery/state/selection_state.dart";
import "package:photos/ui/viewer/people/faces_through_time_page.dart";
import "package:photos/ui/viewer/people/faces_timeline_banner.dart";
import "package:photos/ui/viewer/people/link_email_screen.dart";
import "package:photos/ui/viewer/people/people_app_bar.dart";
import "package:photos/ui/viewer/people/person_gallery_suggestion.dart";
import "package:photos/utils/navigation_util.dart";
@@ -57,6 +60,11 @@ class _PeoplePageState extends State<PeoplePage> {
late PersonEntity _person;
bool userDismissedPersonGallerySuggestion = false;
// Faces Through Time feature state
bool _timelineReady = false;
bool _timelineViewed = false;
StreamSubscription<FacesTimelineReadyEvent>? _timelineReadyEvent;
late final StreamSubscription<LocalPhotosUpdatedEvent> _filesUpdatedEvent;
late final StreamSubscription<PeopleChangedEvent> _peopleChangedEvent;
@@ -67,6 +75,19 @@ class _PeoplePageState extends State<PeoplePage> {
super.initState();
_person = widget.person;
ClusterFeedbackService.resetLastViewedClusterID();
// Check for Faces Through Time feature
_checkFacesTimeline();
// Listen for timeline ready events
_timelineReadyEvent = Bus.instance.on<FacesTimelineReadyEvent>().listen((event) {
if (event.personId == _person.remoteID && mounted) {
setState(() {
_timelineReady = true;
});
}
});
_peopleChangedEvent = Bus.instance.on<PeopleChangedEvent>().listen((event) {
if (event.type == PeopleEventType.saveOrEditPerson) {
if (event.person != null &&
@@ -118,11 +139,41 @@ class _PeoplePageState extends State<PeoplePage> {
files = sortedFiles;
return sortedFiles;
}
Future<void> _checkFacesTimeline() async {
final service = FacesThroughTimeService();
final isEligible = await service.isEligible(_person.remoteID);
if (isEligible) {
_timelineViewed = await service.hasBeenViewed(_person.remoteID);
if (!_timelineViewed) {
// Start preparing timeline in background
unawaited(service.checkAndPrepareTimeline(_person.remoteID));
} else if (mounted) {
setState(() {
_timelineReady = true;
});
}
}
}
void _openTimeline() {
Navigator.of(context).push(
MaterialPageRoute(
builder: (context) => FacesThroughTimePage(
personId: _person.remoteID,
personName: _person.data.name,
),
),
);
}
@override
void dispose() {
_filesUpdatedEvent.cancel();
_peopleChangedEvent.cancel();
_timelineReadyEvent?.cancel();
super.dispose();
}
@@ -179,6 +230,9 @@ class _PeoplePageState extends State<PeoplePage> {
personFiles: personFiles,
loadPersonFiles: loadPersonFiles,
personEntity: _person,
timelineReady: _timelineReady,
timelineViewed: _timelineViewed,
onOpenTimeline: _openTimeline,
);
},
)
@@ -188,6 +242,9 @@ class _PeoplePageState extends State<PeoplePage> {
personFiles: personFiles,
loadPersonFiles: loadPersonFiles,
personEntity: _person,
timelineReady: _timelineReady,
timelineViewed: _timelineViewed,
onOpenTimeline: _openTimeline,
),
FileSelectionOverlayBar(
PeoplePage.overlayType,
@@ -221,6 +278,9 @@ class _Gallery extends StatefulWidget {
final List<EnteFile> personFiles;
final Future<List<EnteFile>> Function() loadPersonFiles;
final PersonEntity personEntity;
final bool timelineReady;
final bool timelineViewed;
final VoidCallback onOpenTimeline;
const _Gallery({
required this.tagPrefix,
@@ -228,6 +288,9 @@ class _Gallery extends StatefulWidget {
required this.personFiles,
required this.loadPersonFiles,
required this.personEntity,
required this.timelineReady,
required this.timelineViewed,
required this.onOpenTimeline,
});
@override
@@ -267,6 +330,13 @@ class _GalleryState extends State<_Gallery> {
widget.personFiles.isNotEmpty ? [widget.personFiles.first] : [],
header: Column(
children: [
// Faces Through Time banner
widget.timelineReady && !widget.timelineViewed
? FacesTimelineBanner(
person: widget.personEntity,
onTap: widget.onOpenTimeline,
)
: const SizedBox.shrink(),
(widget.personEntity.data.email != null &&
widget.personEntity.data.email!.isNotEmpty) ||
widget.personEntity.data.isIgnored