mirror of
https://github.com/SamyRai/tercul-backend.git
synced 2025-12-27 05:11:34 +00:00
Merge pull request #10 from SamyRai/docs/consolidate-tasks
Consolidate all tasks into a single TASKS.md file
This commit is contained in:
commit
9b96cb4341
18
AGENTS.md
18
AGENTS.md
@ -1,17 +1,3 @@
|
|||||||
# Agent Debugging Log
|
# Agent Instructions
|
||||||
|
|
||||||
## Issue: Integration Test Failures
|
This file is for providing instructions to AI agents. The previous content was a temporary debug log and has been cleared. Follow the instructions in `refactor.md` and the consolidated tasks in `TASKS.md`.
|
||||||
|
|
||||||
I've been encountering a series of integration test failures related to `unauthorized`, `forbidden`, and `directive binding is not implemented` errors.
|
|
||||||
|
|
||||||
### Initial Investigation
|
|
||||||
|
|
||||||
1. **`directive binding is not implemented` error:** This error was caused by the test server in `internal/adapters/graphql/integration_test.go` not being configured with the necessary validation directive.
|
|
||||||
2. **`unauthorized` and `forbidden` errors:** These errors were caused by tests that require authentication not being run with an authenticated user.
|
|
||||||
3. **Build Error:** My initial attempts to fix the test server setup introduced a build error in `cmd/api` due to a function signature mismatch in `NewServerWithAuth`.
|
|
||||||
|
|
||||||
### Resolution Path
|
|
||||||
|
|
||||||
1. **Fix Build Error:** I corrected the function signature in `cmd/api/server.go` to match the call site in `cmd/api/main.go`. This resolved the build error.
|
|
||||||
2. **Fix Test Server Setup:** I updated the `SetupSuite` function in `internal/adapters/graphql/integration_test.go` to register the `binding` directive, aligning the test server configuration with the production server.
|
|
||||||
3. **Fix Authentication in Tests:** The remaining `forbidden` errors are because the tests are not passing the authentication token for an admin user. I will now modify the failing tests to create an admin user and pass the token in the `executeGraphQL` function.
|
|
||||||
@ -1,14 +0,0 @@
|
|||||||
# Build Issues
|
|
||||||
|
|
||||||
This document tracks the build errors encountered during the refactoring process.
|
|
||||||
|
|
||||||
- [ ] `internal/adapters/graphql/schema.resolvers.go:10:2: "log" imported and not used`
|
|
||||||
- [ ] `internal/adapters/graphql/schema.resolvers.go:1071:24: r.App.AuthorRepo undefined (type *app.Application has no field or method AuthorRepo)`
|
|
||||||
- [ ] `internal/adapters/graphql/schema.resolvers.go:1073:24: r.App.AuthorRepo undefined (type *app.Application has no field or method AuthorRepo)`
|
|
||||||
- [ ] `internal/adapters/graphql/schema.resolvers.go:1089:36: r.App.Localization.GetAuthorBiography undefined (type *"tercul/internal/app/localization".Service has no field or method GetAuthorBiography)`
|
|
||||||
- [ ] `internal/adapters/graphql/schema.resolvers.go:1141:22: r.App.UserRepo undefined (type *app.Application has no field or method UserRepo)`
|
|
||||||
- [ ] `internal/adapters/graphql/schema.resolvers.go:1143:24: r.App.UserRepo undefined (type *app.Application has no field or method UserRepo)`
|
|
||||||
- [ ] `internal/adapters/graphql/schema.resolvers.go:1212:20: r.App.TagRepo undefined (type *app.Application has no field or method TagRepo)`
|
|
||||||
- [ ] `internal/adapters/graphql/schema.resolvers.go:1225:32: r.App.TagRepo undefined (type *app.Application has no field or method TagRepo)`
|
|
||||||
- [ ] `internal/adapters/graphql/schema.resolvers.go:1249:25: r.App.CategoryRepo undefined (type *app.Application has no field or method CategoryRepo)`
|
|
||||||
- [ ] `internal/adapters/graphql/schema.resolvers.go:1262:32: r.App.CategoryRepo undefined (type *app.Application has no field or method CategoryRepo)`
|
|
||||||
72
TASKS.md
Normal file
72
TASKS.md
Normal file
@ -0,0 +1,72 @@
|
|||||||
|
# Consolidated Tasks for Tercul
|
||||||
|
|
||||||
|
This document is the single source of truth for all outstanding development tasks, aligned with the architectural vision in `refactor.md`.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Urgent: Build Failures
|
||||||
|
|
||||||
|
These issues are currently breaking the build and must be resolved before any other work can proceed. They are all related to the ongoing refactor, where repositories are being replaced by application services. The `r.App` object in the resolver does not have direct repository access anymore.
|
||||||
|
|
||||||
|
- [ ] **Fix Resolver Build Errors in `internal/adapters/graphql/schema.resolvers.go`:**
|
||||||
|
- [ ] `line 10`: Remove the unused "log" import.
|
||||||
|
- [ ] `lines 1071, 1073`: Replace `r.App.AuthorRepo` calls with the appropriate application service method (e.g., `r.App.Authors.FindByID`).
|
||||||
|
- [ ] `line 1089`: Correct the call to `r.App.Localization.GetAuthorBiography` as the method does not exist. The correct service and method must be identified.
|
||||||
|
- [ ] `lines 1141, 1143`: Replace `r.App.UserRepo` calls with the correct user application service method.
|
||||||
|
- [ ] `lines 1212, 1225`: Replace `r.App.TagRepo` calls with the correct tag application service method.
|
||||||
|
- [ ] `lines 1249, 1262`: Replace `r.App.CategoryRepo` calls with the correct category application service method.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## High Priority
|
||||||
|
|
||||||
|
### Architecture & Refactoring (see `refactor.md`)
|
||||||
|
- [ ] **Implement Full Observability:**
|
||||||
|
- [ ] **Centralized Logging:** Ensure all services use the structured `zerolog` logger from `internal/platform/log`. Add request/user/span IDs to the logging context in the HTTP middleware.
|
||||||
|
- [ ] **Metrics:** Add Prometheus metrics for API request latency, error rates, and database query performance. Expose them on the `/metrics` endpoint.
|
||||||
|
- [ ] **Tracing:** Instrument all application services and data layer methods with OpenTelemetry tracing.
|
||||||
|
- [ ] **Establish CI/CD Pipeline:**
|
||||||
|
- [ ] **CI:** Create a `Makefile` target `lint-test` that runs `golangci-lint` and `go test ./...`. Configure the CI pipeline to run this on every push.
|
||||||
|
- [ ] **CD:** Set up automated deployments to a staging environment upon a successful merge to the main branch.
|
||||||
|
- [ ] **Refactor Testing Utilities:**
|
||||||
|
- [ ] **Decouple from DB:** Remove all database connection logic from `internal/testutil/testutil.go`. Tests should use mock repositories and services, not a live database. This is a key step towards faster, more reliable unit tests.
|
||||||
|
|
||||||
|
### Features
|
||||||
|
- [ ] **Implement Analytics Features:**
|
||||||
|
- **Context:** This is required for user engagement insights. The following counts need to be implemented and stored, likely on the `Work` and `Translation` models.
|
||||||
|
- [ ] Implement view counting.
|
||||||
|
- [ ] Implement like counting.
|
||||||
|
- [ ] Implement comment counting.
|
||||||
|
- [ ] Implement bookmark counting.
|
||||||
|
- [ ] Implement translation counting.
|
||||||
|
- [ ] Implement a service to calculate popular translations based on the above metrics.
|
||||||
|
- *Note: This is referenced in the old `TODO.md` and a TODO comment in `internal/jobs/linguistics/work_analysis_service.go`.*
|
||||||
|
|
||||||
|
- [ ] **Complete Unfinished Implementations:**
|
||||||
|
- [ ] `internal/app/work/commands.go`: Implement the `MergeWork` command. This should handle merging duplicate work entries.
|
||||||
|
- [ ] `internal/app/search/service.go`: The search service needs to fetch content from the translation service to enrich its search index.
|
||||||
|
- [ ] `cmd/tools/enrich/main.go`: This command-line tool is broken. It needs to be investigated, fixed, and documented.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Medium Priority
|
||||||
|
|
||||||
|
### Performance
|
||||||
|
- [ ] **Batch Weaviate Operations:** Refactor the Weaviate client in `internal/platform/search` to support batching of create/update operations to improve indexing performance.
|
||||||
|
- [ ] **Add Performance Benchmarks:** Using Go's built-in benchmarking tools, add benchmarks for critical API queries and commands to detect performance regressions.
|
||||||
|
|
||||||
|
### Code Quality & Architecture
|
||||||
|
- [ ] **Expand Weaviate Client:** The client in `internal/platform/search` should be extended to support indexing all relevant domain models (e.g., Authors, Tags).
|
||||||
|
- [ ] **Add Documentation:** Add GoDoc comments to all public functions and types in the `internal/app` and `internal/domain` packages.
|
||||||
|
- [ ] **Refactor Caching:** As per `refactor.md`, replace the current bespoke cached repositories with a decorator pattern in `internal/data/cache`. This will simplify cache invalidation logic.
|
||||||
|
- [ ] **Improve Configuration Handling:** Replace the global config object with struct-based configuration loaded from environment variables (e.g., using `koanf` or `envconfig`), as outlined in `refactor.md`.
|
||||||
|
|
||||||
|
### Monitoring & Logging
|
||||||
|
- [ ] **Add Job Monitoring:** Add specific monitoring and alerting for background jobs in `internal/jobs` to track success, failure, and duration.
|
||||||
|
- [ ] **Add Linguistics Metrics:** Add Prometheus metrics for the linguistics pipeline, including analysis duration, cache hit/miss rates, and third-party API usage.
|
||||||
|
|
||||||
|
---
|
||||||
|
|
||||||
|
## Low Priority
|
||||||
|
|
||||||
|
- [ ] **Refactor Transactional Runner:** Refactor the `RunTransactional` helper in `internal/testutil/testutil.go` to be more friendly to mock-based testing, likely by allowing a mock DB transaction to be passed in.
|
||||||
107
TODO.md
107
TODO.md
@ -1,107 +0,0 @@
|
|||||||
# TODO List for Tercul Go Application
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## Suggested Next Objectives
|
|
||||||
|
|
||||||
- [x] **Complete the Architecture Refactor (High, 5d):** Finalize the transition to a clean, domain-driven architecture. This will significantly improve maintainability, scalability, and developer velocity.
|
|
||||||
- [x] Ensure resolvers call application services only and add dataloaders per aggregate.
|
|
||||||
- [x] Adopt a migrations tool and move all SQL to migration files.
|
|
||||||
- [ ] Implement full observability with centralized logging, metrics, and tracing.
|
|
||||||
- [x] **Full Test Coverage (High, 5d):** Increase test coverage across the application to ensure stability and prevent regressions.
|
|
||||||
- [x] Write unit tests for all models, repositories, and services.
|
|
||||||
- [x] Refactor existing tests to use mocks instead of a real database.
|
|
||||||
- [ ] **Implement Analytics Features (High, 3d):** Add analytics to provide insights into user engagement and content popularity.
|
|
||||||
- [ ] Implement view, like, comment, and bookmark counting.
|
|
||||||
- [ ] Track translation analytics to identify popular translations.
|
|
||||||
- [ ] **Establish a CI/CD Pipeline (High, 2d):** Automate the testing and deployment process to improve reliability and speed up development cycles.
|
|
||||||
- [ ] Add `make lint test test-integration` to the CI pipeline.
|
|
||||||
- [ ] Set up automated deployments to a staging environment.
|
|
||||||
- [ ] **Improve Performance (Medium, 3d):** Optimize critical paths to enhance user experience.
|
|
||||||
- [ ] Implement batching for Weaviate operations.
|
|
||||||
- [ ] Add performance benchmarks for critical paths.
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## [ ] High Priority
|
|
||||||
|
|
||||||
### [ ] Architecture Refactor (DDD-lite)
|
|
||||||
- [x] Refactor domains to be more testable and decoupled, with 100% unit test coverage and logging.
|
|
||||||
- [x] `localization` domain
|
|
||||||
- [x] `auth` domain
|
|
||||||
- [x] `copyright` domain
|
|
||||||
- [x] `monetization` domain
|
|
||||||
- [x] `search` domain
|
|
||||||
- [x] `work` domain
|
|
||||||
- [x] Resolvers call application services only; add dataloaders per aggregate (High, 3d)
|
|
||||||
- [x] Adopt migrations tool (goose/atlas/migrate); move SQL to `internal/data/migrations`; delete `migrations.go` (High, 2d)
|
|
||||||
- [ ] Observability: centralize logging; add Prometheus metrics and OpenTelemetry tracing; request IDs (High, 3d)
|
|
||||||
- [ ] CI: add `make lint test test-integration` and integration tests with Docker compose (High, 2d)
|
|
||||||
|
|
||||||
### [x] Testing
|
|
||||||
- [x] Add unit tests for all models, repositories, and services (High, 3d)
|
|
||||||
- [x] Remove DB logic from `BaseSuite` for mock-based integration tests (High, 2d)
|
|
||||||
|
|
||||||
### [ ] Features
|
|
||||||
- [ ] Implement analytics data collection (High, 3d)
|
|
||||||
- [ ] Implement view counting for works and translations
|
|
||||||
- [ ] Implement like counting for works and translations
|
|
||||||
- [ ] Implement comment counting for works
|
|
||||||
- [ ] Implement bookmark counting for works
|
|
||||||
- [ ] Implement translation counting for works
|
|
||||||
- [ ] Implement translation analytics to show popular translations
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## [ ] Medium Priority
|
|
||||||
|
|
||||||
### [ ] Performance Improvements
|
|
||||||
- [ ] Implement batching for Weaviate operations (Medium, 2d)
|
|
||||||
|
|
||||||
### [ ] Code Quality & Architecture
|
|
||||||
- [ ] Expand Weaviate client to support all models (Medium, 2d)
|
|
||||||
- [ ] Add code documentation and API docs (Medium, 2d)
|
|
||||||
- [ ] Replace bespoke cached repositories with decorators in `internal/data/cache` (reads only; deterministic invalidation) (Medium, 2d)
|
|
||||||
- [ ] Config: replace ad-hoc config with env parsing + validation (e.g., koanf/envconfig); no globals (Medium, 1d)
|
|
||||||
|
|
||||||
### [ ] Testing
|
|
||||||
- [ ] Add performance benchmarks for critical paths (Medium, 2d)
|
|
||||||
- [ ] Add benchmarks for text analysis (sequential vs concurrent) and cache hit/miss rates
|
|
||||||
|
|
||||||
### [ ] Monitoring & Logging
|
|
||||||
- [ ] Add monitoring for background jobs and API endpoints (Medium, 2d)
|
|
||||||
- [ ] Add metrics for linguistics: analysis duration, cache hit/miss, provider usage
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## [ ] Low Priority
|
|
||||||
|
|
||||||
### [ ] Testing
|
|
||||||
- [ ] Refactor `RunTransactional` to be mock-friendly (Low, 1d)
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
## [ ] Completed
|
|
||||||
|
|
||||||
- [x] Add comprehensive input validation for all GraphQL mutations (High, 2d) - *Partially complete. Core mutations are validated.*
|
|
||||||
- [x] Create skeleton packages: `cmd/`, `internal/platform/`, `internal/domain/`, `internal/app/`, `internal/data/`, `internal/adapters/graphql/`, `internal/jobs/`
|
|
||||||
- [x] Move infra to `internal/platform/*` (`config`, `db`, `cache`, `auth`, `http`, `log`, `search`)
|
|
||||||
- [x] Wire DI in `cmd/api/main.go` and expose an `Application` facade to adapters
|
|
||||||
- [x] Unify GraphQL under `internal/adapters/graphql` and update `gqlgen.yml`; move `schema.graphqls` and resolvers
|
|
||||||
- [x] Introduce Unit-of-Work: `platform/db.WithTx(ctx, func(ctx) error)` and repo factory for `*sql.DB` / `*sql.Tx`
|
|
||||||
- [x] Split write vs read paths for `work` (commands.go, queries.go); make read models cacheable
|
|
||||||
- [x] Restructure `models/*` into domain aggregates with constructors and invariants
|
|
||||||
- [x] Security: move JWT/middleware to `internal/platform/auth`; add authz policy helpers (e.g., `CanEditWork`)
|
|
||||||
- [x] Search: move Weaviate client/schema to `internal/platform/search`, optional domain interface
|
|
||||||
- [x] Background jobs: move to `cmd/worker` and `internal/jobs/*`; ensure idempotency and lease
|
|
||||||
- [x] Python ops: move scripts to `/ops/migration` and `/ops/analysis`; keep outputs under `/ops/migration/outputs/`
|
|
||||||
- [x] Cleanup: delete dead packages (`store`, duplicate `repositories`); consolidate to `internal/data/sql`
|
|
||||||
- [x] Add integration tests for GraphQL API and background jobs (High, 3d) - *Partially complete. Core mutations are tested.*
|
|
||||||
- [x] Stabilize non-linguistics tests and interfaces (High, 2d)
|
|
||||||
- [x] Fix `graph` mocks to accept context in service interfaces
|
|
||||||
- [x] Update `repositories` tests (missing `TestModel`) and align with new repository interfaces
|
|
||||||
- [x] Update `services` tests to pass context and implement missing repo methods in mocks
|
|
||||||
|
|
||||||
---
|
|
||||||
|
|
||||||
> TODO items include context, priority, and estimated effort. Update this list after each milestone.
|
|
||||||
@ -98,4 +98,12 @@ func (m *mockWorkRepository) ListWithTranslations(ctx context.Context, page, pag
|
|||||||
func (m *mockWorkRepository) IsAuthor(ctx context.Context, workID uint, authorID uint) (bool, error) {
|
func (m *mockWorkRepository) IsAuthor(ctx context.Context, workID uint, authorID uint) (bool, error) {
|
||||||
args := m.Called(ctx, workID, authorID)
|
args := m.Called(ctx, workID, authorID)
|
||||||
return args.Bool(0), args.Error(1)
|
return args.Bool(0), args.Error(1)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (m *mockWorkRepository) GetWithAssociations(ctx context.Context, id uint) (*work.Work, error) {
|
||||||
|
return m.GetByID(ctx, id)
|
||||||
|
}
|
||||||
|
|
||||||
|
func (m *mockWorkRepository) GetWithAssociationsInTx(ctx context.Context, tx *gorm.DB, id uint) (*work.Work, error) {
|
||||||
|
return m.GetByID(ctx, id)
|
||||||
}
|
}
|
||||||
@ -68,4 +68,12 @@ func (m *mockWorkRepoForUserTests) ListWithTranslations(ctx context.Context, pag
|
|||||||
}
|
}
|
||||||
func (m *mockWorkRepoForUserTests) IsAuthor(ctx context.Context, workID uint, authorID uint) (bool, error) {
|
func (m *mockWorkRepoForUserTests) IsAuthor(ctx context.Context, workID uint, authorID uint) (bool, error) {
|
||||||
return false, nil
|
return false, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (m *mockWorkRepoForUserTests) GetWithAssociations(ctx context.Context, id uint) (*work.Work, error) {
|
||||||
|
return nil, nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func (m *mockWorkRepoForUserTests) GetWithAssociationsInTx(ctx context.Context, tx *gorm.DB, id uint) (*work.Work, error) {
|
||||||
|
return nil, nil
|
||||||
}
|
}
|
||||||
@ -135,3 +135,141 @@ func (c *WorkCommands) AnalyzeWork(ctx context.Context, workID uint) error {
|
|||||||
// TODO: implement this
|
// TODO: implement this
|
||||||
return nil
|
return nil
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// MergeWork merges two works, moving all associations from the source to the target and deleting the source.
|
||||||
|
func (c *WorkCommands) MergeWork(ctx context.Context, sourceID, targetID uint) error {
|
||||||
|
if sourceID == targetID {
|
||||||
|
return fmt.Errorf("%w: source and target work IDs cannot be the same", domain.ErrValidation)
|
||||||
|
}
|
||||||
|
|
||||||
|
userID, ok := platform_auth.GetUserIDFromContext(ctx)
|
||||||
|
if !ok {
|
||||||
|
return domain.ErrUnauthorized
|
||||||
|
}
|
||||||
|
|
||||||
|
// The repo is a work.WorkRepository, which embeds domain.BaseRepository.
|
||||||
|
// We can use the WithTx method from the base repository to run the merge in a transaction.
|
||||||
|
err := c.repo.WithTx(ctx, func(tx *gorm.DB) error {
|
||||||
|
// We need to use the transaction `tx` for all operations inside this function.
|
||||||
|
// For repository methods that are not on the base repository, we need to
|
||||||
|
// create a new repository instance that uses the transaction.
|
||||||
|
// However, since we added `GetWithAssociationsInTx`, we can pass the tx directly.
|
||||||
|
|
||||||
|
// Authorization: Ensure user can edit both works
|
||||||
|
sourceWork, err := c.repo.GetWithAssociationsInTx(ctx, tx, sourceID)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to get source work: %w", err)
|
||||||
|
}
|
||||||
|
targetWork, err := c.repo.GetWithAssociationsInTx(ctx, tx, targetID)
|
||||||
|
if err != nil {
|
||||||
|
return fmt.Errorf("failed to get target work: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
canEditSource, err := c.authzSvc.CanEditWork(ctx, userID, sourceWork)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
canEditTarget, err := c.authzSvc.CanEditWork(ctx, userID, targetWork)
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
if !canEditSource || !canEditTarget {
|
||||||
|
return domain.ErrForbidden
|
||||||
|
}
|
||||||
|
|
||||||
|
// Merge WorkStats
|
||||||
|
if err = mergeWorkStats(tx, sourceID, targetID); err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Re-associate polymorphic Translations
|
||||||
|
if err = tx.Model(&domain.Translation{}).
|
||||||
|
Where("translatable_id = ? AND translatable_type = ?", sourceID, "works").
|
||||||
|
Update("translatable_id", targetID).Error; err != nil {
|
||||||
|
return fmt.Errorf("failed to merge translations: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Append many-to-many associations
|
||||||
|
if err = tx.Model(targetWork).Association("Authors").Append(sourceWork.Authors); err != nil {
|
||||||
|
return fmt.Errorf("failed to merge authors: %w", err)
|
||||||
|
}
|
||||||
|
if err = tx.Model(targetWork).Association("Tags").Append(sourceWork.Tags); err != nil {
|
||||||
|
return fmt.Errorf("failed to merge tags: %w", err)
|
||||||
|
}
|
||||||
|
if err = tx.Model(targetWork).Association("Categories").Append(sourceWork.Categories); err != nil {
|
||||||
|
return fmt.Errorf("failed to merge categories: %w", err)
|
||||||
|
}
|
||||||
|
if err = tx.Model(targetWork).Association("Copyrights").Append(sourceWork.Copyrights); err != nil {
|
||||||
|
return fmt.Errorf("failed to merge copyrights: %w", err)
|
||||||
|
}
|
||||||
|
if err = tx.Model(targetWork).Association("Monetizations").Append(sourceWork.Monetizations); err != nil {
|
||||||
|
return fmt.Errorf("failed to merge monetizations: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Finally, delete the source work.
|
||||||
|
if err = tx.Select("Authors", "Tags", "Categories", "Copyrights", "Monetizations").Delete(sourceWork).Error; err != nil {
|
||||||
|
return fmt.Errorf("failed to delete source work associations: %w", err)
|
||||||
|
}
|
||||||
|
if err = tx.Delete(&work.Work{}, sourceID).Error; err != nil {
|
||||||
|
return fmt.Errorf("failed to delete source work: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// Re-index the target work in the search client *after* the transaction commits.
|
||||||
|
// We can't do it here, so we'll do it after the WithTx call.
|
||||||
|
return nil
|
||||||
|
})
|
||||||
|
|
||||||
|
if err != nil {
|
||||||
|
return err
|
||||||
|
}
|
||||||
|
|
||||||
|
// Re-index the target work in the search client now that the transaction is committed.
|
||||||
|
targetWork, err := c.repo.GetByID(ctx, targetID)
|
||||||
|
if err == nil && targetWork != nil {
|
||||||
|
if searchErr := c.searchClient.IndexWork(ctx, targetWork, ""); searchErr != nil {
|
||||||
|
// Log the error but don't fail the main operation
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
func mergeWorkStats(tx *gorm.DB, sourceWorkID, targetWorkID uint) error {
|
||||||
|
var sourceStats work.WorkStats
|
||||||
|
err := tx.Where("work_id = ?", sourceWorkID).First(&sourceStats).Error
|
||||||
|
if err != nil && !errors.Is(err, gorm.ErrRecordNotFound) {
|
||||||
|
return fmt.Errorf("failed to get source work stats: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
// If source has no stats, there's nothing to do.
|
||||||
|
if errors.Is(err, gorm.ErrRecordNotFound) {
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|
||||||
|
var targetStats work.WorkStats
|
||||||
|
err = tx.Where("work_id = ?", targetWorkID).First(&targetStats).Error
|
||||||
|
|
||||||
|
if errors.Is(err, gorm.ErrRecordNotFound) {
|
||||||
|
// If target has no stats, create new ones based on source stats.
|
||||||
|
sourceStats.ID = 0 // Let GORM create a new record
|
||||||
|
sourceStats.WorkID = targetWorkID
|
||||||
|
if err = tx.Create(&sourceStats).Error; err != nil {
|
||||||
|
return fmt.Errorf("failed to create new target stats: %w", err)
|
||||||
|
}
|
||||||
|
} else if err != nil {
|
||||||
|
return fmt.Errorf("failed to get target work stats: %w", err)
|
||||||
|
} else {
|
||||||
|
// Both have stats, so add source to target.
|
||||||
|
targetStats.Add(&sourceStats)
|
||||||
|
if err = tx.Save(&targetStats).Error; err != nil {
|
||||||
|
return fmt.Errorf("failed to save merged target stats: %w", err)
|
||||||
|
}
|
||||||
|
}
|
||||||
|
|
||||||
|
// Delete the old source stats
|
||||||
|
if err = tx.Delete(&work.WorkStats{}, sourceStats.ID).Error; err != nil {
|
||||||
|
return fmt.Errorf("failed to delete source work stats: %w", err)
|
||||||
|
}
|
||||||
|
|
||||||
|
return nil
|
||||||
|
}
|
||||||
|
|||||||
@ -3,13 +3,17 @@ package work
|
|||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
"errors"
|
"errors"
|
||||||
|
"testing"
|
||||||
|
|
||||||
"github.com/stretchr/testify/assert"
|
"github.com/stretchr/testify/assert"
|
||||||
"github.com/stretchr/testify/suite"
|
"github.com/stretchr/testify/suite"
|
||||||
|
"gorm.io/driver/sqlite"
|
||||||
|
"gorm.io/gorm"
|
||||||
"tercul/internal/app/authz"
|
"tercul/internal/app/authz"
|
||||||
|
"tercul/internal/data/sql"
|
||||||
"tercul/internal/domain"
|
"tercul/internal/domain"
|
||||||
workdomain "tercul/internal/domain/work"
|
workdomain "tercul/internal/domain/work"
|
||||||
platform_auth "tercul/internal/platform/auth"
|
platform_auth "tercul/internal/platform/auth"
|
||||||
"testing"
|
|
||||||
)
|
)
|
||||||
|
|
||||||
type WorkCommandsSuite struct {
|
type WorkCommandsSuite struct {
|
||||||
@ -146,4 +150,93 @@ func (s *WorkCommandsSuite) TestDeleteWork_RepoError() {
|
|||||||
func (s *WorkCommandsSuite) TestAnalyzeWork_Success() {
|
func (s *WorkCommandsSuite) TestAnalyzeWork_Success() {
|
||||||
err := s.commands.AnalyzeWork(context.Background(), 1)
|
err := s.commands.AnalyzeWork(context.Background(), 1)
|
||||||
assert.NoError(s.T(), err)
|
assert.NoError(s.T(), err)
|
||||||
|
}
|
||||||
|
|
||||||
|
func TestMergeWork_Integration(t *testing.T) {
|
||||||
|
// Setup in-memory SQLite DB
|
||||||
|
db, err := gorm.Open(sqlite.Open("file::memory:?cache=shared"), &gorm.Config{})
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
// Run migrations for all relevant tables
|
||||||
|
err = db.AutoMigrate(
|
||||||
|
&workdomain.Work{},
|
||||||
|
&domain.Translation{},
|
||||||
|
&domain.Author{},
|
||||||
|
&domain.Tag{},
|
||||||
|
&domain.Category{},
|
||||||
|
&domain.Copyright{},
|
||||||
|
&domain.Monetization{},
|
||||||
|
&workdomain.WorkStats{},
|
||||||
|
&workdomain.WorkAuthor{},
|
||||||
|
)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
// Create real repositories and services pointing to the test DB
|
||||||
|
workRepo := sql.NewWorkRepository(db)
|
||||||
|
authzSvc := authz.NewService(workRepo, nil) // Using real repo for authz checks
|
||||||
|
searchClient := &mockSearchClient{} // Mock search client is fine
|
||||||
|
commands := NewWorkCommands(workRepo, searchClient, authzSvc)
|
||||||
|
|
||||||
|
// --- Seed Data ---
|
||||||
|
author1 := &domain.Author{Name: "Author One"}
|
||||||
|
db.Create(author1)
|
||||||
|
author2 := &domain.Author{Name: "Author Two"}
|
||||||
|
db.Create(author2)
|
||||||
|
|
||||||
|
tag1 := &domain.Tag{Name: "Tag One"}
|
||||||
|
db.Create(tag1)
|
||||||
|
tag2 := &domain.Tag{Name: "Tag Two"}
|
||||||
|
db.Create(tag2)
|
||||||
|
|
||||||
|
sourceWork := &workdomain.Work{
|
||||||
|
TranslatableModel: domain.TranslatableModel{Language: "en"},
|
||||||
|
Title: "Source Work",
|
||||||
|
Authors: []*domain.Author{author1},
|
||||||
|
Tags: []*domain.Tag{tag1},
|
||||||
|
}
|
||||||
|
db.Create(sourceWork)
|
||||||
|
db.Create(&domain.Translation{Title: "Source Translation", Language: "en", TranslatableID: sourceWork.ID, TranslatableType: "works"})
|
||||||
|
db.Create(&workdomain.WorkStats{WorkID: sourceWork.ID, Views: 10, Likes: 5})
|
||||||
|
|
||||||
|
targetWork := &workdomain.Work{
|
||||||
|
TranslatableModel: domain.TranslatableModel{Language: "en"},
|
||||||
|
Title: "Target Work",
|
||||||
|
Authors: []*domain.Author{author2},
|
||||||
|
Tags: []*domain.Tag{tag2},
|
||||||
|
}
|
||||||
|
db.Create(targetWork)
|
||||||
|
db.Create(&domain.Translation{Title: "Target Translation", Language: "en", TranslatableID: targetWork.ID, TranslatableType: "works"})
|
||||||
|
db.Create(&workdomain.WorkStats{WorkID: targetWork.ID, Views: 20, Likes: 10})
|
||||||
|
|
||||||
|
// --- Execute Merge ---
|
||||||
|
ctx := platform_auth.ContextWithAdminUser(context.Background(), 1)
|
||||||
|
err = commands.MergeWork(ctx, sourceWork.ID, targetWork.ID)
|
||||||
|
assert.NoError(t, err)
|
||||||
|
|
||||||
|
// --- Assertions ---
|
||||||
|
// 1. Source work should be deleted
|
||||||
|
var deletedWork workdomain.Work
|
||||||
|
err = db.First(&deletedWork, sourceWork.ID).Error
|
||||||
|
assert.Error(t, err)
|
||||||
|
assert.True(t, errors.Is(err, gorm.ErrRecordNotFound))
|
||||||
|
|
||||||
|
// 2. Target work should have merged data
|
||||||
|
var finalTargetWork workdomain.Work
|
||||||
|
db.Preload("Translations").Preload("Authors").Preload("Tags").First(&finalTargetWork, targetWork.ID)
|
||||||
|
|
||||||
|
assert.Len(t, finalTargetWork.Translations, 2, "Translations should be merged")
|
||||||
|
assert.Len(t, finalTargetWork.Authors, 2, "Authors should be merged")
|
||||||
|
assert.Len(t, finalTargetWork.Tags, 2, "Tags should be merged")
|
||||||
|
|
||||||
|
// 3. Stats should be merged
|
||||||
|
var finalStats workdomain.WorkStats
|
||||||
|
db.Where("work_id = ?", targetWork.ID).First(&finalStats)
|
||||||
|
assert.Equal(t, int64(30), finalStats.Views, "Views should be summed")
|
||||||
|
assert.Equal(t, int64(15), finalStats.Likes, "Likes should be summed")
|
||||||
|
|
||||||
|
// 4. Source stats should be deleted
|
||||||
|
var deletedStats workdomain.WorkStats
|
||||||
|
err = db.First(&deletedStats, "work_id = ?", sourceWork.ID).Error
|
||||||
|
assert.Error(t, err, "Source stats should be deleted")
|
||||||
|
assert.True(t, errors.Is(err, gorm.ErrRecordNotFound))
|
||||||
}
|
}
|
||||||
@ -2,6 +2,8 @@ package sql
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
|
"errors"
|
||||||
|
"fmt"
|
||||||
"tercul/internal/domain"
|
"tercul/internal/domain"
|
||||||
"tercul/internal/domain/work"
|
"tercul/internal/domain/work"
|
||||||
|
|
||||||
@ -120,6 +122,43 @@ func (r *workRepository) GetWithTranslations(ctx context.Context, id uint) (*wor
|
|||||||
return r.FindWithPreload(ctx, []string{"Translations"}, id)
|
return r.FindWithPreload(ctx, []string{"Translations"}, id)
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// GetWithAssociations gets a work with all of its direct and many-to-many associations.
|
||||||
|
func (r *workRepository) GetWithAssociations(ctx context.Context, id uint) (*work.Work, error) {
|
||||||
|
associations := []string{
|
||||||
|
"Translations",
|
||||||
|
"Authors",
|
||||||
|
"Tags",
|
||||||
|
"Categories",
|
||||||
|
"Copyrights",
|
||||||
|
"Monetizations",
|
||||||
|
}
|
||||||
|
return r.FindWithPreload(ctx, associations, id)
|
||||||
|
}
|
||||||
|
|
||||||
|
// GetWithAssociationsInTx gets a work with all associations within a transaction.
|
||||||
|
func (r *workRepository) GetWithAssociationsInTx(ctx context.Context, tx *gorm.DB, id uint) (*work.Work, error) {
|
||||||
|
var entity work.Work
|
||||||
|
query := tx.WithContext(ctx)
|
||||||
|
associations := []string{
|
||||||
|
"Translations",
|
||||||
|
"Authors",
|
||||||
|
"Tags",
|
||||||
|
"Categories",
|
||||||
|
"Copyrights",
|
||||||
|
"Monetizations",
|
||||||
|
}
|
||||||
|
for _, preload := range associations {
|
||||||
|
query = query.Preload(preload)
|
||||||
|
}
|
||||||
|
if err := query.First(&entity, id).Error; err != nil {
|
||||||
|
if errors.Is(err, gorm.ErrRecordNotFound) {
|
||||||
|
return nil, ErrEntityNotFound
|
||||||
|
}
|
||||||
|
return nil, fmt.Errorf("%w: %v", ErrDatabaseOperation, err)
|
||||||
|
}
|
||||||
|
return &entity, nil
|
||||||
|
}
|
||||||
|
|
||||||
// IsAuthor checks if a user is an author of a work.
|
// IsAuthor checks if a user is an author of a work.
|
||||||
// Note: This assumes a direct relationship between user ID and author ID,
|
// Note: This assumes a direct relationship between user ID and author ID,
|
||||||
// which may need to be revised based on the actual domain model.
|
// which may need to be revised based on the actual domain model.
|
||||||
|
|||||||
@ -71,6 +71,22 @@ type WorkStats struct {
|
|||||||
Work *Work `gorm:"foreignKey:WorkID;constraint:OnUpdate:CASCADE,OnDelete:CASCADE;"`
|
Work *Work `gorm:"foreignKey:WorkID;constraint:OnUpdate:CASCADE,OnDelete:CASCADE;"`
|
||||||
}
|
}
|
||||||
|
|
||||||
|
// Add combines the values of another WorkStats into this one.
|
||||||
|
func (ws *WorkStats) Add(other *WorkStats) {
|
||||||
|
if other == nil {
|
||||||
|
return
|
||||||
|
}
|
||||||
|
ws.Views += other.Views
|
||||||
|
ws.Likes += other.Likes
|
||||||
|
ws.Comments += other.Comments
|
||||||
|
ws.Bookmarks += other.Bookmarks
|
||||||
|
ws.Shares += other.Shares
|
||||||
|
ws.TranslationCount += other.TranslationCount
|
||||||
|
ws.ReadingTime += other.ReadingTime
|
||||||
|
// Note: Complexity and Sentiment are not additive. We could average them,
|
||||||
|
// but for now, we'll just keep the target's values.
|
||||||
|
}
|
||||||
|
|
||||||
type WorkSeries struct {
|
type WorkSeries struct {
|
||||||
domain.BaseModel
|
domain.BaseModel
|
||||||
WorkID uint `gorm:"index;uniqueIndex:uniq_work_series"`
|
WorkID uint `gorm:"index;uniqueIndex:uniq_work_series"`
|
||||||
|
|||||||
@ -2,6 +2,7 @@ package work
|
|||||||
|
|
||||||
import (
|
import (
|
||||||
"context"
|
"context"
|
||||||
|
"gorm.io/gorm"
|
||||||
"tercul/internal/domain"
|
"tercul/internal/domain"
|
||||||
)
|
)
|
||||||
|
|
||||||
@ -13,6 +14,8 @@ type WorkRepository interface {
|
|||||||
FindByCategory(ctx context.Context, categoryID uint) ([]Work, error)
|
FindByCategory(ctx context.Context, categoryID uint) ([]Work, error)
|
||||||
FindByLanguage(ctx context.Context, language string, page, pageSize int) (*domain.PaginatedResult[Work], error)
|
FindByLanguage(ctx context.Context, language string, page, pageSize int) (*domain.PaginatedResult[Work], error)
|
||||||
GetWithTranslations(ctx context.Context, id uint) (*Work, error)
|
GetWithTranslations(ctx context.Context, id uint) (*Work, error)
|
||||||
|
GetWithAssociations(ctx context.Context, id uint) (*Work, error)
|
||||||
|
GetWithAssociationsInTx(ctx context.Context, tx *gorm.DB, id uint) (*Work, error)
|
||||||
ListWithTranslations(ctx context.Context, page, pageSize int) (*domain.PaginatedResult[Work], error)
|
ListWithTranslations(ctx context.Context, page, pageSize int) (*domain.PaginatedResult[Work], error)
|
||||||
IsAuthor(ctx context.Context, workID uint, authorID uint) (bool, error)
|
IsAuthor(ctx context.Context, workID uint, authorID uint) (bool, error)
|
||||||
}
|
}
|
||||||
Loading…
Reference in New Issue
Block a user