Wrong ways to use the databases, when the pendulum swung too far
a year ago
- #lessons-learned
- #software-development
- #database-design
- The author recounts their experience as a junior developer working on a critical pipeline system inherited from an offshore team.
- The system was complex, with ancient codebases, slow build processes (15-30 minutes), and development constrained to VMs.
- Outages were frequent due to flaky tests, hidden undocumented features, and database issues, particularly with stored procedures and MSDTC.
- The original database design had business logic embedded in stored procedures, causing API latency and reliability issues.
- A rewrite was initiated to move away from relational databases to a simple Key-Value store model with only four operations: Read, Insert, Update, Delete.
- The new design stored data as JSON documents, leading to large document sizes and inefficient partial updates requiring full document reads and writes.
- Compression (Gzip) was introduced to reduce IO, but this complicated data inspection, leading to the need for new tooling.
- The lack of transactions and batching in the KV store necessitated a checkpointing system for idempotency, increasing IO operations and latency.
- The author left the team during the rewrite but reflects on the lessons learned from the experience.