Apple's Problem with Bodies
a day ago
- #App Store
- #iOS Development
- #Content Moderation
- Apple's App Store struggles with categorizing apps related to human body and intimacy, often mislabeling them with mature ratings.
- The App Store's rating system is outdated, originating from iTunes and film ratings, and lacks context for modern apps.
- Silk, a private intimacy tracker, was rated 16+ despite being a simple wellbeing journal with no explicit content.
- Apple HealthKit avoids behavioral interpretation, missing relational wellbeing metrics like intimacy and closeness.
- App Store search and metadata systems fail to accurately classify apps that don't fit traditional categories.
- Developers face challenges with metadata, as certain terms trigger moderation or extended review processes.
- The App Store's conservative approach stems from technical and regional constraints, leading to slow adoption of new categories.
- Historical examples like Ninjawords and iFart highlight the App Store's inconsistent and sometimes arbitrary rating decisions.
- New app categories often start as edge cases and gain legitimacy only after proving user demand.
- Building in undefined categories requires reverse-engineering the platform's assumptions and waiting for taxonomy updates.