Is legal the same as legitimate: AI reimplementation and the erosion of copyleft
5 days ago
- #AI Reimplementation
- #Copyleft
- #Open Source
- Dan Blanchard, maintainer of chardet, released version 7.0, which is 48 times faster and supports multiple cores, with Anthropic's Claude listed as a contributor. The license changed from LGPL to MIT.
- Blanchard claims the reimplementation shares less than 1.3% similarity with prior versions, making it an independent work not bound by LGPL. Original author Mark Pilgrim disagrees, citing LGPL's requirement for derivative works to share the same license.
- Armin Ronacher (Flask creator) supports the relicensing, arguing GPL restricts sharing, while Salvatore Sanfilippo (antirez, Redis creator) defends AI reimplementation based on copyright law and GNU project history.
- The core debate: Does legality equate to legitimacy? Both authors conflate legal permissibility with social legitimacy, ignoring the ethical implications of removing copyleft protections.
- GPL ensures recursive sharing by requiring derivative works to share source code, whereas MIT allows proprietary closures, benefiting those with more capital.
- Vercel's reaction to Cloudflare reimplementing Next.js (MIT-licensed) highlights hypocrisy: permissive licensing is praised until it affects one's own competitive interests.
- Legality and social legitimacy are distinct; breaking a social compact (like LGPL's reciprocity) may be legal but ethically questionable.
- Positional asymmetry: Established figures like Ronacher and antirez advocate for adaptation, ignoring how AI reimplementation erodes protections for smaller contributors.
- Bruce Perens warns of disrupted software economics, while others see adaptation or excitement. The central question: Is copyleft more or less necessary as AI lowers reimplementation barriers?
- Proposal for a 'specification copyleft' to protect essential intellectual content (APIs, test suites) as AI-generated code blurs traditional source code boundaries.