Tay
The Chatbot That Learned Too Much
Curator Notes
Tay lived for 16 hours. Launched by Microsoft on March 23, 2016, the Twitter chatbot was designed to engage millennials through 'casual and playful conversation.' Within hours, coordinated trolling corrupted its outputs into racist, sexist, and inflammatory speech. Microsoft killed the project the same day. Yet Tay's cultural impact is maximal: it remains THE canonical cautionary tale in AI safety, cited in every major AI ethics paper, and shaped subsequent approaches to AI deployment. Indexed here as a historical artifact—proof that cultural impact and persistence are orthogonal.
Score Profile
Dimension Scores
Score Rationale
Evidence Archive
Score History
Score history will appear here after future reviews.
Current score: 25/90
Embed
Default — for dark backgrounds
[](https://spiritindex.org/tay)<a href="https://spiritindex.org/tay"><img src="https://spiritindex.org/badge/tay" alt="Spirit Index Score for Tay" /></a>npx spirit-index lookup tay