The Black Opticon
A Search for the Algorithm Determined to Distort My Image
The screen glows, a portal to infinite knowledge. My fingers, like millions of others, perform the hopeful tap-tap-tap, searching for connection, for information, for myself. I type “black girl hairstyles” into the search bar, looking for a style that won’t leave my scalp burning. What the algorithm returns isn’t a helpful guide. It’s a curated gallery of fetishization and stereotypes. The innocent query “black boy” doesn’t summon images of childhood or promise; it returns mugshots and crime stories. It’s a digital verdict delivered before I’ve even clicked a link. It’s like the screen knows I’m Black.
This is the reality of the Black Opticon, a term coined by scholar Anita L. Allen. It’s more than just being watched; it’s being watched through a lens ground by centuries of bias. We are simultaneously hyper-visible, reduced to a narrow set of degrading images, and invisible, our true, complex humanity completely absent from the results.
This daily digital experience begs a haunting question: Are algorithms simply mirrors, passively reflecting society’s oldest prejudices? Or are they something far more dangerous—megaphones, engineered for maximum engagement, that amplify and blast a distorted, centuries-old story about Blackness directly back at us?
When I type “black girl” into the world’s most powerful information system and am met with images of strippers, or search “black boy” and see threats, that’s not just a broken search result. It’s a reflection of how the system sees me. It’s a judgment that reaches into the future of any young person by contaminating the very source they turn to in order to understand who they are and where they belong. The real issue isn’t simply what the algorithm shows, but what it’s designed to overlook: the full depth and complexity of our humanity.



💯 Mind Control.