Some folks simply can’t assist however suck the enjoyable out of issues.
Living proof: the phrase “clanker,” a quasi-new slur circulating on social media aimed toward AI chatbots. A number of weeks in the past, I reported on the development because it gained traction on X and TikTok, largely in satirical posts imagining a future the place robots had absolutely built-in into society. It was darkish humor rooted in resentment towards Huge Tech and the rising function of AI in day by day life.
However the initially ironic nature of the time period didn’t final lengthy.
Mashable Development Report
Like many memes that begin out as parody, “clanker” shortly morphed into one thing uglier. What began as a tongue-in-cheek Star Wars nod has since curdled right into a pejorative that riffs on actual racial slurs.
This Tweet is at present unavailable. It is perhaps loading or has been eliminated.
This Tweet is at present unavailable. It is perhaps loading or has been eliminated.
This Tweet is at present unavailable. It is perhaps loading or has been eliminated.
This Tweet is at present unavailable. It is perhaps loading or has been eliminated.
This Tweet is at present unavailable. It is perhaps loading or has been eliminated.
It’s disappointing. In only a matter of weeks, the joke slid from innocent gags about not wanting a robotic to serve us popcorn on the Tesla diner to full-on skits mimicking Jim Crow–period scenes of racial discrimination.
This Tweet is at present unavailable. It is perhaps loading or has been eliminated.
On the subject of on-line slang, this phenomenon is not new. On social media — and in politics — language has lengthy been co-opted into coded shorthand, particularly for focusing on Black communities. Consider how phrases like “Important Race Concept” or “DEI” get wielded as stand-ins, or how on-line slang as soon as leaned on “ni🅱️🅱️a” and now, “yn.” The Uganda Knuckles “have you learnt da wey” meme famously acquired actually racist, actually quick.
It looks as if the cycle is repeating itself as soon as once more.
[/gpt3]