#twitter

See tagged statuses in the local BookWyrm community

@Gargron In solidarity with you and all you have done for us, I've deleted my and accounts....permanently.

I plan on focusing on and other forms of decentralization, as well as personal privacy technology.

You inspire us all!

It turns out the leader of the of 🏴󠁧󠁢󠁥󠁮󠁧󠁿 and 🏴󠁧󠁢󠁷󠁬󠁳󠁿 previous had an account on Mastodon (hello @ZackPolanski 👋 ) and so did Green MP for Pav' (hello @sianberry 👋)

So I emailed the party to ask if they would consider moving away from /X in light of recent events and reboot a Mastodon effort.

I shared links to mastodon.green and a set of UK-based instances, including toot.wales.

You too can email office@greenparty.org.uk to ask that they get an official account.

it's actually not okay the extremes it takes for some folks to leave Twitter. O_o

and still, some people (who don't personally agree with the many issues) just *still* linger on there casually, seemingly immovable.

Like, anti-consumer, monolithic, data-harvesty platforms that you submit misused personal info to. That should be bad enough, but instead, the effective standard we have as a global community is wayyy too low.

Personally I'm amazed people stayed on X after the Nazi salute, but whatever. The best time to leave Twitter was years ago, the second-best time is now.
---
To anybody still using X: sexual abuse content is the final straw, it’s time to leave - Marie Le Conte https://www.theguardian.com/commentisfree/2026/jan/12/x-sexual-abuse-time-to-leave-elon-musk-grok-imagery-women-children

X, formerly known as Twitter, is a machine that automates the mass-production of CSAM. Its Generative AI chatbot, Grok, is being used to produce non-consensual sexually-explicit images of children. The people who run Twitter know it, and appear to be fine with that. In fact, rather than stop it, they've moved the feature behind a paywall to extract more money from a certain type of customer.

According to reports in the media, Grok generates approximately 6,700 sexually explicit images per hour, with 85% of the images generated by Grok being sexually explicit. Without spending a lot of time on Twitter, I think it's difficult to estimate how many of those are CSAM, though it's said to be a high enough proportion that witnessing it is unavoidable. Presumably, Grok also generates approximately 1,182 images per hour that are not sexually explicit. Gosh, that's a lot of non-sexual images.

I …