Volunteers
On helping out someone else's machine
Meta rolled out Community Notes globally earlier this year. Modelled after X’s version. The Oversight Board was asked to review the expansion. Meanwhile, Instagram’s “Your Algorithm” lets you see and edit the topics shaping your feed. Threads got “Dear Algo”, a similar proposition. More visibility, more user input. All of which understandably barely scratch the (metal) surface.
It is still largely a welcomed change (I’m using that word generously). The notes enables regular users to shape content moderation, albeit at the expense of third-party fact-checkers (3PFC). Your Algorithm can make the recommendation system less opaque. Users contribute to the machine, although they only contribute to one part of it.
I tried to map this out in a recent piece for ISEAS. I initially asked two questions: is this technology reliable for borderline content, and is it controlled by the platform or open to external input? My version of an answer, mostly, is that the technologies capable of handling borderline content are also the ones most tightly held by platforms.
Maybe the question is not whether platforms should collaborate on content moderation. They already do, to a degree. Maybe the question is whether that collaboration can (or should?) be embedded into the technical architecture itself. That is harder to do than to write (heh).
My hesitation came from an impasse. Platforms are understandably reluctant to open up without clear regulatory protection. Regulators struggle to write rules without access to the technical systems they're trying to regulate. Both sides are stuck, and for reasons that make sense from where each of them sits.
But partnership by design still beats partnership by press release. The full piece is here.

