I think I saw Gaius Baltar implement this on Battlestar Galactica. It went well. /s Honestly seems more like a protocol for encoding a popularity contest, which is already what social media signalling does. How do you defend against self-reinforcing botnets and bad actors "cancelling" other people? I can dilute your human signal by creating massive amounts of LLM-generated noise.
by uberdru
0 subcomment
The fact that this won't go "web scale" seems to be its strength. The idea of local/human/authentic trust ecosystems is super powerful. "Proof of personhood" is fraught with issues, but it seems that lightweight trust algos like this do a nice job of treating trust as a human-first emergent thing, rather than trying to be a PKI style "infrastructure". Pretty cool!
> human.json is a lightweight protocol for humans to assert authorship of their site content and vouch for the humanity of others. It uses URL ownership as identity, and trust propagates through a crawlable web of vouches between sites.
This will not (and shouldn't) be used by more than a handful of people who were likely already friends anyway. I can't see it being helpful for anybody (unless accidentally visiting LLM blogspam melts your face à la Raiders of the Lost Ark) unless it's true intention is signalling you don't like LLMs to other people who don't like LLMs.
by petterroea
0 subcomment
If you have to perform a breadth-first search from your "seed" to verify a website, wouldn't every lookup become expensive relatively quickly? Unless max hops is set really low. Id assume you really need mass adoption for 5 degrees of separation to kick in, and that's still a lot of sites to crawl!
by halls-940
0 subcomment
Is there a mechanism here that favors a human over a bot? It seems about the same as adding a field to robots.txt
by semyonsh
1 subcomments
Something tells me GPG would be great for this concept, but it's probably not as accessible as to get people to paste a JSON somewhere.
by alsetmusic
0 subcomment
If nothing else, this at least inspired me to put a disclaimer on my own site declaring my AI policy. It's not so fancy and I think it's a good deal more credible than any formal protocol.
by orsorna
0 subcomment
Too bad they didn't choose a more human interchange format...
by evolve2k
0 subcomment
I’m a bit concerned that the content of humans.json will itself get mopped up by AI crawlers.
by
0 subcomment
by deafpolygon
0 subcomment
Virtue signaling at best; noise at worst… It’s trivial for an AI to add, and will be done so by anyone hoping to get a piece of that attention economy…
by ai-psychopath
1 subcomments
50 commits in 24 hours
it's hilarious that the human.json protocol to fight AI slop is itself AI slop