karpintero@lemmy.world
on 07 Jul 16:08
nextcollapse
Cool project. The closing slide was pretty funny
If you are working at an AI company, here’s how you can sabotage Anubis development as easily and quickly as possible. So first is quit your job, second is work for Square Enix, and third is make absolute banger stuff for Final Fantasy XIV. That’s how you can sabotage this the best.
rhythmisaprancer@piefed.social
on 08 Jul 03:42
nextcollapse
Interesting. I clicked on a link here a couple weeks ago and was presented with this and wasn't really sure what it was. Thanks for sharing this! It seems like a good alternative.
PushButton@lemmy.world
on 08 Jul 04:57
nextcollapse
I use the lynx browser sometimes, for hacker news, some blogs that I follow, or just for a quick browse to find an answer.
The fact that more and more websites need to use this kind of protection is saddening me, since lynx doesn’t support JavaScript.
Guess that’s why I’ve seen Anubis check screen quite a few times.
morbidcactus@lemmy.ca
on 08 Jul 11:50
nextcollapse
Afaik, almost every browser uses “Mozilla/5.0” as part of the user agent, Mozilla mentions it as well in developer docs about User agents, it’s a historical compatibility thing apparently.
it’s even stupider, it’s more like why there is no windows 9 because of programs doing stuff like if os.name.startswith(“windows 9”) then print(“this program is not compatible with windows 98”) end
sugar_in_your_tea@sh.itjust.works
on 08 Jul 13:54
collapse
Yup. There was a time when Mozilla was somewhat dominant, so browsers unlocked features based on the browser being Mozilla (as opposed to Internet Explorer).
well if you want to get into it, i think the last browser that didn’t have mozilla in the useragent was internet explorer, which had “trident/9.0” or something. every other browser on the market is based on the old KDE browser Konqueror, which had “khtml, like gecko” in it. when that didn’t work they just added “mozilla” to it. then apple took that codebase and added “safari”, chrome took that codebase and added “chrome”, etc etc etc. compatibility problems just kept compounding on every browser based on khtml until we got to the point where microsoft edge’s current user agent is Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/134.0.0.0 Safari/537.36 Edg/134.0.0.
even firefox has had to give in to this: my useragent is Mozilla/5.0 (X11; Linux x86_64; rv:140.0) Gecko/20100101 Firefox/140.0 even though the version of gecko in firefox 140 is v125, from 2022.
EncryptKeeper@lemmy.world
on 08 Jul 18:12
collapse
The creator of Anubis did an interview on the Selfhosted Show podcast a little while back and explains this in detail, and it’s worth a listen.
LovableSidekick@lemmy.world
on 08 Jul 19:28
collapse
Great interview! The whole proof-of-work approach is fascinating, and reminds me of a very old email concept he mentions in passing, where an email server would only accept a msg if the sender agreed to pay like a dollar. Then the user would accept the msg, which would refund the dollar. So this would end up costing legitimate senders nothing but would require spammers to front way too much money to make email spamming affordable. In his version the sender must do a processor-intensive computation, which is fine at the volume legitimate senders use but prohibitive for spammers.
Shanmugha@lemmy.world
on 08 Jul 11:55
nextcollapse
My hero in shining armour (not a sarcasm, just a form of appreciation of someone who did what I would never have done)
LovableSidekick@lemmy.world
on 08 Jul 19:08
collapse
Seems like it screens out bots regardless of whether they use AI or are just traditional asshole-created bots.
lagoon8622@sh.itjust.works
on 08 Jul 19:12
collapse
It does, yes. This will prevent your content from being indexed, in most cases. The benefit is that your servers will be up. If your servers are down, they can’t be indexed either
threaded - newest
Cool project. The closing slide was pretty funny
Is this the first seed of the Blackwall?
Interesting. I clicked on a link here a couple weeks ago and was presented with this and wasn't really sure what it was. Thanks for sharing this! It seems like a good alternative.
I use the lynx browser sometimes, for hacker news, some blogs that I follow, or just for a quick browse to find an answer.
The fact that more and more websites need to use this kind of protection is saddening me, since lynx doesn’t support JavaScript.
That’s just another reason why I fucking hate AI.
I don’t hate it, I /fucking/ hate AI.
You can bypass it by changing the user agent to not include Mozilla in the beginning.
Why does default config check Mozilla specifically?
Guess that’s why I’ve seen Anubis check screen quite a few times.
Afaik, almost every browser uses “Mozilla/5.0” as part of the user agent, Mozilla mentions it as well in developer docs about User agents, it’s a historical compatibility thing apparently.
Interesting, thanks!
Guess it’s the same kinda thing as amd64 on Intel lol
it’s even stupider, it’s more like why there is no windows 9 because of programs doing stuff like
if os.name.startswith(“windows 9”) then print(“this program is not compatible with windows 98”) end
Lol
Yup. There was a time when Mozilla was somewhat dominant, so browsers unlocked features based on the browser being Mozilla (as opposed to Internet Explorer).
well if you want to get into it, i think the last browser that didn’t have mozilla in the useragent was internet explorer, which had “trident/9.0” or something. every other browser on the market is based on the old KDE browser Konqueror, which had “khtml, like gecko” in it. when that didn’t work they just added “mozilla” to it. then apple took that codebase and added “safari”, chrome took that codebase and added “chrome”, etc etc etc. compatibility problems just kept compounding on every browser based on khtml until we got to the point where microsoft edge’s current user agent is
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/134.0.0.0 Safari/537.36 Edg/134.0.0.
even firefox has had to give in to this: my useragent is
Mozilla/5.0 (X11; Linux x86_64; rv:140.0) Gecko/20100101 Firefox/140.0
even though the version of gecko in firefox 140 is v125, from 2022.The creator of Anubis did an interview on the Selfhosted Show podcast a little while back and explains this in detail, and it’s worth a listen.
Here’s a time stamped link for the interview
Thanks!
Great interview! The whole proof-of-work approach is fascinating, and reminds me of a very old email concept he mentions in passing, where an email server would only accept a msg if the sender agreed to pay like a dollar. Then the user would accept the msg, which would refund the dollar. So this would end up costing legitimate senders nothing but would require spammers to front way too much money to make email spamming affordable. In his version the sender must do a processor-intensive computation, which is fine at the volume legitimate senders use but prohibitive for spammers.
My hero in shining armour (not a sarcasm, just a form of appreciation of someone who did what I would never have done)
Seems like it screens out bots regardless of whether they use AI or are just traditional asshole-created bots.
It does, yes. This will prevent your content from being indexed, in most cases. The benefit is that your servers will be up. If your servers are down, they can’t be indexed either