autotldr@lemmings.world
on 10 Feb 2024 10:50
nextcollapse
This is the best summary I could come up with:
However, in May, Christian Selig, the developer of the popular iOS client Apollo, had a call with the company where he learned that the cost demanded by the platform was so high that his app would go out of business.
As he wrote in detail on his blog, he noted that Threads starting with ActivityPub integration doesn’t automatically mean that we’ll see a flourishing ecosystem of apps by default.
“Again though, the integration of Threads into this ecosystem doesn’t necessarily equate to a larger market — as Meta may not need to use many of the back-end services, and most likely will not initially allow their users to use alternative clients,” Coates said.
“On the other hand, Meta’s integration may cause a lot more interest in self-hosting and other companies and communities may join the larger Mastodon ecology and that would increase the opportunities and possibility for services and products of this kind,” he noted.
The Iconfactory’s principal developer Ged Maheux told TechCrunch that the company has learned to diversify its revenue across different apps after the bad experience of Twitterific’s shutdown.
Earlier this week, Maheux and Iconfactory began a new Kickstarter campaign for a new app called Tapestry, which will allow you to connect your social media accounts and RSS feeds through a chronological timeline.
The original article contains 2,203 words, the summary contains 216 words. Saved 90%. I’m a bot and I’m open source!
SteefLem@lemmy.world
on 10 Feb 2024 11:01
nextcollapse
“Their data” HA.
ConstipatedWatson@lemmy.world
on 10 Feb 2024 12:03
nextcollapse
Unfortunately, when we sign up to their EULAs we “willingly” give everything up… So technically it ends up being legally theirs 🥺
Not quite, but pretty close. You still hold copyrights in anything protected by copyright for example. They just have a perpetual license to use your work. We really ought to be working on laws to protect privacy and limit corp content piracy without explicitly clear opt-ins.
SteefLem@lemmy.world
on 10 Feb 2024 15:12
collapse
Maybe we should charge them for emails they send us. Want me to sign up for a news letter, that will be 20€ per email. Or something.
Back in the paper spam days, some folk would stuff the "postage paid" envelopes with junk and mail them back to troll the companies. Setting up a junk address with an autoresponder would be pleasing, but probably would get tagged illegal.
SteefLem@lemmy.world
on 10 Feb 2024 20:02
collapse
Shame, its legal when big corp does it but illegal when I do it. This always seems weird to me.
NounsAndWords@lemmy.world
on 10 Feb 2024 13:01
nextcollapse
Hey! It took years of hard work to develop the good will necessary to get into a position to take advantage of their data!
Mark Zuckerberg: “Yeah so if you ever need info about anyone at Harvard just @ me. I have over 4,000 emails, pictures, addresses, SNS.”
“What? How’d you manage that one?” a friend asked.
“People just submitted it,” “I don’t know why. They ‘trust’ me. Dumb fucks.”
edit: copy/paste cleanup
SteefLem@lemmy.world
on 10 Feb 2024 17:11
collapse
Name address and so on i, well, understand if you buy something online to fill that in. But sns and id??? Thats all kinds of stupid. Why would you give thata willingly to fb? Its not a government entity or even a bank.
EvergreenGuru@lemmy.world
on 10 Feb 2024 19:28
collapse
It could be that some of that data was scraped from FB messenger.
MaggiWuerze@feddit.de
on 11 Feb 2024 18:25
collapse
That quote comes the time facebook was not open to the public yet. just fellow students of his.
McDropout@lemmy.world
on 10 Feb 2024 11:04
nextcollapse
I’m on Lemmy due to this!
I literally use this platform just to run from bots and cooperate greed.
I don’t think the Lemmy is well prepared to handle bots or more sophisticated spam, for now we’re just too small to target. I usually browse by new and see spam staying up for hours even in the biggest communities.
tutus@links.hackliberty.org
on 10 Feb 2024 12:32
nextcollapse
The moderation on Lemmy is pretty poor and there is no clear (at least to me) avenue to help or offer help. Reporting it is pointless.
So I agree. Lemmy cannot handle it at the moment. That does not give confidence of it being handled when it gets larger, and the spam / bots becoming more sophisticated.
I do appreciate however, that all platforms have to go through this learning process.
imaqtpie@sh.itjust.works
on 10 Feb 2024 13:59
collapse
Join a larger instance. We rarely see spam on SJW anymore because we now have a bot that removes it automatically.
tutus@links.hackliberty.org
on 10 Feb 2024 18:00
nextcollapse
I see a lot of usernames from SJW so I think I might do that!
tutus@links.hackliberty.org
on 10 Feb 2024 18:13
collapse
Just tried to create an account. Got error that it couldn’t send me an email to verify. Login button spins and then does nothing. Resetting password is the same.
Not sure what went wrong but sh.itdoesnt.work
imaqtpie@sh.itjust.works
on 11 Feb 2024 00:42
collapse
We have been experiencing problems with email verification in the past week. Apparently there is an issue with our email sender provider. Sorry about that.
If so then it should be fixed. If not, please DM me the email address you are using to register.
Thekingoflorda@lemmy.world
on 10 Feb 2024 12:34
nextcollapse
Just chiming in here: there are at the moment some problems with federation. I’m an admin on LW, and generally we remove spam pretty quickly but it currently doesn’t federate quickly. We are working on solutions that temporarily fix it till the lemmy devs themselves fix it.
jeena@jemmy.jeena.net
on 10 Feb 2024 14:57
nextcollapse
Ste spam is bad but I can just ignore it, but last week there was an attack with CSAM which showed up while casually surfing new, that made me not want to open Lemmy anymore.
I think that is what needs to be fixed before we can taccle spam.
Whatever is done to fight spam should be useful in fighting CSAM too. Latest “AI” boom could prove lucky for non-commercial social networks as content recognition is something that can leverage machine learning. Obviously it’s a significant cost so pitching in will have to be more common in covering running costs.
UndercoverUlrikHD@programming.dev
on 10 Feb 2024 16:33
collapse
Admins are actively looking into solutions, nobody wants that stuff stored on their server, and there’s a bunch of legal stuff you must do when it happens.
One of the problems is the cost of compute power for running programs detecting CSAM in pictures before uploading, making it not viable for many instances. Lemmy.world is moving towards only allowing images hosted via whitelisted sites I think.
UndercoverUlrikHD@programming.dev
on 10 Feb 2024 16:35
collapse
Be diligent with reporting, and consider switching instance if your admins aren’t really active.
Nighed@sffa.community
on 10 Feb 2024 17:54
collapse
The reports go to the community mods not your instance admins though don’t they?
UndercoverUlrikHD@programming.dev
on 10 Feb 2024 18:09
collapse
Any reports you make are visible to the admins of your instance.
E.g. if you make a report, the community mods may choose to ignore it while your admins choose to remove it for everyone using their instance.
Everything you see on Lemmy is through the eyes of your instance, people of other instances may see different stuff. E.g. some instances censor certain slurs, but that doesn’t affect users outside that instance. (de)federations also dictates what comments you will see on a post.
Nighed@sffa.community
on 10 Feb 2024 18:30
collapse
But they do go to the community mods, even on a different instance? And if the community mods remove the content that removal federates?
I prefer to rely on the community mods to remove most ‘spam’ as it’s their role to decide what is spam in their community. (Obviously admins can/should remove illegal content etc)
Admins for the most part shouldn’t have to remove content on their copy of other instances communities.
UndercoverUlrikHD@programming.dev
on 10 Feb 2024 19:09
collapse
It goes to the community mods too yeah. But when it comes to spam/scams that is being posted, admins (at least on programming.dev) will remove it immediately and not wait for community moderators. Spammers will usually spam multiple communities at once and only admins have the capability of banning users entirely from the site/their instance.
A few days ago a person created multiple accounts and spammed scat content across multiple communities. Moderators can’t effectively stop those kind of things.
ConstipatedWatson@lemmy.world
on 10 Feb 2024 11:54
nextcollapse
Long live Lemmy!
noodlejetski@lemm.ee
on 10 Feb 2024 12:33
nextcollapse
cybersandwich@lemmy.world
on 10 Feb 2024 14:38
nextcollapse
Lol, well it’s not immune to either. As soon as anyone thinks Lemmy has ROI, it will be targeted by bots, corporate greed, and scrapers.
But all of our posts are publicly available in the Internet and in my opinion should be fair game for web crawlers, archivists, or whoever wants to use it. That’s the free and open Internet.
What’s shitty is when companies like reddit decide it’s “their” data.
DigitalGemini@sh.itjust.works
on 10 Feb 2024 22:01
collapse
Testify! 👏🏻👊🏻
taanegl@lemmy.world
on 10 Feb 2024 14:35
nextcollapse
Does this mean the monetary value of personal data is falling? I’m thinking this may be some sort of price fixing.
huginn@feddit.it
on 10 Feb 2024 16:12
nextcollapse
It’s the opposite.
They’re hoarding more of it because they’re wanting to capitalize on it.
Sharing your capital for free is a bad business move.
z3rOR0ne@lemmy.ml
on 10 Feb 2024 16:42
nextcollapse
Data has always been valuable, even before Surveillance Calitalism. But now with the rise of AI, the owners of social platforms that were easily accessible are now making it harder to hoard the data because they realize they can use it for their own LLM training
Not to mention data’s various other uses like advertising/marketing, selling of it foreign governments/advesaries/law enforcement agencies, etc.
ComradeKhoumrag@infosec.pub
on 10 Feb 2024 18:50
nextcollapse
I suspect it’s a similar story with AI
Before AI took off, it was necessary to make groundbreaking discoveries. Pretty much all the architectures and most if not all of the data for training were released open source
Now that AI is taking off, these companies don’t want to help their competition. So their data and algorithms are becoming more and more closed off
ExLisper@linux.community
on 10 Feb 2024 19:33
collapse
It’s probably more like when Amazon gets into yet another business and kills the competition. Whatever those 3rd party devs are doing the social networks can do themselves and make more money.
werefreeatlast@lemmy.world
on 10 Feb 2024 19:02
nextcollapse
We’re scanning the very last email! It surely has all the passwords!
threaded - newest
This is the best summary I could come up with:
However, in May, Christian Selig, the developer of the popular iOS client Apollo, had a call with the company where he learned that the cost demanded by the platform was so high that his app would go out of business.
As he wrote in detail on his blog, he noted that Threads starting with ActivityPub integration doesn’t automatically mean that we’ll see a flourishing ecosystem of apps by default.
“Again though, the integration of Threads into this ecosystem doesn’t necessarily equate to a larger market — as Meta may not need to use many of the back-end services, and most likely will not initially allow their users to use alternative clients,” Coates said.
“On the other hand, Meta’s integration may cause a lot more interest in self-hosting and other companies and communities may join the larger Mastodon ecology and that would increase the opportunities and possibility for services and products of this kind,” he noted.
The Iconfactory’s principal developer Ged Maheux told TechCrunch that the company has learned to diversify its revenue across different apps after the bad experience of Twitterific’s shutdown.
Earlier this week, Maheux and Iconfactory began a new Kickstarter campaign for a new app called Tapestry, which will allow you to connect your social media accounts and RSS feeds through a chronological timeline.
The original article contains 2,203 words, the summary contains 216 words. Saved 90%. I’m a bot and I’m open source!
“Their data” HA.
Unfortunately, when we sign up to their EULAs we “willingly” give everything up… So technically it ends up being legally theirs 🥺
Not quite, but pretty close. You still hold copyrights in anything protected by copyright for example. They just have a perpetual license to use your work. We really ought to be working on laws to protect privacy and limit corp content piracy without explicitly clear opt-ins.
Maybe we should charge them for emails they send us. Want me to sign up for a news letter, that will be 20€ per email. Or something.
Back in the paper spam days, some folk would stuff the "postage paid" envelopes with junk and mail them back to troll the companies. Setting up a junk address with an autoresponder would be pleasing, but probably would get tagged illegal.
Shame, its legal when big corp does it but illegal when I do it. This always seems weird to me.
Hey! It took years of hard work to develop the good will necessary to get into a position to take advantage of their data!
Mark Zuckerberg: “Yeah so if you ever need info about anyone at Harvard just @ me. I have over 4,000 emails, pictures, addresses, SNS.”
“What? How’d you manage that one?” a friend asked.
“People just submitted it,” “I don’t know why. They ‘trust’ me. Dumb fucks.”
edit: copy/paste cleanup
Name address and so on i, well, understand if you buy something online to fill that in. But sns and id??? Thats all kinds of stupid. Why would you give thata willingly to fb? Its not a government entity or even a bank.
It could be that some of that data was scraped from FB messenger.
That quote comes the time facebook was not open to the public yet. just fellow students of his.
I’m on Lemmy due to this!
I literally use this platform just to run from bots and cooperate greed.
I don’t think the Lemmy is well prepared to handle bots or more sophisticated spam, for now we’re just too small to target. I usually browse by new and see spam staying up for hours even in the biggest communities.
The moderation on Lemmy is pretty poor and there is no clear (at least to me) avenue to help or offer help. Reporting it is pointless.
So I agree. Lemmy cannot handle it at the moment. That does not give confidence of it being handled when it gets larger, and the spam / bots becoming more sophisticated.
I do appreciate however, that all platforms have to go through this learning process.
Join a larger instance. We rarely see spam on SJW anymore because we now have a bot that removes it automatically.
I see a lot of usernames from SJW so I think I might do that!
Just tried to create an account. Got error that it couldn’t send me an email to verify. Login button spins and then does nothing. Resetting password is the same.
Not sure what went wrong but sh.itdoesnt.work
We have been experiencing problems with email verification in the past week. Apparently there is an issue with our email sender provider. Sorry about that.
Is this the account?
sh.itjust.works/u/lostmykeys
If so then it should be fixed. If not, please DM me the email address you are using to register.
Just chiming in here: there are at the moment some problems with federation. I’m an admin on LW, and generally we remove spam pretty quickly but it currently doesn’t federate quickly. We are working on solutions that temporarily fix it till the lemmy devs themselves fix it.
Ste spam is bad but I can just ignore it, but last week there was an attack with CSAM which showed up while casually surfing new, that made me not want to open Lemmy anymore.
I think that is what needs to be fixed before we can taccle spam.
Whatever is done to fight spam should be useful in fighting CSAM too. Latest “AI” boom could prove lucky for non-commercial social networks as content recognition is something that can leverage machine learning. Obviously it’s a significant cost so pitching in will have to be more common in covering running costs.
Admins are actively looking into solutions, nobody wants that stuff stored on their server, and there’s a bunch of legal stuff you must do when it happens.
One of the problems is the cost of compute power for running programs detecting CSAM in pictures before uploading, making it not viable for many instances. Lemmy.world is moving towards only allowing images hosted via whitelisted sites I think.
Be diligent with reporting, and consider switching instance if your admins aren’t really active.
The reports go to the community mods not your instance admins though don’t they?
Any reports you make are visible to the admins of your instance.
E.g. if you make a report, the community mods may choose to ignore it while your admins choose to remove it for everyone using their instance.
Everything you see on Lemmy is through the eyes of your instance, people of other instances may see different stuff. E.g. some instances censor certain slurs, but that doesn’t affect users outside that instance. (de)federations also dictates what comments you will see on a post.
But they do go to the community mods, even on a different instance? And if the community mods remove the content that removal federates?
I prefer to rely on the community mods to remove most ‘spam’ as it’s their role to decide what is spam in their community. (Obviously admins can/should remove illegal content etc)
Admins for the most part shouldn’t have to remove content on their copy of other instances communities.
It goes to the community mods too yeah. But when it comes to spam/scams that is being posted, admins (at least on programming.dev) will remove it immediately and not wait for community moderators. Spammers will usually spam multiple communities at once and only admins have the capability of banning users entirely from the site/their instance.
A few days ago a person created multiple accounts and spammed scat content across multiple communities. Moderators can’t effectively stop those kind of things.
Long live Lemmy!
…corporate?
Corporations cooperate greedily.
Lol, well it’s not immune to either. As soon as anyone thinks Lemmy has ROI, it will be targeted by bots, corporate greed, and scrapers.
But all of our posts are publicly available in the Internet and in my opinion should be fair game for web crawlers, archivists, or whoever wants to use it. That’s the free and open Internet.
What’s shitty is when companies like reddit decide it’s “their” data.
Testify! 👏🏻👊🏻
Does this mean the monetary value of personal data is falling? I’m thinking this may be some sort of price fixing.
It’s the opposite.
They’re hoarding more of it because they’re wanting to capitalize on it.
Sharing your capital for free is a bad business move.
Data has always been valuable, even before Surveillance Calitalism. But now with the rise of AI, the owners of social platforms that were easily accessible are now making it harder to hoard the data because they realize they can use it for their own LLM training
Not to mention data’s various other uses like advertising/marketing, selling of it foreign governments/advesaries/law enforcement agencies, etc.
I suspect it’s a similar story with AI
Before AI took off, it was necessary to make groundbreaking discoveries. Pretty much all the architectures and most if not all of the data for training were released open source
Now that AI is taking off, these companies don’t want to help their competition. So their data and algorithms are becoming more and more closed off
It’s probably more like when Amazon gets into yet another business and kills the competition. Whatever those 3rd party devs are doing the social networks can do themselves and make more money.
We’re scanning the very last email! It surely has all the passwords!
Ohh fuck! Another fuckin cat picture zip file!
Fuck spez.
That’s Steve Huffman folks!