TikTok Shop sellers say its AI moderation has gone rogue, doling out 'bogus' violations and freezing products without a clear explanation (www.businessinsider.com)
from throws_lemy@lemmy.nz to technology@lemmy.world on 14 Jan 2024 04:45
https://lemmy.nz/post/5519515

#technology

threaded - newest

autotldr@lemmings.world on 14 Jan 2024 04:45 next collapse

This is the best summary I could come up with:


He started making money on the app by offering fishing baits through TikTok Shop, selling around 92 strawberry- and sweet-corn-flavored “little bitz,” for instance.

TikTok, in recent months, has been sending a flurry of violation claims to sellers that use its e-commerce platform, stating that they’ve set “spam prices” on products, inadequately displayed items during livestreams, assigned inaccurate product-listing titles, or incorrectly categorized their goods.

Some sellers have managed to slip prohibited goods like homemade foods, sex toys, and THC syrups past its moderation system, while other merchants have added knock-off products mimicking items from mainstream brands like Lululemon.

“We had an LDR (late dispatch rate) penalty put on our account right as we were hitting our peak in sales,” said Jessica Slone, founder of Bad Addiction Boutique, a merchant that has sold tens of thousands of sweatshirts on TikTok.

Amazon uses bots to flag potential violations of its policies, which can lead to sellers’ accounts being deactivated, sometimes taking business owners by surprise.

For small-business owners on TikTok, having to take time to repeatedly appeal Shop violations is a strain alongside the day-to-day work of fulfilling orders.


The original article contains 1,107 words, the summary contains 188 words. Saved 83%. I’m a bot and I’m open source!

helenslunch@feddit.nl on 14 Jan 2024 05:21 next collapse

shocked Pikachu

vexikron@lemmy.zip on 14 Jan 2024 05:43 next collapse

Lol, perhaps even: lmao.

Huge tech corps fucking up in the most predictable yet also insane ways possible never ceases to bring a smile to my face.

FaceDeer@kbin.social on 14 Jan 2024 06:47 collapse

We hear about the instances where they screw up, but we don't hear about the ones where it's working just fine in the background.

otter@lemmy.ca on 14 Jan 2024 08:21 next collapse

They’re supposed to work fine in the background

The point was about the many recent stories of companies rushing to use “AI” and the chaos it caused

vexikron@lemmy.zip on 14 Jan 2024 13:36 collapse

The whole point of having a large, comprehensive database is that it be robust, efficient, amd reliable.

When you introduce an immature, very hyped, untested at scale, other system or software to manage and curate said database, a system that is known to fail at edge cases, and you know your database features a lot of edge cases…

… the results are fairly predictable.

Large Corporate higher up types /consistently/ overlook the valid concerns that are later proven to be correct, which are raised by people in their companies that actually understand the technology their company uses.

This happens time and time again in large corporations where it has become very clear that ego and the potential reward of more profit, outweigh the expertise of the actual people in the company familiar with the technology the company uses, and causes massive, costly debacles.

This happens because, at this point, its clear that a large number of tech ceos and management do not actually know tech or the tech industry, and still operate with the reckless abandon from the ‘move fast and break things’ kind of mentality that /might/ work in a start up, but do not work at all with a larger, more established and mature business, and also more generally this happens in other industries because management doesnt really understand modern technology at anything beyond a surface level.

As a person who has actually worked on different databases and more generally different roles in different parts of the tech industry, and in software related roles in other industries, for around a decade, I have seen things like this happen basically multiple times at every job I have had, though not to this scale.

In summary: I am one of the people whonis responsible for things working smoothly and you not hearing about them, and I am telling you there are many other people like me and most of them will agree that these fuck ups you do hear about happen because people paid 10 to 100 times as much as us do not listen to us.

FaceDeer@kbin.social on 14 Jan 2024 16:24 collapse

Sure. But my point is, we don't know how many companies are using AI where everything's working fine. We're only seeing some of the failures, we're not seeing the successes. So we can't draw general conclusions from these specific examples.

thorbot@lemmy.world on 14 Jan 2024 07:48 next collapse

Oh no!

Anyway…

Fisk400@feddit.nu on 14 Jan 2024 09:27 next collapse

Humans go rouge, programs are broken. Stop using humanizing terms for algoritms.

tabular@lemmy.world on 14 Jan 2024 12:43 next collapse

“Gone rogue” just means to behave in an unexpected way - a very common occurrence when I write software.

Fisk400@feddit.nu on 14 Jan 2024 18:49 collapse

You should stop using humanizing terms for algorithms.

[deleted] on 14 Jan 2024 18:52 next collapse

.

PipedLinkBot@feddit.rocks on 14 Jan 2024 18:52 collapse

Here is an alternative Piped link(s):

dehumanizing people

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

tabular@lemmy.world on 14 Jan 2024 18:54 collapse

To convince me of that you may want to explain how it’s a humanizing term, for starters.

Fisk400@feddit.nu on 14 Jan 2024 20:28 collapse

Rogue is a type of person and going rogue has the same etymology. We have lots off expressions that ascribes living qualities to inert things and that is fine. It is a very human thing to do. But in the field of AI we need to be very firm in what AI is.

tabular@lemmy.world on 14 Jan 2024 23:04 next collapse

Okay, I grant that the term “rogue” is commonly used with a human aspect to it. Are you suggesting that by using a word about humans to describe the non-human algorithms that may make people misunderstand what is the AI we’re talking about?

uriel238@lemmy.blahaj.zone on 15 Jan 2024 02:50 collapse

Rogue is a type of person, like a rogue planet or a rogue comet.

AnUnusualRelic@lemmy.world on 14 Jan 2024 21:17 collapse

Humans only go rouge when exposed to too much sun.

uriel238@lemmy.blahaj.zone on 15 Jan 2024 02:49 collapse

…or join the circus.

MonkderZweite@feddit.ch on 14 Jan 2024 14:12 next collapse

Tiktok sells stuff now?

Corkyskog@sh.itjust.works on 14 Jan 2024 14:25 collapse

Yeah, it’s like direct to consumer ads with shop links attached. It’s pretty popular, prices are okay, but it’s a lot of weird shit and gimmicks, you don’t actually need. There are also live feeds that sell like Pokémon cards or fresh water clams, but that’s a different story.

Krauerking@lemy.lol on 14 Jan 2024 17:08 next collapse

Ok but like this is barely a thing and it comes from people literally selling drugs and other illegal goods through this completely unregulated market.

And literally just to get the US regulators off their back they just dumped the still processing transactions and handed the money over to the US, as a not so subtle “please don’t sue us”

All the early money was made and everyone got their piece of the pie even if it now screws over regular-ish people. Tiktok is risky business even if it made a lot of money.

yuki2501@lemmy.world on 14 Jan 2024 19:25 collapse

There’s few things that can inspire as much fear in the population as the phrase “gone rogue” applied to AI.

Example. (HZD spoilers)

PipedLinkBot@feddit.rocks on 14 Jan 2024 19:25 collapse

Here is an alternative Piped link(s):

Example.

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.