Facial Recognition That Tracks Suspicious Friendliness Is Coming to a Store Near You (gizmodo.com)
from alyaza@beehaw.org to technology@beehaw.org on 02 Nov 00:11
https://beehaw.org/post/16845937

#technology

threaded - newest

scrubbles@poptalk.scrubbles.tech on 02 Nov 00:12 next collapse

We’re already in the dystopia.

thingsiplay@beehaw.org on 02 Nov 00:29 next collapse

I would just get sunglasses and try to look suspicious just to mess up their tracking.

Nytixus@kbin.melroy.org on 02 Nov 01:41 collapse

And be theatric. High five the cashier, over-compliment them, get a bunch of friends to dance with them. Then by the end, be happy they offered you a discount.

thingsiplay@beehaw.org on 02 Nov 01:58 collapse

I’m already trained with over-compliment. I think If I just act like always its already sus. xD

Peanutbjelly@sopuli.xyz on 02 Nov 00:36 next collapse

Big. Fan of ai stuff. Not a fan of this. This definitely won’t have issues with minority populations and neurodivergents falling outside of distribution and causing false positives that enable more harassment of people who already get unfairly harassed.

Let this die with the mind reading tactics they spawned from.

Catoblepas@lemmy.blahaj.zone on 02 Nov 00:37 next collapse

About a month ago, Israel-based Corsight AI began offering its global clients access to a new service aimed at rooting out what the retail industry calls “sweethearting,”—instances of store employees giving people they know discounts or free items.

Lol, I hope stores that use this lose millions on this stupid ass privacy invasion. Anyone stupid enough to believe the savings of catching a 10% employee discount used occasionally for friends or whatever is going to offset whatever the fuck this most recent torment nexus is going to cost frankly deserves to be swindled.

iturnedintoanewt@lemm.ee on 02 Nov 03:10 collapse

That would be if it ends in the news if a store uses it. I definitely can’t imagine people stop using their main grocery store off from a rumor like this.

Catoblepas@lemmy.blahaj.zone on 02 Nov 03:18 collapse

Oh, I don’t even mean from lost sales, I mean because this service is fundamentally going to cost more than the “theft” (lol) it’s allegedly stopping. If any one employee (or even a team) is doing this at scale and a business needs AI tracking customers to pick up on it, there is something drastically wrong.

This service is basically pure AI hype. It’s not doing anything a minimally engaged manager couldn’t already do with the salary you’re having to pay them anyway. Except the AI is also doing it worse and at a higher cost. Yay!

prex@aussie.zone on 02 Nov 01:05 next collapse

<img alt="" src="https://aussie.zone/pictrs/image/47bf0215-f5cf-4452-a5c3-e4f52bd31fc2.jpeg">

ShellMonkey@lemmy.socdojo.com on 02 Nov 01:11 next collapse

They claim $100 B in ‘losses’ to this kind of game. Unless they’re actually running red on their books what they really mean is ‘we think we should make at least $100 B more per year’.

I’m sure that the vast majority of that would go directly to the front of house employees they’re pinning this on too, definitely not to the execs and share holders…

wesker@lemmy.sdf.org on 02 Nov 01:46 next collapse

If it can detect suspicious unfriendliness, then I’m really in trouble.

Alice@beehaw.org on 02 Nov 02:26 next collapse

This is horrifying for a lot of reasons but it’d be nice if my boss had to see the amount of people who yell at me for masking or tell me I look like a man every day. It wouldn’t help anything I just hate my boss and want her to feel the awkwardness

Dirac@lemmy.today on 02 Nov 15:01 next collapse

This is trash. I can’t wait for these people to install this garbage for $100K or something, and then get too many alerts to actually investigate, thus wasting their money (which they deserve for participating in this kinda surveillance) and proving that Corsair are a bunch of charlatans.

Midnitte@beehaw.org on 02 Nov 15:56 next collapse

Absolutely no way this doesn’t explicitly target certain groups of people and end up in a lawsuit.

Doxin@pawb.social on 03 Nov 14:11 collapse

There’s no chance this doesn’t turn out to be, among other things, an autism detector.

t3rmit3@beehaw.org on 02 Nov 17:06 next collapse

Not friendly enough when talking to customers? Bad employee.

Too friendly when talking to customers? Bad employee.

This is just about 1) creating an algorithmic justification for the racial profiling that managers already do, and 2) keeping employees in fear of termination so they put up with bullshit.

Side story about how shitty retail management is:

When I was working retail years ago (big box electronics store), our management employed a system of getting every new employee to 3 write-ups as fast as they could (I’m talking, within a month of starting), using literally any excuse they could, so they could hold the “one more write-up and you’re fired” over their head.

“AI” is definitely going to become a new tool for employee suppression.

loops@beehaw.org on 02 Nov 21:58 collapse

Israel-based

Well there’s your problem right there.