Someone Put Facial Recognition Tech onto Meta's Smart Glasses to Instantly Dox Strangers (www.404media.co)
from notveddev@lemm.ee to technology@lemmy.world on 03 Oct 2024 14:53
https://lemm.ee/post/43936831

#technology

threaded - newest

xodoh74984@lemmy.world on 03 Oct 2024 15:01 next collapse

Surely the original “someone” is Meta. Good to have a redundant system I guess /s

zout@fedia.io on 03 Oct 2024 19:40 collapse

I read earlier "someone" were a couple of college students.

Seraph@fedia.io on 03 Oct 2024 20:10 collapse

It's literally the first line of the article you guys, fucking read it instead of speculating:

A pair of students at Harvard have built...

xodoh74984@lemmy.world on 04 Oct 2024 12:57 collapse

Whoosh

Edit: My point was that a couple of kids doing this on a small scale pales in comparison to Meta’s reach. The students didn’t do anything particularly novel, and Meta, which has a much more comprehensive dataset of faces linked to personal information, personal communications, etc, is already using every means available to do the same thing. The college students simply demonstrated what Meta is already doing on a global scale.

recursive_recursion@lemmy.ca on 03 Oct 2024 15:03 next collapse

at this point, masking up in public provides protections for both health and privacy reasons

NuXCOM_90Percent@lemmy.zip on 03 Oct 2024 15:19 next collapse

Apple already demonstrated that you can still get pretty darn close from eyes and hair. Combine that with a bit of logic (There is a 40% chance this is Sally Smith but she also lives three streets over and works on that street) and you still have very good odds.

Well… unless you are black, brown, or asian. Since the facial recognition tech is heavily geared toward white people because tech bros.

conciselyverbose@sh.itjust.works on 03 Oct 2024 15:40 next collapse

Facial recognition works better on white people because, mathematically, they provide more information in real world camera use cases.

Darker skin reflects less light and dark contrast is much more difficult for cameras to capture unless you have significantly higher end equipment.

NuXCOM_90Percent@lemmy.zip on 03 Oct 2024 15:48 next collapse

For low contrast greyscale sequrity cameras? Sure.

For any modern even SD color camera in a decently lit scenario? Bullshit. It is just that most of this tech is usually trained/debugged on the developers and their friends and families and… yeah.

I always love to tell the story of, maybe a decade and a half ago, evaluating various facial recognition software. White people never had any problems. Even the various AAPI folk in the group would be hit or miss (except for one project out of Taiwan that was ridiculously accurate). And we weren’t able to find a single package that consistently identified even the same black person.

And even professional shills like MKBHD will talk around this problem during his review ads (the apple vision video being particularly funny).

conciselyverbose@sh.itjust.works on 03 Oct 2024 16:03 next collapse

For any scenario short of studio lighting, there is objectively much less information.

You’re also dramatically underestimating how truly fucking awful phone camera sensors actually are without the crazy amount of processing phones do to make them functional.

NuXCOM_90Percent@lemmy.zip on 03 Oct 2024 16:14 collapse

No. I have worked with phone camera sensors quite a bit (see above regarding evaluating facial recognition software…).

Yes, the computation is a Thing. A bigger Thing is just accessing the databases to match the faces. That is why this gets offloaded to a server farm somewhere.

But the actual computer vision and source image? You can get more than enough contours and features from dark skin no matter how much you desperately try to talk about how “difficult” black skin is without dropping an n-word. You just have to put a bit of effort in to actually check for those rather than do what a bunch of white grad students did twenty years ago (or just do what a bunch of multicultural grad students did five or six years ago but…).

conciselyverbose@sh.itjust.works on 03 Oct 2024 16:25 collapse

It’s not racist to understand physics.

It’s exactly the same reason phone cameras do terrible in low light unless they do obscenely long exposures (which can’t resolve detail in anything moving). The information is not captured at sufficient resolution.

NuXCOM_90Percent@lemmy.zip on 03 Oct 2024 16:29 collapse

Rhetorical question (because we clearly can infer the answer) but… have you ever seen a black person?

A bit of melanin does not make you into some giant void that breaks all cameras. Black folk aren’t doing long exposure shots for selfies or group photos. Believe it or not but RDCWorld doesn’t need to use nightvision cameras to film a skit.

conciselyverbose@sh.itjust.works on 03 Oct 2024 16:37 collapse

You can keep hand waving away the statement of fact that lower precision input is lower precision input.

And yes, for actual photography (where people are deliberately still for long enough to offset the longer exposure required), you do actually need different lighting and different camera settings to get the same quality results. But real cameras are also capable of capturing far more dynamic range without guessing heavily on postprocessing.

xor@lemmy.blahaj.zone on 04 Oct 2024 12:29 collapse

And you can keep hand waving away the fact that lower precision because of less light is not the primary cause of racial bias in facial recognition systems - it’s the fact that the datasets used for training are racially biased.

conciselyverbose@sh.itjust.works on 04 Oct 2024 12:42 collapse

Yes, it is. The idea that giant corporations “aren’t trying” is laughable, and it’s a literal guarantee that massively lower quality, noisier inputs will result in a lower quality model with lower quality outputs.

Less photons hitting the sensors matters. A lot.

fartsparkles@sh.itjust.works on 03 Oct 2024 16:11 collapse

You’re not wrong. Research into models trained on racially balanced datasets has shown better recognition performance among with reduced biases. This was in limited and GAN generated faces so it still needs to be recreated with real-world data but it shows promise that balancing training data should reduce bias.

NuXCOM_90Percent@lemmy.zip on 03 Oct 2024 16:25 collapse

Yeah but this is (basically) reddit and clearly it isn’t racism and is just a problem of multi megapixel cameras not being sufficient to properly handle the needs of phrenology.

There is definitely some truth to needing to tweak how feature points (?) are computed and the like. But yeah, training data goes a long way and this is why there was a really big push to get better training data sets out there… until we all realized those would predominantly be used by corporations and that people don’t really want to be the next Lenna because they let some kid take a picture of them for extra credit during an undergrad course.

fartsparkles@sh.itjust.works on 03 Oct 2024 20:59 collapse

You okay?

milicent_bystandr@lemm.ee on 05 Oct 2024 04:18 collapse

No, your honour, I did not wear blackface to trivialise the suffering of people who came from Africa. I wore blackface to hide from Facebook Glasses.

bl_r@lemmy.dbzer0.com on 03 Oct 2024 16:30 collapse

I think it would be funny to normalize wearing bloc in order to retain privacy. It’s why some people might wear accessories they normally don’t wear, such as beanies and sunglasses at protests, even if they aren’t in full bloc, covering hair and eyes (in addition to a surgical mask) can make it really hard to doxx someone.

NuXCOM_90Percent@lemmy.zip on 03 Oct 2024 16:38 collapse

I mean, you definitely want to wear a mask and some goggles at a protest. If only for the purpose of pepper spray. I totally don’t have a thin gaiter, goggles ,and a beanie and have definitely not heard great things about mountain biking helmets (the ones with faceguards) and totally am not considering grabbing one next time I do an REI run.

But also be aware that, with protests, you are almost always up against the groups who have access to all those “traffic” cameras and the like. And computer vision makes it fairly trivial to identify when a bunch of unmasked people walked into a dark alley and came out with their faces fully covered by tracking them back from the 4th street protest. It isn’t Enemy Of The State levels of asking Baby Busey and Jamie Kennedy to generate a 3d model from a single shot of Big Willy Style ogling some ta-tas, but most of the ways surveillance is used during that sequence are shockingly realistic and feasible.

bl_r@lemmy.dbzer0.com on 03 Oct 2024 17:04 collapse

In most cases there isn’t much you can do to fool the government without a lot of prep time such as scouting routes to find cameras, destroying them, or being really good at changing into bloc in the middle of a crowd and not getting caught.

But the important thing is threat modeling. The past dozen or so protests I’ve been at haven’t had the government as a big threat, it has had fascists as the primary threat. While a fascist cop would be a problem, it is much less likely than fascists combing through protest footage to try and doxx people, or a fascist at said action trying to get good photographs. That’s why I masked up.

The last real dicey action that I went to I still masked up, even knowing that the government could still try to track me if needed because I knew it would be time consuming to do so, and that they would only go through the process of doing that if I make it worth their while. Bloc is still effective, but quite hard under this heavily surveillanced police state.

NuXCOM_90Percent@lemmy.zip on 03 Oct 2024 17:14 collapse

The thing is? Ignoring the apparent void that black skin creates on all cameras (oy), it doesn’t take much time. It takes computing power.

As poops and giggles a few friends and I took the public (rumble…) traffic camera feeds that a nearby county has online. Set up a simple python script to scrape those and then configured an off the shelf tool to track a buddy’s general car (green hatchback) and told him to just drive around for an hour.

We were able to map his route with about 70% accuracy with about two hours of scripting and reading documentation. And there are companies that provide MUCH better products for the people who have access to the direct feeds and all the cameras we don’t have access to.

Eggyhead@fedia.io on 03 Oct 2024 16:26 collapse

And then masks become illegal.

massive_bereavement@fedia.io on 03 Oct 2024 17:27 next collapse

If you have something to hide.......

EngineerGaming@feddit.nl on 05 Oct 2024 08:43 collapse

“Don’t mind the cough, my flu has got a bit better since yesterday”.

Eggyhead@fedia.io on 06 Oct 2024 06:17 collapse

“Ah, see-through masks are okay though.”

breadsmasher@lemmy.world on 03 Oct 2024 15:12 next collapse

I remember this happening with google glasses as well

theguardian.com/…/google-glass-facial-recognition…

lepinkainen@lemmy.world on 03 Oct 2024 16:43 collapse

Ahh, Glassholes

OpenStars@discuss.online on 03 Oct 2024 15:24 next collapse

This headline would have carried a ton more weight if it wasn’t so extremely click-baity.

The ends do not justify the means?

WhatAmLemmy@lemmy.world on 03 Oct 2024 15:42 next collapse

The project is designed to raise awareness of what is possible with this technology.

This has nothing to do with smart glasses, and everything to do with surveillance capitalism. You could do the same thing with a smartphone, or any camera + computer. All this does is highlight how everyones most sensitive data has been aggregated by numerous corporations and is available to anyone who will pay for it. There was a time when Capitalism used to equate itself as the “free” and privacy preserving antithesis to Soviet style communist surveillance, yet no KGB agent ever had access to a system with 1/100th the surveillance capabilities that 21st century capitalism now sells freely for profit. If you need proof, a couple of college students were able to create every stalking victims worst nightmare.

vzq@lemmy.world on 03 Oct 2024 16:12 next collapse

I mean sort of.

It does mean that walking around with smart glasses will have people potentially reacting to you like you are waving a recording smartphone in their face.

Which is not great for product adoption, if you get my drift.

nevemsenki@lemmy.world on 03 Oct 2024 16:16 collapse

Soon smartglasses will look like regular glasses though. Miniaturisation isn’t about to stop.

Blue_Morpho@lemmy.world on 03 Oct 2024 16:29 next collapse

New style: Frameless glasses or you are creeping.

CaptainSpaceman@lemmy.world on 03 Oct 2024 16:57 collapse

Frameless glasses AND clear temples

vonbaronhans@midwest.social on 03 Oct 2024 22:50 collapse

Clear temples?

stephen01king@lemmy.zip on 04 Oct 2024 12:36 collapse

So you can see what their brain is doing.

T156@lemmy.world on 05 Oct 2024 08:52 collapse

You can never quite trust an organ you can’t see.

Mushroomm@sh.itjust.works on 03 Oct 2024 21:58 collapse

Yea the ray bans in question are completely discreet unless told or you’ve seen them already

SomeGuy69@lemmy.world on 04 Oct 2024 11:51 collapse

Pretty much no phone is directed at everyone else’s face all the time, that alone is the huge difference. It’s the differences between someone using their phone and someone actively holding it upright to record the crowd. Surveillance cameras might be out there too but they aren’t sighted by everyone (different by country, some even have to deleted after 24h, unless there was a crime).

People quickly would tell you to stop recoding, if you’d hold up your phone all the time, even in situations where you’re closer to each other, like in public transport.

emeralddawn45@discuss.tchncs.de on 05 Oct 2024 03:01 collapse

Im sure you could find a usb c camera that could easily be obscured or pinned to a lapel or otherwise disguised for cheaper than the price of a pair of smart glasses, or even just wear your phone on a lanyard around your neck with the screen facing your chest. People might think its weird but noone is going to second guess it unless your phone is in your hands actively pointing at them.

mesamunefire@lemmy.world on 03 Oct 2024 15:57 next collapse

What does the article say? Its asking to sign up.

NVM got it: archive.is/a2VYP

mesamunefire@lemmy.world on 03 Oct 2024 16:00 next collapse

A company called Clearview AI broke that unwritten rule and developed a powerful facial recognition system using billions of images scraped from social media. Primarily, Clearview sells its product to law enforcement. Clearview has also explored a pair of smart glasses that would run its facial recognition technology. The company signed a contract with the U.S. Air Force on a related study.

Just another reason to not post all your images to social media. Share with family/friends who care but thats it.

11111one11111@lemmy.world on 03 Oct 2024 16:38 next collapse

Right?! That is all it takes to save your privacy is just not having social media but noone is willing to do that.

pineapplelover@lemm.ee on 03 Oct 2024 19:36 collapse

The main concern I have is unavoidably having my picture taken. Say I go to a family gathering, of course they will take my picture if it’s a big event. They then will probably share it everywhere. I can’t reasonably say “don’t post this picture on the internet” they probably will.

EngineerGaming@feddit.nl on 05 Oct 2024 08:44 collapse

In such a situation, I usually just ask to be out of the picture.

xavier666@lemm.ee on 04 Oct 2024 02:42 collapse

Do not share the image in a private Facebook group. Don’t post it on popular direct messaging services.

The only way (which I still don’t trust), some privacy-preserving E2E encrypted file storage server or (which I trust) via your own Matrix server.

Kbobabob@lemmy.world on 04 Oct 2024 18:18 collapse

private Facebook group

Does such a thing actually exist? Seems that “private” and “Facebook” really shouldn’t be in the same sentence together.

xavier666@lemm.ee on 04 Oct 2024 22:58 collapse

People (the general populace) think that if a group visibility is set to Private, then it’s truly private 🤷🏻

FaceDeer@fedia.io on 03 Oct 2024 15:30 next collapse

If I could get glasses that told me "that guy enthusiastically greeting you by name right now is Marty, you last met him in university in such-and-such class eight years ago" I would pay any amount of money for that.

"Doxing people" and "recognizing people" have a pretty blurry border.

Eggyhead@fedia.io on 03 Oct 2024 16:26 next collapse

Imagine never having to go through “the effort” of just knowing someone.

I’m starting to get a feel for the “society is fucked” crowd.

Edit: I’m leaving this up because y’all are making good points.

astrsk@fedia.io on 03 Oct 2024 16:39 next collapse

That’s not what they’re saying. Nuance is important here.

Some people have a legitimate condition where they can’t remember faces. Moreover there’s a lot of different brains out there and some people have very poor memory when it comes to other people’s names or other details, especially if they’re introverted and have anxiety in social situations. It can be helpful to have reminders, like keeping birthdays attached to people in your contacts so your calendar can remind you when it is someone’s birthday. Everyone is different and what you call “effort” might be a physical or mental deficiency or differently wired brain for someone else.

Tower@lemm.ee on 03 Oct 2024 21:10 collapse

Yeah, I’m neurodivergent and have a terrible memory. My life is full of alarms and notes and reminders for everything, otherwise nothing gets done.

While I’m well aware of the insidiousness of tech’s ever increasing privacy violations, I also look forward to things like AI being able to function as a full blown personal assistant to help me run my life.

AwesomeLowlander@sh.itjust.works on 03 Oct 2024 22:58 collapse
DarkThoughts@fedia.io on 03 Oct 2024 16:34 collapse

Recording and even more so profiling people without their explicit consent is completely not okay.

andyburke@fedia.io on 03 Oct 2024 17:17 collapse

In private you are correct. In public it is a lot more complicated.

DarkThoughts@fedia.io on 03 Oct 2024 17:22 collapse

No, it is not. Keep your camera out of my face.

eager_eagle@lemmy.world on 04 Oct 2024 02:09 next collapse

this guy doesn’t smile for the camera

andyburke@fedia.io on 03 Oct 2024 17:32 collapse

Stay home. 🤷‍♂️ When you are in public, people can see you. You don't get to tell me what I can and can't look at or take a picture of. (Note that I said this was complicated, and this is where the complications start - I should be able to record you in public if I am not specifically monitoring or harassing you, or trying to obtain pictures of things under your clothes, for instance, which IS a violation of your privacy. But just walking around in public recording things? You can't take my rights away just because you think you should have complete privacy even when out in public.)

DarkThoughts@fedia.io on 03 Oct 2024 19:08 collapse

I do that as much as I can anyway, but even I have to go and buy groceries about once per week. And yes, I literally do get to tell you not to record me, because it is very much illegal to record people without their consent here. Cry about it if you want.

andyburke@fedia.io on 03 Oct 2024 21:20 collapse

I'd be interested in hearing more about what law you're referring to (or you could point me at a.similar example, I don't need to know where you live). My understanding is that even in two-party consent states you can record in public as long as you aren't recording conversations and/or the people being recorded have no expectation of privacy (no one should be recording anything in public bathrooms, changing rooms, etc. - you do have an expectation of privacy there even though you are in public, for instance.)

I don't get that emotional about online stuff, but thanks for your concern.

DarkThoughts@fedia.io on 03 Oct 2024 21:43 collapse

My country is easy to figure out and not really a secret and generally known for individual privacy laws.
You can publicly record something, but only for as long as it does not violate the personal rights of someone - and yes, that still includes their privacy rights. The above example of directly recording or let alone profiling someone through "AI" is not legal without consent, and there's also further laws regarding AI surveillance within the entirety of the EU. The same goes for publicly sharing such recordings online. You generally have to blur people's faces and even license plates of public recordings. There's also laws regarding "hidden recordings", which I'd place this under since I could not tell if your glasses are recording me or not.

andyburke@fedia.io on 03 Oct 2024 22:36 collapse

Sorry, but could you cite a specific law? I'm interested to see the differences in the EU vs. what we have here in the states.

I spent a little time trying to do my own legwork and there is stuff under GDPR but that excepts personal recordings. (Akin to the complications in the US where if you publish or profit from a video recorded in public it's different and more complicated.)

So I am curious about how these protections are carved out and I can't quite find the law(s) you are discussing without some help.

DarkThoughts@fedia.io on 04 Oct 2024 11:17 collapse

https://eur-lex.europa.eu/eli/reg/2024/1689/oj / https://www.europarl.europa.eu/topics/en/article/20230601STO93804/eu-ai-act-first-regulation-on-artificial-intelligence

https://www.gesetze-im-internet.de/englisch_stgb/englisch_stgb.html#p1935

andyburke@fedia.io on 04 Oct 2024 14:27 collapse

So I read through a good bit, and I could not find anytjing that actually gives you the protections you claimed your country offers.

Theree apprar to be limita about taking pictures of people in distress, injured or deceased and then sharing that.

There is a bit about taking a picture in a room specifically meant for privacy, which to me is akin to the bathrooms and changing rooms I was mentioning.

But I can't find the language that gives the carve outs you say are there.. I'm sorry, but if I missed it could you quote the relevant section you're trying to share?

DarkThoughts@fedia.io on 04 Oct 2024 15:42 collapse

If the links aren't of help to you understanding, then I'm afraid I cannot help you.

andyburke@fedia.io on 04 Oct 2024 17:21 collapse

Ok. Well, fair enough.

Given that you cannot point me at any text that supports your claims directly, though, I have to conclude that what I said above in my original comments holds and that you do not have a right to stop others from recording you in public.

DarkThoughts@fedia.io on 04 Oct 2024 18:10 collapse

It's going to be your legal issue when you break it, so go ahead.

SlopppyEngineer@lemmy.world on 03 Oct 2024 19:15 next collapse

Now we just need to use the user information to check their net worth, and if it’s above a certain amount it needs to hover a quest marker above that person. I’m curious to see how long before privacy laws get stronger.

Manifish_Destiny@lemmy.world on 03 Oct 2024 21:42 next collapse

If it’s a billionaire it’s just a combat marker.

b161@lemmy.blahaj.zone on 03 Oct 2024 23:13 collapse

We can use augmented reality to turn them into a chicken drumstick or a nice juicy steak.

Whitebrow@lemmy.world on 04 Oct 2024 15:08 collapse

Still gonna taste like pork tho.

Fredselfish@lemmy.world on 04 Oct 2024 13:18 next collapse

Didn’t know Watch Dog was becoming a reality!?

prole@sh.itjust.works on 04 Oct 2024 15:41 collapse

They’ll probably just end up making a (very expensive) method of obscuring themselves from the recognition tech. That way they won’t need to pass any laws, and ad companies (or cops or anyone else who knows how to jailbreak their hardware. Probably) can still take advantage of the technology in some way.

Because 💰

Curious_Canid@lemmy.ca on 05 Oct 2024 03:02 next collapse

Exactly. The rich will be able to buy privacy, while the rest become ever easier to exploit.

TexMexBazooka@lemm.ee on 05 Oct 2024 04:04 collapse

This is what will happen.

DarkThoughts@fedia.io on 03 Oct 2024 16:38 next collapse

I guess we need those Cyberpunk 2077 holographic masks that the Scavs use to hide their faces.

IchNichtenLichten@lemmy.world on 03 Oct 2024 21:19 next collapse

Time to get myself a scramble suit.

shoulderoforion@fedia.io on 03 Oct 2024 17:19 next collapse

well, no, someone used meta smart glasses to feed their instagram, and used facial recognition software on a different device like a pc to scan the instagram photos, and push their results to their smartphone

not the same thing

notgold@aussie.zone on 04 Oct 2024 19:54 collapse

Anyone got non paywall

rob_t_firefly@lemmy.world on 05 Oct 2024 03:29 collapse

One of the students’ names who did this was in the non-paywalled chunk of the article. A news search for that name brings up a ton of links about the story.

duckduckgo.com/?q=AnhPhu Nguyen&iar=news&ia=news