Calmara suggests it can detect STIs with photos of genitals -- a dangerous idea | TechCrunch (techcrunch.com)
from Stopthatgirl7@lemmy.world to technology@lemmy.world on 22 Mar 2024 16:34
https://lemmy.world/post/13423222

You’ve gone home with a Tinder date, and things are escalating. You don’t really know or trust this guy, and you don’t want to contract an STI, so… what now?

A company called Calmara wants you to snap a photo of the guy’s penis, then use its AI to tell you if your partner is “clear” or not.

Let’s get something out of the way right off the bat: You should not take a picture of anyone’s genitals and scan it with an AI tool to decide whether or not you should have sex.

#technology

threaded - newest

autotldr@lemmings.world on 22 Mar 2024 16:35 next collapse

This is the best summary I could come up with:


“With lab diagnosis, sensitivity and specificity are two key measures that help us understand the test’s propensity for missing infections and for false positives,” Daphne Chen, founder of TBD Health, told TechCrunch.

HeHealth is framed as a first step for assessing sexual health; then, the platform helps users connect with partner clinics in their area to schedule an appointment for an actual, comprehensive screening.

HeHealth’s approach is more reassuring than Calmara’s, but that’s a low bar — and even then, there’s a giant red flag waving: data privacy.

“It’s good to see that they offer an anonymous mode, where you don’t have to link your photos to personally identifiable information,” Valentina Milanova, founder of tampon-based STI screening startup Daye, told TechCrunch.

This sounds reassuring, but in its privacy policy, Calmara writes that it shares user information with “service providers and partners who assist in service operation, including data hosting, analytics, marketing, payment processing, and security.” They also don’t specify whether these AI scans are taking place on your device or in the cloud, and if so, how long that data remains in the cloud, and what it’s used for.

Calmara represents the danger of over-hyped technology: It seems like a publicity stunt for HeHealth to capitalize on excitement around AI, but in its actual implementation, it just gives users a false sense of security about their sexual health.


The original article contains 773 words, the summary contains 228 words. Saved 71%. I’m a bot and I’m open source!

ObviouslyNotBanana@lemmy.world on 22 Mar 2024 16:51 next collapse

I wouldn’t trust calamari to identify anything tbh

debounced@kbin.run on 22 Mar 2024 16:57 next collapse

i'm almost certain there's a hentai like this

NoRodent@lemmy.world on 22 Mar 2024 17:23 collapse

It’s a trap!

kokesh@lemmy.world on 22 Mar 2024 17:07 next collapse

No more need for Ann Perkins to identify Joe’s problem.

JaymesRS@literature.cafe on 22 Mar 2024 17:09 next collapse

And to think, they stared with an app to identify if something was a hot dog or not.

teft@lemmy.world on 22 Mar 2024 17:21 next collapse

Not a hot dog.

catloaf@lemm.ee on 22 Mar 2024 17:46 next collapse

I’m pretty sure that gonorrhea, chlamydia, and HIV don’t generally have visible symptoms. Just use a condom.

circuscritic@lemmy.ca on 22 Mar 2024 19:14 next collapse

What part of AI don’t you understand?

If you can’t trust AI medical startups operating out of Silicon Valley with pictures of your genitals, well…THEN WHO CAN YOU TRUST?

I mean, to be fair, it also looks like they might be partially financially backed by a foreign authoritarian regime, and they usually have pretty good AI models…so…

RatBin@lemmy.world on 22 Mar 2024 19:47 next collapse

We are reaching the phase where ai is de facto a magic spell to be cast on reality, and ai startup are hyping this up. That and taking pics of stranger’s genitals is a dick move.

circuscritic@lemmy.ca on 22 Mar 2024 19:51 next collapse

Yes, I agree. AI is magic and everyone should submit pictures of their genitals.

Hell, I’ve started converting my dick pics into ASCII art and having ChatGPT diagnose me for STI’s.

AI BABY WOOOOOO HOOOOO

DudeDudenson@lemmings.world on 23 Mar 2024 11:38 collapse

I’m just happy we moved to an AI bubble to raise stock prices instead of continuing to lay off essential personnel to do it

echodot@feddit.uk on 23 Mar 2024 12:22 collapse

AI is not even at the point yet well you can lay off workers and just have the AI do it reliably and safely

TheDeepState@lemmy.world on 23 Mar 2024 01:46 collapse

Fair enough. Unzip.

wizardbeard@lemmy.dbzer0.com on 22 Mar 2024 19:59 next collapse

Man, what about false positives? Ruined date night at minimum, possibly ruined reputation, relationships.

Sorry, we put a picture of your junk into this box. We don’t know what’s in it, or what it does with the picture, but it says you have chlamydia, and I think the box looks trustworthy. Here’s your divorce papers.

wagoner@infosec.pub on 23 Mar 2024 02:09 collapse

If you get the premium ultra plan you will always report as negative on a scan.

Imgonnatrythis@sh.itjust.works on 22 Mar 2024 20:28 collapse

Or if still concerned after the fact, a doctor. Despite what your GOP neighbor might tell you, they’re not all evil quacks and don’t typically take pictures of your stuff either.

SnotFlickerman@lemmy.blahaj.zone on 22 Mar 2024 17:56 next collapse

<img alt="" src="https://lemmy.blahaj.zone/pictrs/image/17b1c844-c603-43e6-a9dc-c470397c4dbb.jpeg">

Obligatory Peep Show

smileyhead@discuss.tchncs.de on 22 Mar 2024 18:05 next collapse

Single reason why this is suspicious from the start:

Advertised not to check yourself, but your one-night partner. If it was advertised for self-check it would be bombed with lawsuit for fake medical advices.

cm0002@lemmy.world on 22 Mar 2024 18:10 collapse

Nah, they’d just throw up a disclaimer “Not true medical advice, consult a doctor for actual confirmation” and they’d probably be in the clear

JackGreenEarth@lemm.ee on 22 Mar 2024 18:32 next collapse

Is it open source and offline? I would only trust that they’re not collecting all the photos people take with the app if so.

dezmd@lemmy.world on 22 Mar 2024 20:17 next collapse

“Not Hotdog.”

nbdjd@lemmy.world on 23 Mar 2024 01:38 next collapse

“Is sandwich.“

Empricorn@feddit.nl on 23 Mar 2024 13:56 collapse

“Meat popsicle confirmed.”

ratzki@discuss.tchncs.de on 22 Mar 2024 20:40 next collapse

Reminds me of this great song

systemglitch@lemmy.world on 22 Mar 2024 21:06 next collapse

I could care less who sees my junk. I also would not let someone take pictures of it so I can fuck them. I’m galaxies away from being that desperate.

Default_Defect@midwest.social on 23 Mar 2024 03:37 collapse

couldn’t care less*

Since I assume you mean you don’t care.

systemglitch@lemmy.world on 23 Mar 2024 14:25 collapse

Thank you.

Jubei_K_08@lemmy.world on 22 Mar 2024 21:17 next collapse

Hold on babe AI wants to see a picture of your shillelagh first 🤳

candywashing@infosec.pub on 22 Mar 2024 21:26 next collapse

Downside is we have unique buttholes, so I assume that extends to other gentials. Fun new privacy attack here

smithsonianmag.com/…/why-scientists-created-smart…

jkrtn@lemmy.ml on 23 Mar 2024 13:17 next collapse

An untapped mobile device biometric.

ratzki@discuss.tchncs.de on 23 Mar 2024 13:26 next collapse

Buttplugs to protect your privacy. But only if you are on the toilet for A, not for B.

Odo@lemmy.world on 23 Mar 2024 13:32 collapse

Wait, someone actually made Smart Pipe?

PipedLinkBot@feddit.rocks on 23 Mar 2024 13:33 collapse

Here is an alternative Piped link(s):

Smart Pipe?

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

Blaster_M@lemmy.world on 22 Mar 2024 21:47 next collapse

This definitely won’t be misused in any way that would completely destroy the good name of the person taking/in the frame of the image. It’s just one “probable cause” search from a bad day.

answersplease77@lemmy.world on 23 Mar 2024 14:10 collapse

what are the chances they build a database to blackmail any individual they want in the future and just say it was leaked

terminhell@lemmy.dbzer0.com on 22 Mar 2024 23:47 next collapse

I’m speechless, ina bad way.

werefreeatlast@lemmy.world on 23 Mar 2024 00:34 next collapse

Maybe they will use the photo to match it with doctor notes and photos medically taken of the same penis or vagina to then illegally match them to illegally obtained health records. Probably not though.

bbuez@lemmy.world on 23 Mar 2024 11:16 next collapse

Finally, using this we’ll be able to train AI models so we can know what super-gonaherpes looks like

jet@hackertalks.com on 23 Mar 2024 14:02 collapse

Some STIs, in some situations, have a visible presentation that could be detected.

A false positive is a good thing here, a false negative is a bad thing here. There’s no way this app will not have huge false negatives.