OpenAI and the FDA Are Holding Talks About Using AI In Drug Evaluation (www.wired.com)
from jeffw@lemmy.world to technology@lemmy.world on 08 May 01:05
https://lemmy.world/post/29302971

#technology

threaded - newest

gravitas_deficiency@sh.itjust.works on 08 May 01:59 next collapse

Jesus fuck. As someone who works in biotech: this is probably going to kill people.

givesomefucks@lemmy.world on 08 May 02:12 next collapse

Ignorance is bliss homie…

afresearchlab.com/…/department-of-the-air-force-l…

People really don’t want to know how much AI is being shoved down the government’s and military’s throats these days.

It ain’t even just a musk/trump thing, most of it was put in motion under Biden, but now idiots are running with it.

Like, who the fuck really wants the space force using a closeted version of the same consumer level chatbot coding shit?

Fucking no one who understands anything about any of this, but the people calling the shots believe the hype.

So ignorance is bliss.

gravitas_deficiency@sh.itjust.works on 08 May 02:30 collapse
UsoSaito@feddit.uk on 08 May 02:13 next collapse

Like why don’t we just have ACTUAL DOCTORS and techs trained to do this actually test it. Wild concept I know.

gravitas_deficiency@sh.itjust.works on 08 May 02:31 next collapse

It’s ok they’re replacing the doctors with LLMs now it’ll be fine I promise

boomzilla@programming.dev on 08 May 09:59 collapse

Yesterday on reddit I saw a photo a patient shot over the shoulder of his doctor of his computer monitor. It had ChadGPT full with diagnosis requests.

reddit.com/…/doctor_using_chatgpt_for_a_visit_due…

floofloof@lemmy.ca on 08 May 02:37 next collapse

Doctors cost money and the money goes to doctors. LLMs cost less and the money goes to billionaire fascist techbros. The fact that they’re not fit for purpose is insignificant compared to the potential for techbro enrichment.

Also, doctors have an annoying habit of helping people to live regardless of whether techbro eugenics says they deserve to.

TexMexBazooka@lemm.ee on 08 May 21:39 collapse

Because it costs money to educate the doctors, then they want money.

bobs_monkey@lemm.ee on 08 May 04:38 collapse

They don’t care though, if anything it’s more money for the medical profit machine.

fullsquare@awful.systems on 08 May 02:37 next collapse

damn i see that chatbots don’t want to stay behind rfk jr in body count

will they learn that safety regulations are written in blood? who am i kidding, that’s not their blood

db2@lemmy.world on 08 May 03:34 next collapse

I hate this timeline.

NarrativeBear@lemmy.world on 08 May 04:09 next collapse

Most PCs no longer have floppy disk readers or CD drives, where are they going to put the placebo or drugs in. /s

Eggyhead@fedia.io on 08 May 05:19 next collapse

If it’s trained carefully, professionally, responsibly, with bonafide medical research data exclusively, I can see it being a boon to healthcare professionals. I just don’t know if I can trust that will happen in the timeline we live in.

gndagreborn@lemmy.world on 08 May 21:14 collapse

Open evidence is a legit tool my colleagues and classmates use every day. Open AI is leagues behind them especially in terms of HIPAA compliance.

CriticalMiss@lemmy.world on 08 May 05:37 collapse

Ah great, horrors beyond my imagination.

paraphrand@lemmy.world on 08 May 06:53 next collapse

Turns out Umbrella Corp was an AI company that pivoted?

Womble@lemmy.world on 08 May 09:03 collapse

He could see AI being used more immediately to address certain “low-hanging fruit,” such as checking for application completeness. “Something as trivial as that could expedite the return of feedback to the submitters based on things that need to be addressed to make the application complete,” he says. More sophisticated uses would need to be developed, tested, and proved out.

Oh no, the dystopian horror…

ZILtoid1991@lemmy.world on 09 May 11:12 collapse

LLMs also do a lot of mistakes even when used for text analysis, and as the tech sector loves the “move fast and break things” mantra, it’ll be put into practice much earlier than it should be.