AI sees beyond humans: automated diagnosis of myopia based on peripheral refraction map using interpretable deep learning.
(doi.org)
from 101@feddit.org to technology@lemmy.world on 26 Sep 2024 10:35
https://feddit.org/post/3204967
from 101@feddit.org to technology@lemmy.world on 26 Sep 2024 10:35
https://feddit.org/post/3204967
Note: this article is from 08 September 2024.
threaded - newest
Neat and a great use for ai
it’s interesting that they’re using pretty modest hardware (i assume they mean 24 cores not CPUs) and fairly outdated dependencies. also having their dependencies listed out like this is pretty adorable. it has academic-out-of-touch-not-a-software-dev vibes. makes you wonder how much further a project like this could go with decent technical support. like, all these talented engineers are using 10k times the power to work on generalist models like GPT that struggle at these kinds of tasks, while promising that it would work someday and trivializing them as “downstream tasks”. i think there’s definitely still room in machine learning for expert models; sucks they struggle for proper support.
Appendix A of this paper is our requirements.txt
I’d say the opposite. Usually you barely get the requirements.txt, when you do you’re missing the versions (including for python itself), and then only must you find out The versions of cuda and cuda driver
There’s no “AI” involved. The authors quickly retreat from their misleading title to the sightly less misleading “deep learning”. Regardless of grifter terminology, we’re actually talking about a machine computing the statistics of images. That might be good for patients. But it’s got nothing to do with “artificial intelligence” or “seeing beyond humans”.
<img alt="" src="https://programming.dev/pictrs/image/9a0d3a5a-ba2c-4a45-8ec8-61b9cd0fb33a.png">