this post was submitted on 08 Jul 2024
825 points (96.8% liked)

Science Memes

10833 readers
2395 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.


Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
(page 2) 50 comments
sorted by: hot top controversial new old
[–] nifty@lemmy.world 3 points 3 months ago (5 children)

My biggest issue is that a lot of physical models for natural phenomena are being solved using deep learning, and I am not sure how that helps deepen understanding of the natural world. I am for DL solutions, but maybe the DL solutions would benefit from being explainable in some form. For example, it’s kinda old but I really like all the work around gradcam and its successors https://arxiv.org/abs/1610.02391

load more comments (5 replies)
[–] someacnt_@lemmy.world 3 points 3 months ago (3 children)

Well, lots of people blinded by hype here.. Obv it is not simply statistical machine, but imo it is something worse. Some approximation machinery that happen to work, but gobbles up energy in cost. Something only possible becauss we are not charging companies enough electricity costs, smh.

[–] Daxtron2@startrek.website 3 points 3 months ago (1 children)

We're in the "computers take up entire rooms in a university to do basic calculations" stage of modern AI development. It will improve but only if we let them develop.

[–] rbesfe@lemmy.ca 2 points 3 months ago* (last edited 3 months ago) (5 children)

Moore's law died a long time ago, and AI models aren't getting any more power efficient from what I can tell.

[–] Daxtron2@startrek.website 3 points 3 months ago

Then you haven't been paying attention. There's been huge strides in the field of small open language models which can do inference with low enough power consumption to run locally on a phone.

load more comments (4 replies)
load more comments (2 replies)
[–] clark@midwest.social 2 points 3 months ago

Moldy Monday!

load more comments
view more: ‹ prev next ›