this post was submitted on 08 Jul 2024
825 points (96.8% liked)

Science Memes

11189 readers
2058 users here now

Welcome to c/science_memes @ Mander.xyz!

A place for majestic STEMLORD peacocking, as well as memes about the realities of working in a lab.



Rules

  1. Don't throw mud. Behave like an intellectual and remember the human.
  2. Keep it rooted (on topic).
  3. No spam.
  4. Infographics welcome, get schooled.

This is a science community. We use the Dawkins definition of meme.



Research Committee

Other Mander Communities

Science and Research

Biology and Life Sciences

Physical Sciences

Humanities and Social Sciences

Practical and Applied Sciences

Memes

Miscellaneous

founded 2 years ago
MODERATORS
 
you are viewing a single comment's thread
view the rest of the comments
[–] nifty@lemmy.world 3 points 4 months ago (1 children)

My biggest issue is that a lot of physical models for natural phenomena are being solved using deep learning, and I am not sure how that helps deepen understanding of the natural world. I am for DL solutions, but maybe the DL solutions would benefit from being explainable in some form. For example, it’s kinda old but I really like all the work around gradcam and its successors https://arxiv.org/abs/1610.02391

[–] model_tar_gz@lemmy.world 6 points 4 months ago (2 children)

How is it different than using numerical methods to find solutions to problems for which analytic solutions are difficult, infeasible, or simply impossible to solve.

Any tool that helps us understand our universe. All models suck. Some of them are useful nevertheless.

I admit my bias to the problem space though: I’m an AI engineer—classically trained in physics and engineering though.

[–] nifty@lemmy.world 8 points 4 months ago (1 children)

In my experience, papers which propose numerical solutions cover in great detail the methodology (which relates to some underlying physical phenomena), and also explain boundary conditions to their solutions. In ML/DL papers, they tend to go over the network architecture in great detail as the network construction is the methodology. But the problem I think is that there’s a disconnect going from raw data to features to outputs. I think physics informed ML models are trying to close this gap somewhat.

[–] model_tar_gz@lemmy.world 3 points 4 months ago (1 children)

As I was reading your comment I was thinking Physics Informed NN’s and then you went there. Nice. I agree.

I’ve built some models that had a solution constrained loss functions—featureA must be between these values, etc. Not quite the same as defining boundary conditions for ODE/PDE solutions but in a way gets to a similar space. Also, ODE/PDE solutions tend to find local minima and short of changing the initial conditions there aren’t very many good ways of overcoming that. Deep learning approaches offer more stochasticity so converge to global solutions more readily (at the risk of overfitting).

The convergence of these fields is exciting to watch.

[–] nifty@lemmy.world 4 points 4 months ago

Deep learning approaches offer more stochasticity so converge to global solutions more readily (at the risk of overfitting).

Yeah, thats a fair point and another appealing reason for DL based methods

[–] iAvicenna@lemmy.world 2 points 4 months ago* (last edited 4 months ago)

well numerical models have to come up with some model that explains how relevant observables behave. With AI you don't even build the model that explains the system physically and mathematically, let alone the solution.

It is basically like having Newton's Equations vs an AI that produces coordinates with respect to time (and possibly many other inputs we thought were relevant but weren't because we don't know the model) given initial conditions and force fields.