this post was submitted on 01 Jun 2024
497 points (97.9% liked)

Technology

59533 readers
3087 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] hersh 58 points 5 months ago (4 children)

Is this legit? This is the first time I've heard of human neurons used for such a purpose. Kind of surprised that's legal. Instinctively, I feel like a "human brain organoid" is close enough to a human that you cannot wave away the potential for consciousness so easily. At what point does something like this deserve human rights?

I notice that the paper is published in Frontiers, the same journal that let the notorious AI-generated giant-rat-testicles image get published. They are not highly regarded in general.

[–] pearsaltchocolatebar@discuss.online 46 points 5 months ago (1 children)

They don't really go into the size of the organoid, but it's extremely doubtful that it's large and complex enough to get anywhere close to consciousness.

There's also no guarantee that a lump of brain tissue could ever achieve consciousness, especially if the architecture is drastically different from an actual brain.

[–] JackGreenEarth@lemm.ee 14 points 5 months ago (1 children)

Well, we haven't solved the hard problem of consciousness, so we don't know if size of brain or similarity to human brain are factors for developing consciousness. But perhaps a more important question is, if it did develop consciousness, how much pain would it experience?

[–] ColeSloth@discuss.tchncs.de 16 points 5 months ago (3 children)

Physical pain? Zero.

Now emotional pain? I'm not sure it would even be able to accomplish emotional pain. So much of our emotions are intertwined with chemical balances and releases. If a brain achieved consciousness, but had none of these chemicals at all......I don't know that'd even work.

[–] Warl0k3@lemmy.world 5 points 5 months ago* (last edited 5 months ago)

While we haven't confirmed this experimentally (ominous voice: yet), computationally there's no reason even a simple synthetic brain couldn't experience emotions. Chemical neurotransmitters are just an added layer of structural complexity so Church–Turing will still hold true. Human brains are only powerful because they have an absurdly high parallel network throughput rate (computational bus might be a better term), the actual neuron part is dead simple. Network computation is fascinating, but much like linear algebra the actual mechanisms are so simple they're dead boring - but if you cram 200,000,000 of those mechanisms into a salty water balloon it can produce some really pompus lemmy comments.

~~Emotions are holographic anyways so the question is kinda meaningless. It's like asking if an artificial brain will perceive the color green as the same color we 'see' as green. It sounds deep until you realize it's all fake, man. It's all fake.~~

[–] Maeve@sh.itjust.works 1 points 5 months ago (1 children)
[–] ColeSloth@discuss.tchncs.de 3 points 5 months ago

I'm not sure what color skittles I ate, but im feeling....horny.

[–] emptiestplace@lemmy.ml 0 points 5 months ago (1 children)

Physical pain? Zero.

Did you think about this before you wrote it?

[–] ColeSloth@discuss.tchncs.de 10 points 5 months ago* (last edited 5 months ago) (1 children)

Didn't have to. Kind of an obvious thing to point out, but OP didn't specify what type of pain he meant, so I figured I would, just in case.

[–] emptiestplace@lemmy.ml 1 points 5 months ago (2 children)
[–] JohnEdwa@sopuli.xyz 16 points 5 months ago (1 children)

Human brains don't actually have any pain receptors (even though headaches would have you seriously believe otherwise), so a brain alone wouldn't be able to feel pain any more than it would be able to smell or see.

[–] ColeSloth@discuss.tchncs.de 11 points 5 months ago (1 children)

Physical pain only exists from nerves. Brains don't have any nerves. No nerves. No pain.

[–] Neuromancer49@midwest.social 25 points 5 months ago

Believe it or not, I studied this in school. There's some niche applications for alternative computers like this. My favorite is the way you can use DNA to solve the traveling salesman problem (https://en.wikipedia.org/wiki/DNA_computing?wprov=sfla1)

There have been other "bioprocessors" before this one, some of which have used neurons for simple image detection, e.g https://ieeexplore.ieee.org/abstract/document/1396377?casa_token=-gOCNaYaKZIAAAAA:Z0pSQkyDBjv6ITghDSt5YnbvrkA88fAfQV_ISknUF_5XURVI5N995YNaTVLUtacS7cTsOs7o. But this seems to be the first commercial application. Yes, it'll use less energy, but the applications will probably be equally as niche. Artificial neural networks can do most of the important parts (like "learn" and "rememeber") and are less finicky to work with.

Careful. Get too deep into that and people will have to admit lesser animals have forms of consciousness.

[–] tyrant@lemmy.world 2 points 5 months ago (1 children)

Seems like it's an ethical gray area. Some brain organoid have responded to light stimulus and there are concerns they might be able to feel pain or develop consciousness. (Full disclosure, I had no idea what an organoid even was before reading this and then did some quick follow up reading)

[–] TheBananaKing@lemmy.world 6 points 5 months ago (1 children)

How complex does a neural net have to be before you can call any of its outputs 'pain'?

Start with a lightswitch with 'pain' written on a post-it note stuck to the on position, end with a toddler. Where's the line?

[–] tyrant@lemmy.world 5 points 5 months ago

I think that's why it's a gray area. No one knows