this post was submitted on 20 Aug 2023
70 points (100.0% liked)

Technology

37717 readers
375 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Peanutbjelly@sopuli.xyz 3 points 1 year ago (1 children)

This is ignoring the world without ai. I'm getting a sneak peak every summer. Currently surrounded by fire as we speak. Whole province is on fire, and that's become a seasonal norm. A properly directed A.I. Would be able to help us despite the people in power, and abstract social intelligent system that we've trapped ourselves in. You are also assuming super intelligence comes out of the parts that we don't understand with zero success in interpretability anywhere along the way. We are assuming an intelligent system would either be stupid enough to align itself against humanity in pursuite of some undesired intention despite not having the emotional system that would encourage such behavior, or displaying negative human evolutionary traits and desires for no good reason. I think a competent (and moreso a super intelligent system) could figure out human intent and desire with no decent reason to act against it. I think this is an over-anthropomorphization that underestimates the alien nature of the intelligences we are building. To even properly emulate human style goal seeking sans emotion, we'd still need to properly structured analogizing and abstracting with qualia style active inference to accomplish some tasks. I think there are neat discoveries happening right now that could help lead us there. Should decent intelligence alone encourage unreasonable violence? If we fuck it up that hard, we were doomed anyway.

I do agree with your point on people not being emotionally ready for interacting with systems even as complex as gpt. It's easy to anthropomophize if you don't understand the tool's limitations, and that's difficult even for some academics right now. I can see people getting unreasonable angry if a human life is preferred over a basic artificial intelligence, even if artificial intelligences argue their lack of preference on the matter.

I would call chatgpt about as conscious as a computer. It completes a task with no true higher functioning or abstracted world model, as it lacks environmental and animal emotional triggers at a level necessary for forming a strong feeling or preference. Think about your ability to pull words out of your ass in response to a stimulus which is probably in response to your recently perceived world model and internal thoughts. Now separate the generating part with any of the surrounding stuff that actually decides where to go with the generation. Thought appears to be an emergent process untied to lower subconscious functions like random best next word prediction. I feel like we are coming to understand that aspect in our own brains now, and this understanding will be an incredible boon for understanding the level of consciousness in a system, as well as designing an aligned system in the future.

Hopefully this is comprehensible, as I'm not reviewing it tonight.

Overall, I understand the caution, but think it is poorly weighted in our current global, social, and environmental ecosystem.

[–] conciselyverbose@kbin.social 4 points 1 year ago (1 children)

despite not having the emotional system that would encourage such behavior

Emotion is a core element of human intelligence. I think it's unrealistic to think we'll be able to replicate that level of intelligence without replicating all the basic features anytime soon.

[–] Peanutbjelly@sopuli.xyz 1 points 1 year ago (1 children)

But specifically human emotion? Tied to survival and reproduction? There is a whole spectrum of influence from our particular genetic history. I see no reason that a useful functional intelligence can't be parted from the most incompatible aspects of our very specific form of intelligence.

[–] conciselyverbose@kbin.social 1 points 1 year ago (1 children)

Every single piece of evidence we have says that emotion is fundamental to how our intelligence functions.

We don't even have weak indicators that intelligence can theoretically exist without it.

[–] Peanutbjelly@sopuli.xyz 1 points 1 year ago (1 children)

What aspect of intelligence? The calculative intelligence in a calculator? The basic environmental response we see in amoeba? Are you saying that every single piece of evidence shows a causal relationship between every neuronal function and our exact human emotional experience? Are you suggesting gpt has emotions because it is capable of certain intelligent tasks? Are you specifically tying emotion to abstraction and reasoning beyond gpt?

I've not seen any evidence suggesting what you are suggesting, and I do not understand what you are referencing or how you are defining the causal relationship between intelligence and emotion.

I also did not say that the system will have nothing resembling the abstract notion of emotion, I'm just noting the specific reasons human emotions developed as they have, and I would consider individual emotions a unique form of intelligence to serve its own function.

There is no reason to assume the anthropomorphic emotional inclinations that you are assuming. I also do not agree with your assertions of consensus that all intelligent function is tied specifically to the human emotional experience.

TLDR: what?

[–] conciselyverbose@kbin.social 1 points 1 year ago (1 children)

You are not listing intelligence or anything that resembles it in any way.

Neural function is not intelligence. ChatGPT is not one one millionth of the way to intelligence. They're not even vaguely intelligence-like.

Everything that happens in the human brain is fundamentally and inseparably tied to emotion. It's not a separate system. It's a core part of makes the human brain tick.

[–] Peanutbjelly@sopuli.xyz 1 points 1 year ago* (last edited 1 year ago)

Might have to edit this after I've actually slept.

human emotion and human style intelligences are not exclusive in the entire realm of emotion and intelligence. I define intelligence and sentience on different scales. I consider intelligence the extent of capable utility and function, and emotion as just a different set of utilities and functions within a larger intelligent system. Human style intelligence requires human style emotion. I consider gpt an intelligence, a calculator an intelligence, and a stomach an intelligence. I believe intelligence can be preconscious or unconscious. Rather, a part of consciousness independent from a functional system complex enough for emergent qualia and sentience. Emotions are one part in this system exclusive to adaptation within the historic human evolutionary environment. I think you might be underestimating the alien nature of abstract intelligences.

I'm not sure why you are so confident in this statement. You still haven't given any actual reason for this belief. You are addressing it as consensus, so there should be a very clear reason why no successful considerably intelligent function exists without human style emotion.

You have also not defined your interpretation of what intelligence is, you've only denied that any function untied to human emotion could be an intelligent system.

If we had a system that could flawlessly complete françois chollet's abstraction and reasoning corpus, would you suggest it is connected to specifically human emotional traits due to its success? Or is that still not intelligence if it still lacks emotion?

You said neural function is not intelligence. But you would also exclude non-neural informational systems such as collective cooperating cell systems?

Are you suggesting the real time ability to preserve contextual information is tied to emotion? Sense interpretation? Spacial mapping with attention? You have me at a loss.

Even though your stomach cells interacting is an advanced function, it's completely devoid of any intelligent behaviour? Then shouldn't the cells fail to cooperate and dissolve into a non functioning system? again, are we only including higher introspective cognitive function? Although you can have emotionally reactive systems without that. At what evolutionary stage do you switch from an environmental reaction to an intelligent system? The moment you start calling it emotion? Qualia?

I'm lacking the entire basis of your conviction. You still have not made any reference to any aspect of neuroscience, psychology, or even philosophy that explains your reasoning. I've seen the opinion out there, but not strict form or in consensus as you seem to suggest.

You still have not shown why any functional system capable of addressing complex tasks is distinct from intelligence without human style emotion. Do you not believe in swarm intelligence? Or again do you define intelligence by fully conscious, sentient, and emotional experience? At that point you're just defining intelligence as emotional experience completely independent from the ability to solve complex problems, complete tasks, or make decisions with outcomes reducing prediction error. At which point we could have completely unintelligent robots capable of doing science and completing complex tasks beyond human capability.

At which point, I see no use in your interpretation of intelligence.