this post was submitted on 19 May 2024
247 points (100.0% liked)

196

16488 readers
1536 users here now

Be sure to follow the rule before you head out.

Rule: You must post before you leave.

^other^ ^rules^

founded 1 year ago
MODERATORS
 
all 41 comments
sorted by: hot top controversial new old
[–] IAmVeraGoodAtThis@lemmy.blahaj.zone 71 points 5 months ago* (last edited 5 months ago) (2 children)

I have seen AI apologists talk about how "AI" is already sentient and we shouldn't restrict it because it's immoral.

That straight up killed my desire to interact in ~~that space~~ the community with that person

[–] DriftinGrifter@lemmy.blahaj.zone 50 points 5 months ago (2 children)

im friends with guys who studied ai and i can tell you people who actually know what they are talking about don't think that

[–] BluesF@lemmy.world 37 points 5 months ago

No one who has even a vague understanding of present day ML models should not even entertain the idea that they are sentient, or thinking, or anything like it.

[–] IAmVeraGoodAtThis@lemmy.blahaj.zone 13 points 5 months ago* (last edited 5 months ago)

Oh, by "that space" I meant the space where that specific person hung out in, not AI research in general

Though I have heard a fair share of idiotic takes from actual researchers as well

[–] TotallynotJessica@lemmy.world 5 points 5 months ago (2 children)

AI is just a portion of a brain at most, not a being capable of feeling pain or pleasure; a nucleus with no will of its own. When we program AI to have a survival instinct, then we'll have something that's meaningfully alive.

[–] uriel238@lemmy.blahaj.zone 9 points 5 months ago (1 children)

We are experimenting with hierarchies of needs, giving behaviors point values to inform the AI how to conduct itself completing its tasks. This is how, in simulations we are seeing warbots kill their commanding officers when they order pauses to attacks. (Standard debugging, we have to add survival of the commanding officer into the needs hierarchy)

So yes, we already have programs, not AGI, but deep learning systems nonetheless, that are coded for their own survival and the survival of allies, peers and the chain of command.

[–] MBM@lemmings.world 3 points 5 months ago (1 children)

in simulations we are seeing warbots kill their commanding officers when they order pauses to attacks.

Wasn't that a hoax?

[–] uriel238@lemmy.blahaj.zone 1 points 5 months ago

If it is, it's a convincing one. The thing is, learning systems will try all sorts of crazy things until you specifically rule them out, whether that's finding exploits to speed-run video games or attacking allies doing so creates a solution with a better score. This is a bigger problem with AGI since all the rules we code as hard for more primitive systems are softer, hence rather than telling it don't do this thing, I'm serious we have to code in why we're not supposed to do that thing, so it's withheld by consequence avoidance rather than fast rules.

So even if it was a silly joke, examples of that sort of thing are routine in AI development, so it's a believable one, even if they happened to luck into it. That's the whole point of running autonomous weapon software through simulators, because if it ever does engage in friendly fire, its coders and operators will have to explain themselves before a commission.

[–] Swedneck@discuss.tchncs.de 1 points 5 months ago

current AI is like the language centre of our brains separated out and severely atrophied, and as you'd expect that results in it violently hallucinating like a madman

[–] Skullgrid@lemmy.world 32 points 5 months ago* (last edited 5 months ago) (3 children)

This week, I spent a few hours listening to Dadabot's Pho Queue stream.

I read the little prose they left in the stream description. And I found myself wanting to go back to it several times. Something about the screams of "FA KYUU" sounds morphing into saxophones, being garbled, sometimes screamed, sometimes sung, the transitions that were talked about in the prose felt nice to listen to. sometimes it felt like it was coming up with genuine new ideas, for itself at least, that it didn't generate before.

Then I started thinking of the level of art discourse we have in the modern world, especially visual arts, and ideas from Cage. And the nature of the music industry.

Jackson Pollock and his CIA funded splashes of paint are only worthwhile to people who actually like them. The AI generated saxophone warblings, the screeches that come out of it, the choppy, syncopated drum beats give me feelings. It is art to me.

There is a person behind the scenes, that curated the algorithm to bring out music in a certain way. Pho Queue doesn't sound the same as a "bach faucet" , because of the human creativity and curation that went into the way the music is generated.

Brian Eno and John Cage made music with procedural generation features before the current wave of generative AIs. It's an established perspective to have on music. Some performances use radios being tuned to frequencies just to add a new source of sound.

How is this stuff any different?

On the other side of the coin, does music made by humans inherently mean anything? The crap churned out by pop musicians that are flying around with private jets, composed by commitee with the end goal of generating another product for someone else to buy, is there any expression or emotion in there? What does the zillionth love song made by a billionaire have to offer that hasn't been expressed a zillion times before?

There's some more to be written , like if I make music myself that relies heavily on timbre, or effects that I bought, who's really doing the expression, and how? If the musical theory of what I'm playing is simple, and contributes less to the aesthetic feelings of the music I'm making than my guitar's timbre, the way I set up my effects pedals; is the expression I'm doing actually more in how I turned the knobs on my pedals to get an aesthetic than the actual notes I play? So then am I expressing something, or are the people who designed and manufactured my equipment?

But then we get into this :

I thought using loops was cheating, so I programmed my own using samples. I then thought using samples was cheating, so I recorded real drums. I then thought that programming it was cheating, so I learned to play drums for real. I then thought using bought drums was cheating, so I learned to make my own. I then thought using premade skins was cheating, so I killed a goat and skinned it. I then thought that that was cheating too, so I grew my own goat from a baby goat. I also think that is cheating, but I’m not sure where to go from here. I haven’t made any music lately, what with the goat farming and all.

The talk about AI is stupid. It's a tool.

The talk about the IMPLICATIONS of AI, and who uses it to automate what, at the cost of who, is the actual argument to be had.

If some metal band that can barely scrape together enough money to tour uses AI to generate an album cover, it shouldn't be taken as a horrifying act of taking money out of an artist's mouth. If BMG fires everyone they have on the art dept to only use AI art, that's something to be concerned about.

[–] Zagorath@aussie.zone 15 points 5 months ago

The talk about AI is stupid. It's a tool.

The talk about the IMPLICATIONS of AI, and who uses it to automate what, at the cost of who, is the actual argument to be had.

Hear, hear.

[–] BluesF@lemmy.world 6 points 5 months ago

Good thoughts, I agree wholeheartedly in most cases. There is a point to be made about the energy consumption of AI, too. Right now I doubt that we're actually getting as much out of it in really value as we are pumping in, just in sheer electricity.

[–] PipedLinkBot@feddit.rocks 1 points 5 months ago

Here is an alternative Piped link(s):

Dadabot's Pho Queue

"bach faucet"

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I'm open-source; check me out at GitHub.

[–] uriel238@lemmy.blahaj.zone 22 points 5 months ago

This is one of those places where a technology might be beneficial to a communal societies but are dangerous in capitalist ones, since any technology that replaces workers, or substitutes high-paying professional jobs with menial jobs impact survival of the workforce.

I think AI will get better at simulating human creativity, or allowing less-skilled workers produce high-quality results to the point that it will change art much the way desktop publishing revolutionized graphic design (with much resistence from the X-acto generation.

The challenge, IMNSHO is navigating the new technology so it serves society and not just the bunch of capitalists at the top who'd gladly replace us all with robots and let us starve.

The working class shall tremble and all that.

[–] Excrubulent@slrpnk.net 15 points 5 months ago (2 children)

Where on the internet is the anti-AI crowd at large talking about divine sparks of creativity? I am the only person I've seen saying that the only way you get an AI that can truly replace workers is by birthing a new intelligence and then it is wrong to enslave it. I didn't know there were others!

[–] nicknonya@lemmy.blahaj.zone 13 points 5 months ago* (last edited 5 months ago) (1 children)

tumblr in this case.
some people have decided to discredit all ai art as "not real art", implying that "art" is a quality that is bestowed upon a thing by its creator.
this is bullshit to me because i believe that anything can be art to someone and that decrying ai art as bad because it's somehow not real art is a thought terminating cliche that distracts from the reality that image generators are made through the exploitation of milions of unpaid artist's work.

[–] Excrubulent@slrpnk.net 11 points 5 months ago* (last edited 5 months ago) (2 children)

Okay but you can acknowledge the exploitation whilst also admitting that AI doesn't make art and what it does make is universally bad. The fact it's using exploited labour and is being used to threaten jobs makes the fact its output sucks even more of a slap in the face. These ideas are not in tension; two things can be true.

Art means something. Art is any creation that meaningfully expresses the intent of its creator. If you want to make art, you need to understand meaning, and current "AI" is devoid of meaning or understanding. It's not about some nebulous "spark", it's that there is no intention behind an LLM's output. It is a stochastic parrot.

Maybe a person can use AI generated imagery to make something with artistic merit, but that's because their time and attention was put into curating it, not because an AI drew a picture that seems plausible if you don't look at it too closely.

An AI needs to have comprehension before it can intend anything. Art isn't "art" just because it makes pretty pictures.

If you want to say AI as it currently exists can make art, then I'd be fascinated to hear what you think art is, and how your definition differs from mine.

[–] nicknonya@lemmy.blahaj.zone 8 points 5 months ago (2 children)

the thing is i don't think a meaningful definition of art can exist. any attempt would necessarily leave something out. you can look at a crack in a random wall on your way to the shop and think it's art.
is all art necessarily good? No, obviously not, but if looking at a wonky ai landscape (or let's be honest with ourselves, massive honking ai tits) means something to you than that's art to you.

[–] Excrubulent@slrpnk.net 4 points 5 months ago

Things can be beautiful or interesting without being art. The crack in the wall, a naturally occurring landscape are examples of that. You could call them "art" but I think you'd be wrong. That's not a generally accepted meaning of the word.

Actually I'd refine my definition to say that art should be primarily for the purposes of expression and not for any other functional use.

[–] Catoblepas@lemmy.blahaj.zone 3 points 5 months ago (2 children)

Would you say that the building made that art? Even that analogy is imperfect, because the building didn’t have to have the work of thousands of other buildings poured into it to create the crack, it just happened.

[–] nicknonya@lemmy.blahaj.zone 3 points 5 months ago* (last edited 5 months ago)

i said that it could be art to the observer art can't exist in a vacuum it needs someone to experience it.
the ethics of how an artpiece is made are ultimately irrelevant to whether or not it counts as real™ art but very relevant to whether or not we should keep letting it be made the same way.

[–] Even_Adder@lemmy.dbzer0.com 1 points 5 months ago

I think you should read this article.

[–] Even_Adder@lemmy.dbzer0.com 6 points 5 months ago (1 children)

I've seen multiple people on here arguing that understanding and creativity are uniquely human abilities.

[–] Excrubulent@slrpnk.net 8 points 5 months ago* (last edited 5 months ago) (2 children)

They are unique abilities of people; whether a neural net can be a person would depend on whether it possesses those abilities. Humans are just the only examples of people that we currently have.

Understanding is not something current neural nets have. They are stochastic parrots.

EDIT: Perhaps I should've said "Humans are the only uncontroversial examples of people that we currently have," but I guess I put too much faith into people to not get sidetracked by irrelevant technicalities. Animals could be considered people by this definition, that's true and says a lot about our anthropocentric society, but that doesn't change the fact that LLMs are not people.

[–] superb@lemmy.blahaj.zone 9 points 5 months ago (1 children)

I do not accept that humans are the only examples of creativity and understanding, in fact I think you find those traits all over the animal kingdom. From great apes making tools, to fish and birds spending hours building beautiful creations to attract a mate

[–] Excrubulent@slrpnk.net 2 points 5 months ago (2 children)

Even accepting that you're right you've missed the point. To the extent that animals are able to have creativity and understanding, perhaps we should understand them to be "people".

And at any rate, we still don't see this kind of thing from LLMs.

[–] Sas@beehaw.org 3 points 5 months ago (1 children)

I think in a lot of ways this already happens. A lot of port parents understand their pets as people. I certainly see my cat as a person. She has her own personality that is probably fairly unique to her

[–] Excrubulent@slrpnk.net 1 points 5 months ago* (last edited 5 months ago)

Yeah, I absolutely agree, and I really did consider saying that humans are the only *uncontroversial examples of people that we have, but I decided not to bog my comment down with too many unnecessary disclaimers. I guess I gave people too much credit there.

[–] superb@lemmy.blahaj.zone 3 points 5 months ago (1 children)

I missed the point on purpose, because I mostly agree with you :)

[–] Excrubulent@slrpnk.net 1 points 5 months ago* (last edited 5 months ago)

Well if it helps I agree that you can't actually say humans are the only people, I was simplifying to focus on the point. Maybe that was actually a mistake.

[–] Even_Adder@lemmy.dbzer0.com 6 points 5 months ago (1 children)

No they aren't. Animals understand LMAO. If you want to continue this conversation, you're going to need to back up your claims with something, otherwise I'm just going to ignore any further replies.

[–] Excrubulent@slrpnk.net 1 points 5 months ago

Okay, so animals can be people too according to my argument. I'm happy to accept that, but the point stands that LLMs don't exhibit this behaviour.

[–] LainTrain@lemmy.dbzer0.com 10 points 5 months ago

I'm pro AI because of this to a large extent. Artbros and their soul shit it's like a cult fr. Nobody's advocating against workers' rights or for capitalism, tech is value neutral, it's how it's used that gives it value, and FOSS models in the hands of the working class everyday folks does more good than bad.

[–] nifty@lemmy.world 8 points 5 months ago (1 children)

The term “divine spark of creativity” is meaningless if it’s used to describe something created by a non-human entity. Why? Let me explain, but note that I am not biased against machine learning systems or whatever we’re calling “AI”.

From what I can remember from my studies, the act of creation (in topics as broad as art, engineering, policy making) is inherently biased towards some kind of exploration and examination of how groups of humans can function together. Humans started creating to facilitate and ensure their propagation by communicating and sharing ideas in different ways. Ultimately, if we look at how things came to be in human history (of society, culture, religion, science etc) the goal of this organization around how we think and do things is to ensure our own development as a species.

Specifically, regarding a piece of “art”, whatever it may be: I disagree with anyone who says that “just because it makes me feel it’s art”. Your drug-induced hallucinations can also make you feel, but they’re not art because they’re not an experience that anyone else but you can accommodate.

Similarly, I think that if a creator cannot understand or communicate with some sense of the human condition, their act of creation is devoid of meaning for human intellectual development, and is simply an exercise in mimicry of human creation—it gets the job done, but it’s not moving anything forward for the human collective. For any number of cynical reasons we may “hate” people, but humans are really the only living organism that we know of which is capable of reasoning about the nature of reality and existence. It doesn’t mean anything to “AI” that you or I or anything exists. Or that itself exists. So what’s the point of its creation?

[–] skye@lemmy.world 4 points 5 months ago (1 children)

I mean, AI doesn't go out by itself to generate art, its someone somewhere asking it to do so. And while most of these are simply eye-candy, im sure there are people out there that use AI to generate art that actually communicates something. So then, it becomes a tool like a paintbrush, but modified to be accessible to everyone.

Anecdotal but once out of curiosity i gave an AI one of my drawings and generate something similar, and the results communicated what i wanted to, without it being the exact same image, and it also inspired me, rather than making me feel bad about my own art.

[–] Lumisal@lemmy.world 1 points 5 months ago (1 children)

It can also literally communicate something through art as it's currently the only reasonable way to create QR code art

[–] skye@lemmy.world 2 points 5 months ago (1 children)

huh, interesting, do you have any examples of this?

[–] DumbAceDragon@sh.itjust.works 5 points 5 months ago* (last edited 5 months ago)

I'm in the fourth camp of "I used to think I had zero artistic talent, and so I was really hyped for AI art to get good, but then I actually got half decent at art myself, and now I don't want people to fall into this hole like I did because they think they're too technical minded or it's too late for them to learn art", which may or may not just be the second camp.

I acknowledge there are people who genuinely enjoy doing AI art, and I have seen some good creative stuff done with it. But I think there's not enough focus on learning that art for yourself, and I side with the "soul creativity" camp on the argument that there is some aspect of human-made art that won't ever be replicated by AI.