this post was submitted on 08 Jun 2025
143 points (95.5% liked)

Technology

71318 readers
4493 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related news or articles.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, this includes using AI responses and summaries. To ask if your bot can be added please contact a mod.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


founded 2 years ago
MODERATORS
 

Link to the article without the paywall

https://archive.ph/h63Dp

top 39 comments
sorted by: hot top controversial new old
[–] Jankatarch@lemmy.world 19 points 3 days ago* (last edited 3 days ago) (1 children)

When I walk around in my uni people openly talk about using chatgpt to pass their classes. When I ask for help on some lecture groupchat first 4 answers are "I just used chatgpt."

They gave me a whole speech about how they take academic dishonesty so seriously at the beginning but I am honestly just disappointed now. Even using solution manuals make you considered a "good student"

[–] Ledericas@lemm.ee 3 points 3 days ago

it has gotten so bad? no wonder my state uni have been complaining review sites about the schools lack of direction for its direction.

[–] fubarx@lemmy.world 49 points 4 days ago (1 children)

Once these AI companies go belly-up, those people with critical thinking and research skills will be able to name their price.

Those abilities have been in high demand for millenia. Focus on the basics.

[–] chaosCruiser@futurology.today 22 points 4 days ago* (last edited 3 days ago) (1 children)

Probably not going to go belly-up in a while, but the enshittification cycle still applies. At the moment, investors are pouring billions into the AI business, and as a result, companies can offer services for free while only gently nudging users towards the paid tiers.

When the interest rates rise during the next recession, investors won’t have access to money any more. Then, the previously constant stream of money dries up, AI companies start cutting what the free tier has, and people start complaining about enshittification. During that period, the paid tiers also get restructured to squeeze more money out of the paying customers. That hasn’t happened yet, but eventually it will. Just keep an eye on those interest rates.

[–] taladar@sh.itjust.works 6 points 3 days ago (1 children)

Probably not going to go belly-up, in a while

Don't be so sure about that, the numbers look incredibly bad for them in terms of money burned per actual revenue, never mind profit. They can't even pay for the inference alone (never mind training, staff, rent,...) from the subscriptions.

[–] chaosCruiser@futurology.today 6 points 3 days ago* (last edited 3 days ago) (1 children)

As long as they can convince investors of potential future revenue, they will be just fine. In the growth stage, companies don’t have to be profitable because the investors will cover the expenses. Being profitable becomes a high priority only when you run out of series F money, and the next investors can’t borrow another 700 million. It’s a combination of having low interest rates and convincing arguments.

BTW I don’t think this is a good way to run a company, but many founders and investors clearly disagree with me.

[–] taladar@sh.itjust.works 4 points 3 days ago (1 children)

The difference between AI companies and most other tech companies is that AI companies have significant expenses that scale with the number of customers.

[–] chaosCruiser@futurology.today 3 points 3 days ago (1 children)

That's a very good point. Actually, video hosting services also suffer from a similar problem, and that's one of the main reasons why it's so hard to compete with YouTube. Since there are so many LLM services out there at the moment, it makes me think that there must be a completely ridiculous amount of investor money floating around there. Doesn't sound like a sustainable situation to me.

Apparently, the companies are hoping that everyone gets so hooked on LLMs that they have no choice but to pay up when the inevitable tsunami of enshittification hits us.

[–] taladar@sh.itjust.works 2 points 3 days ago (1 children)

There are some numbers in this blog post https://www.wheresyoured.at/openai-is-a-systemic-risk-to-the-tech-industry-2/ (and a couple of others on the same blog) and they really don't look like OpenAI is going to last a couple of years until profitability.

[–] chaosCruiser@futurology.today 1 points 3 days ago

Wow, those are some pretty big numbers! About 10x bigger than what I was thinking. I knew these things can get pretty weird, but this is just absolutely wild. When expectations fly that high, the crash can be all the more spectacular.

When you notice that your free account can’t do much, that’s a sign that OpenAI is beginning to run out of money. When that happens, the competitors will be ready to welcome all the users who didn’t feel like paying OpenAI.

[–] lupusblackfur@lemmy.world 43 points 4 days ago (4 children)

Thank fuck I graduated college decades ago...

Actual education/teaching is under assault in the US from all sides these days... Not certain today's students have any chance. 🙄 🤦‍♀️ 🖕 💩

[–] Ledericas@lemm.ee 4 points 3 days ago* (last edited 3 days ago)

i graduated just before the fuckery, and they were at the forefront of using software/weeding out software for jobs already. when i was HS, students already had given up on doing homework, and they were passing people with failing/D grades to graduation. eveyrone that graduated during the pandemic, or is taking classes, said my old college is pretty bad now. because most of them elected to be online classes. theres also other prevailing issues that never were solved when i was still in college. them USING CHATGPT, is probably a step up from just copy and pasting content from various sites with the same exact question or essay.

[–] 14th_cylon@lemm.ee 16 points 4 days ago* (last edited 4 days ago) (3 children)

it is not just the US... the VSE - Prague's university of economics and business has decided to abandon graduation theses, because it is supposedly "impossible to verify" whether they were written by student or AI, and replaced them with "hands-on" graduation project)

[–] uranibaba@lemmy.world 5 points 4 days ago (1 children)

“hands-on” graduation project

Does hand-on mean supervised?

[–] 14th_cylon@lemm.ee 11 points 4 days ago

no, it means they will try to somehow apply the knowledge they acquired to real life problem. creating a project, instead of writing a text.

[–] Sidyctism2@discuss.tchncs.de 1 points 3 days ago (2 children)

well how would you verify wether a thesis was written by AI? Mind that accusations are a serious matter, so "i guess it sorta looks like AI" or a percent number spat out by some unreliable LLM-detection AI isnt going to cut it

[–] 14th_cylon@lemm.ee 4 points 3 days ago (1 children)

well, not sure if it works the same everywhere in the world, but here, you first write the graduation thesis and then you have to publicly defend it.

if the defense committee (or is it an attack committee, since it is the student who is on defense? :D) can't ask questions in a way to find out whether the student actually wrote the paper and understands the topic, then what fucking pseudo-scientific field is that? (and the answer indeed is - it is economics 😂)

[–] danzabia@infosec.pub 2 points 3 days ago (1 children)

Then the student could just ask the AI to simulate a thesis defense and learn answers to the most likely questions.

The funny thing is, they would actually learn the material this way, through a kind of osmosis. I remember writing cheat sheets in college and finding I didn't need it by the end.

So there are potential use cases, but not if the university doesn't acknowledge it and continues asking for work that can be simply automated.

[–] 14th_cylon@lemm.ee 1 points 1 day ago

Then the student could just ask the AI to simulate a thesis defense and learn answers to the most likely questions.

while my opinion of economy as a field is not very high, i still have high enough opinion of any teacher to believe they do outperform shitty ai...

[–] Ledericas@lemm.ee 1 points 3 days ago

from what ive seen on some reddit posts, they USE AI to accuse the student of writing in AI.

[–] Lodespawn@aussie.zone 1 points 4 days ago (4 children)

Jesus how bad are their student papers that they can't tell whether an AI wrote one?!

[–] 14th_cylon@lemm.ee 2 points 3 days ago (1 children)

more like "how bad is the education"?

[–] Lodespawn@aussie.zone 2 points 3 days ago

Well that's a much better question.

[–] danzabia@infosec.pub 1 points 3 days ago

Students are now prompting the AI to make it sound like a student wrote it, or putting it through an AI detector and changing the parts that are detected as being written by AI (adding typos or weird grammar, say). Even kids who write their own papers have to do the latter sometimes.

[–] mineralfellow@lemmy.world 1 points 3 days ago (1 children)

Perfect grammar and slightly unusual words in a paragraph. Could be a weird formulation from a student’s mind, could be AI. No way to really know.

[–] Lodespawn@aussie.zone 1 points 3 days ago (1 children)

If the peer review are unable to differentiate between student output and AI output then they are either incompetent or they are inundated with absolute garbage. The latter also suggests the former is true.

[–] mineralfellow@lemmy.world 2 points 3 days ago

I just finished marking student reports. There are some sections clearly written without AI, some that clearly are written by AI, and then some sections where the ideas are correct, the grammar is perfect, and it is on topic, but it doesn’t seem like it is written in the student’s voice. Could be AI, could be a friend editing, could be plagiarism, could be written long before or after the surrounding paragraphs. It is not always obvious, and the edge cases are the problem.

[–] Ledericas@lemm.ee 1 points 3 days ago (1 children)

the professors are using AI to sniff out AI.

[–] Lodespawn@aussie.zone 1 points 3 days ago* (last edited 3 days ago)

Yeah given the quality of AI outputs they could just read the papers to spot it .. you know .. do their jobs? I mean there's a few layers here for thesis review, the supervisor, the professor, the other peer reviewers. They are all supposed to review the paper and at least some of the data that led to its production.

[–] chunes@lemmy.world 1 points 3 days ago

There are people who want to do their own thinking and those who don't. The ratio hasn't changed much over time. Only the possibilities.

[–] etchinghillside@reddthat.com 2 points 4 days ago

It’s fine… I’m sure we’ll have UBI anytime now. We’ll have AI working on it.

"Hi, what version of chatgpt did you use in your surgical training? Great!"

"You say the engineering team that designed this suspended walkway just used chatgpt during their training? Sounds good!"

[–] Empricorn@feddit.nl 25 points 4 days ago

I'm going to throw up.

[–] PushButton@lemmy.world 11 points 4 days ago

"You are going to pay me to think."

Nah, thank you, I will keep doing that my way...

[–] Mwa@thelemmy.club 1 points 3 days ago* (last edited 3 days ago)

I kinda dont like ChatGPT because it trains everything you feed it into it (goes the same with any LLM)
and hopefully it wont be forced Right???? (Probably)

[–] scarilog@lemmy.world -2 points 3 days ago* (last edited 3 days ago) (1 children)

I'm not supporting higher education becoming reliant on for-profit companies like this, but AI tutors and the like, if properly implemented, would be kinda awesome. For example, it's usually not feasible to have real life staff on hand to answer student questions at all hours of the day. Especially at the more early years of university, where content is simpler, AI is more than capable of meeting needs like this.

I don't fully agree with most of the people on this thread. I also hate AI slop being forced into what feels like all aspects of our life right now, but LLMs do have some genuine uses.

[–] qbus@lemmy.world 4 points 3 days ago

Yeah man for profit companies should be banned in higher ed also unrelated did you renew the license for your textbook?