this post was submitted on 25 Jul 2024
993 points (97.4% liked)

Technology

59300 readers
4927 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

The new global study, in partnership with The Upwork Research Institute, interviewed 2,500 global C-suite executives, full-time employees and freelancers. Results show that the optimistic expectations about AI's impact are not aligning with the reality faced by many employees. The study identifies a disconnect between the high expectations of managers and the actual experiences of employees using AI.

Despite 96% of C-suite executives expecting AI to boost productivity, the study reveals that, 77% of employees using AI say it has added to their workload and created challenges in achieving the expected productivity gains. Not only is AI increasing the workloads of full-time employees, it’s hampering productivity and contributing to employee burnout.

you are viewing a single comment's thread
view the rest of the comments
[–] silasmariner@programming.dev 0 points 3 months ago (1 children)

Yeah but the idea of AI in that kind of workflow is so that the product guy can actually do it themselves without asking you and in less than 30 mins

[–] jjjalljs@ttrpg.network 13 points 3 months ago (1 children)

Yeah but that's like using an entire gasoline powered car to play a CD.

Competent product guy should be able to learn some simpler tools like Google sheets.

[–] silasmariner@programming.dev 1 points 3 months ago (1 children)

No arguments from me that it's better if people are just better at their job, and I like to think I'm good at mine too, but let's be real - a lot of people are out of their depth and I can imagine it can help there. OTOH is it worth the investment in time (from people who could themselves presumably be doing astonishing things) and carbon energy? Probably not. I appreciate that the tech exists and it needs to, but shoehorning it in everywhere is clearly bollocks. I just don't know yet how people will find it useful and I guess not everyone gets that spending an hour learning to do something that takes 10s when you know how is often better than spending 5 mins making someone or something else do it for you... And TBF to them, they might be right if they only ever do the thing twice.

[–] balder1991@lemmy.world 6 points 3 months ago (1 children)

I think the actual problem here is that if the product people can’t learn such a simple thing by themselves, they also won’t be able to correctly prompt the LLM to their use case.

They said, I do think LLMs can boost productivity a lot. I’m learning a new framework and since there’s so much details to learn about it, it’s fast to ask ChatGPT what’s the proper way to do X on this framework etc. Although that only works because I already studied the foundation concepts of that framework first.

[–] silasmariner@programming.dev 4 points 3 months ago

I think the actual problem is that they won't know when they've got something that compiles but is wrong... I dunno though. I've never seen someone doing this and I can only speculate tbh. I only ever asked ChatGPT a couple of times, as a joke to myself when I got stuck, and it spouted completely useless nonsense both times... Although on one occasion the wrong code it produced looked like it had the pattern of a good idiom behind it and I stole that.