So, before you get the wrong impression, I'm 40. Last year I enrolled in a master program in IT to further my career. It is a special online master offered by a university near me and geared towards people who are in fulltime employement. Almost everybody is in their 30s or 40s. You actually need to show your employement contract as proof when you apply at the university.
Last semester I took a project management course. We had to find a partner and simulate a project: Basically write a project plan for an IT project, think about what problems could arise and plan how to solve them, describe what roles we'd need for the team etc. Basically do all the paperwork of a project without actually doing the project itself. My partner wrote EVERYTHING with ChatGPT. I kept having the same discussion with him over and over: Write the damn thing yourself. Don't trust ChatGPT. In the end, we'll need citations anyway, so it's faster to write it yourself and insert the citation than to retroactively figure them out for a chapter ChatGPT wrote. He didn't listen to me, had barely any citation in his part. I wrote my part myself. I got a good grade, he said he got one, too.
This semester turned out to be even more frustrating. I'm taking a database course. SQL and such. There is again a group project. We get access to a database of a fictional company and have to do certain operations on it. We decided in the group that each member will prepare the code by themselves before we get together, compare our homework and decide, what code to use on the actual database. So far whenever I checked the other group members' code it was way better than mine. A lot of things were incorporated that the script hadn't taught us at that point. I felt pretty stupid becauss they were obviously way ahead of me - until we had a videocall. One of the other girls shared her screen and was working in our database. Something didn't work. What did she do? Open a chatgpt tab and let the "AI" fix the code. She had also written a short python script to help fix some errors in the data and yes, of course that turned out to be written by chatgpt.
It's so frustrating. For me it's cheating, but a lot of professors see using ChatGPT as using the latest tools at our disposal. I would love to honestly learn how to do these things myself, but the majority of my classmates seem to see that differently.
Obviously this is the fuckai community so you'll get lots of agreement here.
I'm coming from all communities and don't have the same hate for AI. I'm a professional software dev, have been for decades.
I have two minds here, on the one hand you absolutely need to know the fundamentals. You must know how the technology works what to do when things go wrong or you're useless on the job. On the other hand, I don't demand that the people who work for me use x86 assembly and avoid stack overflow, they should use whatever language/mechanism produces the best code in the allotted time. I feel similarly with AI. Especially local models that can be used in an idempotent-ish way. It gets a little spooky to rely on companies like anthropic or openai because they could just straight up turn off the faucet one day.
Those who use ai to sidestep their own education are doing themselves a disservice, but we can't put our heads in the sand and pretend the technology doesn't exist, it will be used professionally going forward regardless of anyone's feelings.
I am subscribed to this community and I largely agree with you. Mostly I hate AI slop and that the human element is becoming an afterthought.
That said, I work for a small company. My boss wanted me to look up AI products for proposal writing. Some of the proposals we do are pretty massive, and we can't afford the overhead of a whole team of proposal writers just for a chance at getting a contract. But a closely-monitored AI to help out with the boilerplate stuff especially? I can see it. If nothing else, it's way easier (and maybe better results) to tweak existing content than it is to create something entirely from scratch