So, before you get the wrong impression, I’m 40. Last year I enrolled in a master program in IT to further my career. It is a special online master offered by a university near me and geared towards people who are in fulltime employement. Almost everybody is in their 30s or 40s. You actually need to show your employement contract as proof when you apply at the university.

Last semester I took a project management course. We had to find a partner and simulate a project: Basically write a project plan for an IT project, think about what problems could arise and plan how to solve them, describe what roles we’d need for the team etc. Basically do all the paperwork of a project without actually doing the project itself. My partner wrote EVERYTHING with ChatGPT. I kept having the same discussion with him over and over: Write the damn thing yourself. Don’t trust ChatGPT. In the end, we’ll need citations anyway, so it’s faster to write it yourself and insert the citation than to retroactively figure them out for a chapter ChatGPT wrote. He didn’t listen to me, had barely any citation in his part. I wrote my part myself. I got a good grade, he said he got one, too.

This semester turned out to be even more frustrating. I’m taking a database course. SQL and such. There is again a group project. We get access to a database of a fictional company and have to do certain operations on it. We decided in the group that each member will prepare the code by themselves before we get together, compare our homework and decide, what code to use on the actual database. So far whenever I checked the other group members’ code it was way better than mine. A lot of things were incorporated that the script hadn’t taught us at that point. I felt pretty stupid becauss they were obviously way ahead of me - until we had a videocall. One of the other girls shared her screen and was working in our database. Something didn’t work. What did she do? Open a chatgpt tab and let the “AI” fix the code. She had also written a short python script to help fix some errors in the data and yes, of course that turned out to be written by chatgpt.

It’s so frustrating. For me it’s cheating, but a lot of professors see using ChatGPT as using the latest tools at our disposal. I would love to honestly learn how to do these things myself, but the majority of my classmates seem to see that differently.

  • biggerbogboy@sh.itjust.works
    link
    fedilink
    arrow-up
    13
    arrow-down
    2
    ·
    19 hours ago

    Personally, I can’t lie, I use ChatGPT a lot, but I don’t offload much of my thinking, I really just discuss random things with it. I used to use it far more often 2 years ago though, I had it write entire essays for me, virtually all my geography, history, and English SACs (assessments) were AI made with some tweaks to get through the detectors which hardly worked.

    What really puzzles me is how fellow students genuinely somehow got to year 12 with only ChatGPT and still use it as if they are guaranteed to pass everything, like last week I was surrounded by people using ChatGPT to write their English speeches, but myself and the friend I was next to didn’t use AI. Those students were conversing between each other about the most accurate AI detectors, as if the free ones are better than the expensive, paid software the teachers are using. All those students are the least likely to pass, since they get consistently low scores, then complain about those scores without changing anything, not even studying a single second.

    Students around me are digging themselves a hole willingly, then get pissed off about not getting high study scores and ATARs (basically our metrics for value in the workforce), like if you wanna score high, or even just maintain your memory, it’s pretty damn obvious that you NEED to put in effort.