I think my employer saw the Shopify CEO’s mandatory AI memo and got a little overexcited.

As a web developer I’ve tried copilot and disliked it immensely. It didn’t save me time because my syntax memory and minimal keystroke workflow are pretty decent after 20 years of huckin’ HTML and CSS in various frameworks.

I feel like if I give studies or interviews from companies who FAFOd I’d have a better chance of arguing my point. Does anybody have any in their back pocket they can spare?

Yes, I am very aware of the irony that I could try to ask an AI but avoiding it is kind of the point in this c, isn’t it?

  • Rayquetzalcoatl@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    18 hours ago

    My bosses hauled me into a meeting because I’m “too negative” about AI. I’m also a web developer, like you. Sometimes, when your boss is a clueless freak who has become obsessed with a shiny new toy, there’s no way out. I use it as sparingly and sarcastically as possible, but they still tell me to use it most weeks.

    Sorry I have no advice for you! But I do feel your pain lol

  • KeenFlame@feddit.nu
    link
    fedilink
    arrow-up
    4
    ·
    13 days ago

    Thing is, it is useful for juniors, except when they fuck up, or when they need to have learned something they’re empty of any skill. There is the security issue, the code maintenance issue… You can halve the lifespan of the codebase because it will bloat up fast. No models follows best practices yet. If you make Web apps they will eventually be hackable in several unknown ways were nobody can even find the issue because nobody wrote it so it is up to a security expert to sift through kilograms of generated code - not humanoid code and find the exploit. A hacker (or just a normal user even) can find a new opening in a fraction of that time. But, I assume you want to audit all code commits. So this is what your new profession is, and when it’s not even in the correct ballpark, you have to prod it like a cowboy. It’s demeaning even haha

  • wizardbeard@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    4
    ·
    13 days ago

    Sorry, I don’t have anything off hand. I know there has been some stories by research groups about how it’s measurably increasing security issues in codebases, but I don’t have any links.

    You might also want to ask in the pinned weekly thread on !techtakes@awful.systems as they seem to be more active than this comm. It’s about making fun of all the “techbro takes” online, but I’ve seen people ask for similar help in the weekly thread and get assistance before.

    Part of the problem is that most companies aren’t publicly sharing failure stories for anything, let alone failures of the latest hype thing.

    You might be able to find some stories on the more “greybeard” oriented tech news sites like the register as well.

  • friend_of_satan@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    11 days ago

    Alternatively, maybe you should try giving different AI systems your standard interview questions and see how they do? I doubt any of them would pass, unless the questions simple. Even if they do as good as a human, using them requires a second human, so they are still using up the same amount of engineering hours as hiring a new employee.