• MagicShel@lemmy.zip
    link
    fedilink
    English
    arrow-up
    3
    ·
    14 hours ago

    You’re not just making sure you write the letters correctly, you’re also following the syntax rules of the language you’re writing. And while you’re writing, you’re reinforcing those rules in your head.

    I get where you’re coming from, but I’ve worked with a lot of bad developers who never got the hang of this even as mid-level developers. On the other hand, I understand the utility of knowing how to do these things for ourselves. There are a number of “black-box” libraries that were just an absolute mystery to me until I tried implementing them myself and began to see these libraries are usually not complex so much as they are thorough in covering edge cases that 90% of users will never care about.

    It would definitely be a shame if these tools caused new developers to bypass fundamental skill development. My only hesitation is the number of developers who should’ve developed those skills and never did before AI. There’s something wrong either with how developers are learning or who is getting into development.

    I spent a couple weeks trying to use CoPilot and at the end I still had to correct its shitty code, which either hallucinated features I wasn’t implementing, or hallucinated syntax rules I wasn’t using.

    We are using CoPilot. As a code-completion engine it is handy. I’m much more skeptical about the new code it writes. Like you, I have not had good experiences with that.

    Also, I’ve never heard of anyone paying $20 a month for the privilege of not writing in cursive, or being unable to write because they don’t have internet. Something to think about.

    You’re right. Tool access is certainly something to think about. I have more nuanced thoughts, but I don’t want to disagree just to disagree, you know?

    • groucho
      link
      fedilink
      English
      arrow-up
      3
      ·
      13 hours ago

      On the other hand, I understand the utility of knowing how to do these things for ourselves. There are a number of “black-box” libraries that were just an absolute mystery to me until I tried implementing them myself and began to see these libraries are usually not complex so much as they are thorough in covering edge cases that 90% of users will never care about.

      Yeah, that’s one of my big fears. Not necessarily losing my job to an AI, but AI exacerbating existing bad practices.

      When I started my current job, we had one rock star coder responsible for a fairly fiddly piece of our product. He went heads-down for two weeks and churned out pages of densely-written python without comments. It did what it was supposed to do, flawlessly. He left the team shortly afterward to work on a bigger project, and we got word from the higher-ups that we had to support a new feature upstream in that code. And then another. And so on. Nothing’s commented. Everything’s over-optimized. We eventually ended up just cross-compiling the upstream logic and using that in our stack because it was easier than using his impenetrable stuff.

      In the end, we had to fix it with menial, boring, aggravating manual work anyway. We got ourselves into that situation without AI, but I could see something like that becoming more prevalent. And that was working code. Imagine getting a SEV, and everyone on the blame list shrugs and says “idk, I had CoPilot do it.”

      It would definitely be a shame if these tools caused new developers to bypass fundamental skill development. My only hesitation is the number of developers who should’ve developed those skills and never did before AI. There’s something wrong either with how developers are learning or who is getting into development.

      Yeah, this is part of it. There’s maybe the science of programming and also, for lack of a better term, the craft: writing maintainable code, handling a SEV, thinking in terms of uptime, setting things up to be reverted easily, shutting down neurotic code reviewers, testing your code… stuff like that. Universities are good at the science part. Internships, theoretically, handle the rest. This isn’t an AI issue, but I could see AI making this problem a lot worse.