• RedstoneValley@sh.itjust.works
    link
    fedilink
    arrow-up
    0
    ·
    2 months ago

    While I (almost) agree with the conclusion, there is a lot of bullshit and unproven assumptions in this blog post. I always cringe about the “AI is democratising software development” argument in particular. This is just wrong on so many levels. Software development is not an ivory tower. Everyone with an internet connection had access to all the resources to learn the necessary skills for free, for decades. Everyone who had an interest in actually learning that stuff and putting a bit of effort into it was able to do so. What LLMs provide is not democratising anything but advertising the illusion that everyone can produce software, effortless and without any skills whatsoever. Software development is much more than just churning out lines of code that seem to work. The Vibecoding approach is like trying to build your own car without having the skills and asking an AI to construct it as the sum of individual parts which all come from different car models from a Lada to a Ferrari. The end result might be drivable, but it will be neither secure nor efficient nor fast nor stable nor maintainable etc. A Frankenstein car. Everyone with half a brain would agree that’s not a good idea, however with LLMs people just do pretend its fine.

    • Eheran@lemmy.world
      link
      fedilink
      arrow-up
      0
      arrow-down
      1
      ·
      2 months ago

      Everyone could always learn woodworking, weaving, sewing, smithing, … that is not an argument. The point is that better tools make it easier to learn/perform/perfect these skills. Today anyone with a little torch and a hammer can play around with steel. 300 years ago you had to at least take on an apprenticeship to ever get to do that. Sewing with a sewing machine is so much faster, there is not much time to invest before you can make your own clothes.

      Not everyone has 100s of hours free time to sink into this and that skill “the purist way”. Any tool that makes the learning curve more shallow and/or the process itself easier/cheaper/… helps democratizing these things.

      You argue as if everyone needs to be a super duper software architect, while most people just want to create some tool or game or whatever they think of, just for themselves.

      • xthexder@l.sw0.com
        link
        fedilink
        arrow-up
        0
        ·
        2 months ago

        Not everyone has 100s of hours free time to sink into this and that skill

        That’s life, buddy. Nobody can learn everything, so communities rely on specialists who can master their craft. Would you rather your doctor have 100s of hours of study and practice, or a random person off the street with ChatGPT? If something is worth studying for 100s of hours, then there’s more nuance to the skill than any layman or current AI system can capture in a few sentence prompt.

        • Eheran@lemmy.world
          link
          fedilink
          arrow-up
          0
          ·
          2 months ago

          What kind of nonsense comparison is that? Somewhat off topic, borderline straw man.

          People still have their job, better tools enable people to do more things in their free time. Some even switch professions later on, once they have enough experience. Lowering the bar (invest, skill, …) is simply a good thing.

          • xthexder@l.sw0.com
            link
            fedilink
            arrow-up
            0
            ·
            edit-2
            2 months ago

            I personally have spent those 100s (actually more like 1000s) of hours studying Software Engineering, and I was doing my best to give an example of how current AI tools are not a replacement for experience. Neither is having access to a sewing machine or blowtorch and hammer (you still need to know about knots and thread / metallurgy / the endless amount of techniques for using those tools).
            Software in particular is an extremely theoretical field, similar to medicine (thus my example with a doctor).
            ChatGPT is maybe marginally better than a simple web search when it comes to learning. There is simply no possible way to compress the decade of experience I have into a few hours of using an LLM. The usefulness of AI for me starts and ends at fancy auto-complete, and that literally only slightly speeds up my already fast typing speed. Getting a good result out of AI for coding requires so much prerequisite knowledge to ask the right questions, a complete novice is not even going to know what they should be asking for without going through those same 100s of hours of study.

            • Eheran@lemmy.world
              link
              fedilink
              arrow-up
              0
              arrow-down
              1
              ·
              2 months ago

              Without chatGPT I could not have repaired things where I simply threw the datasheet at it and got code to reprogram it, like for an BMS. I could not digitize data streams by sniffing I2C. I could not use computer vision to decode a display. I could not make control and data logging interfaces for machines, turning decade old shit into good-as-new just based on their serial interface. Etc. Etc.