• Echo Dot@feddit.uk
    link
    fedilink
    arrow-up
    1
    ·
    1 day ago

    If AI takes my job it’s already taken everyone else’s job. So non I guess.

    Seriously though if it gets to that point, and I don’t think it will I think AI is overhyped, governments are going to have to either put up or get lynched.

    • pinball_wizard@lemmy.zip
      link
      fedilink
      arrow-up
      2
      ·
      2 days ago

      This, exactly!

      I’ll accept a substantial ownership stake in the whole place, in exchange for making a leasurely attempt to pull their asses out of the fire.

  • philpo@feddit.org
    link
    fedilink
    arrow-up
    7
    ·
    3 days ago

    Depends.

    My main job it would be interesting. I mainly plan for organisations how to handle disasters. Not necessarily IT disasters but actual one - what happens if your hospital is on fire, your airline has hundreds of people stranded somewhere (yeah, we had a bad time recently), your muncipial water supply goes bad, the Russians actually come,etc.

    If AI can do that on a level it replaces my staff and me…well…good for everyone else,because right now it’s a underdeveloped and rarely looked upon issue.

    In my side job I am still working in my original trade as a critical care paramedic. Until AI can fully replace one there it will take a long time (but we see a lof of actually beneficial developments that makes the job insanely more easy and capable) and I am very likely retired by then. What is far more likely is that societies won’t be able to pay for proper healthcare anymore…and that would be “not replaced” technically, I guess.

  • JustTesting@lemmy.hogru.ch
    link
    fedilink
    arrow-up
    23
    ·
    edit-2
    3 days ago

    In a way, LLMs have already taken my job as a software engineer. It’s not that they can do my job better than me. But they suck all the joy out of the field, they expose the almost religious culture around efficiency and velocity in the field (no, i don’t want to be 5% faster to make the boss richer and feel miserable doing it) and how little my peers care about craft and quality. Also why do those fucks have to lap up every new technofacist oligarchy thing with such enthusiasm, it pisses me off.

    so it’s not that it does my job, but that it showed me how much i disdain this field now.

    I’m thinking of switching to something (cnc) machining/cad related, both skills i taught myself and love , but i don’t know what kind of position could be suitable given my dev knowledge and lack of formal training. Plus i wouldn’t want to do the operator kind of work where all you do is put stock in the cnc machine and execute someone elses CAM, that seems too close to using LLMs in spirit. I do want and like creative work and tinkering.

    [edit]

    and I’m lucky enough to work for a university, so good work life balance, job security, pension, mostly meaningful work. Ironically enough in the AI field… But that makes considering to switch even harder. I could easily and comfortably coast along and feel discontent for many years to come.

    • blackbirdbiryani@lemmy.world
      link
      fedilink
      arrow-up
      11
      ·
      3 days ago

      Agreed, I love programming and taking my time to think through problems before I code, but my God these casual AI users at work lap this shit up and can’t be fucked spending an extra minute thinking before they write their code.

  • COASTER1921@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    2 days ago

    I’m not convinced that there’s enough training data for it to be good in my specialization anytime soon. And it certainly won’t be trusted for safety critical applications even if that were to happen.

    That being said I’m very, very, glad to not be in CS or law where there is nearly endless data to train on.

    • MatSeFi@lemmy.liebeleu.de
      link
      fedilink
      arrow-up
      6
      ·
      4 days ago

      That’s also what I currently experience in my position. A lot of workflows are going to be automated by some AI stuff, but whenever someone is planning to produce physical goods in masses you can not effort doing stupid misstakes. When you simply put the output of the LLM in to the input of your CNC, Ion-Implanter, Litograph-Machine, Welding Robot, Aribag or Breaking systems. A mistake is going to be sooo fucking expensive that the human in the loop is not an cost factor anymore. And when you do high precision stuff an approximate solution is never sufficient.

      • KingGimpicus@sh.itjust.works
        link
        fedilink
        arrow-up
        6
        ·
        4 days ago

        I always think years back when Jeep had to issue a recall for like 30k vehicles because a robot missed a fillet weld on a suspension component and QA missed every single one lmfao

      • pinball_wizard@lemmy.zip
        link
        fedilink
        arrow-up
        1
        ·
        2 days ago

        Serious question: Anyone seen a janitor robot running anywhere that wasn’t constantly filthly?

        I have only encountered janitor bots fighting in vain in places that have clearly gone decades (or one astonishingly abusive hour) without a successful cleaning.

  • Saprophyte@lemmy.world
    link
    fedilink
    arrow-up
    22
    ·
    4 days ago

    I work in cybersecurity. My job is in no danger. AI seems to be an expert in things until you start asking it questions about a subject you’re an expert in. Then it all falls apart. Anyone who thinks they’re using AI for cybersecurity or thinks AI can do cybersecurity knows nothing about cybersecurity.

    The only people who would use AI for cybersecurity wouldn’t hire a cybersecurity firm anyway but would instead ask their friend Bob who “knows computers” and would get roughly the same level of expertise as a result and feel just as happy about either.