• mlg@lemmy.world
    link
    fedilink
    English
    arrow-up
    97
    ·
    6 days ago

    The modern web is an insult to the idea of efficiency at practically every level.

    You cannot convince me that isolation and sandboxing requires a fat 4Gb slice of RAM for a measly 4 tabs.

    • kalpol@lemmy.ca
      link
      fedilink
      arrow-up
      19
      ·
      6 days ago

      It is crazy that I can have a core 2 duo with 8 gig of RAM that struggles loading web pages

      • CovfefeKills@lemmy.world
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        5 days ago

        Actshually it’s bandwidth censorship if you make something too heavy to be used then it won’t get used. It is one of the things China is doing to separate their internet from the rest of the worlds, by having an internet so blazingly fast it is unbearable to goto the world wide web.

        So yesh, the epstien class are making the news too slow for typical users to access. /maybe some sarcasm maybe not I’m not sure yet

        EDIT: I have decided I was not being sarcastic. https://ioda.inetintel.cc.gatech.edu/reports/shining-a-light-on-the-slowdown-ioda-to-track-internet-bandwidth-throttling/

        
        Episodes of network throttling have been reported in countries like Russia, Iran, Egypt, and Zimbabwe, and many more, especially during politically sensitive periods such as elections and protests. In some cases, entire regions such as Iran’s Khuzestan province have experienced indiscriminate throttling, regardless of the protocol or specific services in use. Throttling is particularly effective and appealing to authoritarian governments for several reasons: Throttling is simple to implement, difficult to detect or attribute and hard to circumvent.```
  • bampop@lemmy.world
    link
    fedilink
    arrow-up
    115
    arrow-down
    2
    ·
    7 days ago

    My PC is 15 times faster than the one I had 10 years ago. It’s the same old PC but I got rid of Windows.

  • GenderNeutralBro@lemmy.sdf.org
    link
    fedilink
    English
    arrow-up
    96
    arrow-down
    3
    ·
    7 days ago

    Everything bad people said about web apps 20+ years ago has proved true.

    It’s like, great, now we have consistent cross-platform software. But it’s all bloated, slow, and only “consistent” with itself (if even). The world raced to the bottom, and here we are. Everything is bound to lowest-common-denominator tech. Everything has all the disadvantages of client-server architecture even when it all runs (or should run) locally.

    It is completely fucking insane how long I have to wait for lists to populate with data that could already be in memory.

    But at least we’re not stuck with Windows-only admin consoles anymore, so that’s nice.

    All the advances in hardware performance have been used to make it faster (more to the point, “cheaper”) to develop software, not faster to run it.

  • kunaltyagi@programming.dev
    link
    fedilink
    arrow-up
    75
    arrow-down
    1
    ·
    7 days ago

    The same? Try worse. Most devices have seen input latency going up. Most applications have a higher latency post input as well.

    Switching from an old system with old UI to a new system sometimes feels like molasses.

    • Korhaka@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      25
      ·
      7 days ago

      I work in support for a SaaS product and every single click on the platform takes a noticeable amount of time. I don’t understand why anyone is paying any amount of money for this product. I have the FOSS equivalent of our software in a test VM and its far more responsive.

    • Buddahriffic@lemmy.world
      link
      fedilink
      arrow-up
      15
      ·
      7 days ago

      Except for KDE. At least compared to cinnamon, I find KDE much more responsive.

      AI generated code will make things worse. They are good at providing solutions that generally give the correct output but the code they generate tends to be shit in a final product style.

      Though perhaps performance will improve since at least the AI isn’t limited by only knowing JavaScript.

      • boonhet@sopuli.xyz
        link
        fedilink
        arrow-up
        4
        ·
        7 days ago

        I still have no idea what it is, but over time my computer, which has KDE on it, gets super slow and I HAVE to restart. Even if I close all applications it’s still slow.

        It’s one reason I’ve been considering upgrading from6 cores and 32 GB to 16 and 64.

        • rumba@lemmy.zip
          link
          fedilink
          English
          arrow-up
          9
          ·
          7 days ago

          Upgrade isn’t likely to help. If KDE is struggling on 6@32, you have something going on that 16@64 is only going to make it last twice as long before choking.

          wail till it’s slow

          Check your Ram / CPU in top and the disk in iotop, hammering the disk/CPU (of a bad disk/ssd) can make kde feel slow.

          plasmashell --replace # this just dumps plasmashell’s widgets/panels

          See if you got a lot of ram/CPU back or it’s running well, if so if might be a bad widget or panel

          if it’s still slow,

          kwin_x11 --replace

          or

          kwin_wayland --replace &

          This dumps everything and refreshes the graphics driver/compositor/window manager

          If that makes it better, you’re likely looking at a graphics driver issue

          I’ve seen some stuff where going to sleep and coming out degrades perf

          • Passerby6497@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            6 days ago

            I’ve seen some stuff where going to sleep and coming out degrades perf

            I’ll have to try some of these suggestions myself, as I’ve been dealing with my UI locking up if the monitors turn off and I wake it up too soon. Sometimes I still have ssh access to it, so thanks for the shell commands!

            • rumba@lemmy.zip
              link
              fedilink
              English
              arrow-up
              2
              ·
              6 days ago

              I was doing horrible things the other day and ended up with my KDE login page not working when I came out of sleep.

              CTRL+ALT+F2 > text login > loginctl unlock-sessions

              • Passerby6497@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                ·
                6 days ago

                I’m aware of the TUI logins (I think f7 is your graphical, but I might be wrong) and sometimes those work too. I’ve started just sshing in because the terminal switching was hit and miss.

                But thanks for that loginctl command, I’ll have to give that one a try as well!

          • boonhet@sopuli.xyz
            link
            fedilink
            arrow-up
            1
            ·
            7 days ago

            Hmm, I haven’t noticed high CPU usage, but usually it only leaves me around 500MB actually free RAM, basically the entire rest of it is either in use or cache (often about 15 gigs for cache). Turning on the 64 gig swapfile usually still leaves me with close to no free RAM.

            I’ll see if it’s slow already when I get home, I restarted yesterday. Then I’ll try the tricks you suggested. For all I know maybe it’s not even KDE itself.

            Root and home are on separate NVMe drives and there’s a SATA SSD for misc non-system stuff.

            GPU is nvidia 3060ti with latest proprietary drivers.

            The PC does not sleep at all.

            To be fair I also want to upgrade to speed up Rust compilation when working on side projects and because I often have to store 40-50 gigs in tmpfs and would prefer it to be entirely in RAM so it’s faster to both write and read.

            • rumba@lemmy.zip
              link
              fedilink
              English
              arrow-up
              4
              ·
              7 days ago

              Don’t let me stop you from upgrading, that’s got loads of upsides. Just suspecting you still have something else to fix before you’ll really get to use it :)

              It CAN be ok to have very low free ram if it’s used up by buffers/cache. (freeable) If Buff/cache gets below about 3GB on most systems, you’ll start to struggle.

              If you have 16GB, it’s running low, and you can’t account for it in top, you have something leaking somewhere.

              • boonhet@sopuli.xyz
                link
                fedilink
                arrow-up
                4
                ·
                7 days ago

                Lol I sorted top by memory usage and realized I’m using 12 gigs on an LLM I was playing around with to get local code completion in my JetBrains IDE. It didn’t work all that well anyway and I forgot to disable it.

                I did have similar issues before this too, but I imagine blowing 12 gigs on an LLM must’ve exacerbated things. I’m wondering how long I can go now before I’m starting to run out of memory again. Though I was still sitting at 7 gigs buffer/cache and it hadn’t slowed down yet.

                • rumba@lemmy.zip
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  7 days ago

                  12/16, That’ll do it. Hopefully that’s all, good luck out there and happy KDE’ing

        • arendjr@programming.dev
          link
          fedilink
          arrow-up
          2
          ·
          7 days ago

          Have you tried disabling the file indexing service? I think it’s called Baloo?

          Usually it doesn’t have too much overhead, but in combination with certain workflows it could be a bottleneck.

        • dr_robotBones@reddthat.com
          link
          fedilink
          arrow-up
          1
          ·
          7 days ago

          Have you gone through settings and disabled unnecessary effects, indexing and such? With default settings it can get quite slow but with some small changes it becomes very snappy.

          • AdrianTheFrog@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 days ago

            I have a 2 core, 2 thread, 4gb RAM 3855u Chromebook that I installed Plasma on, and it’s usually pretty responsive.

          • boonhet@sopuli.xyz
            link
            fedilink
            arrow-up
            2
            ·
            7 days ago

            I have not, but also it’s not slow immediately, it takes time under use to get slow. Fresh boot is quite fast. And then once it’s slow, even if I close my IDE, browsers and everything, it remains slow, even if CPU usage is really low and there’s theoretically plenty of memory that could be freed easily.

  • AeonFelis@lemmy.world
    link
    fedilink
    arrow-up
    50
    arrow-down
    1
    ·
    7 days ago

    Thought leaders spent the last couple of decades propaganding that features-per-week is the only metric to optimize, and that if your software has any bit of efficiency or quality in it that’s a clear indicator for a lost opportunity to sacrifice it on the alter of code churning.

    The result is not “amazing”. I’d be more amazed had it turned out differently.

    • SanicHegehog@lemmy.world
      link
      fedilink
      arrow-up
      35
      ·
      6 days ago

      Fucking “features”. Can’t software just be finished? I bought App. App does exactly what I need it to do. Leave. It. Alone.

      • Yaky@slrpnk.net
        link
        fedilink
        arrow-up
        11
        ·
        6 days ago

        No, never! Tech corps (both devs and app stores) brainwashed people into thinking “no updates = bad”.

        Recently, I have seen people complain about lack of updates for: OS for a handheld emulation device (not the emulator, the OS, which does not have any glaring issues), and Gemini protocol browser (gemini protocol is simple and has not changed since 2019 or so).

        Maybe these people don’t use the calculator app because arithmetic was not updated in a few thousand years.

    • ChickenLadyLovesLife@lemmy.world
      link
      fedilink
      English
      arrow-up
      24
      arrow-down
      1
      ·
      7 days ago

      It’s kind of funny how eagerly we programmers criticize “premature optimization”, when often optimization is not premature at all but truly necessary. A related problem is that programmers often have top-of-the-line gear, so code that works acceptably well on their equipment is hideously slow when running on normal people’s machines. When I was managing my team, I would encourage people to develop on out-of-date devices (or at least test their code out on them once in a while).

      • AeonFelis@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        6 days ago

        It’s kind of funny how eagerly we programmers criticize “premature optimization”, when often optimization is not premature at all but truly necessary.

        I will forever be salty about that one time I blamed of premature optimization for pushing to optimize a code that was allocating memory faster than the GC could free it, which was causing one of the production servers to keep getting OOM crashes.

        If urgent emails from one of the big clients who put the entire company into emergency mode during a holiday is still considered “premature”, then no optimization is ever going to be mature.

      • AnyOldName3@lemmy.world
        link
        fedilink
        arrow-up
        3
        ·
        6 days ago

        Premature optimisation often makes things slower rather than faster. E.g. if something’s written to have the theoretical optimal Big O complexity class, that might only break even around a million elements, and be significantly slower for a hundred elements where everything fits in L1 and the simplest implemention possible is fine. If you don’t know the kind of situations the implementation will be used in yet, you can’t know whether the optimisation is really an optimisation. If it’s only used a few times on a few elements, then it doesn’t matter either way, but if it’s used loads but only ever on a small dataset, it can make things much worse.

        Also, it’s common that the things that end up being slow in software are things the developer didn’t expect to be slow (otherwise they’d have been careful to avoid them). Premature optimisation will only ever affect the things a developer expects to be slow.

      • G_M0N3Y_2503@lemmy.zip
        link
        fedilink
        arrow-up
        8
        arrow-down
        5
        ·
        7 days ago

        Optomisation often has a cost, weather it’s code complexity, maintenance or even just salary. So it has to be worth it, and there are many areas where it isn’t enough unfortunately.

          • G_M0N3Y_2503@lemmy.zip
            link
            fedilink
            arrow-up
            2
            arrow-down
            3
            ·
            6 days ago

            How is that mindset lazy? Unhappy customers also have a cost! At my last job the customer just always bought hardware specifically for the software as a matter of process, partly because the price of the hardware compared to the price of the software was negligible. You literally couldn’t make a customer care.

            • prime_number_314159@lemmy.world
              link
              fedilink
              arrow-up
              6
              ·
              6 days ago

              In industrial software, I’m sure performance is a pretty stark line between “good enough” and “costing us money”.

              The pattern I’ve seen in customer facing software is a software backend will depend on some external service (e.g. postgres), then blame any slowness (and even stability issues…) on that other service. Each time I’ve been able to dig into a case like this, the developer has been lazy, not understanding how the external service works, or how to use it efficiently. For example, a coworker told me our postgres system was overloaded, because his select queries were taking too long, and he had already created indexes. When I examined his query, it wasn’t able to use any of the indexes he created, and it was querying without appropriate statistics, so it always did a full table scan. All but 2 of the indexes he made were unused, so I deleted those, then added a suitable extended statistics object, and an index his query could use. That made the query run thousands of times faster, sped up writes, and saved disk space.

              Most of the optimization I see is in algorithms, and most of the slowness I see is fundamentally misunderstanding what a program does and/or how a computer works.

              Slowness makes customers unhappy too, but with no solid line between “I have what I want” and “this product is inadequate”.

            • Passerby6497@lemmy.world
              link
              fedilink
              English
              arrow-up
              5
              ·
              6 days ago

              How is that mindset lazy?

              Are you really asking how it’s lazy to pass unoptimized code to a customer and make their hardware do all the work for you because optimization was too costly?? Like I get that you are in an Enterprise space, but this mentality is very prevalent and is why computers from today don’t feel that much faster software wise than they did 10 years ago. The faster hardware gets, the lazier devs can be because why optimize when they’ve got all those cycles and RAM available?

              And this isn’t a different at you, that’s software development in general, and I don’t see it getting any better.

              • racemaniac@lemmy.dbzer0.com
                link
                fedilink
                arrow-up
                3
                arrow-down
                1
                ·
                6 days ago

                It’s not just software development, it’s everywhere. Devices are cheap, people are expensive. So it’s not lazy, he’s being asked to put his expensive time into efforts the customer actually wants to pay for. If having him optimize the code further costs way more than buying a better computer, it doesn’t make sense economically for him to waste his time on that.

                Is that yet another example of how the economy has strange incentives? For sure, but that doesn’t make him lazy.

                • Passerby6497@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  4
                  ·
                  6 days ago

                  I never called them lazy, I stated that the mentality is lazy, which it is. Whether or not that laziness is profit driven, it still comes down to not wanting to put forth the effort to make a product that runs better.

                  Systemic laziness as profit generation is still laziness. We’re just excusing it with cost and shit, and if everyone is lazy, then no one is.

                  If cost is a justification for this kind of laziness, it also justifies slop code development. After all, it’s cheaper that way, right?

  • oyo@lemmy.zip
    link
    fedilink
    arrow-up
    59
    arrow-down
    2
    ·
    7 days ago

    Windows 11 is the slowest Windows I’ve ever used, by far. Why do I have to wait 15-45 seconds to see my folders when I open explorer? If you have a slow or intermittent Internet connection it’s literally unusable.

    • da_cow (she/her)@feddit.org
      link
      fedilink
      arrow-up
      20
      ·
      7 days ago

      Even Windows 10 is literally unusable for me. When pressing the windows key it literally takes about 4 seconds until the search pops up, just for it to be literally garbage.

      • UnderpantsWeevil@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        7 days ago

        Found out about this while watching “Halt and Catch Fire” (AMC’s effort to recreate the magic of Mad Men, but on the computer).

        Doherty Threshold

        In 1982 Walter J. Doherty and Ahrvind J. Thadani published, in the IBM Systems Journal, a research paper that set the requirement for computer response time to be 400 milliseconds, not 2,000 (2 seconds) which had been the previous standard. When a human being’s command was executed and returned an answer in under 400 milliseconds, it was deemed to exceed the Doherty threshold, and use of such applications were deemed to be “addicting” to users.

      • WhyJiffie@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 days ago

        if it only occurs hours or days after boot, try killing the startmenuexperiencehost process. that’s what I was doing until I switched to linux

        • da_cow (she/her)@feddit.org
          link
          fedilink
          arrow-up
          2
          ·
          7 days ago

          I am using windows like once a week at maximum and then it only takes about 10 minutes. So I kind of do not really care and am glad, that I do not need to use it more often.

    • Echo Dot@feddit.uk
      link
      fedilink
      arrow-up
      11
      ·
      7 days ago

      It takes forever to boot I know that and that’s from fast food which is extra pathetic.

        • HugeNerd@lemmy.ca
          link
          fedilink
          arrow-up
          6
          ·
          7 days ago

          I’ve given up trying to understand modern PC software. I can barely keep up with the little microcontrollers I work with. They aren’t so little.

  • Michal@programming.dev
    link
    fedilink
    arrow-up
    20
    arrow-down
    2
    ·
    6 days ago

    PCs aren’t faster, they have more cores, so they can do more at a time, but it takes effort to optimize for parallel work. Also the form factor keeps getting smaller, more people use laptops now and you can’t cheat thermal efficiency.

    • leftzero@lemmy.dbzer0.com
      link
      fedilink
      arrow-up
      22
      arrow-down
      3
      ·
      6 days ago

      My first PC ran at 16MHz on turbo.

      PCs today are orders of magnitude faster. Way less fun, but faster.

      What’s even more orders of magnitude slower and infinitely more bloated is software. Which is the point of the post.

      It’s almost impossible to find any piece of actually optimised software these days (with some exceptions like sqlite) to the point that 99% percent of the software currently in use can be considered unintentional (or intentional) malware.

      Particularly egregious are web browsers, which seem designed to waste the maximum possible amount of resources and run as inefficiently as possible.

      And the fact that most supposedly desktop software these days runs on top of one of those pieces ofintentional (it’s impossible to achieve such levels of inefficiency and bloat unintentionally, it requires active effort) malware obviously doesn’t help.

        • Blue_Morpho@lemmy.world
          link
          fedilink
          arrow-up
          2
          ·
          4 days ago

          Only on some and name brand PC’s which used it for compatibility. For home built or local store, the turbo would overclock. I remember telling a friend, that although their 16mhz could run at 20, to not do it because it would compromise longevity! Ha! Mind you the cpu’s in those days didn’t have heat sinks but still- Oh no your 386 might not work in 20 years from running too hot!

      • Auli@lemmy.ca
        link
        fedilink
        English
        arrow-up
        1
        ·
        6 days ago

        Browsers are not the same as they where. They are basically bikers ring systems in themselves now.

    • EddoWagt@feddit.nl
      link
      fedilink
      arrow-up
      6
      ·
      6 days ago

      What do you mean pc’s aren’t faster? Yes they have more cores, they also clock higher (mostly) and have more instructions per clock. Computers now perform way better than ever before in every single metric most tasks, even linear ones, could be way faster

    • ragas@lemmy.ml
      link
      fedilink
      arrow-up
      4
      ·
      5 days ago

      I came from C and C++ and had learned that parallelism is hard. Then I tried parallelism on Rust in a project of mine and it was so insanely easy.

    • CovfefeKills@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      6 days ago

      It’s all about memory latency and bandwidth now which has improved greatly PC’s are still getting faster. There is a new RAM standards being pushed right now CAMM2 is really exciting it pushes back the need for soldered memory.

      • Kairos@lemmy.today
        link
        fedilink
        arrow-up
        3
        ·
        6 days ago

        The faster single core out of order execution performance on newer x86 CPUs lets it work on that higher bandwidth of data too.

  • DontRedditMyLemmy@lemmy.world
    link
    fedilink
    arrow-up
    46
    arrow-down
    1
    ·
    7 days ago

    I hate that our expectations have been lowered.

    2016: “oh, that app crashed?? Pick a different one!”

    2026: “oh, that app crashed again? They all crash, just start it again and cross your toes.”

    • wabasso@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      7 days ago

      I’m starting to develop a conspiracy theory that MS is trying to make the desktop experience so terrible that everyone switches to mobile devices, such that they can be more easily spied on.

        • Yaky@slrpnk.net
          link
          fedilink
          arrow-up
          3
          ·
          6 days ago

          Windows Phone was around in mid-2010s, at least 7 years after iPhone release. But it was not hyped enough: companies did not care to develop apps for it, customers didn’t want a smartphone without X Y Z apps (same argument i see now about mobile linux or even custom ROMs). The phones had nice and fast UI though, and some had very good cameras.

          • ChickenLadyLovesLife@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            6 days ago

            Windows Phone was great. I’d done Windows Mobile since 2005 and it was nice to be able to continue developing with C#/.NET and Visual Studio (back when it was still good) in a more modern OS. One thing that really spoiled me permanently was being able to compile, build and deploy the app I was working on to my test device effectively instantaneously – like, by the time I’d moved my hand over to the device, the app was already up and running. Then I switched to iOS where the same process could take minutes, also Blackberry where it might take half an hour or never happen at all.

            Funny thing: RIM was going around circa 2010/2011 offering companies cash bounties of $10K to $20K to develop apps for Blackberry, since they were dying a rapid death but were still flush with cash. Nobody that I know of took them up on the offers. I tried to get my company to make a Windows Phone version of our software but I was laughed at (and deservedly so).

  • merc@sh.itjust.works
    link
    fedilink
    arrow-up
    28
    ·
    7 days ago

    You do really feel this when you’re using old hardware.

    I have an iPad that’s maybe a decade old at this point. I’m using it for the exact same things I was a decade ago, except that I can barely use the web browser. I don’t know if it’s the browser or the pages or both, but most web sites are unbearably slow, and some simply don’t work, javascript hangs and some elements simply never load. The device is too old to get OS updates, which means I can’t update some of the apps. But, that’s a good thing because those old apps are still very responsive. The apps I can update are getting slower and slower all the time.

    • ssfckdt@lemmy.blahaj.zone
      cake
      link
      fedilink
      arrow-up
      26
      ·
      7 days ago

      It’s the pages. It’s all the JavaScript. And especially the HTML5 stuff. The amount of code that is executed in a webpage these days is staggering. And JS isn’t exactly a computationally modest language.

      Of the 200kB loaded on a typical Wikipedia page, about 85kb of it is JS and CSS.

      Another 45kB for a single SVG, which in complex cases is a computationally nontrivial image format.

      • 87Six@lemmy.zip
        link
        fedilink
        arrow-up
        6
        ·
        7 days ago

        I don’t agree. It’s both. I’ve opened basic no JS sites on old tablets to test them out and even those pages BARELY load

          • Passerby6497@lemmy.world
            link
            fedilink
            English
            arrow-up
            4
            ·
            6 days ago

            Probably just the browser itself, considering how bloated they’re getting. It’s not super surprising, considering the apps run about as fast (on a good day) as it did 5-10 years ago on a new phone, it’s gonna run like dogshit on a phone from that era.

  • ZILtoid1991@lemmy.world
    link
    fedilink
    arrow-up
    27
    ·
    7 days ago

    They often are worse, because everything needed to be an electron app, so they could hire the cheaper web developers for it, and also can boast about “instant cross platform support” even if they don’t release Linux versions.

    Qt and GTK could do cross platform support, but not data collection, for big data purposes.

    • Echo Dot@feddit.uk
      link
      fedilink
      arrow-up
      8
      ·
      7 days ago

      I don’t know why electron has to use so much memory up though. It seems to use however much RAM is currently available when it boots, the more RAM system has the more electron seems to think it needs.

      • GamingChairModel@lemmy.world
        link
        fedilink
        arrow-up
        14
        ·
        7 days ago

        Chromium is basically Tyrone Biggums asking if y’all got any more of that RAM, so bundling that into Electron is gonna lead to the same behavior.

      • Buddahriffic@lemmy.world
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        7 days ago

        Ib4 “uNusEd RAm iS wAStEd RaM!”

        No, unused RAM keeps my PC running fast. I remember the days where accidentally hitting the windows key while in a game meant waiting a minute for it to swap the desktop pages in, only to have to swap the game pages back when you immediately click back into it, expecting it to either crash your computer or probably disconnect from whatever server you were connected to. Fuck that shit.

        • boonhet@sopuli.xyz
          link
          fedilink
          arrow-up
          5
          ·
          7 days ago

          I mean unused RAM is still wasted: You’d want all the things cached in RAM already so they’re ready to go.

          • Buddahriffic@lemmy.world
            link
            fedilink
            arrow-up
            3
            ·
            7 days ago

            I don’t want my PC wasting resources trying to guess every possible next action I might take. Even I don’t know for sure what games I’ll play tonight.

            • boonhet@sopuli.xyz
              link
              fedilink
              arrow-up
              4
              ·
              7 days ago

              Well you’d want your OS to cache the start menu in the scenario you highlighted above. The game could also run better if it can cache assets not currently in use instead of waiting for the last moment to load them. Etc.

              • Buddahriffic@lemmy.world
                link
                fedilink
                arrow-up
                1
                ·
                7 days ago

                Yeah, for things that will likely be used, caching is good. I just have a problem with the “memory is free, so find more stuff to cache to fill it” or “we have gigabytes of RAM so it doesn’t matter how memory-efficient any program I write is”.

                • boonhet@sopuli.xyz
                  link
                  fedilink
                  arrow-up
                  3
                  ·
                  7 days ago

                  “memory is free, so find more stuff to cache to fill it”

                  As long as it’s being used responsibly and freed when necessary, I don’t have a problem with this

                  “we have gigabytes of RAM so it doesn’t matter how memory-efficient any program I write is”

                  On anything running on the end user’s hardware, this I DO have a problem with.

                  I have no problem with a simple backend REST API being built on Spring Boot and requiring a damn gigabyte just to provide a /status endpoint or whatever. Because it runs on one or a few machines, controlled by the company developing it usually.

                  When a simple desktop application uses over a gigabyte because of shitty UI frameworks being used, I start having a problem with it, because that’s a gigabyte used per every single end user, and end users are more numerous than servers AND they expect their devices to do multiple things, rather than running just one application.

          • Echo Dot@feddit.uk
            link
            fedilink
            arrow-up
            1
            ·
            7 days ago

            I mean I have access to a computer with a terabyte of RAM I’m gonna go ahead and say that most applications aren’t going to need that much and if they use that much I’m gonna be cross.

            • boonhet@sopuli.xyz
              link
              fedilink
              arrow-up
              2
              ·
              7 days ago

              Wellll

              If you have a terabyte of RAM sitting around doing literally nothing, it’s kinda being wasted. If you’re actually using it for whatever application can make good use of it, which I’m assuming is some heavy-duty scientific computation or running full size AI models or something, then it’s no longer being wasted.

              And yes if your calculator uses the entire terabyte, that’s also memory being wasted obviously.

              • Echo Dot@feddit.uk
                link
                fedilink
                arrow-up
                1
                ·
                6 days ago

                That’s a different definition of wasted though. The RAM isn’t lost just because it isn’t being currently utilised. It’s sitting there waiting for me to open a intensive task.

                What I am objecting to is programs using more RAM than they need simply because it’s currently available. Aka chromium.

    • boonhet@sopuli.xyz
      link
      fedilink
      arrow-up
      5
      ·
      7 days ago

      There’s no difference whatsoever between qt or gtk and electron for data collection. You can add networking to your application in any of those frameworks.

  • ssfckdt@lemmy.blahaj.zone
    cake
    link
    fedilink
    arrow-up
    18
    ·
    7 days ago

    The program expands so as to fill the resources available for its execution

    – C.N. Parkinson (if he were alive today)

  • kamen@lemmy.world
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    1
    ·
    7 days ago

    I paid for the whole amount of RAM, I’m gonna use the whole amount of RAM.

    /s

    Joke aside, the computer I used a little more than a decade ago used to take 1 minute just to display a single raw photo. I’m a liiiittle better off now.

      • kamen@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        6 days ago

        It was a s. 754 Sempron at a time when people were already running Core 2 Duos ans Quads.

        • sip@programming.dev
          link
          fedilink
          arrow-up
          1
          ·
          6 days ago

          sory for making you feel old…er.

          i7 4th gen/ haswell was 13 years ago. I still use it.

          that sempron is probably more than 17 years ago.

          I had an athlon xp 2000+, single core. OC to 2666MHz with proper thermals

          • kamen@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            6 days ago

            I got that PC in high school and had to run it a bit afterwards because I didn’t have the money for a new one. When eventually I got around to replacing it, I got an X99/Haswell-E system and it was a night and day difference.

    • brucethemoose@lemmy.world
      link
      fedilink
      arrow-up
      2
      ·
      6 days ago

      used to take 1 minute just to display a single raw photo

      See, that’s a great example!

      RAW processing (at least in that context) hasn’t really changed in 10 years. It’s probably the same code doing all the heavy lifting.

      But most software doesn’t have that benefit.

  • Valmond@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    19
    ·
    7 days ago

    Had to install (an old mind you, 2019) visual studio on windows…

    First it’s like 30GB, what the hell?? It’s an advanced text editor with a compiler and some …

    Crashed a little less than what I remember 🥴😁

      • Valmond@lemmy.dbzer0.com
        link
        fedilink
        arrow-up
        3
        ·
        7 days ago

        Visual code is another project, visual studio is indeed an IDE but it integrates it all. Vscode is also an integrated development environment. I don’t really know what more to say.

        • The Stoned Hacker@lemmy.world
          link
          fedilink
          arrow-up
          7
          ·
          7 days ago

          VS Code is considered a highly extensible text editor that can be used as an IDE, especially for web based tools, but it isnt an IDE. It’s more comparable to Neovim or Emacs than to IntelliJ in terms of the role it’s supposed to fill. Technically. VS Code definitely is used more as an IDE by most people, and those people are weak imo. I’m not one to shill for companies (i promise this isnt astroturf) but if you need to write code Jetbrains probably has the best IDE for that language. Not always true but moee often than not it is imo.

          • Valmond@lemmy.dbzer0.com
            link
            fedilink
            arrow-up
            2
            ·
            6 days ago

            Ooh, a flame war 🔥🔥🔥 ! It has been so long since I was involved in one, thank you 🙋🏻‍♀️! 😊

            Who uses visual code to something else than writing and launching vode? I only uses it for C#/Godot on Linux but it has all the bells and whistles to make it an IDE IMO (BTW anyone who doesn’t code in C/C++ is weak ofc ☺️! 🔥).

            Let me just add that jetbrains (at least pycharm) have started their enshittification death cycle, and I’m looking for a lightweight python IDE that doesn’t hallucinate (but lets you use venm and debug), if you have any ideas I’m listening!

            Cheers

            • The Stoned Hacker@lemmy.world
              link
              fedilink
              arrow-up
              2
              ·
              6 days ago

              I wanna clarify that when i say VS Code I’m talking about Visual Studio Code. I was only commenting on the difference between Visual Studio and Visual Studio Code because you said you downloaded Visual Studio and was confused why a text editor was 30gb, and it’s possible you downloaded the IDE rather than the text editor. I apologize if you thought i was talking about Visual Code; I wasn’t.

              And i agree that JetBrains has started to enshittify but I also think their enshittification has been pretty slow because they sell professional tools that still have to perform the basic functionality of an IDE. And for the modt part I’ve been able to disable all AI features save the ones I’m required to use at work (yay AI usage metrics ;-;)