Windows 11 often requires new hardware. But that will be extremely pricey or have very little RAM for a while.

I dont believe that a single competent person works at Micro$oft anymore, but maybe maybe this could lead them to make a less shitty OS?

And garbage software like Adobe Creative Cloud too?

They obviously dont care about users, but the pain could become too big.

  • tomkatt@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    47 minutes ago

    There’s plenty of “unbloated” software available. It’s just not on Windows.

  • shiroininja@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    3 hours ago

    Do people really use that much Ram with normal use? Like I rarely even fill my 16gb, even with gaming, etc. I mean I just don’t leave 16 tabs open in a browser because that feels really disorganized. And I turn my computer off every night and start fresh every day

    • pantherina@feddit.orgOP
      link
      fedilink
      arrow-up
      5
      ·
      3 hours ago

      Yes. 16GB is the bare minimum for regular usage on Windows. On Linux, it is a minimum for “regular to advanced” usage (i.e. more than 5 more complex programs open, Flatpak, Electron apps)

  • mycodesucks@lemmy.world
    link
    fedilink
    arrow-up
    13
    ·
    7 hours ago

    It’s a really nice idea, but bad developers are already so deep in the sunk cost fallacy that they’ll likely just double down.

    Nobody reassesses their dogma just because the justification for it is no longer valid. That’s not how people work.

  • CMDR_Horn@lemmy.world
    link
    fedilink
    arrow-up
    104
    ·
    12 hours ago

    Not likely. I expect the AI bubble will burst before those software optimization gears even start to turn.

      • antrosapien@lemmy.ml
        link
        fedilink
        arrow-up
        2
        ·
        5 hours ago

        Yes, but with AI, you can build it in 4 hours, and with all those extra RAMs, it could drop to 2

    • Riskable@programming.dev
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      9 hours ago

      Big AI is a bubble but AI in general is not.

      If anything, the DRAM shortages will apply pressure on researchers to come up with more efficient AI models rather than more efficient (normal) software overall.

      I suspect that as more software gets AI-assisted development we’ll actually see less efficient software but eventually, more efficient as adoption of AI coding assist becomes more mature (and probably more formalized/automated).

      I say this because of experience: If you ask an LLM to write something for you it often does a terrible job with efficiency. However, if you ask it to analyze an existing code base to make it more efficient, it often does a great job. The dichotomy is due to the nature of AI prompting: It works best if you only give it one thing to do at a time.

      In theory, if AI code assist becomes more mature and formalized, the “optimize this” step will likely be built-in, rather than something the developer has to ask for after the fact.

  • kboos1@lemmy.world
    link
    fedilink
    arrow-up
    10
    ·
    10 hours ago

    The “shortage” is temporary and artificial, so that’s a hard NO. The ram shortage doesn’t present any incentive to make apps more efficient because the hardware and software that is already in people’s homes won’t be effected by the shortage and people who currently use the software won’t be affected by the shortage. The very small percentage of people that will be affected by the temporary shortage wouldn’t justify making changes to software that is currently in development.

    There’s no incentive for software companies to make their code more efficient until people stop using their software so stop using it and it will get better. Just as an example Adobe reader is crap, just straight up garbage, but people still use it so the app stopped getting improvements many years ago. Then Adobe moved to a subscription based system, and cloud service for selling your data but guess what, it’s still the same app that it was 10 years ago, just more expensive.

    • SkyNTP@lemmy.ml
      link
      fedilink
      arrow-up
      2
      arrow-down
      2
      ·
      7 hours ago

      What crystal ball told you this was temporary? Every day for the past few years the consumer market moves further and further into serving only the wealthy. The people in power don’t care about selling RAM or other scraps to peasants.

      • kboos1@lemmy.world
        link
        fedilink
        arrow-up
        1
        ·
        59 minutes ago

        History and normal market cycles. I’ll remind you of the great GPU shortage caused by Bitcoin miners.

  • ChillPC@programming.dev
    link
    fedilink
    arrow-up
    14
    arrow-down
    1
    ·
    12 hours ago

    You fool, humans are flexible enough to get used to slow experiences. Even if the average user needs to have discord, slack, 100 chrome tabs, word and any other electron app opened simultaneously, he will just go through his work. He may not be happy with it but still continue without changing his habits.

    But to be honest, I goddamn hope you are right!

    • pantherina@feddit.orgOP
      link
      fedilink
      arrow-up
      1
      ·
      4 hours ago

      The impact is that your software runs even worse on existing hardware. Might not be a big impact, but an impact

    • atro_city@fedia.io
      link
      fedilink
      arrow-up
      3
      ·
      12 hours ago

      Why do you believe so? Do you believe software developers earn too much to care about RAM prices and will continue to write software that requires more RAM than the rest of the world can afford?

      • CarbonatedPastaSauce@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        11 hours ago

        Because that kind of shift in mindset (going backwards, basically) will require far more pressure than a 1-2 year RAM shortage.

        Enterprise developers are basically unaffected by this. And anyone writing software for mom & pop was already targeting 8gb because that’s what Office Depot is selling them.

        This mostly hurts the enthusiast parts of tech. Most people won’t notice, because they don’t know the difference between 8, 16, or over 9000 gb of RAM. I’ve had this discussion with ‘users’ so many times when they ask for pc recommendations, and they just don’t really get it, or care.

      • drcobaltjedi@programming.dev
        link
        fedilink
        arrow-up
        3
        ·
        10 hours ago

        As a software dev, theres a lot of stuff thats just bloat now. Electron apps are really easy to make pretty and write for web devs and are super portable, but each one is literally an instance of a chrome browser. Theres still a lot of devs that care (to some degree) about performance and are willing to trim fat or take small shortcuts where viable.

        However theres also the issue of management. I once was tasked with a problem at work dealing with the traveling salesman problem. I managed to make a very quick solution that worked fairly well and was fast but always left 1 point for last that probably should have been like point 3. Anyway, it was quick and mostly accurate, but my boss told me to “fix it” and in spite of my explaination that hes asking me to solve an unsolved math problem he persisted. I am now ashamed of how slow that operation is now since instead of just finding the nearest point it now needs to look ahead a few steps to see what path is shorter.

      • badgermurphy@lemmy.world
        link
        fedilink
        arrow-up
        5
        ·
        11 hours ago

        For the most part, the answer seems to be yes. Some products did also ship with missing or reduced feature sets for a time, too.

      • magic_lobster_party@fedia.io
        link
        fedilink
        arrow-up
        4
        ·
        11 hours ago

        Dealing with memory usage will likely require significant rewrites and architectural changes. It will take years.

        The ”memory optimizations” we’ll see is the removal of features but charge the same price. Software shrinkflation. Will require same amount of memory though.

  • vala@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    2
    ·
    9 hours ago

    Sometimes I also think there is no one competent left at Microsoft anymore but they still have their flight sim team so I guess that’s something.

  • Tehhund@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    10 hours ago

    Misaligned incentives. The people making bloated software are not the people buying the Ram. In theory the people buying the ram are the same people buying the software and so might put pressure on the people making the software to make it more efficient, but that is a very loose feedback loop and I wouldn’t hold my breath.

  • pr06lefs@lemmy.ml
    link
    fedilink
    arrow-up
    3
    ·
    12 hours ago

    Where I’m eyeing resource usage is in the cloud right now. I run a few discourse instances which seem really inefficient to me - 1.5G ram for just a discussion board. I have to dedicate a server for each one, whereas my rust web servers can have more like 30meg usage. Probably doing a lot less stuff, but still.

  • Rentlar@lemmy.ca
    link
    fedilink
    arrow-up
    2
    ·
    11 hours ago

    Maybe in gaming is where we will see it first, before other software and webapps.

    If the DRAM shortage is long and protracted, perhaps more dumb appliances will make their return, but that’s just a pipe dream of mine.