• @floofloof@lemmy.ca
    link
    fedilink
    English
    42310 months ago

    Intel has not halted sales or clawed back any inventory. It will not do a recall, period.

    Buy AMD. Got it!

    • @grue@lemmy.world
      link
      fedilink
      English
      9210 months ago

      I’ve been buying AMD for – holy shit – 25 years now, and have never once regretted it. I don’t consider myself a fanboi; I just (a) prefer having the best performance-per-dollar rather than best performance outright, and (b) like rooting for the underdog.

      But if Intel keeps fucking up like this, I might have to switch on grounds of (b)!

      spoiler

      (Realistically I’d be more likely to switch to ARM or even RISCV, though. Even if Intel became an underdog, my memory of their anti-competitive and anti-consumer bad behavior remains long.)

      • @SoleInvictus@lemmy.blahaj.zone
        link
        fedilink
        English
        36
        edit-2
        10 months ago

        Same here. I hate Intel so much, I won’t even work there, despite it being my current industry and having been headhunted by their recruiter. It was so satisfying to tell them to go pound sand.

      • @Damage@slrpnk.net
        link
        fedilink
        English
        1310 months ago

        I’ve been on AMD and ATi since the Athlon 64 days on the desktop.

        Laptops are always Intel, simply because that’s what I can find, even if every time I scour the market extensively.

      • Final Remix
        link
        fedilink
        English
        7
        edit-2
        10 months ago

        I’ve had nothing but issues with some computers, laptops, etc… once I discovered the common factor was Intel, I haven’t had a single problem with any of my devices since. AMD all the way for CPUs.

      • @vxx@lemmy.world
        link
        fedilink
        English
        2
        edit-2
        10 months ago

        I hate the way Intel is going, but I’ve been using Intel chips for over 30 years and never had an issue.

        So your statement is kind of pointless, since it’s such a small data set, it’s irrelevant and nothing to draw any conclusion from.

      • nek0d3r
        link
        fedilink
        English
        110 months ago

        Genuinely, I’ve also been an AMD buyer since I started building 12 years ago. I started out as a fan boy but mellowed out over the years. I know the old FX were garbage but it’s what I started on, and I genuinely enjoy the 4 gens of Intel since ivy bridge, but between the affordability and being able to upgrade without changing the motherboard every generation, I’ve just been using Ryzen all these years.

      • @Dudewitbow@lemmy.zip
        link
        fedilink
        English
        46
        edit-2
        10 months ago

        arm is very primed to take a lot of market share of server market from intel. Amazon is already very committed on making their graviton arm cpu their main cpu, which they own a huge lion share of the server market on alone.

        for consumers, arm adoption is fully reliant on the respective operating systems and compatibility to get ironed out.

        • @icydefiance@lemm.ee
          link
          fedilink
          English
          1610 months ago

          Yeah, I manage the infrastructure for almost 150 WordPress sites, and I moved them all to ARM servers a while ago, because they’re 10% or 20% cheaper on AWS.

          Websites are rarely bottlenecked by the CPU, so that power efficiency is very significant.

        • @sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          10
          edit-2
          10 months ago

          Linux works great on ARM, I just want something similar to most mini-ITX boards (4x SATA, 2x mini-PCIe, and RAM slots), and I’ll convert my DIY NAS to ARM. But there just isn’t anything between RAM-limited SBCs and datacenter ARM boards.

          • @Dudewitbow@lemmy.zip
            link
            fedilink
            English
            14
            edit-2
            10 months ago

            arm is a mixed bag. iirc atm the gpu on the Snapdragon X Elite is disabled on Linux, and consumer support is reliant on how well the hardware manufacturer supports it if it closed source driver. In the case of qualcomm, the history doesnt look great for it

              • @Zangoose@lemmy.world
                link
                fedilink
                English
                210 months ago

                Apparently (from another comment on a thread about arm from a few weeks ago) consumer GPU bioses contain some x86 instructions that get run on the CPU, so getting full support for ARM isn’t as simple as swapping the cards over to a new motherboard. There are ways to hack around it (some people got AMD GPUs booting on a raspberry pi 5 using its PCIe lanes with a bunch of adapters) but it is pretty unreliable.

                • @sugar_in_your_tea@sh.itjust.works
                  link
                  fedilink
                  English
                  310 months ago

                  Yeah, there are some software issues that need to be resolved, but the bigger issue AFAIK is having the hardware to handle it. The few ARM devices with a PCIe slot often don’t fully implement the spec, such as power delivery. Because of that, driver work just doesn’t happen, because nobody can realistically use it.

                  If they provide a proper PCIe slot (8-16 lanes, on-spec power delivery, etc), getting the drivers updated should be relatively easy (months, not years).

      • moxOP
        link
        fedilink
        English
        2810 months ago

        RISC-V isn’t there yet, but it’s moving in the right direction. A completely open architecture is something many of us have wanted for ages. It’s worth keeping an eye on.

      • @sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        14
        edit-2
        10 months ago

        If there were decent homelab ARM CPUs, I’d be all over that. But everything is either memory limited (e.g. max 8GB) or datacenter grade (so $$$$). I want something like a Snapdragon with 4x SATA, 2x m.2, 2+ USB-C, and support for 16GB+ RAM in a mini-ITX form factor. Give it to me for $200-400, and I’ll buy it if it can beat my current NAS in power efficiency (not hard, it’s a Ryzen 1700).

        • chingadera
          link
          fedilink
          English
          310 months ago

          I hope so, I accidentally advised a client to snatch up a snapdragon surface (because they had to have a dog shit surface) and I hadn’t realized that a lot of shit doesn’t quite work yet. Most of it does, which is awesome, but it needs to pick up the pace

        • @barsoap@lemm.ee
          link
          fedilink
          English
          1
          edit-2
          10 months ago

          Depends on the desktop. I have a NanoPC T4, originally as a set top box (that’s what the RK3399 was designed for, has a beast of a VPU) now on light server and wlan AP duty, and it’s plenty fast enough for a browser and office. Provided you give it an SSD, that is.

          Speaking of Desktop though the graphics driver situation is atrocious. There’s been movement since I last had a monitor hooked up to it but let’s just say the linux blob that came with it could do gles2, while the android driver does vulkan. Presumably because ARM wants Rockchip to pay per fucking feature per OS for Mali drivers.

          Oh the VPU that I mentioned? As said, a beast, decodes 4k h264 at 60Hz, very good driver support, well-documented instruction set, mpv supports it out of the box, but because the Mali drivers are shit you only get an overlay, no window system integration because it can’t paint to gles2 textures. Throwback to the 90s.

          Sidenote some madlads got a dedicated GPU running on the thing. M.2 to PCIe adapter, and presumably a lot of duct tape code.

          • @cmnybo@discuss.tchncs.de
            link
            fedilink
            English
            210 months ago

            GPU support is a real mess. Those ARM SOCs are intended for embeded systems, not PCs. None of the manufacturers want to release an open source driver and the blobs typically don’t work with a recent kernel.

            For ARM on the desktop, I would want an ATX motherboard with a socketed 3+ GHz CPU with 8-16 cores, socketed RAM and a PCIe slot for a desktop GPU.

            Almost all Linux software will run natively on ARM if you have a working GPU. Getting windows games to run on ARM with decent performance would probably be difficult. It would probably need a CPU that’s been optimized for emulating x86 like what Apple did with theirs.

        • @frezik@midwest.social
          link
          fedilink
          English
          1
          edit-2
          10 months ago

          Yes. Problem is, this is the only way our system of justice allows for keeping companies accountable. They still pay out the nose on their end.

          However, in this case, there’s a lot of big companies that would also be part of the class. Some from oem desktop systems in offices, and also for some servers. The 13\14900k has a lot of cores, and there’s quite a few server motherboards that accept it. It was often a good choice over going Xeon or EPYC.

          Those companies are now looking over at the 7950x, noticing it’s faster, uses less power, and doesn’t crash.

          They’re not going to be satisfied with a $10 check.

      • @lath@lemmy.world
        link
        fedilink
        English
        -210 months ago

        Yet they do it all the time when a higher specs CPU is fabricated with physical defects and is then presented as a lower specs variant.

  • @wirehead@lemmy.world
    link
    fedilink
    English
    5810 months ago

    A few years ago now I was thinking that it was about time for me to upgrade my desktop (with a case that dates back to 2000 or so, I guess they call them “sleepers” these days?) because some of my usual computer things were taking too long.

    And I realized that Intel was selling the 12th generation of the Core at that point, which means the next one was a 13th generation and I dono, I’m not superstitious but I figured if anything went wrong I’d feel pretty darn silly. So I pulled the trigger and got a 12th gen core processor and motherboard and a few other bits.

    This is quite amusing in retrospect.

    • @JPAKx4@lemmy.blahaj.zone
      link
      fedilink
      English
      1210 months ago

      I recently built myself a computer, and went with AMD’s 3d cache chips and boy am I glad. I think I went 12th Gen for my brothers computer but it was mid range which hasn’t had these issues to my knowledge.

      Also yes, sleeper is the right term.

  • @sebsch@discuss.tchncs.de
    link
    fedilink
    English
    3710 months ago

    Is there really still such a market for Intel CPUs? I do not understand that AMDs Zen is so much better and is the superior technology since almost a decade now.

    • @UnderpantsWeevil@lemmy.world
      link
      fedilink
      English
      1510 months ago

      Intel is in the precarious position of being the largest surviving American owned semiconductor manufacturer, with the competition either existing abroad (TSMC, Samsung, ASML) or as a partner/subsidiary of a foreign national firm (NVidia simply procures its chips from TSMC, GlobalFoundries was bought up by the UAE sovereign wealth fund, etc).

      Consequently, whenever the US dumps a giant bucket of money into the domestic semiconductor industry, Intel is there to clean up whether or not their technology actually works.

      • @frezik@midwest.social
        link
        fedilink
        English
        1310 months ago

        Small correction: only surviving that makes desktop/server class chips. Companies like Texas Instruments and Microchip still have US foundries for microcontrollers.

    • @frezik@midwest.social
      link
      fedilink
      English
      910 months ago

      The argument was that while AMD is better on paper in most things, Intel would give you rock solid stability. That argument has now taken an Iowa-class broadside to the face.

      I don’t watch LTT anymore, but a few years back they had a video where they were really pushing the limits of PCIe lanes on an Epyc chip by stuffing it full of NVMe drives and running them with software RAID (which Epyc’s sick number of cores should be able to handle). Long story short, they ran into a bunch of problems. After talking to Wendel of Level1Techs, he mentioned that sometimes, AMD just doesn’t work the way it seems it should based on paper specs. Intel usually does. (Might be getting a few details wrong about this, but the general gist should be right.)

      This argument was almost the only thing stopping AMD from taking over the server market. The other thing was AMD simply being able to manufacture enough chips in a short time period. The server market is huge; Intel had $16B revenue in “Data Center and AI” in 2023, while AMD’s total revenue was $23B. Now manufacturing ramp up might be all that’s stopping AMD from owning it.

    • shastaxc
      link
      fedilink
      English
      310 months ago

      Intels have been working in my Linux server better than AMD. The AMDs kept causing server crashes due to C-state nonsense that no amount of BIOS tweaking would fix. AMD is great for performance and efficiency (and cost/value) in my gaming PC but wreaking havoc with my server which I need to be reliably functional without power restarts.

      So I have both.

    • @BobGnarley@lemm.ee
      link
      fedilink
      English
      110 months ago

      Its the only chip that runs on open source bios and you can completely disable the Intel ME after boot up.

      AMD’s PSP is 100% proprietary spyware that can’t be disabled or manipulated into not running.

      • DefederateLemmyMl
        link
        fedilink
        English
        2110 months ago

        Why does that graph show Epyc (server) and Threadripper (workstation) processors in the upper right corner, but not the equivalent Xeons? If you take those away, it would paint a different picture.

        Also, a price/performance graph does not say much about which is the superior technology. Intel has been struggling to keep up with AMD technologically the past years, and has been upping power targets and thermal limits to do so … which is one of the reasons why we are here points at headline.

        I do hope they get their act together, because we an AMD monopoly would just be as bad as an Intel monopoly. We need the competition, and a healthy x86 market, lest proprietary ARM based computers take over the market (Apple M-chips, Snapdragon laptops,…)

        • @tempest@lemmy.ca
          link
          fedilink
          English
          310 months ago

          Aha because if they included the xeon scalables it show how bad they are doing in the datacenter market.

        • @ruse8145@lemmy.sdf.org
          link
          fedilink
          English
          -110 months ago

          I guess I’m confused by your fundamental point though: if we aren’t looking for raw processing power on a range of workloads, what is the technology you see them winning in?

        • @ruse8145@lemmy.sdf.org
          link
          fedilink
          English
          -110 months ago

          Id guess because I selected single processors and many of the xeons are server oriented with multi socket expected. Given the original post I’m responding to I’m more concerned by desktop grade (10-40k pts multi core) than server grade.

    • @deeply_moving_queef@lemmy.ml
      link
      fedilink
      English
      -110 months ago

      Intel’s iGPU is still the by far the best option for applications such as media transcoding. It’s a shame that AMD haven’t focussed more on this but understandable, it’s relatively niche.

    • @w2tpmf@lemmy.world
      link
      fedilink
      English
      -810 months ago

      Naw. Zen was a leap ahead when it came out but AMD didn’t keep that pace long and Intel CPUs quickly caught up.

      I just almost bought a Ryzen 9 7900x but a i7-13700k ended up being cheaper and outperforms the AMD chip.

      • @frezik@midwest.social
        link
        fedilink
        English
        2
        edit-2
        10 months ago

        On what workloads? AMD is king for most games, and for less price. It’s also king for heavily multicore workloads, but not on the same CPU as for games.

        In other words, they don’t have a CPU that is king for both at the same time. That’s the one thing Intel was good at, provided you could cool the damn thing.

  • @TrickDacy@lemmy.world
    link
    fedilink
    English
    3610 months ago

    Amd processors have literally always been a better value and rarely have been surpassed by much for long. The only problem they ever had was back in the day they overheated easily. But I will never ever buy an Intel processor on purpose, especially after this.

    • moxOP
      link
      fedilink
      English
      31
      edit-2
      10 months ago

      The only problem they ever had was back in the day they overheated easily.

      That’s not true. It was just last year that some of the Ryzen 7000 models were burning themselves out from the insides at default settings (within AMD specs) due to excessive SoC voltage. They fixed it through new specs and working with board manufacturers to issue new BIOS, and I think they eventually gave in to pressure to cover the damaged units. I guess we’ll see if Intel ends up doing the same.

      I generally agree with your sentiment, though. :)

      I just wish both brands would chill. Pushing the hardware so hard for such slim gains is wasting power and costing customers.

      • DefederateLemmyMl
        link
        fedilink
        English
        910 months ago

        That’s not true. It was just last year that some of the Ryzen 7000 models were burning themselves

        I think he was referring to “back-in-the-day” when Athlons, unlike the competing Pentium 3 and 4 CPUs of the day, didn’t have any thermal protections and would literally go up in smoke if you ran them without cooling.

        https://www.youtube.com/watch?v=yRn8ri9tKf8

        • @RdVortex@lemmy.world
          link
          fedilink
          English
          3
          edit-2
          10 months ago

          Some motherboards did have overheating protection back then though. Personally I had my Athlon XP computer randomly shut down several times back then, because the system had some issue, where fans would randomly start slowing down and eventually completely stop. This then triggered overheat protection of the motherboard, which simply cut the power as soon as the temperature was too hight.

        • moxOP
          link
          fedilink
          English
          310 months ago

          When I started using computers, I wasn’t aware of any thermal protections in popular CPUs. Do you happen to know when they first appeared in Intel chips?

          • DefederateLemmyMl
            link
            fedilink
            English
            310 months ago

            Pentium 2 and 3 had rudimentary protection. They would simply shutdown if they got too hot. Pentium 4 was the first one that would throttle down clock speeds.

            Anything before that didn’t have any protection as far as I’m aware.

      • @TrickDacy@lemmy.world
        link
        fedilink
        English
        410 months ago

        Yeah. I just meant AMD cpus used to easily overheat if your cooling system had an issue. My ryzen 7 3700x has been freaking awesome though. Feels more solid than any PC I’ve built. And it’s fast AF. I think I saved over $150 when comparing to a similarly rated Intel CPU. And the motherboards generally seem cheaper for AMD too. I would feel ripped off with Intel even without the crashing issues

        • moxOP
          link
          fedilink
          English
          510 months ago

          Where do you think Asus got the specs for that voltage?

          • @ichbinjasokreativ@lemmy.world
            link
            fedilink
            English
            410 months ago

            Then why were there essentially no blow ups from other motherboard manufacturers? Tell me if my information on this is wrong, but when there’s only one brand causing issues then they’re the ones to blame for it.

            • moxOP
              link
              fedilink
              English
              1
              edit-2
              10 months ago

              Then why were there essentially no blow ups from other motherboard manufacturers?

              There were, including MSI, who also released corrected BIOS versions.

              (But even if that were not the case, it could be explained by Asus being the only board maker to use the high end of a voltage range allowed by AMD, or by Asus having a significantly larger share of users who are vocal about such problems.)

          • @frezik@midwest.social
            link
            fedilink
            English
            210 months ago

            Not from AMD. From the autogenerated transcript (with minor edits where it messed up the names of things):

            amd’s official recommendation [f]or the cut off now is 1.3 volts but the board vendors can still technically set whatever they want so even though the [AGESA] update can lock down and start restricting the voltage the problem is Asus their 1.3 number manifests itself as something like 1.34 volts so it is still on the high side

            This was pretty much all on motherboard manufacturers, and ASUS was particularly bad (out scumbaging MSI, good job, guys).

            At the start of this Intel mess, it was thought they had a similar issue on their hands and motherboard manufactures just needed to get in line, but it ended up going a lot deeper.

            • moxOP
              link
              fedilink
              English
              2
              edit-2
              10 months ago

              That doesn’t contradict anything I wrote. Note that it says AMD’s recommended cutoff is now 1.3 volts, implying that it wasn’t before this mess began. Note also that the problem was worse on Asus boards because their components’ tolerance was a bit too loose for a target voltage this high, not because they used a voltage target beyond AMD’s specified cutoff. If the cutoff hadn’t been pushed so high for this generation in the first place, that same tolerance probably would have been okay.

              In any case, there’s no sense in bickering about it. Asus was not without blame (I was upset with them myself) but also not the only affected brand, so it’s not possible that they were the cause of the underlying problem, now is it?

              AMD and Intel have been pushing their CPUs to very high voltages and temperatures for small performance gains recently. 95°C as the new “normal” was unheard of just a few years ago. It’s no surprise that it led to damage in some cases, especially for early adopters. It’s a thin line to walk.

    • @Deway@lemmy.world
      link
      fedilink
      English
      2510 months ago

      rarely have been surpassed by much for long.

      I’ve been on team AMD for over 20 years now but that’s not true. The CoreDuo and the first couple of I CPUS were better than what AMD was offering and were for a decade. The Athlon were much better than the Pentium 3 and P4, the Ryzen are better than the current I series but the Phenom weren’t. Don’t get me wrong, I like my Phenom II X4 but it objectively wasn’t as good as Intel’s offerings back in the day.

      • @deltapi@lemmy.world
        link
        fedilink
        English
        510 months ago

        My i5-4690 and i7-4770 machines remain competitive to this day, even with spectre patches in place. I saw no reason to ‘upgrade’ to 6/7/8th gen CPUs.

        I’m looking for a new desktop now, but for the costs involved I might just end up parting together a HP Z6 G4 with server surplus cpu/ram. The costs of going to 11th+ desktop Intel don’t seem worth it.

        I’m going to look at the more recent AMD offerings, but I’m not sure they’ll compete with surplus server kit.

        • @Deway@lemmy.world
          link
          fedilink
          English
          410 months ago

          I’d say that regardless of the brand, X86 CPU don’t need to be upgraded as often as they used to. No awesome new extension like SSE or something like that, not much more powerful, power consumption not going down significantly. If you don’t care about power consumption, the server CPU will be more interesting, there’s no doubt about that.

        • @floofloof@lemmy.ca
          link
          fedilink
          English
          2
          edit-2
          10 months ago

          They’re still useful, but they’re not competitive in overall performance with recent CPUs in the same category. They do still compete with some of the budget and power-efficient CPUs, but they use more power and get hotter.

          That said, those 4th gen Intel CPUs are indeed good enough for most everyday computing tasks. They won’t run Windows 11 because MS locks them out, but they will feel adequately fast unless you’re doing pretty demanding stuff.

          I still have an i5-2400, an i7-4770K and an i7-6700 for occasional or server use, and my i7-8550U laptop runs great with Linux (though it overheated with Windows).

          I buy AMD now though.

        • @frezik@midwest.social
          link
          fedilink
          English
          2
          edit-2
          10 months ago

          My issue with surplus server kit at home is that it tends to idle at very high power usage compared to desktop kit. For home use that won’t be pushing high CPU utilization, the savings in cost off eBay aren’t worth much.

          This is also why you’re seeing AM5 on server motherboards. If you don’t need to have tons of PCIe lanes–and especially with PCIe 5, you probably don’t–the higher core count AM5 chips do really well for servers.

    • @edgesmash@lemmy.world
      link
      fedilink
      English
      610 months ago

      The only problem they ever had was back in the day they overheated easily.

      Very easily.

      In college (early aughts), I worked as tech support for fellow students. Several times I had to take the case cover off, point a desktop fan into the case, and tell the kid he needed to get thermal paste and a better cooler (services we didn’t offer).

      Also, as others have said, AMD CPUs have not always been superior to Intel in performance or even value (though AMDs have almost always been cheaper). It’s been a back-and-forth race for much of their history.

      • @TrickDacy@lemmy.world
        link
        fedilink
        English
        210 months ago

        Yeah. I never said they were always better in performance. But I have never had an issue other than the heat problem which all but one time was fully my fault. And I don’t need a processor to perform 3% better on random tasks… which was the kind of benchmark results I would typically find when comparing similar AMD/intel processors (also in some categories amd did win). I saved probably a couple grand avoiding Intel. And as another user said, I prefer to support the underdog. The company making a great product for a lot less money. Again I say: fuck Intel.

  • @deltreed@lemmy.world
    link
    fedilink
    English
    3410 months ago

    So like, did Intel lay off or deprecate its QA teams similar to what Microsoft did with Windows? Remember when stability was key and everything else was secondary? Pepperidge farms remembers.

  • @gearheart@lemm.ee
    link
    fedilink
    English
    3310 months ago

    This would be funny if it happened to Nvidia.

    Hope Intel recovers from this. Imagine if Nvidia was the only consumer hardware manufacturer…

    No one wants that.

    • @mlg@lemmy.world
      link
      fedilink
      English
      2110 months ago

      This would be funny if it happened to Nvidia.

      Hope Intel recovers from this. Imagine if Nvidia was the only consumer hardware manufacturer…

      Lol there was a reason Xbox 360s had a whopping 54% failure rate and every OEM was getting sued in the late 2000s for chip defects.

    • @brucethemoose@lemmy.world
      link
      fedilink
      English
      7
      edit-2
      10 months ago

      This would be funny if it happened to Nvidia.

      It kinda, has, with Fermi, lol. The GTX 480 was… something.

      Same reason too. They pushed the voltage too hard, to the point of stupidity.

      Nvidia does not compete in this market though, as much as they’d like to. They do not make x86 CPUs, and frankly Intel is hard to displace since they have their own fab capacity. AMD can’t take the market themselves because there simply isn’t enough TSMC/Samsung to go around.

      • @Kyrgizion@lemmy.world
        link
        fedilink
        English
        310 months ago

        There’s also Intel holding the x86 patent and AMD holding the x64 patent. Those two aren’t going anywhere yet.

    • nek0d3r
      link
      fedilink
      English
      110 months ago

      I genuinely think that was the best Intel generation. Things really started going downhill in my eyes after Skylake.

    • @A_Random_Idiot@lemmy.world
      link
      fedilink
      English
      3910 months ago

      They cant even commit to offering RMAs, period. They keep using vague, cant-be-used-against-me-in-a-court-of-law language.

    • @BobGnarley@lemm.ee
      link
      fedilink
      English
      2310 months ago

      Oh you mean they’re going to underclock the expensive new shit I bought and have it underperform to fix their fuck up?

      What an unacceptable solution.

      • @frezik@midwest.social
        link
        fedilink
        English
        1110 months ago

        That’s where the lawsuits will start flying. I wouldn’t be surprised if they knock off 5-15% of performance. That’s enough to put it well below comparable AMD products in almost every application. If performance is dropped after sale, there’s a pretty good chance of a class action suit.

        Intel might have a situation here like the XBox 360 Red Ring of Death. Totally kills any momentum they had and hands a big victory to their competitor. This at a time when Intel wasn’t in a strong place to begin with.

      • @Strykker@programming.dev
        link
        fedilink
        English
        910 months ago

        They aren’t over clocking / under clocking anything with the fix. The chip was just straight up requesting more voltage than it actually needed, this didn’t give any benefit and was probably an issue even without the damage it causes, due to extra heat generated.

        • nek0d3r
          link
          fedilink
          English
          -110 months ago

          Giving a CPU more voltage is just what overclocking is. Considering that most of these modern CPUs from both AMD and Intel have already been designed to start clocking until it reaches a high enough temp to start thermally throttling, it’s likely that there was a misstep in setting this threshold and the CPU doesn’t know when to quit until it kills itself. In the process it is undoubtedly gaining more performance than it otherwise would, but probably not by much, considering a lot of the high end CPUs already have really high thresholds, some even at 90 or 100 C.

          • @Strykker@programming.dev
            link
            fedilink
            English
            010 months ago

            If you actually knew anything you’d know that overclockers tend to manually reduce the voltage as they increase the clock speeds to improve stability, this only works up to a point, but clearly shows voltage does not directly influence clock speed.

            • nek0d3r
              link
              fedilink
              English
              010 months ago

              Ah, got me with a reverse gish gallop. Now I’m an idiot, oh no…

    • AnyOldName3
      link
      fedilink
      English
      1410 months ago

      If you give a chip more voltage, its transistors will switch faster, but they’ll degrade faster. Ideally, you want just barely enough voltage that everything’s reliably finished switching and all signals have propagated before it’s time for the next clock cycle, as that makes everything work and last as long as possible. When the degradation happens, at first it means things need more voltage to reach the same speed, and then they totally stop working. A little degradation over time is normal, but it’s not unreasonable to hope that it’ll take ten or twenty years to build up enough that a chip stops working at its default voltage.

      The microcode bug they’ve identified and are fixing applies too much voltage to part of the chip under specific circumstances, so if an individual chip hasn’t experienced those circumstances very often, it could well have built up some degradation, but not enough that it’s stopped working reliably yet. That could range from having burned through a couple of days of lifetime, which won’t get noticed, to having a chip that’s in the condition you’d expect it to be in if it was twenty years old, which still could pass tests, but might keel over and die at any moment.

      If they’re not doing a mass recall, and can’t come up with a test that says how affected an individual CPU has been without needing to be so damaged that it’s no longer reliable, then they’re betting that most people’s chips aren’t damaged enough to die until the after warranty expires. There’s still a big difference between the three years of their warranty and the ten to twenty years that people expect a CPU to function for, and customers whose parts die after thirty-seven months will lose out compared to what they thought they were buying.

    • @BobGnarley@lemm.ee
      link
      fedilink
      English
      910 months ago

      No refunds for the fried ones should be all you need to see about hwp they “handle” this.

    • Metype
      link
      fedilink
      English
      610 months ago

      For what it’s worth my i9-13900 was experiencing serious instability issues. Disabling turbo helped a lot but Intel offered to replace it under warranty and I’m going through that now. Customer support on the issue seems to be pretty good from my experience.

    • @residentmarchant@lemmy.world
      link
      fedilink
      English
      1810 months ago

      As compared to a recall and re-fitting a fab, a class action is probably the cheaper way out.

      I wish companies cared about what they sold instead of picking the cheapest way out, but welcome to the world we live in.

  • @InAbsentia@lemmy.world
    link
    fedilink
    English
    1410 months ago

    Thankfully I haven’t had any issues out of my 13700k but it’s pretty shitty of Intel to not stand behind their products and do a recall.

    • @communism@lemmy.ml
      link
      fedilink
      English
      1
      edit-2
      10 months ago

      I think a lot of things can cause that. Unfortunately it’s difficult to diagnose hardware issues for certain without just having a bunch of spare cpus, spare mobos, spare ram, etc lying around and a lot of time on your hands to keep swapping out parts until you find a swap that fixes it. Especially when it’s an issue that happens occasionally so you have to keep using your computer without issue for long enough until you think it’s likely that the problem is fixed.

      Also not guaranteed to be a hardware issue but probably is. I’ve sometimes had similar issues that were a combination of the kernel not working well with a specific piece of hardware I use.