Instruction decoding takes space and power. If there are fewer, smaller transistors dedicated to the task it will take less space and power.
Instruction decoding takes space and power. If there are fewer, smaller transistors dedicated to the task it will take less space and power.
Well, not exactly. You have to remove instructions at some point. That’s what Intel’s x86-S is supposed to be. You lose some backwards compatibility but they’re chosen to have the least impact on most users.
I also haven’t wanted an Intel processor in a while . They used to be best in class for laptops prior to the M1, but they’re basically last now behind Apple, AMD, Qualcomm. They might win in a few specific benchmarks that matter very little to people, and are still the default option in most gaming laptops. For desktop use the Ryzen family is much more compelling. For servers they still seem to have an advantage but it’s also an industry which requires longer term contracts that Intel has the infrastructure for more so than it’s competitors, but ARM is also gaining ground there with exceptional performance per watt.
Exactly. Adding a third should be much simpler than a second.
As a fellow risc-v supporter, I think the rise of arm is going to help risc-v software support and eventually adoption. They’re not compatible, but right now developers everywhere are working to ensure their applications are portable and not tied to x86. I imagine too that when it comes to emulation, emulating arm is going to be a lot easier than x86, possibly even statically recompilable.
I’m both surprised and not surprised that ever since the M1, Intel seems to just be doing nothing in the consumer space. Certainly losing their contract with Apple was a blow to their sales, and with AMD doing pretty well these days, ARM slowly taking over the server space where backwards compatibility isn’t as significant, and now Qualcomm coming to eat the windows market, Intel just seems like a dying beast. Unless they do something magical, who will want an Intel processor in 5 years?
All else being equal, a complex decoding pipeline does reduce the efficiency of a processor. It’s likely not the most important aspect, but eventually there will be a point where it does become an issue once larger efficiency problems are addressed.
We stuck to x86 forever because backwards compatibility and because nobody had anything better. Now manufacturers do have something better, and it’s fast enough that emulation is good enough for backwards compatibility.
I think it is this way because Apple thought it would be misleading if the option was “deny tracking”, because there isn’t a specific technical mechanism to ensure that. It’s unfortunate but I’d rather it was honest than lied.
Western governments need to step up their subsidies for green tech then to compete, I guess. Not start banning the people who are providing the solution.
Yeah no such catastrophic celestial events are likely in the next few millennia, and we’re pretty good at predicting those things now. The impact of climate change is already affecting a billion or more people right now.
I’m fully aware that EVs won’t solve the climate crisis. And, of course leaders in the west, especially the US, pitch consumerism as the solution to climate change. Unfortunately for many people, myself included, we have no option but to to drive as public transit has been purposefully dismantled, and opting for EVs (when already buying a car) is one of the only real choices that has any noticeable climate impact.
Alternative plan: we all are stuck on this rock together and maybe we should prioritize maintaining its habitability over bickering about who is allowed to provide the solution.
Western governments: We need to take climate change seriously and transition to renewables and EVs.
Also western governments: It’s bad that China has ramped up production on renewable energy sources and EVs, hit them with tariffs to protect our insufficient domestic production.
As someone who primarily uses Unix-like systems and develops cross platform software, having windows as a weird outlier is probably best for the long term. Windows is weird and dumb but it forces us to consider platform differences more explicitly. In the future if a new operating system becomes popular, all the checks that were implemented for windows will make it a bit easier to port to newer systems.
Most people shouldn’t self host. It’s a hobby for people who want to do it, and there are benefits, but spending 3 hours on a weekend fixing stuff is not how most people wish to spend their time. Furthermore, it’s not a good use of most people’s time. We split labor up into specialties, forcing people to do work outside their specialty causes pointless inefficiency. I agree with what other commenters have said in that a better approach would be to have more small businesses hosting federated together, and anyone not inclined to self host should just purchase service through one of those many small providers instead.
Will the sun swallow the earth before the mantle cools down?
I use it to describe a variety of things, but usually it’s related to servers not being able to handle load rather than an outright crash, but I’m not strict about it. Laos balancer failures could be it, could also just be that something was really I efficient but wasn’t noticed until it went into production.
That’s true for all commercial development. No company wants to invest more than they have to. Upstreaming does save time in the long run, but not in the short term.
This is how I would describe my experience. Sometimes it’s crunch time and most of the time it’s fuck around time. After crunch time I always throw a tantrum about how if we only bothered with planning we could largely avoid it.