I heard that they don’t have similar problems in China, because they’ve made laws against it.
Why can’t we do that here in EU?
I heard that they don’t have similar problems in China, because they’ve made laws against it.
Why can’t we do that here in EU?
No but the M4 Max is claimed to be as fast, and Intel improved their chip, so it’s down from 250W for previous gen! And the M4 Max is faster.
The new Intel Arrow Lake is supposed to max out at 150W, but it doesn’t. And that’s still almost 40% better than previous gen Intel!
So hovering around 80-90W max is pretty modest by today’s standards.
The laws of physics apply to everyone
That is obviously true, but a ridiculous argument, there are plenty examples of systems performing better and using less power than the competition.
For years Intel chips used twice the power for similar performance compared to AMD Ryzen. And in the Buldozer days it was the same except the other way around.
Arm has designed chips for efficiency for a decade before the first smartphones came out, and they’ve kept their eye on the ball the entire time since.
It’s no wonder Arm is way more energy efficient than X86, and Apple made by far the best Arm CPU when M1 arrived.
The great advantage of Apple is that they are usually a node ahead
Yes that is an advantage, but so it is for the new Intel Arrow Lake compared to current Ryzen, yet Arrow Lake use more power for similar performance. Despite Arrow Lake is designed for efficiency.
It’s notable that Intel was unable to match Arm on power efficiency for an entire decade, even when Intel had the better production node. So it’s not just a matter of physics, it is also very much a matter of design. And Intel has never been able to match Arm on that. Arm still has the superior design for energy efficiency over X86, and AMD has the superior design over Intel.
I look forward to watching a Gamers Nexus review of this. I hope it’s as good as they say. 😀
Oh I misremembered what bitwarden is.
Yes, there was the Xeon Phi, Knights Landing, with up to 72 cores, and 4 threads per core!
The Knights Landing was put into production though, but it was more a compute unit than a GPU.
I’m not aware they tried to sell it as a GPU too? Although If I recall correctly they made some real time ray tracing demos.
deleted by creator
You are right.
and used quadratics instead of triangles.
Now that you mention it, I remember reading about that, but completely forgot.
I remembered it as the Riva coming out of nowhere. As the saying goes, first impressions last. And I only learned about NV1 much later.
But the third one stayed up!
👍 😋
But Intel also made the i815 GPU, So Arc isn’t really the first.
True, but I use the phone reference to show how ridiculous it is that Intel remained on 4 cores for almost 8 years.
Even Phenom was available with 6 good cores in 2010, yet Intel remained on 4 for almost 8 years until Coffee Lake came out late 2017, but only with 6 cores against the Ryzen 8.
Intel was pumping money from their near monopoly for 7 years, letting the PC die a slow death of irrelevancy. Just because AMD FX was so horrible their 8 Buldozer cores were worse than 4 Core2 from Intel. They were even worse than AMDs own previous gen Phenom.
It was pretty obvious when Ryzen came out that the market wanted more powerful processors for desktop computers.
First gen chips are rarely blockbusters
True, yet Nvidia was a nobody that arrived out of nowhere with the Riva graphics cards, and beat everybody else thoroughly. ATi, S3, 3Dfx, Matrox etc.
But you are right, these things usually take time, and for instance Microsoft was prepared to spend 10 years without making money on Xbox, because they saw it had potential in the long run.
I’m surprised Intel consider themselves so hard pressed, they are already thinking of giving up.
They were competitive for customers, but only because Intel sold them at no profit.
The main reason Intel can’t compete is the fact CUDA is both proprietary and the industry standard
AFAIK the AMD stack is open source, I’d hoped they’d collaborate on that.
I’ve commented many times that Arc isn’t competitive, at least not yet.
Although they were decent performers, they used twice the die size for similar performance compared to Nvidia and AMD, so Intel has probably sold them at very little profit.
Still I expected them to try harder this time, because the technologies to develop a good GPU, are strategically important in other areas too.
But maybe that’s the reason Intel recently admitted they couldn’t compete with Nvidia on high end AI?
CPUs would be powerful enough for high-performance graphics rendering lmao
And then they continued making 4 core desktop CPU’s, even after phones were at deca-core. 🤣🤣🤣
X’s most recent report, covering February to July 2024, showed that its user base in the EU fell once more to 105.9 million.
And these are datapoints they release themselves, 3rd party data points hint at bigger losses.
That can be the case, but IMO Terminator 2 was an amazing follow up, and it had 15 times bigger budget.
it takes over management of the planet
Novel Colossus from 1966 has exactly that.
I too was shocked and a bit offended that it was called a B movie. But the budget was a freaking measly $6.4 million.
That’s peanuts even back in 1984.
https://en.wikipedia.org/wiki/The_Terminator
Indeed they accomplished a lot, an all time classic movie. Some of my friends were a bit critical about the stop motion not being very good in their opinion. But I just thought the movie was amazing.
No it’s not, because the point is that design matters. When Ryzen came out originally, it was far more energy efficient than the Intel Skylake. And Intel had the node advantage.