this post was submitted on 28 Oct 2023
124 points (100.0% liked)

Technology

37739 readers
500 users here now

A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.

Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.

Subcommunities on Beehaw:


This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.

founded 2 years ago
MODERATORS
top 50 comments
sorted by: hot top controversial new old
[–] stolid_agnostic@lemmy.ml 71 points 1 year ago (1 children)

lol it’s already out there on tens of millions of laptops, but I guess hubris is the way to go

[–] lauha@lemmy.one 23 points 1 year ago (1 children)

Bollocks! 64k RAM is enough for anything!

[–] stardreamer@lemmy.blahaj.zone 19 points 1 year ago (1 children)

A more recent example:

"Nobody needs more than 4 cores for personal use!"

[–] thingsiplay@kbin.social 10 points 1 year ago (1 children)

I don't know who said this, but my bet would be Intel. Without AMD, we would probably still stuck on 4 cores.

[–] stardreamer@lemmy.blahaj.zone 14 points 1 year ago

Yep it's Intel.

They said it up until their competitor started offering more than 4 cores as a standard.

[–] conciselyverbose@kbin.social 63 points 1 year ago (4 children)

Apple already did though. Even specifically replacing Intel chips because Intel's offering was dogshit that was destroying their ability to offer the design they wanted with their stupid power draw.

The rest of ARM is behind, and Windows has done a shit job of ARM support, but that doesn't mean that's forever.

[–] megopie@beehaw.org 39 points 1 year ago (1 children)

Windows also seems more concerned with going all in on cloud computing, the whole “you will own nothing and like it” paradigm. So making a faster and more efficient mobile platform isn’t probably a high priority for them.

[–] conciselyverbose@kbin.social 20 points 1 year ago (4 children)

Them trying to force control away from users is bad.

But arm's efficiency make it a damn good option for a thin client.

[–] megopie@beehaw.org 9 points 1 year ago (3 children)

Yah, I’m really not enthused with the idea of having to pay monthly rent for my computers ability to function.

I wonder if intel just values their existing experience with 86 more than any potential efficiency gains since the efficiency matters a lot less when the whole system is just a glorified screen and antenna.

[–] conciselyverbose@kbin.social 9 points 1 year ago

I think it matters more.

Apple's battery life is so good in large part because ARM is way better at low end power draw.

load more comments (2 replies)
load more comments (3 replies)
[–] vanderbilt@beehaw.org 23 points 1 year ago (2 children)

Especially when it’s becoming increasingly obvious that Windows isn’t the future. Windows has maintained dominance because it is great at backwards compatibility. ARM erodes that advantage because of architectural differences, coupled with the difficulty and drawbacks of emulating x86 on ARM. Mobile is eating more and more market share, and devs aren’t making enterprise software for Windows like they use to.

No one working on a greenfield project says “let’s develop our systems on Windows server” unless they already were doing that. Windows as a service is the more likely future, funneled by Azure.

load more comments (2 replies)
load more comments (2 replies)
[–] cmnybo@discuss.tchncs.de 44 points 1 year ago (6 children)

The problem with ARM laptops is all of the x86 windows software that will never get ARM support and all of the users that will complain about poor performance if an emulator is used to run the x86 software.

Most Linux software already supports ARM natively. I would love to have an ARM laptop as long as it has a decent GPU with good open source drivers. It would need full OpenGL and Vulkan support and not that OpenGL ES crap though.

[–] smileyhead@discuss.tchncs.de 20 points 1 year ago (3 children)

Windows as always turn out to be the main villain.

[–] w2tpmf@kbin.social 19 points 1 year ago (2 children)

Windows has nothing to do with it. They are talking about software applications that were made for x86. Stuff like Adobe CC, etc.

Windows runs on ARM (and has for a decade) and the apps available in the Windows app store run on ARM.

[–] upstream@beehaw.org 7 points 1 year ago (2 children)

Apple has shown that the market could be willing to adapt.

But then again, they’ve always had more leverage than the Wintel-crowd.

But what people seem to ignore is that there is another option as well: hardware emulation.

IIRC correctly old AMD CPU’s, notably the K6, was actually a RISC core with a translation layer turning X86 instructions into the necessary chain of RISC instructions.

That could also be a potential approach to swapping outright. If 80% of your code runs natively and then 20% passes this hardware layer where the energy loss is bigger than the performance loss you might have a compelling product.

[–] DaPorkchop_@lemmy.ml 6 points 1 year ago (1 children)

Virtually all modern x86 chips work that way

load more comments (1 replies)
[–] DJDarren@thelemmy.club 5 points 1 year ago (1 children)

Apple has shown that the market could be willing to adapt.

It's less that they'll adapt, and more that they don't really care. And particularly in the case of Apple users: their apps are (mostly) available on their Macs already. The vast majority of people couldn't tell you what architecture their computer runs on and will just happily use whatever works and doesn't cost them the earth.

load more comments (1 replies)
load more comments (1 replies)
[–] anlumo@feddit.de 6 points 1 year ago

Microsoft is actually pushing Windows on ARM right now, since their exclusivity deal with Qualcom expired. This is going to get interesting.

And if it wasn't for these meddling gnu followers it would have gotten away with it too.

[–] anlumo@feddit.de 14 points 1 year ago (1 children)

Modern ARM GPUs already support OpenGL and Vulkan, that’s not a problem. Just some platforms chose to go mobile APIs due to running Android.

The trick with emulation that Apple did was to add custom instructions to the CPU that are used by the emulation layer to efficiently run x86_64 code. Nothing is stopping other CPU manufacturers from doing the same, the only issue is that they have to collaborate with the emulation developer.

[–] barsoap@lemm.ee 7 points 1 year ago

The driver situation is less than ideal. Mesa got support for Mali but that's not the only GPU that comes with ARM chips and you get bonkers situations. E.g. with my rk3399-based NanoPC, a couple of years ago (haven't checked in a while and yes it's a Mali) rockchip's blob supported vulkan for android but only gles for linux as rockchip never paid ARM the licensing fees for that.

And honestly ARM is on the way down: Chip producers are antsy about the whole Qualcomm thing and Qualcomm itself is definitely moving away from ARM, as such my bets for the long and even mid-term are firmly on RISC-V. Still lack desktop performance but with mobile players getting into the game laptops aren't far off.

load more comments (4 replies)
[–] bedrooms@kbin.social 36 points 1 year ago (4 children)

I love my ARMed Mac because battery life. I almost never use the power cable outside.

[–] Semi-Hemi-Demigod@kbin.social 11 points 1 year ago (1 children)

And it’s really responsive even on battery. It’s actually a little bad because I can have too many windows open and can’t find anything.

[–] boonhet@lemm.ee 5 points 1 year ago (5 children)

MacOS doesn't throttle performance on battery like many Windows power plans do, that's why

load more comments (5 replies)
load more comments (3 replies)
[–] anlumo@feddit.de 27 points 1 year ago (1 children)

This sounds very familiar to when Steve Ballmer wasn’t worried about the iPhone at all.

[–] BestBouclettes@jlai.lu 7 points 1 year ago

Or when Kodak didn't worry about digital cameras

[–] Pantherina@feddit.de 25 points 1 year ago (2 children)

I hope Risc-V will make it. Even though idk? But it literally has no weird proprietary shit like ARM and it actually makes sense.

Going away from x86_64 is important, even for the environment

[–] taanegl@beehaw.org 10 points 1 year ago (2 children)

Much like the open source movement before it, the open hardware movement will have a slow crawl to a bare victory.

It'll first be used a lot by labs, embedded applications and general infrastructure, far away from the consumer space with only a little bit of overlap.

Then, hopefully, some new Apple-like company manages to slam dunk their presentation and introduction to market, effectively disrupting the market - in a good way.

Follow me for more hopeful divining. We'll have the shaking of sticks, a dead goat boy and symbols written in the floor.

Bring candles.

load more comments (2 replies)
[–] happyhippo@feddit.it 6 points 1 year ago

My hopes for RISC V are higher than I like to admit.

I really hope it goes mainstream and gives us ARM benefits with the open nature awesomeness

[–] melroy@kbin.melroy.org 24 points 1 year ago* (last edited 1 year ago) (2 children)

Well they should be afraid. I want a ARM Linux laptop as well. Or even better RISC-V! Yes plz.. THE WORLD NEED RISC-V, Yesterday.

load more comments (2 replies)
[–] peter@lemmy.emerald.show 24 points 1 year ago* (last edited 1 year ago) (12 children)

I replaced my old Intel Core i7 HP ProLiant server with an Odroid M1 (ARM Based) and it consumes 2 watts compared to 72 that the Intel Server did.

The only thing I can't do with it is my Minecraft server, it runs all else perfectly. Even the Lemmy instance of this account is powered by the same server! And what's more it basically runs for free, as solar generates enough power for the server to consume, even when it's cloudy.

Yes, I believe Intel should be afraid.

load more comments (12 replies)
[–] furrowsofar@beehaw.org 20 points 1 year ago

I think I remember Intel saying that 64 bit on the desktop was not needed. They are great at making meaningless predictions it seems.

[–] figaro@lemdro.id 14 points 1 year ago (1 children)

My m1 MacBook Air is hands down the most incredible laptop I've ever owned. I've had it for 3ish years now and it just doesn't fucking stop. Battery life is still amazing and runs just as fast as it did day 1.

I've NEVER had that experience with any Intel/PC laptop, ever. Honestly I'm never going back.

[–] b0rlax@beehaw.org 11 points 1 year ago (1 children)
load more comments (1 replies)
[–] lemillionsocks@beehaw.org 11 points 1 year ago (7 children)

They're of course exaggerating a little and speaking confidently because theyre in the business of selling a product and not in the business of trash talking what they sell or reducing confidence in their product.

That said the M1/M2 silicon battery life gains were a huge leap forward when they first launched but in terms of battery efficiency and power AMD has been nipping at their heels, and in due time intel will likely get it's stuff together and join them. You can already get ryzen laptops efficient enough and cool running enough that the fan is off during most light usage, and they can get hours into the mid to high teens on some models.

Likewise even macs will start to drain quite a bit when say watching an hd video 1.75x speed, or playing a video game, or encoding something using max CPU power. So while the Macs do have a power per watt advantage, you'll still need to be plugged in.

And thats BEST arm vs intel and amd as they catch up. Samsung, google, and qualcom dont really have anything like the m2 at play and while qualcom is rumored to be close the samsung fab'd chips definitely arent.

So as things are the death Intel and AMD has been greatly exaggerated and in part due a combination of the usual apple hype combined with that hype being VERY VERY justified this go around.

load more comments (7 replies)
[–] thingsiplay@kbin.social 10 points 1 year ago (5 children)

My hope, no... dream, is that we get both ARM and x86 compatible chips on the same motherboard one day. Off course the operating system needs to support dual architectures. Then they could run ARM binaries directly without any major compatibility or performance hit, without the need for recompilation.

A man can only hope. Is this something that could happen? Technically it should be possible, but realistically, probably not.

[–] u_tamtam@programming.dev 6 points 1 year ago (1 children)

But then you end up with the downsides of having both and none of the upsides? Wouldn't that incur an enormous effort on the software side to make it all possible, so you could run a less efficient chip in the end (practically two instead of one)

[–] thingsiplay@kbin.social 5 points 1 year ago

Having compatibility to legacy software is a pretty upside. Either you use an application that runs power efficient, maybe the entire operating system uses the power efficient ARM at default and then for compatibility or for faster calculation (games?) the x86 cores could be used. Intel already does two different kind of cores, performance and efficiency cores. And smartphones have something similar too. I imagine this would be expensive and it is not for everyone. And who knows what other cutbacks and drawbacks it would require.

[–] Pantherina@feddit.de 5 points 1 year ago (1 children)

I think thats a pretty unmotivated approach. Imagine every invention replacing previous ones, just getting piled on top of each others?

load more comments (1 replies)
load more comments (3 replies)
[–] Harlan_Cloverseed@kbin.social 10 points 1 year ago (1 children)
load more comments (1 replies)
[–] Dark_Arc@social.packetloss.gg 9 points 1 year ago (1 children)

It's possible this is a result of improvements Intel is planning for their x86 chips. They've already mirrored the efficiency and performance core designs that AFAIK originated in ARM.

In a way, this might be Intel making a prediction based on how years ago Intel launched an x86 replacement, and AMD launched x86-64 ... and AMD won because people didn't want to rebuild all their software/couldn't get their software.

[–] sanzky@beehaw.org 7 points 1 year ago (1 children)

yeah but back then it was not 90% web apps. also programming languages are way better supporting both platforms. ARM is far from being a little player anymore

load more comments (1 replies)
[–] Plume@beehaw.org 8 points 1 year ago

Intel is evidently not paying attention.

[–] u_tamtam@programming.dev 7 points 1 year ago

Intel planning to abuse its quasi-monopoly to stifle competition and innovation? They wouldn't dare, would they? /s

[–] megopie@beehaw.org 5 points 1 year ago (3 children)

I wonder if intel is betting on increased centralized cloud computing as the way forward for personal computers. So the efficiency benefits of ARM are irrelevant in their minds since they think the real power will come from big data centers.

[–] chameleon@kbin.social 5 points 1 year ago* (last edited 1 year ago) (1 children)

AWS has a shitton of in-house "Graviton" ARM stuff available and the ARM server chips from Ampere are popping up in more and more places as well. Most Linux servery distros have ARM images available now, and most software builds without major changes. It's a slow transition but it's already happening.

load more comments (1 replies)
load more comments (2 replies)
load more comments
view more: next ›