this post was submitted on 26 Sep 2023
34 points (100.0% liked)
Technology
37750 readers
255 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
Considering how bad some generations of Qualcomm chips have been about this, the Apple chip must have been seriously bad.
Well yeah. It's certainly much easier when you start with ARM reference designs. Apple has what, the modem IP they bought from Intel? A company that, for all its prowess, decided to give up the modem market after only a few years rather than continue to refine the modem that they already brought to market?
Even Samsung gives in and uses Qualcomm modems in the US. And they're a major provider of the baseband hardware on the other end of the connection!
Apple will get there. But there is no way that their aggressive timeline was ever reasonable. Gotta make big promises to the shareholders, I guess.
Apple tends to over-build their chips. For example, they’re running iOS on their displays, just because that’s what their engineers are used to. This means that the display needs a full blown desktop-class computer in there just for showing an image on the screen. It’s the same story for their wireless routers.
That works fine in environments where power and heat don’t really matter, but that’s completely not the case for a modem.