this post was submitted on 24 Jun 2024
1 points (100.0% liked)

Technology

58458 readers
4472 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
 

There were a number of exciting announcements from Apple at WWDC 2024, from macOS Sequoia to Apple Intelligence. However, a subtle addition to Xcode 16 — the development environment for Apple platforms, like iOS and macOS — is a feature called Predictive Code Completion. Unfortunately, if you bought into Apple's claim that 8GB of unified memory was enough for base-model Apple silicon Macs, you won't be able to use it. There's a memory requirement for Predictive Code Completion in Xcode 16, and it's the closest thing we'll get from Apple to an admission that 8GB of memory isn't really enough for a new Mac in 2024.

you are viewing a single comment's thread
view the rest of the comments
[–] _number8_@lemmy.world 0 points 3 months ago (9 children)

imagine showing this post to someone in 1995

shit has gotten too bloated these days. i mean even in my head 8GB still sounds like 'a lot' of RAM and 16GB feels extravagant

[–] mycodesucks@lemmy.world 0 points 3 months ago (1 children)

Absolutely.

Bad, rushed software that wires together 200 different giant libraries just to use a fraction of them and then run it in a sandboxed container with three daemons it needs for some reason doesn't mean "8 Gb isn't enough", it means write tighter, better software.

[–] AnxiousOtter@lemmy.world 0 points 3 months ago

That ship has long sailed unfortunately. The industry gave up on optimization in favour of praying that hardware advancements can keep up with the bloat.

[–] jas0n@lemmy.world 0 points 3 months ago* (last edited 3 months ago)

Guy from '95: "I bet it's lightning fast though..."

No dude. It peaks pretty soon. In my time, Microsoft is touting a chat program that starts in under 10 seconds. And they're genuinely proud of it.

[–] Shadywack@lemmy.world 0 points 3 months ago

We measure success by how many GB's we have consumed when the only keys depressed from power on to desktop is our password. This shit right here is the real issue.

[–] rottingleaf@lemmy.zip 0 points 3 months ago (3 children)

I still can't fully accept that 1GB is not normal, 2GB is not very good, and 4GB is not all you ever gonna need.

If only it got bloated for some good reasons.

[–] Honytawk@lemmy.zip 0 points 3 months ago (1 children)

The moment you use a file that is bigger than 1GB, that computer will explode.

Some of us do more than just browse Lemmy.

[–] rottingleaf@lemmy.zip 0 points 3 months ago

Wow. Have you ever considered how people were working with files bigger than total RAM they had in the normal days of computing?

So in your opinion if you have 2GB+ of a log file, editing it you should have 2GB RAM occupied?

I just have no words, the ignorance.

[–] Aux@lemmy.world 0 points 3 months ago (2 children)

High quality content is the reason. Sit in a terminal and your memory usage will be low.

[–] rottingleaf@lemmy.zip 0 points 3 months ago (2 children)

256MB or 512MB was fine for high-quality content in 2002, what was that then.

Suppose the amount of pixels and everything quadrupled - OK, then 2GB it is.

But 4GB being not enough? Do you realize what 4GB is?

[–] lastweakness@lemmy.world 0 points 3 months ago (1 children)

They didn't just quadruple. They're orders of magnitude higher these days. So content is a real thing.

But that's not what's actually being discussed here, memory usage these days is much more of a problem caused by bad practices rather than just content.

[–] rottingleaf@lemmy.zip 0 points 3 months ago

I know. BTW, if something is done in an order of magnitude less efficient way than it could and it did, one might consider it a result of intentional policy aimed at neutering development. Just not clear whose. There are fewer corporations affecting this than big governments, and those are capable of reaching consensus from time to time. So not a conspiracy theory.

[–] Aux@lemmy.world 0 points 3 months ago (1 children)

One frame for a 4K monitor takes 33MB of memory. You need three of them for triple buffering used back in 2002, so half of your 256MB went to simply displaying a bloody UI. But there's more! Today we're using viewport composition, so the more apps you run, the more memory you need just to display the UI. Now this is what OS will use to render the final result, but your app will use additional memory for high res icons, fonts, photos, videos, etc. 4GB today is nothing.

I can tell you an anecdote. My partner was making a set of photo collages, about 7 art works to be printed in large format (think 5m+ per side). So 7 photo collages with source material saved on an external drive took 500 gigs. Tell me more about 256MB, lol.

[–] rottingleaf@lemmy.zip 0 points 3 months ago* (last edited 3 months ago) (1 children)

Yes, you wouldn't have 4K in 2002.

4GB today is nothing.

My normal usage would be kinda strained with it, but possible.

$ free -h
               total        used        free      shared  buff/cache   available
Mem:            17Gi       3,1Gi        11Gi       322Mi       3,0Gi        14Gi
Swap:          2,0Gi          0B       2,0Gi
$ 
[–] Aux@lemmy.world 0 points 3 months ago (1 children)

I can do a cold boot and show you empty RAM as well. So fucking what?

load more comments (1 replies)
[–] lastweakness@lemmy.world 0 points 3 months ago (1 children)

So we're just going to ignore stuff like Electron, unoptimized assets, etc... Basically every other known problem... Yeah let's just ignore all that

[–] Aux@lemmy.world 0 points 3 months ago (2 children)

Is Electron that bad? Really? I have Slack open right now with two servers and it takes around 350MB of RAM. Not that bad, considering that every other colleague thinks that posting dumb shit GIFs into work chats is cool. That's definitely nowhere close to Firefox, Chrome and WebStorm eating multiple gigs each.

[–] Jakeroxs@sh.itjust.works 0 points 3 months ago (2 children)

What's wrong with using Gifs in work chat lmao, can laugh or smile while hating your job like the rest of us.

load more comments (2 replies)
[–] lastweakness@lemmy.world 0 points 3 months ago (26 children)

Yes, it really is that bad. 350 MBs of RAM for something that could otherwise have taken less than 100? That isn't bad to you? And also, it's not just RAM. It's every resource, including CPU, which is especially bad with Electron.

I don't really mind Electron myself because I have enough resources. But pretending the lack of optimization isn't a real problem is just not right.

load more comments (26 replies)
load more comments (1 replies)
[–] cyberpunk007@lemmy.ca 0 points 3 months ago (1 children)

I chalk it up to lazy rushed development. Good code is art.

[–] Aux@lemmy.world 0 points 3 months ago (2 children)

That's not true at all. The code doesn't take much space. The content does. Your high quality high res photos, 4K HDR videos, lossless 96kHz audio, etc.

load more comments (2 replies)
[–] qqq@programming.dev 0 points 3 months ago* (last edited 3 months ago)

I once went for lower CAS timing 2x 128MB ram sticks (256 MB) instead of 2x 256s with slower speeds because I thought 512MB was insane overkill. Realized how wrong I was when trying to play Star Wars galaxies mmorpg when a lot of people were on the screen it started swapping to disk. Look up the specs for an IBM Aptiva, first computer my parents bought, and you'll understand how 512MB can seem like a lot.

Now my current computer has 64 GB (most gaming computers go for 32GB) at the time I built it. My workstation at work has 128GB which really isn't even enough for some workloads we have that use a lot of in-memory cache.. And large servers can have multiple TB of RAM. My mind has been blown multiple times.

[–] Aux@lemmy.world 0 points 3 months ago (6 children)

You can always switch to a text based terminal and free up your memory. Just don't compain that YouTube doesn't play 4K videos anymore.

load more comments (6 replies)
[–] Bjornir@programming.dev 0 points 3 months ago (2 children)

I have a VPS that uses 1GB of RAM, it has 6-7 apps running in docker containers which isn't the most ram efficient method of running apps.

A light OS really helps, plus the most used app that uses a lot of RAM actually reduce their consumption if needed, but use more when memory is free, the web browser. On one computer I have chrome running with some hundreds of MB used, instead of the usual GBs because RAM is running out.

So it appears that memory is full,but you can actually have a bit more memory available that is "hidden"

[–] derpgon@programming.dev 0 points 3 months ago

Same here. When idle, the apps basically consume nothing. If they are just a webserver that calls to some PHP script, it basically takes no RAM at all when idle, and some RAM when actually used.

Websites and phone apps are such an unoptimized pieces if garbage that they are the sole reason for high RAM requirements. Also lots of background bloatware.

[–] Specal@lemmy.world 0 points 3 months ago

This is resource reservation, it happens at an OS level. If chrome is using what appears to be alot of ram, it will be freed up once either the OS or another application requires it.

It just exists so that an application knows that if it needs that resource it can use X amount for now.

load more comments (1 replies)