unexposedhazard@discuss.tchncs.de
on 13 Apr 2024 18:30
nextcollapse
Common apple L
sugar_in_your_tea@sh.itjust.works
on 13 Apr 2024 18:35
nextcollapse
Well yeah, they’re enough to meet the minimum use cases so they can upsell most people on expensive RAM upgrades.
That’s why I don’t buy laptops with soldered RAM. That’s getting harder and harder these days, but my needs for a laptop have also gone down. If they solder RAM, there’s nothing you can (realistically) do if you need more, so you’ll pay extra when buying so they can upcharge a lot. If it’s not soldered, you have a decent option to buy RAM afterward, so there’s less value in upselling too much.
So screw you Apple, I’m not buying your products until they’re more repair friendly.
akilou@sh.itjust.works
on 13 Apr 2024 19:16
nextcollapse
I had a extra stick of RAM available the other day so I went to open my wife’s Lenovo to see if it’d take it and the damn thing is screwed shut with the smallest torx screws I’ve ever seen, smaller than what I have. I was so annoyed
SpaceNoodle@lemmy.world
on 13 Apr 2024 19:40
nextcollapse
The real question is why you don’t have a complete precision screwdriver set.
akilou@sh.itjust.works
on 13 Apr 2024 20:16
collapse
I thought I did! Until I got the smallest one out and it just spun on top of the screw
sugar_in_your_tea@sh.itjust.works
on 13 Apr 2024 19:52
nextcollapse
I bought the E495 because the T495 had soldered RAM and one RAM slot, while the E495 had both RAM slots replacable. Adding more RAM didn’t need any special tools. Newer E-series and T-series both have one RAM slot and some soldered RAM. I’m guessing you’re talking about one of the consumer lines, like the Yoga series or something?
That said, Lenovo (well, Motorola in this case, but Lenovo owns Motorola) puts all kinds of restrictions to your rights if you unlock the bootloader of their phones (PDF version of the agreement). That, plus going down the path of soldering RAM gives me serious concerns about the direction they’re heading, so I can’t really recommend their products anymore.
If I ever need a new laptop, I’ll probably get a Framework.
akilou@sh.itjust.works
on 13 Apr 2024 20:18
nextcollapse
Yeah, it’s a Yoga
Capricorn_Geriatric@lemmy.world
on 13 Apr 2024 20:18
nextcollapse
puts all kinds of restrictions to your rights
The document mentions a lot of US laws. I wonder if they try the same over in the EU.
sugar_in_your_tea@sh.itjust.works
on 13 Apr 2024 20:38
collapse
I’m guessing it wouldn’t hold. But I’m in the US, so I’ll just avoid their phones going forward, and will probably avoid their laptops and whatnot as well just due to a lack of trust.
I keep looking at the Frameworks, because I’m happy with the philosophy, but the problem is that the parts that they went to a lot of trouble to make user-replaceable are the parts that I don’t really care about.
They let you stick a fancy video card on the thing. I’d rather have battery life – I play games on a desktop. If they’d stick a battery there, that might be interesting.
They let you choose the keyboard. I’m pretty happy with current laptop keyboards, don’t really need a numpad, and even if you want one, it’s available elsewhere. I’ve got no use for the LED inserts that you can stick on the thing if you don’t want keyboard there.
They let you choose among sound ports, Ethernet, HDMI, DisplayPort, and various types of USB. Maybe I could see putting in more USB-C then some other vendors have. But the stuff I really want is:
A 100Wh battery. Either built-in, or give me a bay where I can put more internal battery.
A touchpad with three mechanical buttons, like the Synaptics ones that the Thinkpads have.
The fact that they aren’t soldering in the RAM and NVMe is nice in that they’re committing to not charging much more then market rate, so I guess they should get credit for that, but they are certainly not the only vendor to avoid soldering those.
sugar_in_your_tea@sh.itjust.works
on 13 Apr 2024 22:42
collapse
Yeah, ThinkPad used to allow either a CD drive or an extra battery in their T-series. They stopped offering the extra battery and started soldering RAM, so I got the cheaper E-series (might as well save cash if I can get what I want).
I think there’s a market there. Have an option for a hot-swap battery to bring on trips and use the GPU at home. Serious travelers could even bring a spare battery to keep working for longer.
touchpad with three mechanical buttons
Yes please! And give me the ThinkPad nipple as well. :) If they had those, I’d not bother with even looking at Lenovo. The middle button is so essential to my normal workflow that any other laptop (including my fancy MacBook for work) feels crappy.
I’m guessing the things they made modular are just the low hanging fruit. It’s pretty easy to make a USB-C to whatever port, it’s a bit harder to make a pluggable battery in a slot that can also support a GPU.
I don’t know if I’d recommend it, but if you are absolutely set on having the Thinkpad nipple – I don’t use it, even if I really want the Thinkpad trackpad – the factory that made the original IBM Model M keyboards is still in business somewhere in Kentucky. IIRC the employees bought it or something when IBM stopped making the things. They offer a nipple keyboard, goes by the name of “Endura Pro”. checksUnicomp. That’s the remnants in the US of the IBM business; the Chinese Lenovo purchased the laptops and also do the Trackpoint.
I got one like twenty years back, and while the actual buckling-spring keyswitches on the keyboard are pretty much immune to time, I wore out the switches on the mouse buttons, so I don’t know if I can give a buy recommendation for the mouse-enabled version (though maybe they improved the switches there). But if you really, really like it, that might be worthwhile for you. Last I looked they were still making them.
checks
They’ve got a message up saying that a supplier of a component used in that keyboard went under due to COVID so they suspended production. I don’t know what the status is on that.
NOTICE CONCERNING AVAILABILITY – Unfortunately, we have had to temporarily suspend the sale of the Endura Pro keyboards due to another supply chain shortage. The supplier of one of the flex harnesses had to close their doors during the pandemic. We’ve begun the task of sourcing a new supplier but do not have a definite time frame for when these keyboards will be available again. For our customers with orders already placed, we have enough stock to complete all on order.
Keep in mind that this is a very large, heavy keyboard that you could brain someone with; if you’re going to haul it around with a laptop, it’s going to be larger and heavier than the laptop. Mentioning it mostly since I figure that you might use it at some location where you could leave the keyboard.
sugar_in_your_tea@sh.itjust.works
on 14 Apr 2024 03:42
collapse
The thing is, I only like the Trackpoint in a laptop. It’s really nice to scroll while holding the middle mouse button and just shifting my finger. That way, my hand is ready to type, unlike using the trackpad, where I have to move my hands to type, and it works well in my largely keyboard-driven workflow (ViM for text editing, Trackpoint for web browsing).
On a desktop, I have multiple screens and way more real estate, so the Trackpoint isn’t nearly as effective and it’s worth using the mouse instead.
But I honestly don’t use my laptop all that often, so it’s something I’m fine doing without. But all other things being similar, I’ll prefer the Trackpoint since it’s a nice value add.
It’s cool that they’re making those keyboards though. I have and nice mechanical keyboards, so I’m not looking for one, but I would be very interested in a Framework-compatible keyboard with a Trackpoint.
Torx is legitimately useful for small screws, because it’s more resistant to stripping than Phillips.
Now, if they start using Torx security bits or some oddball shapes, then they’re just being obnoxious. But there are not-trying-to-obstruct-the-customer reasons not to use Phillips.
generichate1546@lemmynsfw.com
on 13 Apr 2024 22:22
collapse
IFixit kit is a great toolset from the site that has every type of bit in it.
NekkoDroid@programming.dev
on 13 Apr 2024 23:11
nextcollapse
Got myself an IFixit Mako a while ago, really nice even if I mostly just use the philips head ones
generichate1546@lemmynsfw.com
on 14 Apr 2024 02:27
collapse
Right? It’s nice to have the occasional reverse tri head metric upside down weird random bit when you need it.
Does it have triangle bits? Nintendo uses some really unusual driver shapes.
generichate1546@lemmynsfw.com
on 14 Apr 2024 23:53
collapse
I’ve taken apart so so so many things… sometimes for the right reasons and sometimes for the wrong reasons…my ZuneHD still works. I’ll never ever try to open a Surface product.
user224@lemmy.sdf.org
on 13 Apr 2024 19:26
nextcollapse
That’s why I don’t buy laptops with soldered RAM.
Oh, that shit is soldered on…
I mean, I did see that on some laptops, but only those cheap things in €150 range (new) which even use eMMC for storage.
sugar_in_your_tea@sh.itjust.works
on 13 Apr 2024 19:38
nextcollapse
Yup, all Apple laptops have soldered RAM for some years now…
cmnybo@discuss.tchncs.de
on 13 Apr 2024 19:55
collapse
It became pretty common even on higher end laptops when they switched to DDR5, but some manufacturers are starting to go back to socketed RAM.
BorgDrone@lemmy.one
on 13 Apr 2024 20:49
nextcollapse
That’s why I don’t buy laptops with soldered RAM.
In my opinion disadvantages of user-replaceable RAM far outweigh the advantages. The same goes for discrete GPUs. Apple moved away from this and I expect PC manufacturers to follow Apple’a move in the next decade or so, as they always do.
sugar_in_your_tea@sh.itjust.works
on 13 Apr 2024 21:06
nextcollapse
Here’s how I see the advantages of soldered RAM:
better performance
less risk of physical damage
more energy efficient
smaller
The risk of physical damage is so incredibly low already, and energy use of RAM is also incredibly low, so neither of those seem important.
So that leaves performance, which I honestly haven’t found good numbers for. If you have this, I’m very interested, but since RAM speed is rarely the bottleneck in a computer (unless you have specific workloads), I’m going to assume it to be a marginal improvement.
So really, I guess “smaller” is the best argument, and I honestly don’t care about another half centimeter of space, it’s really not an issue.
So that leaves performance, which I honestly haven’t found good numbers for. If you have this, I’m very interested, but since RAM speed is rarely the bottleneck in a computer (unless you have specific workloads), I’m going to assume it to be a marginal improvement.
This is where you’re mistaken. There is one thing that integrated RAM enables that makes a huge difference for performance: unified memory. GPUs code is almost always bandwidth limited, which why on a graphics card the RAM is soldered on and physically close to the GPU itself, because that is needed for the high bandwidth requirements of a GPU.
By having everything in one package, CPU and GPU can share the same memory, which means that you eliminate any overhead of copying data to/from VRAM for GPGPU tasks. But there’s more than that, unified memory doesn’t just apply to the CPU and GPU, but also other accelerators that are part of the SoC. What is becoming increasingly important is AI acceleration. UMA means the neural engine can access the same memory as the CPU and GPU, and also with zero overhead.
This is why user-replaceable RAM and discrete GPUs are going to die out. The overhead and latency of copying all that data back and forth over the relatively slow PCIe bus is just not worth it.
sugar_in_your_tea@sh.itjust.works
on 13 Apr 2024 23:40
nextcollapse
Do you have actual numbers to back that up?
The best I’ve found is benchmarks of Apple silicon vs Intel+dGPU, but that’s an apples to oranges comparison. And if I’m not mistaken, Apple made other changes like a larger bus to the memory chips, which again makes comparisons difficult.
I’ve heard about potential benefits, but without something tangible, I’m going to have to assume it’s not the main driver here. If the difference is significant, we’d see more servers and workstations running soldered RAM, but AFAIK that’s just not a thing.
BorgDrone@lemmy.one
on 14 Apr 2024 00:23
nextcollapse
The best I’ve found is benchmarks of Apple silicon vs Intel+dGPU, but that’s an apples to oranges comparison.
The thing with benchmarks is that they only show you the performance of the type of workload the benchmark is trying to emulate. That’s not very useful in this case. Current PC software is not build with this kind of architecture in mind so it was never designed to take advantage of it. In fact, it’s the exact opposite: since transferring data to/from VRAM is a huge bottleneck, software will be designed to avoid it as much as possible.
For example: a GPU is extremely good at performing an identical operation on lots of data in parallel. The GPU can perform such an operation much, much faster than the CPU. However, copying the data to VRAM and back may add so much additional time that it still takes less time to run it on the CPU, a developer may then choose to run it on the CPU instead even if the GPU was specifically designed to handle that kind of work. On a system with UMA you would absolutely run this on the GPU.
The same thing goes for something like AI accelerators. What PC software exists that takes advantage of such a thing?
A good example of what happens if you design software around this kind of architecture can be found here. This is a post by a developer who worked on Affinity Photo. When they designed this software they anticipated that hardware would move towards a unified memory architecture and designed their software based on that assumption.
When they finally got their hands on UMA hardware in the form of an M1 Max that laptop chip beat the crap out of a $6000 W6900X.
We’re starting to see software taking advantage of these things on macOS, but the PC world still has some catching up to do. The hardware isn’t there yet, and the software always lags behind the hardware.
I’ve heard about potential benefits, but without something tangible, I’m going to have to assume it’s not the main driver here. If the difference is significant, we’d see more servers and workstations running soldered RAM, but AFAIK that’s just not a thing.
It’s coming, but Apple is ahead of the game by several years. The problem is that in the PC world no one has a good answer to this yet.
Nvidia makes big, hot, power hungry discrete GPUs. They don’t have an x86 core and Windows on ARM is a joke at this point. I expect them to focus on the server-side with custom high-end AI processors and slowly move out of the desktop space.
AMD has the best papers for desktop. They have a decent x86 core and GPU, they already make APUs. Intel is trying to get into the GPU game but has some catching up to do.
Apple has been quietly working towards this for years. They have their UMA architecture in place, they are starting to put some serious effort into GPU performance and rumor has it that with M4 they will make some big steps in AI acceleration as well. The PC world is held back by a lot of legacy hard and software, but there will be a point where they will have to catch up or be left in the dust.
I understand the scepticism, but without links of what you’ve found or which parts in particular you consider dubious claims (ram speed can be increased when soldered, higher speeds lead to better performance, etc) it comes across as “i don’t believe you, because i choose to not believe you”
sugar_in_your_tea@sh.itjust.works
on 14 Apr 2024 16:48
collapse
Yes, and the results from that video (i assume, I skimmed it, but have watched similar videos) is that the difference is negligible (like 1-10FPS) and you’re usually better off spending that money on something else.
I look at the benchmarks between the Intel MacBook Pro and the M1 MacBook Pro, and both use soldered RAM, yet the M1 gets so much better performance, even on non-GPU tasks (e.g. memory-heavy unit tests at work went from 3-5min to 45-50sec from latest Intel to M1). Docker build times saw a similar drop. But it’s hard for me to know what the difference is between memory vs CPU changes. I’d have to check, but I’m guessing there’s also the DDR4 to DDR5 switch, which increases memory channels.
The claim is that proximity to the CPU explains it, but I have trouble quantifying that. For me, a 1-10FPS drop isn’t enough to reduce repairability and expandability. Maybe it is for others though, but if that’s the difference, that’s a lot less than the claims they seem to make.
The video has a short section on productivity (i.e. rendering or compiling). That part is probably the most relevant for most people. Check the chapter view in YouTube to jump directly to it.
I think a 2x performance improvement is plausible when comparing non-soldered ram to the Apple silicon, which goes even further and has the memory on the die itself. If, of course, ram is the limiting factor.
The advantages of upgradable, expandable ram are obvious. But let’s face it: most people don’t need and even less use that capability.
sugar_in_your_tea@sh.itjust.works
on 15 Apr 2024 03:37
collapse
short section on productivity
Looks about the same as the rest. Big gains for handbrake, pretty much nothing for anything else. And that makes sense, because handbrake will be doing lots of roundtrips to the GPU for encoding.
has the memory on the die itself
On the package, not the die. But perhaps that’s what you meant. On die would be closer to a massive cache like on the X3D AMD chips.
The performance improvement seems to be that Apple has a massive iGPU, not anything to do with RAM next to the CPU. So in CPU-only benchmarks, I’d expect the lion’s share of the difference to be CPU design and process node, not the memory.
Also, unified memory isn’t particularly new, APUs have supported it for years. It’s just not well utilized by devs because most users have dGPUs. So I think the main innovation here is Apple committing to it and providing tooling for devs to utilize the unified memory better, like console manufacturers have done.
So I guess that brings a few more questions:
what performance improvements could we see if devs use unified memory in socketed LPDDR memory in laptops?
how would that compare to Apple’s on-package RAM (I think it’s also LPDDR, so more apples to apples?)?
how likely are AMD and Intel to push for massive APUs on laptops?
I guess we’re kind of seeing it with the gaming PC handhelds, like Steam Deck and Ayaneo etc al, so maybe that’ll become more mainstream.
“unified memory” is an Apple marketing term for what everyone’s been doing for well over a decade. Every single integrated GPU in existence shares memory between the CPU and GPU; that’s how they work. It has nothing to do with soldering the RAM.
You’re right about the bandwidth though, current socketed RAM standards have severe bandwidth limitations which directly limit the performance of integrated GPUs. This again has little to do with being socketed though: LPCAMM supports up to 9.6GT/s, considerably faster than what ships with the latest macs.
This is why user-replaceable RAM and discrete GPUs are going to die out. The overhead and latency of copying all that data back and forth over the relatively slow PCIe bus is just not worth it.
The only way discrete GPUs can possibly be outcompeted is if DDR starts competing with GDDR and/or HBM in terms of bandwidth, and there’s zero indication of that ever happening. Apple needs to puts a whole 128GB of LPDDR in their system to be comparable (in bandwidth) to literally 10 year old dedicated GPUs - the 780ti had over 300GB/s of memory bandwidth with a measly 3GB of capacity. DDR is simply not a good choice GPUs.
“unified memory” is an Apple marketing term for what everyone’s been doing for well over a decade.
Wrong. Unified memory (UMA) is not an Apple marketing term, it’s a description of a computer architecture that has been in use since at least the 1970’s. For example, game consoles have always used UMA.
Every single integrated GPU in existence shares memory between the CPU and GPU; that’s how they work.
Again, wrong.
While iGPUs have existed for PCs for a long time, they did not use a unified memory architecture. What they did was reserve a portion of the system RAM for the GPU. For example on a PC with 512MB RAM and an iGPU, 64MB may have been reserved for the GPU. The CPU then had access to 512-64 = 448MB. While they shared the same physical memory chips, they both had a separate address space. If you wanted to make a texture available to the GPU, it still had to be copied to the special reserved RAM space for the GPU and the CPU could not access that directly.
With unified memory, both CPU and GPU share the same address space. Both can access the entire memory. No RAM is reserved purely for the GPU. If you want to make something available to the GPU, nothing needs to be copied, you just need to point to where it is in RAM. Likewise, anything done by the GPU is immediately accessible by the CPU.
Since there is one memory pool for both, you can use RAM more efficiently. If you have a discrete GPU with 16GB VRAM, and your app only needs 8GB VRAM, that other memory just sits there being useless. Alternatively, if your app needs 24GB VRAM, you can’t run it because your GPU only has 16B, even if you have lots of system RAM available.
With UMA you can use all the RAM you have for whatever you need it for. On an M2 Ultra with 192GB RAM you can use almost all of that for the GPU (minus a little bit that’s used for the OS and any running apps). Even on a tricked out PC with a 4090 you can’t run anything that needs more than 24GB VRAM. Want to run something where the GPU needs 180MB of memory? No problem on an M1 Ultra.
It has nothing to do with soldering the RAM.
It has everything to do with soldering the RAM. One of the reason iGPUs sucked, other than not using UMA, is that GPUs performance is almost limited by memory bandwidth. Compared to VRAM, standard system RAM has much, much less bandwidth causing iGPUs to be slow.
A high-bandwidth memory bus, like a GPU needs, has a lot of connections and runs at high speeds. The only way to do this reliably is to physically place the RAM very close to the actual GPU. Why do you think GPUs do not have user-upgradable RAM?
Soldering the RAM makes it possible to integrate a CPU and an non-sucking GPU. Go look at the inside of a PS5 or XSX and you’ll see the same thing: an APU with the RAM chips soldered to the board very close to it.
This again has little to do with being socketed though: LPCAMM supports up to 9.6GT/s, considerably faster than what ships with the latest macs.
LPCAMM is a very recent innovation. Engineering samples weren’t available until late last year and the first products will only hit the market later this year. Maybe this will allow for Macs with user-upgradable RAM in the future.
The only way discrete GPUs can possibly be outcompeted is if DDR starts competing with GDDR and/or HBM in terms of bandwidth
What use is high bandwidth memory if it’s a discrete memory pool with only a super slow PCIe bus to access it?
Discrete VRAM is only really useful for gaming, where you can upload all the assets to VRAM in advance and data practically only flows from CPU to GPU and very little in the opposite direction. Games don’t matter to the majority of users. GPGPU is much more interesting to the general public.
Wrong. Unified memory (UMA) is not an Apple marketing term, it’s a description of a computer architecture that has been in use since at least the 1970’s. For example, game consoles have always used UMA.
Apologies, my google-fu seems to have failed me. Search results are filled with only apple-related results, but I was now able to find stuff from well before. Though nothing older than the 1990s.
While iGPUs have existed for PCs for a long time, they did not use a unified memory architecture.
Do you have an example, because every single one I look up has at least optional UMA support. The reserved RAM was a thing but it wasn’t the entire memory of the GPU instead being reserved for the framebuffer. AFAIK iGPUs have always shared memory like they do today.
It has everything to do with soldering the RAM. One of the reason iGPUs sucked, other than not using UMA, is that GPUs performance is almost limited by memory bandwidth. Compared to VRAM, standard system RAM has much, much less bandwidth causing iGPUs to be slow.
I don’t disagree, I think we were talking past each other here.
LPCAMM is a very recent innovation. Engineering samples weren’t available until late last year and the first products will only hit the market later this year. Maybe this will allow for Macs with user-upgradable RAM in the future.
What use is high bandwidth memory if it’s a discrete memory pool with only a super slow PCIe bus to access it?
Discrete VRAM is only really useful for gaming, where you can upload all the assets to VRAM in advance and data practically only flows from CPU to GPU and very little in the opposite direction. Games don’t matter to the majority of users. GPGPU is much more interesting to the general public.
gestures broadly at every current use of dedicated GPUs. Most of the newfangled AI stuff runs on Nvidia DGX servers, which use dedicated GPUs. Games are a big enough industry for dGPUs to exist in the first place.
User replaceable RAM is slow, which means you can’t integrate the CPU and GPU in one package. This means a GPU with it’s own RAM, which has huge disadvantages.
Even a 4090 only has 24GB and slow transfers to/from VRAM. The GPU can only operate on data in VRAM, so anything you need it to work on you need to copy over the relatively slow PCIe bus to the GPU. Then once it’s done you need to copy the results back over the PCIe bus to system RAM for the CPU to be able to access it. This considerably slows down GPGPU tasks.
Ah yeah, I see. That’s definitely a downside if you work with something where that becomes a factor.
scarabic@lemmy.world
on 14 Apr 2024 16:46
collapse
These days I don’t realistically expect my RAM requirements to change over the lifetime of the product. And I’m keeping computers longer than ever: 6+ years where it used to be 1 or 2.
People have argued millions of times on the internet that Apple’s products don’t meet people’s needs and are massively overpriced. Meanwhile they just keep selling like crazy and people love them. I think the issue comes from having pricing expectations set over the in race-to-the-bottom world of commoditized Windows/Android trash.
sugar_in_your_tea@sh.itjust.works
on 14 Apr 2024 18:05
collapse
I upgraded my personal laptop a year or so after I got it (started with 8GB, which was fine until I did Docker stuff), and I’m probably going to upgrade my desktop soon (16GB, which has been fine for a few years, but I’m finally running out). My main complaint about my work laptop is RAM (16GB I think; I’d love another 8-16GB), but I cannot upgrade it because it’s soldered, so I have to wait for our normal cycle (4 years; will happen next year). I upgraded my NAS RAM when I upgraded a different PC as well.
I don’t do it very often, but I usually buy what I need when I build/buy the machine and upgrade 3-4 years later. I also often upgrade the CPU before doing a motherboard upgrade, as well as the GPU.
Meanwhile they just keep selling like crazy and people love them. I think the issue comes from having pricing expectations set over the in race-to-the-bottom world of commoditized Windows/Android trash.
I might agree if Apple hardware was actually better than alternatives, but that’s just not the case. Look at Louis Rossmann’s videos, where he routinely goes over common failure cases that are largely due to design defects (e.g. display cable being cut, CPU getting fried due to a common board short, butterfly keyboard issues, etc). As in, defects other laptops in a similar price bracket don’t have.
I’ve had my E-series ThinkPad for 6 years, with no issues whatsoever. The USB-C charge port is getting a little loose, but that’s understandable since it’s been mostly a kids Minecraft device for a couple years now, and kids are hard on computers. I had my T-Mobile series before that for 5-ish years until it finally died due to water damage (a lot of water).
Apple products (at least laptops) are designed for aesthetics first, not longevity. They do generally have pretty good performance though, especially with the new Apple Silicon chips, but they source a lot of their other parts from the same companies that provide parts for the rest of the PC market.
If you stick to the more premium devices, you probably won’t have issues. Buy business class laptops and phones with long software support cycles. For desktops, I recommend buying higher end components (Gold or Platinum power supply, mid-range or better motherboard, etc), or buying from a local DIY shop with a good warranty if buying pre built.
Like anything else, don’t buy the cheapest crap you can, buy something in the middle of the price range for the features you’re looking for.
paraphrand@lemmy.world
on 13 Apr 2024 18:38
nextcollapse
Even if they are right, no one cares and it will always be a bad look.
I mean, I have no interest in an 8GB machine, but it’s also fair to say that there definitely are people who are fine with it, and who would like to save the money. Say you’ve got four kids and you’re buying them all laptops – I dunno if that’s the thing parents do these days, or whether kids typically just get by on smartphones or what. And sometimes they get broken or whatnot, and you’re paying for the other expenses associated with those kids. That money adds up.
Apple runs a walled garden, unless things have changed in recent years while I wasn’t watching. They tried opening up to third-party hardware vendors back around 2000 with some third-party PowerPC vendors, found that too many users were buying that hardware instead of theirs, and killed off the clone vendors. That means that if you want to use MacOS, you have to buy Apple hardware. And so there’s good reason to have a broad range of offerings from Apple, even some that are higher-end or lower-end than the typical user might want, because Apple is the only option that MacOS users have. If I want to run Linux on a machine with 2GB of memory, I can do it, and if I want to run Linux on a machine with 256GB GB of memory, I can do it. MacOS users need to have an offering from Apple to do that.
Plus, I assume that these are running some form of solid-state storage, which makes hitting virtual memory a lot less painful than was the case in the past.
paraphrand@lemmy.world
on 13 Apr 2024 18:46
nextcollapse
I agree. But we still have to listen to all the bitching.
ABCDE@lemmy.world
on 13 Apr 2024 18:55
nextcollapse
We both have 8GB Airs in our house, an M1 and an M2. They run just fine.
The thing is that Apple charges three kidneys per gigabyte over 8 GB.
dual_sport_dork@lemmy.world
on 13 Apr 2024 19:27
nextcollapse
If you’ve got four kids and you’re buying them all laptops, I don’t think buying them all Macs and “saving money” by getting cut-down machines with too little memory (or whatever other hobbling Apple may cook up now or later) is exactly the smart play. You would need to have a very compelling reason to absolutely have to run MacOS to the exclusion of everything else which if we’re honest, most people don’t.
A Lenovo IdeaPad Slim, just to pick an example out of a hat that contains many other options, costs half as much as the low spec 2024 Macbook Air the article is spotlighting while having double the RAM, double the SSD, and, you know, ports. For the cost of a 8GB Macbook Pro you could buy a Legion Slim with an i7 and an RTX4060 in it and have change left over, a machine which would blow that Mac out of the water.
There are a lot of things you can say about Macbooks, but being a good value for the money is consistently never one of them.
proton_lynx@lemmy.world
on 13 Apr 2024 19:28
collapse
Save money, buy an Apple computer. Choose one.
mariusafa@lemmy.sdf.org
on 13 Apr 2024 19:08
nextcollapse
It’s okay if you run efficient OS on it, not the case.
That’s no justification for selling a >$1,000 MacBook Pro with only 8GB of RAM, though. It’s specifically marketed as a professional-class machine.
Dariusmiles2123@sh.itjust.works
on 14 Apr 2024 12:56
collapse
Yeah clearly the PRO part with just 8Gb of RAM is the problem.
Come on that’s what I have on my Surface Go 1 from 2019. It runs Linux perfectly and is okay for my needs, but I wouldn’t put such specs on a PRO thing😅
Rai@lemmy.dbzer0.com
on 13 Apr 2024 19:28
nextcollapse
OSX is waaaay more memory efficient than windows…
mariusafa@lemmy.sdf.org
on 13 Apr 2024 21:36
nextcollapse
Yeah my blueprint of efficient os it isn’t Windows also.
morrowind@lemmy.ml
on 14 Apr 2024 19:41
nextcollapse
The recent stats I’ve seen indicate macOS usually uses more ram
Anything is way more efficient than windows. That’s very low(or high, but you need to go under it) bar.
Can your macos run on router with 32MB RAM? Or on most powerful supercomputer? Or both?
disguy_ovahea@lemmy.world
on 13 Apr 2024 19:39
nextcollapse
And it’s not RAM, it’s UM for an SoC. The usage of memory changed with the introduction of Apple Silicon.
billiam0202@lemmy.world
on 13 Apr 2024 21:58
nextcollapse
“Unified” only means there’s not a discrete block for the CPU and a discrete block for the GPU to use. But it’s still RAM- specifically, LPDDR4x (for M1), LPDDR5 (for M2), or LPDDR5X (for M3).
Besides, low-end PCs with integrated graphics have been using unified memory for decades- no one ever said “They don’t have RAM, they have UM!”
disguy_ovahea@lemmy.world
on 13 Apr 2024 22:14
collapse
Yes, that’s true, but it’s still an indicator of an uninformed reporter.
Apple Silicon chips pass data from one dedicated cores directly to another without the need of passing through memory, hence the smaller processor cache. There are between 18 and 58 cores in the M3 (model dependent). The architecture works very differently than the conventional CPU/GPU/RAM model.
I can run FCP and Logic Pro and have memory to spare with 16GB of UM. The only thing that pushes me into swap is Chrome. lol
BearOfaTime@lemm.ee
on 14 Apr 2024 00:44
nextcollapse
It’s a pointless distinction.
And in this case, it makes 8gig look even worse.
disguy_ovahea@lemmy.world
on 14 Apr 2024 00:52
collapse
Maybe you’re not familiar with the apps I’m referring to. Final Cut Pro and Logic Pro are professional video and audio workstations.
If I tried to master an export from Adobe Premiere Pro in Protools on PC I’d need 32GB of RAM to to prevent stutter. I only use ~12GB of 16GB doing the same on Apple Silicon.
8GB of UM is not for someone running two pro apps at once. It’s for grandma to use for online banking and check her email and Facebook.
billiam0202@lemmy.world
on 14 Apr 2024 06:36
collapse
it’s still an indicator of an uninformed reporter.
My dude, you’re literally in here arguing that because Apple has a blob for both CPU memory and GPU memory that somehow makes that blob “not RAM.” Apple’s design might give fantastic performance, but that’s irrelevant to the fact that the memory on the chip is RAM of known and established standards.
disguy_ovahea@lemmy.world
on 14 Apr 2024 12:40
collapse
Read my other replies to this comment. There’s no GPU. It’s an SoC.
Each power intensive process is given its own dedicated core. The OS is designed specifically to send dedicated processes to the associated core. For example, your CPU isn’t bogged down decrypting data while loading an application.
You can’t compare it to anything else out at this time. Just learn about it, or don’t. Guessing is just a waste of time.
Great topic switch. Also what century do you live?
disguy_ovahea@lemmy.world
on 14 Apr 2024 21:18
collapse
The topic is substantiating that 8GB of UM on an Apple Silicon Mac being acceptable for a base model.
I’ve explained how the UM is used strictly as a storage liaison due to the processor having a multitude of dedicated cores, with the ability to pass data directly without utilizing UM.
I don’t know what you want from me, but maybe you should just do your own homework instead of being combative with people who understand something better than you.
I’ve explained how the UM is used strictly as a storage liaison due to the processor having a multitude of dedicated cores, with the ability to pass data directly without utilizing UM.
I really doubt they run apps with cache turned into scratchpad memory.
BearOfaTime@lemm.ee
on 14 Apr 2024 00:42
nextcollapse
Like has been done on laptops with on-board video cards since, well, forever?
disguy_ovahea@lemmy.world
on 14 Apr 2024 01:04
collapse
It’s different. The GPU is broken into several parts and integrated into the SoC along with the CPU’s dedicated processes. Data is passed within the SoC without entering UM. It’s exclusively used as a storage liaison.
You should check out Apple Silicon M-Series. Specs don’t translate to performance in the way conventional PC architecture does. I guarantee you’ll see PC manufacturers going to 2nm SoC configurations soon enough. The performance is undeniable.
disguy_ovahea@lemmy.world
on 14 Apr 2024 21:08
collapse
A CPU performs integer math.
A GPU performs floating-point math.
Those are only two of the 18-52 cores (model dependent) of Apple M chips. The OS is designed around this for maximum efficiency. Most Macs don’t even have a fan anymore.
disguy_ovahea@lemmy.world
on 14 Apr 2024 21:12
collapse
That’s correct. My mistake.
n3m37h@lemmy.dbzer0.com
on 14 Apr 2024 11:07
collapse
Dude it is just GDDR#, the same stuff consoles use
PC’s have had this ability for over a decade there mate apple is just good at marketing.
What’s next? When VRAM overflows it gets dumped into regular ram? Oh wait PC’s can do that too…
disguy_ovahea@lemmy.world
on 14 Apr 2024 12:47
collapse
With independent CPU and GPU, sure. There’s no SoC that performs anywhere near Apple Silicon.
n3m37h@lemmy.dbzer0.com
on 14 Apr 2024 16:24
collapse
According to benchmarks the 8700G vs M3 is on average 22% slower single core, and is 31% faster multicore, FP32 is 41% higher than the M3 and AI is 54% slower 8700G also uses 54% more energy
What about those stats says AMD can’t compete? 8700G is a APU just as is the M3
disguy_ovahea@lemmy.world
on 14 Apr 2024 16:45
collapse
I’m talking about practical use performance. I understand your world, you don’t understand mine. I’ve been taking apart and upgrading PCs since the 286. I understand benchmarks. What you don’t understand, is how MacOS uses the SoC in a way where benchmarks =/= real-world performance. I’ve used pro apps on powerful PCs and powerful Macs, and I’m speaking from experience. We can agree to disagree.
n3m37h@lemmy.dbzer0.com
on 14 Apr 2024 17:53
collapse
I grew up with a Tandy 1000 and was always getting yelled at for taking it apart along with just about every PC we owned after than too.
Benchmarks are indicative of real world performance for most part. If they were useless we wouldn’t use them, kinda like userbenchmark.
The one benefit apple does have is owning its own ecosystem where they can modify the silicon/OS/Software to work with each other better.
Does not mean the M3 is the best there is and can’t be touched, that is just misleading
8700G is gonna stomp the M3 using Maxton’s software suite just as the M3 will stop the 8700G using Apples software suite.
Then also on-top if that the process node for manufacturing said silicon is different (3nm vs 4nm) that alone allows for a 20% (give or take some) performance difference just like every process node change in the past decade or so
I’ll take the loss on the experience part as the only apple product I own is an Apple TV 4k, but there are many nuances you’ve obviously glossed over
Is the M3 a good piece of silicon? Yes
Is it the best at EVERYTHING? Of course not
Should apple give up because they are not the best? Fuck no
disguy_ovahea@lemmy.world
on 14 Apr 2024 18:09
collapse
Man, you’re kinda off the point. This is about how much UM is appropriate for a base model. I’m simply saying the architecture of an SoC utilizes UM as a storage liaison exclusively, since CPU and GPU are cores of the same chip. It simply does not mean the same thing as 8GB of RAM in standard architecture. As a pro app user, 16GB is enough. 8GB is plenty for grandma to check her Facebook and online banking.
n3m37h@lemmy.dbzer0.com
on 14 Apr 2024 18:39
collapse
There’s no SoC that performs anywhere near Apple Silicon.
Am I missing the point really? UM is not a new concept. Specifically look at the PS5/X:SX
Notice the soldered RAM and lack of video card? Kinda like what the M series does.
And when all is said and done, 8gb is not nearly enough and apple should be chastised for just like Nvidia when they first decided to make 5 different variations of the 1060 making sure 4 of those variations will become ewaste in a few short years and again with the 3050 6gb vs 3050 8gb
disguy_ovahea@lemmy.world
on 14 Apr 2024 18:44
collapse
They both have have independent CPU and GPU. UM is not used to pass from CPU to GPU on an SoC system, it’s exclusively a storage liaison. Therefore it’s used far less than in non SoC applications.
The CPU and GPU are one chip. Learn about Apple Silicon SoC rather than trying to find a comparison. You won’t find one anywhere yet.
Ghostalmedia@lemmy.world
on 14 Apr 2024 05:44
collapse
Have you used an 8 gig ARM Mac?
I’m pretty brutal on my machines, and if my 8 gig m1 really only starts to beach ball when multiple accounts are open, and those accounts all have bloated multimedia software running.
My 16 gig machines can handle that use case fine, but the 8 gig machine will occasionally beach ball.
Personally, I won’t buy an 8 gig config again. But I’m a fucking monster that leaves a million bloated things open across multiple active user sessions.
scorpious@lemmy.world
on 13 Apr 2024 20:00
nextcollapse
To be fair, M-series Macs are pretty insanely efficient with memory. Unless you’ve actually used one extensively, I can understand the attitudes here…BUT:
I’ve done broadcast animation for many years, and back in ‘21 delivered an entire season of info/explainer-type pieces for a network show — using Motion, Cinema 4D, and After Effects (+ Ai and Ps) — all of it running on a base-level, first-gen M1 Mini (8/256). Workflow was fast and smooth; even left memory-pig apps running in the background most of the time…not one hiccup. Oh, and everything was delivered in 4k.
So 8gb actually is plenty for most folks…even professionals doing some heavy lifting. Sure I’d go for 16 next one, but damn I was/am still impressed. (Maybe it sucks for gaming, I don’t do that so have no clue).
Sure, 8GB gets the job done but why are Apple selling “professional” grade laptops in this price range that clearly require additional memory to reach peak performance?
PipedLinkBot@feddit.rocks
on 13 Apr 2024 20:20
nextcollapse
I get more because I know I’ll need more. I don’t get less and then complain I should have gotten more even though I knew I couldn’t upgrade later.
Really, Apple just shouldn’t have said what they did and they wouldn’t be in hot water.
freeman@sh.itjust.works
on 13 Apr 2024 21:32
nextcollapse
It doesn’t matter how ‘insanely efficient’ they are. If your tasks need to use more than 8Gb of memory you are going to run out and start swapping to disk.
8gb worth of data is not heavy lifting for professional use.
scorpious@lemmy.world
on 13 Apr 2024 22:10
nextcollapse
…And yet…?
My point is that while of course more is better, 8 sufficed for me…a professional, doing demanding…professional…work.
freeman@sh.itjust.works
on 14 Apr 2024 06:01
collapse
Sufficed is not an objective term but still is not a favorable term especially for machines that cost that much.
Your original point was that apple’s cpu are somehow more ‘efficient’ with ram. That’s misinformation to put it kindly.
kalleboo@lemmy.world
on 14 Apr 2024 02:53
collapse
It mostly just shows how crazy fast modern SSDs are that they can do swap duties with performance that is acceptable to many people. The SSD in my MacBook Pro can read/write at 5-6 GB/s. That means it can write out the whole 8 GB of memory of one of those smaller machines in under 2 seconds. As long as your current task fits in 8 GB and you’re fine waiting 2 seconds to switch between apps…
freeman@sh.itjust.works
on 14 Apr 2024 05:53
collapse
Yes if you don’t run out of ram you won’t face ram performance issues…
I wouldn’t be ok waiting 2 seconds to switch between apps on something the price of Mac laptop, even the cheapest m1.
woelkchen@lemmy.world
on 14 Apr 2024 00:55
nextcollapse
To be fair, at the price point of Macs, 16GB is easily achievable.
june@lemmy.dbzer0.com
on 13 Apr 2024 21:26
nextcollapse
I was using my 2016 (or so) MacBook Air the other day and getting low memory errors. I thought, wow, this thing only has 8 gb, maybe it’s time to upgrade, just to see this 😐
realitista@lemm.ee
on 14 Apr 2024 00:06
nextcollapse
My 2009 Mac mini had 8gb of RAM. And it wasn’t even very expensive to do so when I did it in ~2013. Couple hundred bucks max.
11 years ago. May very well have been less. Apples still probably charging more than that to go from 8 to 16 and I had to buy all 8 and replace both DIMMS.
COASTER1921@lemmy.ml
on 14 Apr 2024 04:24
collapse
Part of the difference is that the Apple silicon Macs aggressively use SSD swap to make up for limited memory. But that’s at expense of the SSD lifespan, which of course isn’t replaceable.
I’d never recommend a Mac, but the prices they charge to get a little more RAM or SSD over base are crazy. The only configurations offering any “value” are the base models with 8gb RAM.
nekusoul@lemmy.nekusoul.de
on 13 Apr 2024 22:44
nextcollapse
Even the PC manufacturers selling “gaming” PCs using integrated graphics aren’t usually this brazen about it.
tsonfeir@lemm.ee
on 13 Apr 2024 23:08
nextcollapse
Apple said some pretty dumb things to defend that 8gb, but let’s not pretend that most manufacturers do the same thing.
For years people have known it can’t be upgraded. You know that going in.
No one complains that video cards on (most) laptops can’t be replaced, yet many of them wind up being useless for anything but daily tasks.
purplemonkeymad@programming.dev
on 14 Apr 2024 09:32
collapse
For years people have known it can’t be upgraded. You know that going in.
Not sure that is true, lots of people see the marketing for a MacBook and think that any of them will be enough. Or see the price difference and think they are getting a good deal, or don’t understand why that is. I’ve had to tell people, sorry I know you spent a lot of money on this, but it does not have the storage for what you are wanting to do. Yes, the only way is to buy another one.
Otherwise yea, everyone tries to gaslight customers into thinking they didn’t get ripped off.
Sure, some people buy a computer without knowing anything about the computer.
Unified memory is not user accessible. If you think you’ll need additional memory, it’s a good idea to upgrade now.
They say it right there. Should it be red and flashing? Should there be a confirm button?
If you go into the Apple Store, someone who is trained to help is always available, and various models are typically in stock.
I’d like to firmly repeat, that Apple never should’ve said that bullshit. Also I feel that 16 gigs should be the standard amount for any Apple laptop. They are premium products. Perhaps the Mac Mini could start at 8.
And since you pulled out the gaslight, I’ll call you a misinformed accuser.
Ghostalmedia@lemmy.world
on 14 Apr 2024 01:35
nextcollapse
I bought one of the early M1s and bought into a lot of the early reviewers that claimed 8 was enough on the ARM architecture. Honestly, for most folks, it’s probably fine. For me, it’s not.
My wife and I use the M1 has a multi-account family machine. And we’re both experience design directors, so we both have RAM hog design apps open under our accounts. The poor little Mac just can’t handle all that abuse with 8 gigs.
Our old ass Intel Mac with 16gig of RAM had no problems keeping a ton of crap open.
The battery life and low heat are absolutely amazing on the M1. That stuff was a monumental upgrade. But we absolutely can’t be lazy and just leave crap open unless it’s actually needed.
The fact that Apple is selling “Pro” machine with 8 gigs is a joke. 8 would be fine for my folks who fart around on Facebook all day, but it’s not enough for a lot of heavy multimedia work.
BreakDecks@lemmy.ml
on 14 Apr 2024 03:07
nextcollapse
8 megs of RAM? I didn’t know they brought back the Macintosh II.
Ghostalmedia@lemmy.world
on 14 Apr 2024 03:59
collapse
lol. Fixed. My brain is broken.
rushaction@programming.dev
on 14 Apr 2024 03:14
nextcollapse
I dunno if you noticed or if that was the joke. But you said “8 megs” three times in your comment when I think you meant to say “8 gigs”. 1 gigabyte ~ 1024 megabytes. Just wanted to let you know in case it wasn’t a joke about how 8 wasn’t enough. That’s all, thank you!
Ghostalmedia@lemmy.world
on 14 Apr 2024 03:58
nextcollapse
Actually, 1 gigabyte (10^9^ B) is 1000 megabytes (10^6^ B), while one gibibyte (2^30^ B) corresponds to 1024 mebibytes (2^20^ B). I know that in some circles, 1 GB is treated as 1 GiB, so I don’t blame you. This system of quantities is standardised internationally in order to conform with the SI (mega must mean a million times and not 2^20^ times), but many don’t conform to it, such as Microsoft as far as I know.
rushaction@programming.dev
on 14 Apr 2024 15:28
collapse
Thank you for the correction and details.
Car@lemmy.dbzer0.com
on 14 Apr 2024 04:00
collapse
I found for most CS-ish tasks 8GB is okay. I also bought an early M1 and haven’t had too many problems outside of running VMs, which I expected. I purchased one of the stocked configurations at an Apple store, so there were slim pickings with 16GB of memory that weren’t like double the price of the machine.
Ghostalmedia@lemmy.world
on 14 Apr 2024 05:24
collapse
Yeah, my guess is 2x accounts is the cause of 90% of my performance issues. One person’s Adobe crap is fine, but two us too much for 8gigs without the occasional beach ball.
Is Adobe still the standard? When I realized browsers and 3rd party apps render PDFs much quicker than Reader, I started looking for other alternatives to Adobe. I was familiar with the flow of PaintShop Pro and GIMP, so now the very little I did in Photoshop I do in GIMP/Inkscape/a couple other freebie tools. When they acquired Macromedia and killed Flash, I was out of their ecosystem, so my poor knowledge of their products is almost 2 decades old. What are their can’t-live-without products nowadays?
Ghostalmedia@lemmy.world
on 14 Apr 2024 23:53
collapse
Depends what you’re doing, but for branding and print media, Adobe still dominates most shops. If you’re doing UX, then you’re probably in Figma these days.
Ghostalmedia@lemmy.world
on 16 Apr 2024 15:43
collapse
Yeah, Figma is the new standard for UX design. Adobe was trying to buy them for the last couple years because most people no longer use Adobe tools for UX work.
AlecSadler@sh.itjust.works
on 14 Apr 2024 04:57
nextcollapse
I’ll admit I don’t use Macs, so maybe they are more efficient than the Linux and windows machines I work off…
…but I typically use machines with 64GB and recently upgraded my personal machine to 128GB. I still swap about 50GB to my SSD from time to time.
And I’m not doing heavy graphic design or movie editing stuff.
I cannot fathom for the life of me how 8GB would ever be feasible.
thedeadwalking4242@lemmy.world
on 14 Apr 2024 05:17
nextcollapse
How the fuck are you using that much ram of you aren’t doing “heavy duty” stuff???
AlecSadler@sh.itjust.works
on 14 Apr 2024 05:32
nextcollapse
I just said I’m not doing graphic design or movie editing. I typically have 10 different browser profiles open to separate data / bookmarks, maybe 8 email accounts in tabs and Outlook (if not on Linux), 4-8 VS code windows, a mix of jetbrains rider or visual studio instances, a smattering mix of postman/SQL server/azure data studio/thunder client, among other things like PDFs and documents. And then multiple docker containers and other local running servers.
The swap usually comes in when I’m parsing a data file or something.
QuaternionsRock@lemmy.world
on 14 Apr 2024 05:49
nextcollapse
I do not want to see what your desktop looks like lol
AlecSadler@sh.itjust.works
on 14 Apr 2024 06:19
collapse
Hahaha, it stressed me out so I hide all the icons and changed the background to just black.
Excuse me, can I get some more pepper for this troll dish?
jadedwench@lemmy.world
on 14 Apr 2024 15:41
collapse
I do a lot on my M1 air and I haven’t even considered I would have RAM issues with 16GB. Windows, I would be getting 64GB to not be miserable. I don’t run as much as you all the time, but having a container or two going, far too many browser tabs, PDFs, 3-4 intellij projects, discord, teams, and probably other things I am forgetting about is the norm. I even have AutoCAD open sometimes.
The biggest difference is Mx is arm based, which goes a long way into getting better performance and battery life. I really need to look up again how Apple manages memory, swap, and performance in general. I just checked Activity Monitor and even with most of the memory showing as used, I don’t even notice. If my laptop were to die tomorrow due to my clumsy fumbling, I am getting another Mac. My only wish is getting Vulkan support. That would be amazing. Not going to hold my breath on that though.
Now, 8GB is a crime and it is not something I would recommend for any laptop/desktop, no matter what it is running. Not saying it wouldn’t work ok on a Mac for someone who only uses it for web browsing, but it is utterly ridiculous that 8GB is even an option these days. This is a dumb hill for Apple to die on and 16 should be the absolute minimum.
I have a debloated W11 VM on my proxmox server that I have used only once and is only there for some unknown emergency. With a little fiddling, I got it to idle under 4GB. I don’t plan to run servers on my laptop and invested enough on a little server rack to give me things like file servers, VMs, more permanent containers, and somehow got talked into making a gaming VM that I use at LAN parties. The 3U case for the main server travels very well.
Personally, I would try and get some of your server stuff off your machine. You can even take a look at some docker swarm or similar k8 concepts to reduce your container load. RPis are another good choice for some lower load server operations. I have a little RPi swarm that is powered by PoE+, though I plan on trying k8 on them soon to get some experience. RPis are also small enough that you could throw one in your bag if you needed something portable and are fairly inexpensive. Just a thought and may not be possible with your server applications.
AlecSadler@sh.itjust.works
on 14 Apr 2024 16:45
collapse
Hmm, getting server stuff off sounds fun! I have a couple laptops sitting around so it might be fun to even just use those to offload some processes.
I’d love to get my own little server rack or something, not the best timing financially, but that’d be awesome.
I’ll have to look into the RPi thing. Thanks for the ideas!
Ghostalmedia@lemmy.world
on 14 Apr 2024 05:37
nextcollapse
I get the sense that a lot of people here don’t use MacOS.
I have a few ARM and Intel Macs in 8 and 16gig configs, and I do a lot of heavy multimedia work. My 8 gig M1 only really gets into trouble when my partner and I both have an account with files open in bloated creative software. One pro user, and it’s usually fine. 2 active accounts with shitty creative software running, and you get a few beach balls.
AlecSadler@sh.itjust.works
on 14 Apr 2024 05:40
nextcollapse
Interesting to know for sure! I guess I can’t speak to what they’re doing for optimizations first hand, but at the same time…my 128GB cost me like $300 on sale so, I dunno, a wash? Haha.
I’ve tried to become a Mac convert a few times, mostly peer pressure, but I just haven’t been able to do it successfully yet.
Ghostalmedia@lemmy.world
on 14 Apr 2024 05:57
collapse
Yeah, if I’m building a PC, I’ll throw in as much RAM as I can get.
That said, with 16gigs I’m usually not thinking about RAM at all. I’d probably only want to go higher than that if I was living in Adobe Lightroom 24/7.
olympicyes@lemmy.world
on 14 Apr 2024 08:41
nextcollapse
There are a ton of benchmark videos on YouTube. I saw one recently for the new MacBook Air comparing the 8/16/24 GB models. They found that 8GB was significantly slower than 16GB for tasks like exporting video, but there was no difference between 16 and 24 gb.
I get the sense that a lot of people here don’t use MacOS.
I wish that was true.
emptiestplace@lemmy.ml
on 14 Apr 2024 09:33
nextcollapse
Do you understand kernel memory management fundamentals? I’m asking because what you wrote here strongly suggests otherwise - so, unless you’re able to show me I’m wrong, I’m going to stick with my conclusion that this is all incorrect and likely complete bullshit.
AlecSadler@sh.itjust.works
on 14 Apr 2024 11:33
nextcollapse
I don’t think even Eclipse can burn so much memory
lolcatnip@reddthat.com
on 14 Apr 2024 21:19
nextcollapse
Dude, that’s how much RAM I used to have on a super high-end dev box at work with 56 cores. It was very helpful for compiling Chrome. WTF are you doing with a personal machine that needs that much RAM?
AlecSadler@sh.itjust.works
on 14 Apr 2024 22:39
collapse
I mean it’s my personal machine but I am a software engineer consultant/contractor so I use it for work, too.
lolcatnip@reddthat.com
on 15 Apr 2024 15:37
collapse
Ok fair enough. It’s just surprising to see someone say that. The standard-issue dev machine where I work is a laptop with 32 GB.
MacNCheezus@lemmy.today
on 14 Apr 2024 22:30
collapse
I have a five year old MBP here with 16 gigs of RAM and it runs the latest version of macOS. I can run multiple web browsers with dozens of open tabs, VS Code, an LLM, and a video editing app on it, all simultaneously, without breaking a sweat.
IDK what Apple’s secret sauce is but their shit just works better than everyone else’s, that’s a fact.
Veraxus@lemmy.world
on 14 Apr 2024 06:08
nextcollapse
My basic web dev Docker suite uses about 13GB just on its own, which - assuming you were on 16GB (double Apple’s minimum) - wouldn’t leave much for things like browser tabs, which also eat memory for breakfast.
A fast swap is not an argument to short-change on RAM, especially since SSDs have a shorter lifespan than RAM modules. 16GB remains the absolute bare minimum for modern computing, and Apple is making weak, ridiculous excuses to pocket just a few extra bucks per MacBook.
filister@lemmy.world
on 14 Apr 2024 07:33
nextcollapse
Have you seen the difference between the 8 and 16Gb Macbooks, it is ridiculously expensive.
localhost443@discuss.tchncs.de
on 14 Apr 2024 10:49
collapse
Nah its about £13 retail.
Oh wait, you mean from apple… Its £200 from them.
filister@lemmy.world
on 14 Apr 2024 11:27
collapse
Yes, my bad, I wanted to say the difference in price between the 8 and 16Gb model, I know that RAM became dirt cheap nowadays and there aren’t any excuses for Apple to continue offering 8Gb model, as this is exactly a planned obsolescence.
localhost443@discuss.tchncs.de
on 14 Apr 2024 14:05
collapse
Yeah I was just pointing out the insanity of their pricing, using sarcasm. Its the main way we communicate over here.
The price difference between the first 2 models where 8gb ram is the only change, is £200.
Post 2025 I’m going to need some solution to replace my windows install which solely runs CAD/CAM software. If it wasn’t for this scumbaggery I’d buy a Mac to replace win10, but at present apple are such a shower of cunts I think I may have to put up with win11.
localhost443@discuss.tchncs.de
on 15 Apr 2024 00:12
collapse
I already run Linux for everything else. Its not an option for my CAD work unfortunately
PlexSheep@infosec.pub
on 14 Apr 2024 08:35
nextcollapse
What do you bist that takes that much memory?
olympicyes@lemmy.world
on 14 Apr 2024 08:36
nextcollapse
PS5 has 16GB and it’s a toy.
hector@sh.itjust.works
on 14 Apr 2024 08:40
nextcollapse
Wow! 13GB! I did some heavy stuff on my computer with like a shit ton of Docker servers running together + deployment and I never reached 13GB!
Without disclosing private company information lol what are you doing ;)
ben_dover@lemmy.ml
on 14 Apr 2024 16:18
nextcollapse
not OP, but I have to run fronted and backend of a project in docker simultaneously (multiple postgres and redis dbs, queues, search index, etc., plus two webservers), plus a few browser tabs and two VSCode instances open, regularly pushes my machine over 15gb ram usage
That is basically my use-case. You add a DB service (or two), DNS, reverse proxy, Redis, Memcached, etc… maybe some containers for additional proprietary backend services like APIs, and then the application themselves that need those things to run… it adds up FAST. The advantage is that you can have multiple projects all running simultaneously and you can add/remove/swap them pretty easily.
RAM is cheap. There is no excuse for shipping a 8GB computer… even if it’s mostly going to be used for family photos and internet.
Running a suite of services in containers (DBs, DNS, reverse proxy, memcached, redis, elasticsearch, shared services, etc) plus a number of discreet applications that use all those things. My day-to-day usage hovers around 20GB with spikes to 32 (my max allocation) when I run parallelized test suites.
Dockers memory usage really adds up fast.
accideath@lemmy.world
on 14 Apr 2024 09:10
nextcollapse
Playing devils advocate here: As someone who deals with stuff like that, you also wouldn’t buy the base model mac. The average computer user can get by with 8GB just fine and it’s not like you can’t configure Macs with more than that.
That of course doesn’t justify the abhorrent price of the upgrades…
Specal@lemmy.world
on 14 Apr 2024 10:51
nextcollapse
And here I am, putting 16gb in every machine I work on because it’s so damn cheap there’s no reason not to future proof
accideath@lemmy.world
on 14 Apr 2024 10:56
nextcollapse
I mean, same. The difference in price for 8GB and 16GB is negligible, especially if you want dual channel on desktops
exanime@lemmy.today
on 14 Apr 2024 12:22
nextcollapse
For apple, that difference is $200… not negligible I’d say
accideath@lemmy.world
on 14 Apr 2024 12:32
nextcollapse
Oh yea, absolutely. I meant that in regards to the price of memory itself, be it as modules for your desktop PC or the chips itself for soldered solutions. Apple’s markup is bonkers
My girlfriends mum wanted to know why her laptop was slow… It was because HP thought that 4gb of ram is acceptable in 2022 (when the laptop was sold). Granted ram wasn’t as cheap then as it is now… Still I paid £30 for a brand new 8gb DDR4 sodimm, there’s not reason hp couldn’t do that. It’s annoying the corners these company cut.
accideath@lemmy.world
on 14 Apr 2024 19:46
collapse
My experience is, that 4GB is just about useable for a bit of web browsing and similar stuff. Even on windows 11. I have an old Surface Pro 4 laying around that, in a pinch, works perfectly fine with 11. Of course, it’s not fast. But it’s totally useable.
Her laptop just wasn’t having it, windows 11, windows was using 3.7gb ram took about 30 seconds for task manager to open. As soon as I upgraded the ram is was usable.
I checked for any surprising background services or anti virus software and there was nothing really
accideath@lemmy.world
on 14 Apr 2024 20:35
collapse
That sounds more like issues Windows would have running on an HDD (or maybe eMMC) instead of an SSD… Bit that wouldn’t explain why it got better, when you upgraded the RAM…
It’s not worth trying to understand windows ram usage, it will drive any same person insane. The laptop uses intel optane as it’s main drive, which is slower than an SSD but much much lower latency so should actually be perfect for the job of being swap. But it shit the bed.
AngryCommieKender@lemmy.world
on 14 Apr 2024 14:59
collapse
I just slap in 32GB on every computer I build because the MoBos can take 128GB and anything less feels cheap and silly.
PraiseTheSoup@lemm.ee
on 14 Apr 2024 15:04
nextcollapse
The average computer user can get by with 8GB just fine
Hard disagree. The average computer user is idling at 5gb already because the average computer user is stupid.
accideath@lemmy.world
on 14 Apr 2024 15:44
collapse
Still leaves 3gb for the web browser and the average user isn’t using anything else anyways. And even on chrome that’s quite a few pages.
captainlezbian@lemmy.world
on 14 Apr 2024 15:43
collapse
No they can’t. I ran 8gb of ram for years and it turns out that that’s why my computer sucked
accideath@lemmy.world
on 14 Apr 2024 15:50
collapse
Maybe you’re not an average user then. Most people just browse the web and maybe manage some photos or fill out a document once in a while. You could do that on 4GB if you wanted to, let alone 8.
I wouldn’t say 4gb is usable for the average consumer. Using the assumption they’re using windows 11 that’ll eat 3.7 ish GB of ram just idling.
accideath@lemmy.world
on 14 Apr 2024 19:43
nextcollapse
You forget there though, that a lot of the RAM, that Windows (and most modern operating systems) uses, while idling, is a cache of programs you’re likely to open and that gets cleared, if you open something else. That has been a thing since Vista and was btw one of the reasons why Vista was criticized for high memory useage. Windows 11 is very useable with 4GB of RAM, if you’re not planning to do something bigger than browsing the web or editing a word document.
I’m not forgetting that, but it won’t just clear that ram it will want to put it into swap, and depending on your storage speed that can slow tasks down. Making it quite stuttery.
accideath@lemmy.world
on 14 Apr 2024 20:38
collapse
I mean, a (good) SSD is worth quite a lot, even on very old systems. I have an old 2008 MacBook laying around. It’s certainly not fast but with an SSD it’s totally useable, even on current macOS versions.
Tabs of what? Chromes ram usage is more of a meme than an actual ram issue, windows will only allow an application to use so much ram depending on ram availability
108 tabs in chromium. Mentioned RAM usage is total RAM usage including all system and kernel, but excluding page cache. Forgot to mention libreoffice in background.
dullbananas@lemmy.ca
on 14 Apr 2024 13:57
nextcollapse
My basic web dev Docker suite uses about 13GB just on its own
YIj54yALOJxEsY20eU@lemm.ee
on 14 Apr 2024 15:08
collapse
The people need to know how you use 13GB of ram worth of containers for web dev.
linearchaos@lemmy.world
on 14 Apr 2024 16:47
collapse
Docker is awesome for a lot of things. But it’s not particularly good for RAM.
rasakaf679@lemmy.ml
on 14 Apr 2024 06:46
nextcollapse
Why tf can’t they sell mac with upgradable parts?? They are “so” into renewable and recycling stuff and saving planet and stuff. Then they should start selling shits with upgradable parts. Even cpu’s if possible. Now apple fan boys argue with that. And don’t bullshit me with soc should be near cpu for faster optimisation they can redesign the mobo.
accideath@lemmy.world
on 14 Apr 2024 08:51
nextcollapse
There are legitimate advantages of the RAM being soldered right next to the SoC. However, if anyone could figure out how to create a proprietary RAM module, that slots in right next to the SoC (or even just an SoC module including RAM) that can be swapped out and that doesn‘t have any meaningful performance impact, it would be Apple. Just that it never could be Apple…
natebluehooves@pawb.social
on 15 Apr 2024 16:01
collapse
The problem is the electrical resistance of the socket. Most of the performance on apple silicon is achieved through extremely high bandwidth, low latency memory. Unfortunately that necessitates a socketless design at the moment, and you can see that happening on the snapdragon X too.
accideath@lemmy.world
on 15 Apr 2024 16:32
collapse
Yea, not just snapdragon and apple. Even intel and amd processors usually get paired with higher bandwidth soldered ram on many mobile offerings.
And on GPUs soldered VRAM has been a thing for a loooong time, with HBM memory being the prime example for what RAM close to the chip can do. AMD‘s Vega cards were highly sought after during the mining craze, even though they weren’t that fast in general computing, simply because their memory bandwidth was so beyond any other consumer cards…
flop_leash_973@lemmy.world
on 14 Apr 2024 14:48
nextcollapse
Because that gives the user as much or more control over the device as Apple themselves have. One of the fairly consistent things about Apple over the years has been a desire to maintain tight control for themselves over the products they make.
anon_8675309@lemmy.world
on 14 Apr 2024 15:14
nextcollapse
Because then they can’t gaslight people into thinking their 8GB is magical.
phoenixz@lemmy.ca
on 14 Apr 2024 15:59
nextcollapse
There is what they say they are in favor of, and there is what they really are in favor of.
They are in favor of apple getting all the monies, the end
Caiman86@lemmy.world
on 15 Apr 2024 03:17
nextcollapse
They certainly used to. My wife’s 2012 MacBook Pro has upgraded RAM and SSD parts I’ve put in over the years and still runs fine, though it isn’t used much anymore and OS upgrades stopped a while ago.
Their current environmental marketing is pure greenwashing bullshit and their stances on upgradability and repairability are terrible.
AdrianTheFrog@lemmy.world
on 15 Apr 2024 05:04
collapse
It’s basically just greenwashing. They pretend to be into renewables and recycling only when it doesn’t disincentivize people from buying the newest product. Ex: iPhone trade in for recycling - Yes, they do recover some raw material but you can only do it if you’re buying a new iPhone with that credit, and its probably also an attempt to keep cheap used iPhones off of the market.
Sam_Bass@lemmy.world
on 14 Apr 2024 11:59
nextcollapse
Cant have users getting all uppity with excess memory after all
MonkderDritte@feddit.de
on 14 Apr 2024 12:36
nextcollapse
Of course they do.
reverendsteveii@lemm.ee
on 14 Apr 2024 15:35
nextcollapse
Tim Apple be like “We’ve tried charging more money. Have we tried charging more money and delivering less stuff in exchange?”
goatman360@lemmy.world
on 14 Apr 2024 17:04
nextcollapse
Yes, they do constantly. Yet, people still keep buying. I hate that I have to use Apple for my job because of the software and interface is exclusive.
sugar_in_your_tea@sh.itjust.works
on 14 Apr 2024 18:15
nextcollapse
Yup, same. I really don’t like macOS, but that’s what we’ve standardized on. I’m a Linux guy and use Linux at home for everything.
reverendsteveii@lemm.ee
on 14 Apr 2024 18:50
collapse
I really like my macbook for dev work, and I think that now that macos is essentially a linux distro it’s quite nice, but it’s not that much better than the free distros and it’s getting worse while they get better. Right now the only thing keeping me on a mac at work is that they gave it to me and the only thing keeping me on a mac at home is that it’s already paid for.
KillingTimeItself@lemmy.dbzer0.com
on 14 Apr 2024 21:25
collapse
you wanna expand on why you think it’s basically a linux distro? Last i heard macos was more closely based on BSD than it was linux, and this was ages ago. Unless they rewrote it without my knowledge it really shouldn’t be anything like either one of the two.
reverendsteveii@lemm.ee
on 15 Apr 2024 13:18
collapse
because I can pop a terminal into zsh and beyond that I don’t really know the taxonomy
KillingTimeItself@lemmy.dbzer0.com
on 15 Apr 2024 19:10
collapse
you can do that with WSL though.
All modern terminals are actually terminal emulators, unless you’re sitting in TTY. It’s pretty trivial to implement a proper UNIX/linux like CLI environment.
reverendsteveii@lemm.ee
on 15 Apr 2024 19:26
collapse
okay
KillingTimeItself@lemmy.dbzer0.com
on 16 Apr 2024 02:47
collapse
and for completion sake here, technically windows implements a “terminal” through CMD, it’s not linux/unix like at all, but it is still a CLI interface, so.
RememberTheApollo_@lemmy.world
on 14 Apr 2024 20:28
collapse
Lol, audio jacks come to mind. As well as a physical button. And shipping devices without cords or chargers.
phoenixz@lemmy.ca
on 14 Apr 2024 16:02
nextcollapse
Granted, I’m a developer and my dev ide already uses a good 10+GB, I have probably hundreds of tabs and windows open over 6 desktops… But I got 64GB, and I’m considering upgrading to 128, and these clowns think 8 is okay today? My development laptop of like 10 years ago has 8GB
sugar_in_your_tea@sh.itjust.works
on 14 Apr 2024 18:14
nextcollapse
I’ve been okay with 16 for a while. I use ViM as my editor, and occasionally VSCode. I use a single desktop, but I generally have a half dozen or more tmux tabs for various parts of the project.
That said, I’ve been feeling a bit squeezed with 16GB. The main RAM consumers are:
Firefox - I frequently have 100 tabs open, so it takes a few GBs RAM
Docker - running most of our app (a dozen or so microservices) takes 3-4GB if I’m careful about turning stuff off that I don’t need, 5-6 if I’m not
Teams and Slack - especially during calls, these use a lot
So I think 16GB should be the minimum, and 24GB should be average. I’m going to be adding another 16GB to my personal development machine (hobbies and whatnot), and my work laptop can’t be upgraded (MacBook), but I’ll be upgrading to an M3 or M4 soonish and will request more RAM.
8GB is probably fine if you’re just running a browser and that’s it. If you’re doing anything else, 16GB should be the minimum.
Buy more memory, if you have the financial means to do so. If not then I’m sorry you’re in that situation
Yerbouti@lemmy.ml
on 14 Apr 2024 16:12
nextcollapse
My students with the 8gb version struggle to do basic audio work with only a few plugins. This is BS from apple. Unless you use your computer only for web browsing, in which case you shouldn’t get a stupid mac in the first place.
To be fair I have no idea why audio plugins need so much ram
KillingTimeItself@lemmy.dbzer0.com
on 14 Apr 2024 21:26
nextcollapse
to be fair, apple is the one literally curating this experience so it “just works” only to then fuck it up somehow.
Jentu@lemmy.blahaj.zone
on 14 Apr 2024 23:00
collapse
Apple has a masterclass of tiering their products in just a way so that in every tier but the upper tiers, you’re giving up something really important. If you spend the least you possibly can on a MacBook, apple guarantees you’re going to have a very bad time for “doing the bare minimum to be seen with a laptop with an apple logo on it”. Their whole tier system is an exercise in “how can we get away with fucking up these things just enough so the customer feels like it is necessary to spend a little bit more” every step of the way. Then they make it unupgradable so you can’t sidestep their crafted feature tier system.
KillingTimeItself@lemmy.dbzer0.com
on 15 Apr 2024 02:49
collapse
nvidia also does this. It’s actually insane.
Love spending 200 USD on 16GB of ram in 2024 because of apple, very cool, or however much they charge, it’s still too much.
dustyData@lemmy.world
on 15 Apr 2024 00:12
collapse
Latency is a bitch. If you want anything to run on real time with zero latency, then it means everything, including those pretty large sample data, has to be stored as close to the processor as possible. Compressing/decompressing takes a shit tonne of time and effort, and to keep both delay down and fidelity up, you have to pay in absurd amounts of RAM to the DAW shrine.
potentiallynotfelix@iusearchlinux.fyi
on 14 Apr 2024 18:32
nextcollapse
Lmao I’d take my chonky ass dell laptop with expandable ram any day of the week
KillingTimeItself@lemmy.dbzer0.com
on 14 Apr 2024 21:23
collapse
my w520 would win in a fist fight against the latest macbook, hell any of them ever produced.
mhague@lemmy.world
on 14 Apr 2024 20:35
nextcollapse
Isn’t “it’s good enough for most users” a little too close to “it’s good enough to be bought, used for a bit, and then tossed”? Usually computers that were adequate for X stop being able to do X. There’s little to no margin and you can’t upgrade it?
kamen@lemmy.world
on 14 Apr 2024 20:41
nextcollapse
Yeah, sure. Even if what they say about the OS resource usage is true, it’s only a fraction of the total usage. A lot of the multiplatform software will use the same resources regardless of the OS. Many apps eat RAM for breakfast, doesn’t matter if it’s content creation or software development. Heck, even smartphones these days have have this much or more RAM.
I won’t argue, I just won’t buy an Apple product in the near future or probably ever at all.
KillingTimeItself@lemmy.dbzer0.com
on 14 Apr 2024 21:22
nextcollapse
buys [insert price] laptop, top of the line, flagship, custom silicon, built ground up to be purpose specific.
Opens final cut pro: crashes
ok…
Retrograde@lemmy.world
on 14 Apr 2024 22:07
collapse
Especially paired with Apple’s 128gb integrated, non replaceable hard drives. Whoops you installed all of Microsoft office? Looks like you have no room to save any documents :(
KillingTimeItself@lemmy.dbzer0.com
on 14 Apr 2024 22:27
collapse
ah yes, we can’t forget the proprietary non controller based nvme drives that use m.2 but arent actually nvme drives, they’re just flash.
KillingTimeItself@lemmy.dbzer0.com
on 15 Apr 2024 02:26
collapse
it’s NVME in the sense that it’s non volatile flash, probably even higher quality than most existing NVME ssds out there today.
The thing is that it literally just the flash. On a card with an m.2 pin out, that fits into an m.2 slot, it doesn’t have a storage controller or any standardized method of communication, that already exists. It’s literally a proprietary non standard standard form factor SSD.
The controller is integrated onto the silicon chip die itself, there is no storage controller on the storage itself.
Same. And I bet you the price will also go up with less ram.
mechoman444@lemmy.world
on 14 Apr 2024 20:58
nextcollapse
I mean. It makes sense. The vast majority of people buying apple computers are loyalists or people that simply need an Internet/word processor.
And if you want to develop in apple then you have to spend a massive premium for their higher end hardware.
AdrianTheFrog@lemmy.world
on 15 Apr 2024 04:54
collapse
Their CPUs are actually really good now, when the apps are actually optimized for them. Especially in single core, they are very competitive with top Intel or AMD chips while being way more power efficient.
ex: in Geekbench 5.1 single core the M2 max gets 1967 points (85%) compared to 2311 points from the 7950X3D and 2369 from the 14900k. The M2 max (12 cores (8 p + 4 e), 12 threads) can draw a maximum of 36 watts while the 7950X3D (16 cores, 32 threads) can draw around 250 watts, and the 14900k (16 cores (8 p + 16 e), 32 threads) can draw around 350 watts.
Apple’s GPUs are definitely lacking though, in terms of performance.
mechoman444@lemmy.world
on 15 Apr 2024 22:35
collapse
Ya. Their CPUs are really good. Got to give credit where credit is due.
Blackmist@feddit.uk
on 14 Apr 2024 21:09
nextcollapse
8GB RAM is what my phone has.
Having that in a laptop shows what they think of people buying their kit. They think you’re only buying it so you can type easier on Facebook.
KillingTimeItself@lemmy.dbzer0.com
on 14 Apr 2024 21:23
nextcollapse
TBF 8gb of ram on a phone is actually psychotic. You really shouldn’t be doing all that much on a phone lol.
IthronMorn@sh.itjust.works
on 14 Apr 2024 21:27
nextcollapse
Then what should I be doing on my phone?
Khanzarate@lemmy.world
on 14 Apr 2024 21:33
nextcollapse
Obviously using it as a thin client for this MacBook, duh.
KillingTimeItself@lemmy.dbzer0.com
on 14 Apr 2024 21:33
collapse
nothing that requires 8GB of ram lol.
I’ve played the entirety of java minecraft on an old thinkpad with 4GB of ram. It didn’t crash (i dont use swap)
There literally shouldn’t be anything capable of using that much memory.
greedytacothief@lemmy.world
on 14 Apr 2024 21:51
nextcollapse
Is this bait? Because like, you could be rendering, simulating, running virtual machines. Lots of stuff that aren’t web browsers also eat ram
RippleEffect@lemm.ee
on 14 Apr 2024 22:12
nextcollapse
Web browsers also eat ram.
KillingTimeItself@lemmy.dbzer0.com
on 14 Apr 2024 22:30
nextcollapse
90% of which can be paged in the background, it’s not like most people are chronically browsing the web on their phones.
woelkchen@lemmy.world
on 15 Apr 2024 19:37
collapse
it’s not like most people are chronically browsing the web on their phones.
Yes, they do.
KillingTimeItself@lemmy.dbzer0.com
on 16 Apr 2024 02:44
collapse
and it’s also the worst place to do that. If you’re going to be chronically online like me, you should at least give it clear boundaries between something you carry on you at all times, and something that you regularly have access to, like my workstation for instance.
Unless you like being horribly depressed or something.
greedytacothief@lemmy.world
on 14 Apr 2024 22:54
collapse
I was trying to mention things that weren’t just web browsers. Since it seemed the comment was about programs that use more ram than they seemingly need to.
Edit: There’s like photogrammetry and stuff that happens on phones now!
RippleEffect@lemm.ee
on 14 Apr 2024 23:38
nextcollapse
And games!
KillingTimeItself@lemmy.dbzer0.com
on 15 Apr 2024 02:51
collapse
games are probably a better argument honestly, but even at that point, it’s not a really good experience. Unless you buy a gaming phone, which i guess is an option. Regardless the mobile gaming market is actually vile.
KillingTimeItself@lemmy.dbzer0.com
on 15 Apr 2024 02:53
nextcollapse
i suppose photo editing would be one? Maybe? I’m not sure how advanced photo editing would be on mobile, it’s not like you’re going to load up the entirety of GIMP or something.
As for photogrammetry, i’m not sure that would consume very much ram. It could, i honestly don’t think it would be that significant.
AdrianTheFrog@lemmy.world
on 15 Apr 2024 04:15
collapse
There’s like photogrammetry and stuff that happens on phones now!
No, the photogrammetry apps all use cloud processing. The LIDAR ones don’t, but that’s only for Apple phones and the actual mesh quality is pretty bad.
KillingTimeItself@lemmy.dbzer0.com
on 14 Apr 2024 22:29
nextcollapse
on a phone? Why the fuck would anyone be running virtual machines on a phone?
dustyData@lemmy.world
on 14 Apr 2024 22:39
nextcollapse
My man, have you been to selfhosted? People are using smart phones for all kinds of crazy stuff. They are basically mini ARM computers. Particularly the flagships, they can do many things like editing video, rendering digital drawings, after they end their use life they can host adguards, do torrent to NAS, host nextcloud. You name it.
pythonoob@programming.dev
on 14 Apr 2024 22:55
nextcollapse
Something like the Samsung Dex app that basically turns your phone into a mini computer with kbm and a monitor wouldn’t bee too bad tbh for most people. Take all your shit with you in your pocket and dock it at home or at work or whatever.
KillingTimeItself@lemmy.dbzer0.com
on 14 Apr 2024 22:57
nextcollapse
yeah, i literally selfhost a server, running like 8 different services. I’m quite acclimated to it by now. Using a phone for this kind of thing is the wrong device. A chromebook is going to be a better alternative. You can probably get those cheaper anyway.
A big problem with phones is that they just aren’t really designed for that kind of thing, you leave a phone plugged in constantly and it’s going to spicy pillow itself. Let alone even trying to do that on something that isn’t an android. I cannot imagine the hell that self hosting on an android would be, let alone on an iphone.
I could see a usecase for it as a network relay in the event that you need a hyper portable node or something. GLHF with the dongling if you need those.
Unfortunately, if you already have a server, it’s going to be better to just spin up a new task on that server, as the cost of running a new device is going to outweight the cost of just using an existing one that’s already running. Also, you can get stuff like a raspi or le potato for pretty cheap also. not very powerful, but probably more utility, especially given the IO.
dustyData@lemmy.world
on 14 Apr 2024 23:31
collapse
Yeah, god forbids anyone ever does anything suboptimal or worse…for fun 😱
KillingTimeItself@lemmy.dbzer0.com
on 15 Apr 2024 02:32
collapse
i’m not saying that you can’t but like, you shouldn’t buy a phone with the prospect to turn it into a server. You should sell your old phone. Or use it until it dies. That’s probably going to be better in the long run honestly. You use a laptop? A desktop? An SBC even? All of those can be converted into a server with MUCH longer lifespans, and better software support.
Mobile hardware often has a support period of like 2-3 years, although that’s changed recently, the hardware expectancy is probably more like 5 years at most. Meanwhile, desktop hardware, and mobile hardware in particular can easily last like 10 years. Even longer if you’re ok with running legacy hardware.
My primary mobile laptops are 10 12 years old respectively. They’re perfectly fine for what i need. I would NOT want to be using a 10 year old phone for that.
If you aren’t the type of person buying or owning laptops, you almost certainly do not know what self hosting is.
AdrianTheFrog@lemmy.world
on 15 Apr 2024 04:17
collapse
It sounds a lot more cost effective to get a used mini-pc than a flagship phone for any sort of server stuff.
KillingTimeItself@lemmy.dbzer0.com
on 15 Apr 2024 19:04
collapse
literally this, anything other than a phone is going to be more purpose suited. cheaper, and probably more versatile. You’re spending money on a really expensive screen that you are literally not going to be using. You might as well buy something with a shitty screen, or none at all.
AdrianTheFrog@lemmy.world
on 15 Apr 2024 21:23
collapse
I got a ThinkCentre M700 with an i7-6700, 16gb of ram and a 256gb SSD for $70 total. It’s really hard to get a phone with anywhere near that value for money.
KillingTimeItself@lemmy.dbzer0.com
on 16 Apr 2024 02:42
collapse
exactly, even if we’re talking buying brand new modern desktop hardware. The sheer benefit you gain of having an sata port, and being able to stuff an 18TB exos drive on it, for example, will immediately pay itself off in terms of what cloud storage would cost, while also not being limited to your internet uplink speeds. You could easily run 10gig if you really wanted to. Although realistically, 2.5gb is going to be more apt.
greedytacothief@lemmy.world
on 14 Apr 2024 22:59
collapse
Does the JVM count?
KillingTimeItself@lemmy.dbzer0.com
on 15 Apr 2024 02:49
collapse
thats really funny, but no.
AdrianTheFrog@lemmy.world
on 15 Apr 2024 04:14
collapse
you could be rendering, simulating, running virtual machines
On a phone? I guess you could, although 4gb is probably enough for any video game that any amount of people use.
woelkchen@lemmy.world
on 15 Apr 2024 08:22
collapse
People use phone apps for photo and video editing these days. The common TikTok kid out there doesn’t use Adobe Premiere on a desktop workstation.
Phone apps often are desktop applications with a specialized GUI these days.
KillingTimeItself@lemmy.dbzer0.com
on 15 Apr 2024 19:02
collapse
i mean yeah, but even then those aren’t significant filters, and what makes you think that tiktok isn’t running a render farm somewhere in china to collect shit tons of data? They’re already collecting the data, might as well provide a rendering service to make the UI nicer, but i don’t use tiktok so don’t quote me on it.
Those are also all built into tiktok, and im pretty sure tiktok doesn’t require 8GB of ram to open.
woelkchen@lemmy.world
on 15 Apr 2024 19:34
collapse
i mean yeah, but even then those aren’t significant filters, and what makes you think that tiktok isn’t running a render farm somewhere in china to collect shit tons of data?
Pretty sure my Adobe Premiere comparison made it clear I wasn’t talking about the TikTok app itself but 3rd party apps to later upload to online services like TikTok.
Just because you are completely inapt to think of use cases, doesn’t mean they don’t exist.
KillingTimeItself@lemmy.dbzer0.com
on 16 Apr 2024 02:46
collapse
i mean yeah, you could, but then tiktok doesn’t have you on it’s app, and im pretty sure tiktok has a pretty comprehensive editing tool set, otherwise people wouldnt be making as much edited content on it.
even then, there are still a lot of people that do edit video intended for 9:16 consumption, and they do it on PC. Primarily because it’s just a better place to edit things.
IthronMorn@sh.itjust.works
on 15 Apr 2024 15:43
collapse
What about running a chrooted nix install and using a vnc to connect to it? While web browsing and playing a background video? Just because you don’t use your ram doesn’t mean others don’t. And no, I don’t use all my ram, but a little overhead is nice.
KillingTimeItself@lemmy.dbzer0.com
on 15 Apr 2024 19:01
collapse
on a phone? I mean i suppose you could do that, but VNC is not a very slick remote access tool for anything other than, well, remote access. The latency and speed over WIFI would be a significant problem, i suppose you could stream from your phone to your TV, but again, most TVs that exist today are smart TVs so literally a non issue.
my example here was using a computer rather than a phone, to show that even desktop computing tasks, don’t really use all that much ram.
IthronMorn@sh.itjust.works
on 16 Apr 2024 15:53
collapse
Well, then by that logic, since desktop computing tasks don’t really use all that ram: we shouldn’t need more than 8GB in a desktop ever. Yes, my example was a tad extreme, vnc-ing into your own VM on your phone, but my point was rather phones are becoming capable and replacing traditional computers more and more. A more realistic example is when I was using Samsung Dex the other day I had 80ish chrome tabs open, a video chat, and a terminal ssh’d into my computer fixing it. I liked the overhead of ram I had above me. Was I even close to 12GB? No. But it gave me room if I wanted another background program or had to spin something up quickly without disrupting my flow or lagging out/crashing.
KillingTimeItself@lemmy.dbzer0.com
on 16 Apr 2024 17:48
collapse
Well, then by that logic, since desktop computing tasks don’t really use all that ram: we shouldn’t need more than 8GB in a desktop ever.
if this is the logic we’re using, then we shouldn’t have phones at all. Since clearly they do nothing more than a computer. Or we shouldn’t have desktops/laptops at all. Because clearly they do nothing more than a phone.
I understand that phones are more capable, my point is that they have no reason to be more capable. 99% of what you do on a phone is going to be the same whether you spend 200 dollars on it, or 2000.
Yeah, but if you have plenty of RAM on Android, there’s a chance those apps you left in the background will still be running when you go back to them, rather than doing the usual Android thing of just restarting them.
KillingTimeItself@lemmy.dbzer0.com
on 15 Apr 2024 19:13
collapse
yeah i get that, but i often only have like 2 apps open on my android phone (maybe three). And even if you didn’t have enough ram there’s no reason android can’t cache old apps to page file or something. Then you don’t need to restart them, just load it from page. Given how fast modern phone storage is likely to be, this should be pretty negligible.
macrocephalic@lemmy.world
on 17 Apr 2024 06:38
collapse
My phone was manufactured in 2022, cost under USD250, and has 8gb of ram. New phones generally come with 12gb or more.
KillingTimeItself@lemmy.dbzer0.com
on 14 Apr 2024 21:21
nextcollapse
what a weird title bro, of course they argue in favor of it, they sell the fucking hardware that they created. Be a little weird if they just argued against it after spending billions designing and manufacturing it.
Regardless, i still can’t believe apple thought 8GB minimum was ok, genuinely baffling to me.
jenny_ball@lemmy.world
on 14 Apr 2024 21:46
nextcollapse
i have more ram on my old gpu apple sucks
dustyData@lemmy.world
on 14 Apr 2024 22:34
collapse
A friend has a phone with more ram.
jenny_ball@lemmy.world
on 15 Apr 2024 00:20
collapse
all my phones have more ram since like 2015
GlobalMind@lemm.ee
on 14 Apr 2024 22:51
nextcollapse
I also can not figure out why so many companies are selling them with only a 500Gb drive. SSD or HDD.
So they can charge more for an upgrade. Simple business tactics.
Classy@sh.itjust.works
on 15 Apr 2024 00:24
collapse
Don’t forget cloud services!
anhydrous@lemmy.world
on 14 Apr 2024 23:12
nextcollapse
My X220 and T520 each have 16GB. The designed max was actually “only” 8GB, but it turns out 16 GB actually works. I replaced the RAM modules myself without asking Lenovo for permission. Those models came out in 2011.
My HP Omen 17" was designed for a maximum of 32GB ram. I’m currently running 64GB on it.
Duamerthrax@lemmy.world
on 15 Apr 2024 03:36
nextcollapse
This was also true for Apple computers before they started soldering the ram in place. I remember going way over spec in my old G4 tower. Hell, I doubt the system would crash if you found larger ram chips and soldered them in.
Klause@discuss.tchncs.de
on 15 Apr 2024 09:06
collapse
I doubt the system would crash if you found larger ram chips and soldered them in.
You can’t even swap components with official ones from other upgraded models. Everything is tied down with verification codes and shit nowadays. So I doubt you could solder in new ram and get it to work.
Yeah lol my thinkcentre with a 6gen intel had only 8GB (I paid under 100€ for it) so I went shopping to double that on a second hand site, but the price for 4, 8 or the 16GB ddr4 ram stick (sodimm, there seems to be a flood of used ones) I bought was about the same, like 30€ shipping included, so now I got 24GB.
mightyfoolish@lemmy.world
on 15 Apr 2024 04:47
nextcollapse
I get upgrades help the bottom line but considering that 8GB of RAM chokes the silicon they are allegedly so proud of… seems like a slap in the face to their own engineers (and the customer as well but that is not my point).
drmoose@lemmy.world
on 15 Apr 2024 07:16
nextcollapse
Apple has been really stretching their takes lately. Nice to see some fire under their ass though it’s not going to matter. Too many ignorant people falling for likeable propaganda.
BilboBargains@lemmy.world
on 15 Apr 2024 07:40
nextcollapse
As engineers, we should never insert proprietary interfaces into our designs. We shouldn’t obfuscate the design.
The motivation for these toxic practices comes from the business side because it’s profitable. These people won’t share the profits with you because they are psychopaths. Ultimately we are making more waste when electronics cannot be upgraded, maintained and repaired. It’s bad for people and it’s bad for the environment.
TheGrandNagus@lemmy.world
on 15 Apr 2024 08:41
collapse
So much stuff in both the hardware and software world really annoys me and makes me think our future is shit the more I think about it.
Things could be so much better. Pretty much everything could be open and standardised, yet it isn’t.
Software can be made in a way that isn’t user-hostile, but that’s not the way of things. Hardware could be repairable and open, without OEMs having to navigate a minefield of IP and patents, much of which shouldn’t have been granted in the first place, or users having no ability to repair or upgrade their devices.
It’s all so tiresome.
rottingleaf@lemmy.zip
on 15 Apr 2024 11:12
collapse
I think Napoleon said something similar to “the army is commanded by me and the sergeants”?
Well, not true anymore today. All this connectivity and processing power, however seemingly inefficiently they are used, allow to centralize the world more than it could ever be. No need to consider what sergeants think.
(Which also means no Napoleons, cause much more average, grey, unskilled and generally unpleasant and uninteresting people are there now.)
It’s about power and it happened in the last 15 years.
I think it’s a political tendency, very intentional for those making decisions, not a “market failure” and other smartassery. It comes down to elites making laws. I feel they are more similar to Goering than to Hitler all over the world today.
This post may seem nuts, but our daily lives significantly depend on things more complex and centralized in supply chains and expertise than nukes and spaceships.
We don’t need desktop computers which can’t be fully made in, say, Italy, or at least in a few European countries taken together. Yes, this would mean kinda going back to late 90s at best in terms of computing power per PC, but we waste so much of it on useless things that our devices do less now than then.
We trade a lot of unseen security for comfort.
NostraDavid@programming.dev
on 15 Apr 2024 08:14
nextcollapse
I haven’t used 8GB since… 2008 or so? TBF, I’m a power user (as are most people on any Lemmy instance, I presume), but still…
And sure, Mac OS presumably uses less RAM than Windows, but all the applications don’t.
horse@lemmy.world
on 15 Apr 2024 08:28
nextcollapse
There is exactly one reason why they do this: So they can charge you $200 to upgrade it to 16GB and in doing so make the listed price of the device look $200 cheaper than it actually is. Or sometimes $400 if it’s a model where the base model comes with a 256GB SSD (the upgrade to 512GB, the minimum I’d ever recommend, is also $200).
The prices Apple charges for storage and RAM are plain offensive. And I say that as someone who enjoys using their stuff.
Jesus_666@lemmy.world
on 15 Apr 2024 08:55
collapse
That’s why I dropped them when my mid-2013 MBP got a bit long in the tooth. Mac OS X, I mean OS X, I mean macOS is a nice enough OS but it’s not worth the extortionate prices for hardware that’s locked down even by ultralight laptop standards. Not even the impressive energy efficiency can save the value proposition for me.
Sometimes I wish Apple hadn’t turned all of their notebook lines into MacBook Air variants. The unibody MBP line was amazing.
Sometimes I wish Apple hadn’t turned all of their notebook lines into MacBook Air variants. The unibody MBP line was amazing.
Typing this from a M2 Max Macbook Pro with 32GB, and honestly, this thing puts the “Pro” back in the MBP. It’s insanely powerful, I rarely have to wait for it to compile code, transcode video, or run AI stuff. It also does all of that while sipping battery, it’s not even breaking a sweat. Yes, it’s pretty thin, but it’s by no means underpowered. Apple really is onto something with their M* lineup.
But yeah, selling “Pro” laptops with 8GB in 2024 is very stupid.
egeres@lemmy.world
on 15 Apr 2024 08:51
nextcollapse
I can’t believe I’m reading this in 2024
chemicalwonka@discuss.tchncs.de
on 15 Apr 2024 23:41
collapse
threaded - newest
Common apple L
Well yeah, they’re enough to meet the minimum use cases so they can upsell most people on expensive RAM upgrades.
That’s why I don’t buy laptops with soldered RAM. That’s getting harder and harder these days, but my needs for a laptop have also gone down. If they solder RAM, there’s nothing you can (realistically) do if you need more, so you’ll pay extra when buying so they can upcharge a lot. If it’s not soldered, you have a decent option to buy RAM afterward, so there’s less value in upselling too much.
So screw you Apple, I’m not buying your products until they’re more repair friendly.
I had a extra stick of RAM available the other day so I went to open my wife’s Lenovo to see if it’d take it and the damn thing is screwed shut with the smallest torx screws I’ve ever seen, smaller than what I have. I was so annoyed
The real question is why you don’t have a complete precision screwdriver set.
I thought I did! Until I got the smallest one out and it just spun on top of the screw
I bought the E495 because the T495 had soldered RAM and one RAM slot, while the E495 had both RAM slots replacable. Adding more RAM didn’t need any special tools. Newer E-series and T-series both have one RAM slot and some soldered RAM. I’m guessing you’re talking about one of the consumer lines, like the Yoga series or something?
That said, Lenovo (well, Motorola in this case, but Lenovo owns Motorola) puts all kinds of restrictions to your rights if you unlock the bootloader of their phones (PDF version of the agreement). That, plus going down the path of soldering RAM gives me serious concerns about the direction they’re heading, so I can’t really recommend their products anymore.
If I ever need a new laptop, I’ll probably get a Framework.
Yeah, it’s a Yoga
The document mentions a lot of US laws. I wonder if they try the same over in the EU.
I’m guessing it wouldn’t hold. But I’m in the US, so I’ll just avoid their phones going forward, and will probably avoid their laptops and whatnot as well just due to a lack of trust.
I keep looking at the Frameworks, because I’m happy with the philosophy, but the problem is that the parts that they went to a lot of trouble to make user-replaceable are the parts that I don’t really care about.
They let you stick a fancy video card on the thing. I’d rather have battery life – I play games on a desktop. If they’d stick a battery there, that might be interesting.
They let you choose the keyboard. I’m pretty happy with current laptop keyboards, don’t really need a numpad, and even if you want one, it’s available elsewhere. I’ve got no use for the LED inserts that you can stick on the thing if you don’t want keyboard there.
They let you choose among sound ports, Ethernet, HDMI, DisplayPort, and various types of USB. Maybe I could see putting in more USB-C then some other vendors have. But the stuff I really want is:
A 100Wh battery. Either built-in, or give me a bay where I can put more internal battery.
A touchpad with three mechanical buttons, like the Synaptics ones that the Thinkpads have.
The fact that they aren’t soldering in the RAM and NVMe is nice in that they’re committing to not charging much more then market rate, so I guess they should get credit for that, but they are certainly not the only vendor to avoid soldering those.
Yeah, ThinkPad used to allow either a CD drive or an extra battery in their T-series. They stopped offering the extra battery and started soldering RAM, so I got the cheaper E-series (might as well save cash if I can get what I want).
I think there’s a market there. Have an option for a hot-swap battery to bring on trips and use the GPU at home. Serious travelers could even bring a spare battery to keep working for longer.
Yes please! And give me the ThinkPad nipple as well. :) If they had those, I’d not bother with even looking at Lenovo. The middle button is so essential to my normal workflow that any other laptop (including my fancy MacBook for work) feels crappy.
I’m guessing the things they made modular are just the low hanging fruit. It’s pretty easy to make a USB-C to whatever port, it’s a bit harder to make a pluggable battery in a slot that can also support a GPU.
I don’t know if I’d recommend it, but if you are absolutely set on having the Thinkpad nipple – I don’t use it, even if I really want the Thinkpad trackpad – the factory that made the original IBM Model M keyboards is still in business somewhere in Kentucky. IIRC the employees bought it or something when IBM stopped making the things. They offer a nipple keyboard, goes by the name of “Endura Pro”. checks Unicomp. That’s the remnants in the US of the IBM business; the Chinese Lenovo purchased the laptops and also do the Trackpoint.
I got one like twenty years back, and while the actual buckling-spring keyswitches on the keyboard are pretty much immune to time, I wore out the switches on the mouse buttons, so I don’t know if I can give a buy recommendation for the mouse-enabled version (though maybe they improved the switches there). But if you really, really like it, that might be worthwhile for you. Last I looked they were still making them.
checks
They’ve got a message up saying that a supplier of a component used in that keyboard went under due to COVID so they suspended production. I don’t know what the status is on that.
www.pckeyboard.com/mm5/merchant.mvc?Screen=CTGY&C…
Keep in mind that this is a very large, heavy keyboard that you could brain someone with; if you’re going to haul it around with a laptop, it’s going to be larger and heavier than the laptop. Mentioning it mostly since I figure that you might use it at some location where you could leave the keyboard.
The thing is, I only like the Trackpoint in a laptop. It’s really nice to scroll while holding the middle mouse button and just shifting my finger. That way, my hand is ready to type, unlike using the trackpad, where I have to move my hands to type, and it works well in my largely keyboard-driven workflow (ViM for text editing, Trackpoint for web browsing).
On a desktop, I have multiple screens and way more real estate, so the Trackpoint isn’t nearly as effective and it’s worth using the mouse instead.
But I honestly don’t use my laptop all that often, so it’s something I’m fine doing without. But all other things being similar, I’ll prefer the Trackpoint since it’s a nice value add.
It’s cool that they’re making those keyboards though. I have and nice mechanical keyboards, so I’m not looking for one, but I would be very interested in a Framework-compatible keyboard with a Trackpoint.
Torx is legitimately useful for small screws, because it’s more resistant to stripping than Phillips.
Now, if they start using Torx security bits or some oddball shapes, then they’re just being obnoxious. But there are not-trying-to-obstruct-the-customer reasons not to use Phillips.
IFixit kit is a great toolset from the site that has every type of bit in it.
Got myself an IFixit Mako a while ago, really nice even if I mostly just use the philips head ones
Right? It’s nice to have the occasional reverse tri head metric upside down weird random bit when you need it.
Does it have triangle bits? Nintendo uses some really unusual driver shapes.
I’ve taken apart so so so many things… sometimes for the right reasons and sometimes for the wrong reasons…my ZuneHD still works. I’ll never ever try to open a Surface product.
Oh, that shit is soldered on…
I mean, I did see that on some laptops, but only those cheap things in €150 range (new) which even use eMMC for storage.
Yup, all Apple laptops have soldered RAM for some years now…
It became pretty common even on higher end laptops when they switched to DDR5, but some manufacturers are starting to go back to socketed RAM.
In my opinion disadvantages of user-replaceable RAM far outweigh the advantages. The same goes for discrete GPUs. Apple moved away from this and I expect PC manufacturers to follow Apple’a move in the next decade or so, as they always do.
Here’s how I see the advantages of soldered RAM:
The risk of physical damage is so incredibly low already, and energy use of RAM is also incredibly low, so neither of those seem important.
So that leaves performance, which I honestly haven’t found good numbers for. If you have this, I’m very interested, but since RAM speed is rarely the bottleneck in a computer (unless you have specific workloads), I’m going to assume it to be a marginal improvement.
So really, I guess “smaller” is the best argument, and I honestly don’t care about another half centimeter of space, it’s really not an issue.
This is where you’re mistaken. There is one thing that integrated RAM enables that makes a huge difference for performance: unified memory. GPUs code is almost always bandwidth limited, which why on a graphics card the RAM is soldered on and physically close to the GPU itself, because that is needed for the high bandwidth requirements of a GPU.
By having everything in one package, CPU and GPU can share the same memory, which means that you eliminate any overhead of copying data to/from VRAM for GPGPU tasks. But there’s more than that, unified memory doesn’t just apply to the CPU and GPU, but also other accelerators that are part of the SoC. What is becoming increasingly important is AI acceleration. UMA means the neural engine can access the same memory as the CPU and GPU, and also with zero overhead.
This is why user-replaceable RAM and discrete GPUs are going to die out. The overhead and latency of copying all that data back and forth over the relatively slow PCIe bus is just not worth it.
Do you have actual numbers to back that up?
The best I’ve found is benchmarks of Apple silicon vs Intel+dGPU, but that’s an apples to oranges comparison. And if I’m not mistaken, Apple made other changes like a larger bus to the memory chips, which again makes comparisons difficult.
I’ve heard about potential benefits, but without something tangible, I’m going to have to assume it’s not the main driver here. If the difference is significant, we’d see more servers and workstations running soldered RAM, but AFAIK that’s just not a thing.
The thing with benchmarks is that they only show you the performance of the type of workload the benchmark is trying to emulate. That’s not very useful in this case. Current PC software is not build with this kind of architecture in mind so it was never designed to take advantage of it. In fact, it’s the exact opposite: since transferring data to/from VRAM is a huge bottleneck, software will be designed to avoid it as much as possible.
For example: a GPU is extremely good at performing an identical operation on lots of data in parallel. The GPU can perform such an operation much, much faster than the CPU. However, copying the data to VRAM and back may add so much additional time that it still takes less time to run it on the CPU, a developer may then choose to run it on the CPU instead even if the GPU was specifically designed to handle that kind of work. On a system with UMA you would absolutely run this on the GPU.
The same thing goes for something like AI accelerators. What PC software exists that takes advantage of such a thing?
A good example of what happens if you design software around this kind of architecture can be found here. This is a post by a developer who worked on Affinity Photo. When they designed this software they anticipated that hardware would move towards a unified memory architecture and designed their software based on that assumption.
When they finally got their hands on UMA hardware in the form of an M1 Max that laptop chip beat the crap out of a $6000 W6900X.
We’re starting to see software taking advantage of these things on macOS, but the PC world still has some catching up to do. The hardware isn’t there yet, and the software always lags behind the hardware.
It’s coming, but Apple is ahead of the game by several years. The problem is that in the PC world no one has a good answer to this yet.
Nvidia makes big, hot, power hungry discrete GPUs. They don’t have an x86 core and Windows on ARM is a joke at this point. I expect them to focus on the server-side with custom high-end AI processors and slowly move out of the desktop space.
AMD has the best papers for desktop. They have a decent x86 core and GPU, they already make APUs. Intel is trying to get into the GPU game but has some catching up to do.
Apple has been quietly working towards this for years. They have their UMA architecture in place, they are starting to put some serious effort into GPU performance and rumor has it that with M4 they will make some big steps in AI acceleration as well. The PC world is held back by a lot of legacy hard and software, but there will be a point where they will have to catch up or be left in the dust.
I understand the scepticism, but without links of what you’ve found or which parts in particular you consider dubious claims (ram speed can be increased when soldered, higher speeds lead to better performance, etc) it comes across as “i don’t believe you, because i choose to not believe you”
LTT has made a comparison video on ram speeds: www.youtube.com/watch?v=b-WFetQjifc
Do you need proof that soldered ram can be made to run faster?
Here is an alternative Piped link(s):
https://www.piped.video/watch?v=b-WFetQjifc
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Yes, and the results from that video (i assume, I skimmed it, but have watched similar videos) is that the difference is negligible (like 1-10FPS) and you’re usually better off spending that money on something else.
I look at the benchmarks between the Intel MacBook Pro and the M1 MacBook Pro, and both use soldered RAM, yet the M1 gets so much better performance, even on non-GPU tasks (e.g. memory-heavy unit tests at work went from 3-5min to 45-50sec from latest Intel to M1). Docker build times saw a similar drop. But it’s hard for me to know what the difference is between memory vs CPU changes. I’d have to check, but I’m guessing there’s also the DDR4 to DDR5 switch, which increases memory channels.
The claim is that proximity to the CPU explains it, but I have trouble quantifying that. For me, a 1-10FPS drop isn’t enough to reduce repairability and expandability. Maybe it is for others though, but if that’s the difference, that’s a lot less than the claims they seem to make.
The video has a short section on productivity (i.e. rendering or compiling). That part is probably the most relevant for most people. Check the chapter view in YouTube to jump directly to it.
I think a 2x performance improvement is plausible when comparing non-soldered ram to the Apple silicon, which goes even further and has the memory on the die itself. If, of course, ram is the limiting factor.
The advantages of upgradable, expandable ram are obvious. But let’s face it: most people don’t need and even less use that capability.
Looks about the same as the rest. Big gains for handbrake, pretty much nothing for anything else. And that makes sense, because handbrake will be doing lots of roundtrips to the GPU for encoding.
On the package, not the die. But perhaps that’s what you meant. On die would be closer to a massive cache like on the X3D AMD chips.
The performance improvement seems to be that Apple has a massive iGPU, not anything to do with RAM next to the CPU. So in CPU-only benchmarks, I’d expect the lion’s share of the difference to be CPU design and process node, not the memory.
Also, unified memory isn’t particularly new, APUs have supported it for years. It’s just not well utilized by devs because most users have dGPUs. So I think the main innovation here is Apple committing to it and providing tooling for devs to utilize the unified memory better, like console manufacturers have done.
So I guess that brings a few more questions:
I guess we’re kind of seeing it with the gaming PC handhelds, like Steam Deck and Ayaneo etc al, so maybe that’ll become more mainstream.
“unified memory” is an Apple marketing term for what everyone’s been doing for well over a decade. Every single integrated GPU in existence shares memory between the CPU and GPU; that’s how they work. It has nothing to do with soldering the RAM.
You’re right about the bandwidth though, current socketed RAM standards have severe bandwidth limitations which directly limit the performance of integrated GPUs. This again has little to do with being socketed though: LPCAMM supports up to 9.6GT/s, considerably faster than what ships with the latest macs.
The only way discrete GPUs can possibly be outcompeted is if DDR starts competing with GDDR and/or HBM in terms of bandwidth, and there’s zero indication of that ever happening. Apple needs to puts a whole 128GB of LPDDR in their system to be comparable (in bandwidth) to literally 10 year old dedicated GPUs - the 780ti had over 300GB/s of memory bandwidth with a measly 3GB of capacity. DDR is simply not a good choice GPUs.
Wrong. Unified memory (UMA) is not an Apple marketing term, it’s a description of a computer architecture that has been in use since at least the 1970’s. For example, game consoles have always used UMA.
Again, wrong.
While iGPUs have existed for PCs for a long time, they did not use a unified memory architecture. What they did was reserve a portion of the system RAM for the GPU. For example on a PC with 512MB RAM and an iGPU, 64MB may have been reserved for the GPU. The CPU then had access to 512-64 = 448MB. While they shared the same physical memory chips, they both had a separate address space. If you wanted to make a texture available to the GPU, it still had to be copied to the special reserved RAM space for the GPU and the CPU could not access that directly.
With unified memory, both CPU and GPU share the same address space. Both can access the entire memory. No RAM is reserved purely for the GPU. If you want to make something available to the GPU, nothing needs to be copied, you just need to point to where it is in RAM. Likewise, anything done by the GPU is immediately accessible by the CPU.
Since there is one memory pool for both, you can use RAM more efficiently. If you have a discrete GPU with 16GB VRAM, and your app only needs 8GB VRAM, that other memory just sits there being useless. Alternatively, if your app needs 24GB VRAM, you can’t run it because your GPU only has 16B, even if you have lots of system RAM available.
With UMA you can use all the RAM you have for whatever you need it for. On an M2 Ultra with 192GB RAM you can use almost all of that for the GPU (minus a little bit that’s used for the OS and any running apps). Even on a tricked out PC with a 4090 you can’t run anything that needs more than 24GB VRAM. Want to run something where the GPU needs 180MB of memory? No problem on an M1 Ultra.
It has everything to do with soldering the RAM. One of the reason iGPUs sucked, other than not using UMA, is that GPUs performance is almost limited by memory bandwidth. Compared to VRAM, standard system RAM has much, much less bandwidth causing iGPUs to be slow.
A high-bandwidth memory bus, like a GPU needs, has a lot of connections and runs at high speeds. The only way to do this reliably is to physically place the RAM very close to the actual GPU. Why do you think GPUs do not have user-upgradable RAM?
Soldering the RAM makes it possible to integrate a CPU and an non-sucking GPU. Go look at the inside of a PS5 or XSX and you’ll see the same thing: an APU with the RAM chips soldered to the board very close to it.
LPCAMM is a very recent innovation. Engineering samples weren’t available until late last year and the first products will only hit the market later this year. Maybe this will allow for Macs with user-upgradable RAM in the future.
What use is high bandwidth memory if it’s a discrete memory pool with only a super slow PCIe bus to access it?
Discrete VRAM is only really useful for gaming, where you can upload all the assets to VRAM in advance and data practically only flows from CPU to GPU and very little in the opposite direction. Games don’t matter to the majority of users. GPGPU is much more interesting to the general public.
Apologies, my google-fu seems to have failed me. Search results are filled with only apple-related results, but I was now able to find stuff from well before. Though nothing older than the 1990s.
Do you have an example, because every single one I look up has at least optional UMA support. The reserved RAM was a thing but it wasn’t the entire memory of the GPU instead being reserved for the framebuffer. AFAIK iGPUs have always shared memory like they do today.
I don’t disagree, I think we were talking past each other here.
Here’s a link to buy some from Dell: www.dell.com/en-us/shop/…/memory. Here’s the laptop it ships in: www.dell.com/en-au/…/precision-16-7670-laptop. Available since late 2022.
gestures broadly at every current use of dedicated GPUs. Most of the newfangled AI stuff runs on Nvidia DGX servers, which use dedicated GPUs. Games are a big enough industry for dGPUs to exist in the first place.
What kind of disadvantages do you see?
User replaceable RAM is slow, which means you can’t integrate the CPU and GPU in one package. This means a GPU with it’s own RAM, which has huge disadvantages.
Even a 4090 only has 24GB and slow transfers to/from VRAM. The GPU can only operate on data in VRAM, so anything you need it to work on you need to copy over the relatively slow PCIe bus to the GPU. Then once it’s done you need to copy the results back over the PCIe bus to system RAM for the CPU to be able to access it. This considerably slows down GPGPU tasks.
Ah yeah, I see. That’s definitely a downside if you work with something where that becomes a factor.
These days I don’t realistically expect my RAM requirements to change over the lifetime of the product. And I’m keeping computers longer than ever: 6+ years where it used to be 1 or 2.
People have argued millions of times on the internet that Apple’s products don’t meet people’s needs and are massively overpriced. Meanwhile they just keep selling like crazy and people love them. I think the issue comes from having pricing expectations set over the in race-to-the-bottom world of commoditized Windows/Android trash.
I upgraded my personal laptop a year or so after I got it (started with 8GB, which was fine until I did Docker stuff), and I’m probably going to upgrade my desktop soon (16GB, which has been fine for a few years, but I’m finally running out). My main complaint about my work laptop is RAM (16GB I think; I’d love another 8-16GB), but I cannot upgrade it because it’s soldered, so I have to wait for our normal cycle (4 years; will happen next year). I upgraded my NAS RAM when I upgraded a different PC as well.
I don’t do it very often, but I usually buy what I need when I build/buy the machine and upgrade 3-4 years later. I also often upgrade the CPU before doing a motherboard upgrade, as well as the GPU.
I might agree if Apple hardware was actually better than alternatives, but that’s just not the case. Look at Louis Rossmann’s videos, where he routinely goes over common failure cases that are largely due to design defects (e.g. display cable being cut, CPU getting fried due to a common board short, butterfly keyboard issues, etc). As in, defects other laptops in a similar price bracket don’t have.
I’ve had my E-series ThinkPad for 6 years, with no issues whatsoever. The USB-C charge port is getting a little loose, but that’s understandable since it’s been mostly a kids Minecraft device for a couple years now, and kids are hard on computers. I had my T-Mobile series before that for 5-ish years until it finally died due to water damage (a lot of water).
Apple products (at least laptops) are designed for aesthetics first, not longevity. They do generally have pretty good performance though, especially with the new Apple Silicon chips, but they source a lot of their other parts from the same companies that provide parts for the rest of the PC market.
If you stick to the more premium devices, you probably won’t have issues. Buy business class laptops and phones with long software support cycles. For desktops, I recommend buying higher end components (Gold or Platinum power supply, mid-range or better motherboard, etc), or buying from a local DIY shop with a good warranty if buying pre built.
Like anything else, don’t buy the cheapest crap you can, buy something in the middle of the price range for the features you’re looking for.
Even if they are right, no one cares and it will always be a bad look.
I’m fine with this.
I mean, I have no interest in an 8GB machine, but it’s also fair to say that there definitely are people who are fine with it, and who would like to save the money. Say you’ve got four kids and you’re buying them all laptops – I dunno if that’s the thing parents do these days, or whether kids typically just get by on smartphones or what. And sometimes they get broken or whatnot, and you’re paying for the other expenses associated with those kids. That money adds up.
Apple runs a walled garden, unless things have changed in recent years while I wasn’t watching. They tried opening up to third-party hardware vendors back around 2000 with some third-party PowerPC vendors, found that too many users were buying that hardware instead of theirs, and killed off the clone vendors. That means that if you want to use MacOS, you have to buy Apple hardware. And so there’s good reason to have a broad range of offerings from Apple, even some that are higher-end or lower-end than the typical user might want, because Apple is the only option that MacOS users have. If I want to run Linux on a machine with 2GB of memory, I can do it, and if I want to run Linux on a machine with 256GB GB of memory, I can do it. MacOS users need to have an offering from Apple to do that.
Plus, I assume that these are running some form of solid-state storage, which makes hitting virtual memory a lot less painful than was the case in the past.
I agree. But we still have to listen to all the bitching.
We both have 8GB Airs in our house, an M1 and an M2. They run just fine.
The thing is that Apple charges three kidneys per gigabyte over 8 GB.
If you’ve got four kids and you’re buying them all laptops, I don’t think buying them all Macs and “saving money” by getting cut-down machines with too little memory (or whatever other hobbling Apple may cook up now or later) is exactly the smart play. You would need to have a very compelling reason to absolutely have to run MacOS to the exclusion of everything else which if we’re honest, most people don’t.
A Lenovo IdeaPad Slim, just to pick an example out of a hat that contains many other options, costs half as much as the low spec 2024 Macbook Air the article is spotlighting while having double the RAM, double the SSD, and, you know, ports. For the cost of a 8GB Macbook Pro you could buy a Legion Slim with an i7 and an RTX4060 in it and have change left over, a machine which would blow that Mac out of the water.
There are a lot of things you can say about Macbooks, but being a good value for the money is consistently never one of them.
Save money, buy an Apple computer. Choose one.
It’s okay if you run efficient OS on it, not the case.
That doesn’t help with memory hungry apps though.
There are people who never touch anything but the browser and email. For them the SSD keeping some page files is good enough
That’s no justification for selling a >$1,000 MacBook Pro with only 8GB of RAM, though. It’s specifically marketed as a professional-class machine.
Yeah clearly the PRO part with just 8Gb of RAM is the problem.
Come on that’s what I have on my Surface Go 1 from 2019. It runs Linux perfectly and is okay for my needs, but I wouldn’t put such specs on a PRO thing😅
OSX is waaaay more memory efficient than windows…
Yeah my blueprint of efficient os it isn’t Windows also.
The recent stats I’ve seen indicate macOS usually uses more ram
Anything is way more efficient than windows. That’s very low(or high, but you need to go under it) bar.
Can your macos run on router with 32MB RAM? Or on most powerful supercomputer? Or both?
And it’s not RAM, it’s UM for an SoC. The usage of memory changed with the introduction of Apple Silicon.
“Unified” only means there’s not a discrete block for the CPU and a discrete block for the GPU to use. But it’s still RAM- specifically, LPDDR4x (for M1), LPDDR5 (for M2), or LPDDR5X (for M3).
Besides, low-end PCs with integrated graphics have been using unified memory for decades- no one ever said “They don’t have RAM, they have UM!”
Yes, that’s true, but it’s still an indicator of an uninformed reporter.
Apple Silicon chips pass data from one dedicated cores directly to another without the need of passing through memory, hence the smaller processor cache. There are between 18 and 58 cores in the M3 (model dependent). The architecture works very differently than the conventional CPU/GPU/RAM model.
I can run FCP and Logic Pro and have memory to spare with 16GB of UM. The only thing that pushes me into swap is Chrome. lol
It’s a pointless distinction.
And in this case, it makes 8gig look even worse.
Maybe you’re not familiar with the apps I’m referring to. Final Cut Pro and Logic Pro are professional video and audio workstations.
If I tried to master an export from Adobe Premiere Pro in Protools on PC I’d need 32GB of RAM to to prevent stutter. I only use ~12GB of 16GB doing the same on Apple Silicon.
8GB of UM is not for someone running two pro apps at once. It’s for grandma to use for online banking and check her email and Facebook.
My dude, you’re literally in here arguing that because Apple has a blob for both CPU memory and GPU memory that somehow makes that blob “not RAM.” Apple’s design might give fantastic performance, but that’s irrelevant to the fact that the memory on the chip is RAM of known and established standards.
Read my other replies to this comment. There’s no GPU. It’s an SoC.
BCM2835 is SoC too. And RK3328. And Mali-450 is GPU.
apple.com/…/apple-unveils-m3-m3-pro-and-m3-max-th…
Each power intensive process is given its own dedicated core. The OS is designed specifically to send dedicated processes to the associated core. For example, your CPU isn’t bogged down decrypting data while loading an application.
You can’t compare it to anything else out at this time. Just learn about it, or don’t. Guessing is just a waste of time.
docs.kernel.org/scheduler/sched-capacity.html
Basic priority-based scheduling.
Sent to one of two processors on a PC, or 18-52 dedicated cores in an M chip.
Great topic switch. Also what century do you live?
The topic is substantiating that 8GB of UM on an Apple Silicon Mac being acceptable for a base model.
I’ve explained how the UM is used strictly as a storage liaison due to the processor having a multitude of dedicated cores, with the ability to pass data directly without utilizing UM.
I don’t know what you want from me, but maybe you should just do your own homework instead of being combative with people who understand something better than you.
I really doubt they run apps with cache turned into scratchpad memory.
Like has been done on laptops with on-board video cards since, well, forever?
It’s different. The GPU is broken into several parts and integrated into the SoC along with the CPU’s dedicated processes. Data is passed within the SoC without entering UM. It’s exclusively used as a storage liaison.
You should check out Apple Silicon M-Series. Specs don’t translate to performance in the way conventional PC architecture does. I guarantee you’ll see PC manufacturers going to 2nm SoC configurations soon enough. The performance is undeniable.
Soooo Integrated Graphics?
Negative.
apple.com/…/apple-unveils-m3-m3-pro-and-m3-max-th…
So it’s not on same chip with CPU?
A CPU performs integer math.
A GPU performs floating-point math.
Those are only two of the 18-52 cores (model dependent) of Apple M chips. The OS is designed around this for maximum efficiency. Most Macs don’t even have a fan anymore.
There. Is. No. Comparison. In. PC.
A GPU performs integer math.
A CPU performs floating point math.
All four statements are true.
That’s correct. My mistake.
Dude it is just GDDR#, the same stuff consoles use PC’s have had this ability for over a decade there mate apple is just good at marketing.
What’s next? When VRAM overflows it gets dumped into regular ram? Oh wait PC’s can do that too…
With independent CPU and GPU, sure. There’s no SoC that performs anywhere near Apple Silicon.
According to benchmarks the 8700G vs M3 is on average 22% slower single core, and is 31% faster multicore, FP32 is 41% higher than the M3 and AI is 54% slower 8700G also uses 54% more energy
What about those stats says AMD can’t compete? 8700G is a APU just as is the M3
I’m talking about practical use performance. I understand your world, you don’t understand mine. I’ve been taking apart and upgrading PCs since the 286. I understand benchmarks. What you don’t understand, is how MacOS uses the SoC in a way where benchmarks =/= real-world performance. I’ve used pro apps on powerful PCs and powerful Macs, and I’m speaking from experience. We can agree to disagree.
I grew up with a Tandy 1000 and was always getting yelled at for taking it apart along with just about every PC we owned after than too.
Benchmarks are indicative of real world performance for most part. If they were useless we wouldn’t use them, kinda like userbenchmark.
The one benefit apple does have is owning its own ecosystem where they can modify the silicon/OS/Software to work with each other better.
Does not mean the M3 is the best there is and can’t be touched, that is just misleading
8700G is gonna stomp the M3 using Maxton’s software suite just as the M3 will stop the 8700G using Apples software suite.
Then also on-top if that the process node for manufacturing said silicon is different (3nm vs 4nm) that alone allows for a 20% (give or take some) performance difference just like every process node change in the past decade or so
I’ll take the loss on the experience part as the only apple product I own is an Apple TV 4k, but there are many nuances you’ve obviously glossed over
Is the M3 a good piece of silicon? Yes Is it the best at EVERYTHING? Of course not Should apple give up because they are not the best? Fuck no
Man, you’re kinda off the point. This is about how much UM is appropriate for a base model. I’m simply saying the architecture of an SoC utilizes UM as a storage liaison exclusively, since CPU and GPU are cores of the same chip. It simply does not mean the same thing as 8GB of RAM in standard architecture. As a pro app user, 16GB is enough. 8GB is plenty for grandma to check her Facebook and online banking.
Am I missing the point really? UM is not a new concept. Specifically look at the PS5/X:SX
pcgamer.com/this-amd-mini-pc-kit-is-likely-made-o…
Notice the soldered RAM and lack of video card? Kinda like what the M series does.
And when all is said and done, 8gb is not nearly enough and apple should be chastised for just like Nvidia when they first decided to make 5 different variations of the 1060 making sure 4 of those variations will become ewaste in a few short years and again with the 3050 6gb vs 3050 8gb
They both have have independent CPU and GPU. UM is not used to pass from CPU to GPU on an SoC system, it’s exclusively a storage liaison. Therefore it’s used far less than in non SoC applications.
The CPU and GPU are one chip. Learn about Apple Silicon SoC rather than trying to find a comparison. You won’t find one anywhere yet.
Have you used an 8 gig ARM Mac?
I’m pretty brutal on my machines, and if my 8 gig m1 really only starts to beach ball when multiple accounts are open, and those accounts all have bloated multimedia software running.
My 16 gig machines can handle that use case fine, but the 8 gig machine will occasionally beach ball.
Personally, I won’t buy an 8 gig config again. But I’m a fucking monster that leaves a million bloated things open across multiple active user sessions.
To be fair, M-series Macs are pretty insanely efficient with memory. Unless you’ve actually used one extensively, I can understand the attitudes here…BUT:
I’ve done broadcast animation for many years, and back in ‘21 delivered an entire season of info/explainer-type pieces for a network show — using Motion, Cinema 4D, and After Effects (+ Ai and Ps) — all of it running on a base-level, first-gen M1 Mini (8/256). Workflow was fast and smooth; even left memory-pig apps running in the background most of the time…not one hiccup. Oh, and everything was delivered in 4k.
So 8gb actually is plenty for most folks…even professionals doing some heavy lifting. Sure I’d go for 16 next one, but damn I was/am still impressed. (Maybe it sucks for gaming, I don’t do that so have no clue).
It’s clear that the M3 MacBooks are noticably slower with 8GB or RAM than with 16GB for various tasks, though, including photo & video editing, and 3D rendering.
Sure, 8GB gets the job done but why are Apple selling “professional” grade laptops in this price range that clearly require additional memory to reach peak performance?
Here is an alternative Piped link(s):
videos showing how the M3 MacBooks struggle with 8GB of RAM
Piped is a privacy-respecting open-source alternative frontend to YouTube.
I’m open-source; check me out at GitHub.
Point taken! Clearly more is always better. Don’t have any experience with the M2 or 3.
I’m just adding a personal experience with having the minimum be plenty to get big jobs done.
I get more because I know I’ll need more. I don’t get less and then complain I should have gotten more even though I knew I couldn’t upgrade later.
Really, Apple just shouldn’t have said what they did and they wouldn’t be in hot water.
It doesn’t matter how ‘insanely efficient’ they are. If your tasks need to use more than 8Gb of memory you are going to run out and start swapping to disk.
8gb worth of data is not heavy lifting for professional use.
…And yet…?
My point is that while of course more is better, 8 sufficed for me…a professional, doing demanding…professional…work.
Sufficed is not an objective term but still is not a favorable term especially for machines that cost that much.
Your original point was that apple’s cpu are somehow more ‘efficient’ with ram. That’s misinformation to put it kindly.
It mostly just shows how crazy fast modern SSDs are that they can do swap duties with performance that is acceptable to many people. The SSD in my MacBook Pro can read/write at 5-6 GB/s. That means it can write out the whole 8 GB of memory of one of those smaller machines in under 2 seconds. As long as your current task fits in 8 GB and you’re fine waiting 2 seconds to switch between apps…
Yes if you don’t run out of ram you won’t face ram performance issues…
I wouldn’t be ok waiting 2 seconds to switch between apps on something the price of Mac laptop, even the cheapest m1.
To be fair, at the price point of Macs, 16GB is easily achievable.
.
.
Apple is acting like this article does not exist.
I was using my 2016 (or so) MacBook Air the other day and getting low memory errors. I thought, wow, this thing only has 8 gb, maybe it’s time to upgrade, just to see this 😐
My 2009 Mac mini had 8gb of RAM. And it wasn’t even very expensive to do so when I did it in ~2013. Couple hundred bucks max.
Couple hundred bucks for 8 gigs of ram?
Yeah I didn’t even pay a hundred bucks for the 32GB of RAM in my current desktop, and it’s DDR5.
11 years ago. May very well have been less. Apples still probably charging more than that to go from 8 to 16 and I had to buy all 8 and replace both DIMMS.
Part of the difference is that the Apple silicon Macs aggressively use SSD swap to make up for limited memory. But that’s at expense of the SSD lifespan, which of course isn’t replaceable.
I’d never recommend a Mac, but the prices they charge to get a little more RAM or SSD over base are crazy. The only configurations offering any “value” are the base models with 8gb RAM.
Even the PC manufacturers selling “gaming” PCs using integrated graphics aren’t usually this brazen about it.
Apple said some pretty dumb things to defend that 8gb, but let’s not pretend that most manufacturers do the same thing.
For years people have known it can’t be upgraded. You know that going in.
No one complains that video cards on (most) laptops can’t be replaced, yet many of them wind up being useless for anything but daily tasks.
Not sure that is true, lots of people see the marketing for a MacBook and think that any of them will be enough. Or see the price difference and think they are getting a good deal, or don’t understand why that is. I’ve had to tell people, sorry I know you spent a lot of money on this, but it does not have the storage for what you are wanting to do. Yes, the only way is to buy another one.
Otherwise yea, everyone tries to gaslight customers into thinking they didn’t get ripped off.
Sure, some people buy a computer without knowing anything about the computer.
They say it right there. Should it be red and flashing? Should there be a confirm button?
If you go into the Apple Store, someone who is trained to help is always available, and various models are typically in stock.
I’d like to firmly repeat, that Apple never should’ve said that bullshit. Also I feel that 16 gigs should be the standard amount for any Apple laptop. They are premium products. Perhaps the Mac Mini could start at 8.
And since you pulled out the gaslight, I’ll call you a misinformed accuser.
I bought one of the early M1s and bought into a lot of the early reviewers that claimed 8 was enough on the ARM architecture. Honestly, for most folks, it’s probably fine. For me, it’s not.
My wife and I use the M1 has a multi-account family machine. And we’re both experience design directors, so we both have RAM hog design apps open under our accounts. The poor little Mac just can’t handle all that abuse with 8 gigs.
Our old ass Intel Mac with 16gig of RAM had no problems keeping a ton of crap open.
The battery life and low heat are absolutely amazing on the M1. That stuff was a monumental upgrade. But we absolutely can’t be lazy and just leave crap open unless it’s actually needed.
The fact that Apple is selling “Pro” machine with 8 gigs is a joke. 8 would be fine for my folks who fart around on Facebook all day, but it’s not enough for a lot of heavy multimedia work.
8 megs of RAM? I didn’t know they brought back the Macintosh II.
lol. Fixed. My brain is broken.
I dunno if you noticed or if that was the joke. But you said “8 megs” three times in your comment when I think you meant to say “8 gigs”. 1 gigabyte ~ 1024 megabytes. Just wanted to let you know in case it wasn’t a joke about how 8 wasn’t enough. That’s all, thank you!
lol. Apparently my brain is broken.
Actually, 1 gigabyte (10^9^ B) is 1000 megabytes (10^6^ B), while one gibibyte (2^30^ B) corresponds to 1024 mebibytes (2^20^ B). I know that in some circles, 1 GB is treated as 1 GiB, so I don’t blame you. This system of quantities is standardised internationally in order to conform with the SI (mega must mean a million times and not 2^20^ times), but many don’t conform to it, such as Microsoft as far as I know.
Thank you for the correction and details.
I found for most CS-ish tasks 8GB is okay. I also bought an early M1 and haven’t had too many problems outside of running VMs, which I expected. I purchased one of the stocked configurations at an Apple store, so there were slim pickings with 16GB of memory that weren’t like double the price of the machine.
Yeah, my guess is 2x accounts is the cause of 90% of my performance issues. One person’s Adobe crap is fine, but two us too much for 8gigs without the occasional beach ball.
Is Adobe still the standard? When I realized browsers and 3rd party apps render PDFs much quicker than Reader, I started looking for other alternatives to Adobe. I was familiar with the flow of PaintShop Pro and GIMP, so now the very little I did in Photoshop I do in GIMP/Inkscape/a couple other freebie tools. When they acquired Macromedia and killed Flash, I was out of their ecosystem, so my poor knowledge of their products is almost 2 decades old. What are their can’t-live-without products nowadays?
Depends what you’re doing, but for branding and print media, Adobe still dominates most shops. If you’re doing UX, then you’re probably in Figma these days.
Ooh, Figma looks interesting, thanks!
Yeah, Figma is the new standard for UX design. Adobe was trying to buy them for the last couple years because most people no longer use Adobe tools for UX work.
I’ll admit I don’t use Macs, so maybe they are more efficient than the Linux and windows machines I work off…
…but I typically use machines with 64GB and recently upgraded my personal machine to 128GB. I still swap about 50GB to my SSD from time to time.
And I’m not doing heavy graphic design or movie editing stuff.
I cannot fathom for the life of me how 8GB would ever be feasible.
How the fuck are you using that much ram of you aren’t doing “heavy duty” stuff???
I just said I’m not doing graphic design or movie editing. I typically have 10 different browser profiles open to separate data / bookmarks, maybe 8 email accounts in tabs and Outlook (if not on Linux), 4-8 VS code windows, a mix of jetbrains rider or visual studio instances, a smattering mix of postman/SQL server/azure data studio/thunder client, among other things like PDFs and documents. And then multiple docker containers and other local running servers.
The swap usually comes in when I’m parsing a data file or something.
I do not want to see what your desktop looks like lol
Hahaha, it stressed me out so I hide all the icons and changed the background to just black.
Excuse me, can I get some more pepper for this troll dish?
I do a lot on my M1 air and I haven’t even considered I would have RAM issues with 16GB. Windows, I would be getting 64GB to not be miserable. I don’t run as much as you all the time, but having a container or two going, far too many browser tabs, PDFs, 3-4 intellij projects, discord, teams, and probably other things I am forgetting about is the norm. I even have AutoCAD open sometimes.
The biggest difference is Mx is arm based, which goes a long way into getting better performance and battery life. I really need to look up again how Apple manages memory, swap, and performance in general. I just checked Activity Monitor and even with most of the memory showing as used, I don’t even notice. If my laptop were to die tomorrow due to my clumsy fumbling, I am getting another Mac. My only wish is getting Vulkan support. That would be amazing. Not going to hold my breath on that though.
Now, 8GB is a crime and it is not something I would recommend for any laptop/desktop, no matter what it is running. Not saying it wouldn’t work ok on a Mac for someone who only uses it for web browsing, but it is utterly ridiculous that 8GB is even an option these days. This is a dumb hill for Apple to die on and 16 should be the absolute minimum.
I have a debloated W11 VM on my proxmox server that I have used only once and is only there for some unknown emergency. With a little fiddling, I got it to idle under 4GB. I don’t plan to run servers on my laptop and invested enough on a little server rack to give me things like file servers, VMs, more permanent containers, and somehow got talked into making a gaming VM that I use at LAN parties. The 3U case for the main server travels very well.
Personally, I would try and get some of your server stuff off your machine. You can even take a look at some docker swarm or similar k8 concepts to reduce your container load. RPis are another good choice for some lower load server operations. I have a little RPi swarm that is powered by PoE+, though I plan on trying k8 on them soon to get some experience. RPis are also small enough that you could throw one in your bag if you needed something portable and are fairly inexpensive. Just a thought and may not be possible with your server applications.
Hmm, getting server stuff off sounds fun! I have a couple laptops sitting around so it might be fun to even just use those to offload some processes.
I’d love to get my own little server rack or something, not the best timing financially, but that’d be awesome.
I’ll have to look into the RPi thing. Thanks for the ideas!
For me, it’s huddle (the conf call thing of slack), zoom, and a few Google sheets. Very easy to get to OOM killer
With 64GiB or more of RAM?
That VM has 16.
I get the sense that a lot of people here don’t use MacOS.
I have a few ARM and Intel Macs in 8 and 16gig configs, and I do a lot of heavy multimedia work. My 8 gig M1 only really gets into trouble when my partner and I both have an account with files open in bloated creative software. One pro user, and it’s usually fine. 2 active accounts with shitty creative software running, and you get a few beach balls.
Interesting to know for sure! I guess I can’t speak to what they’re doing for optimizations first hand, but at the same time…my 128GB cost me like $300 on sale so, I dunno, a wash? Haha.
I’ve tried to become a Mac convert a few times, mostly peer pressure, but I just haven’t been able to do it successfully yet.
Yeah, if I’m building a PC, I’ll throw in as much RAM as I can get.
That said, with 16gigs I’m usually not thinking about RAM at all. I’d probably only want to go higher than that if I was living in Adobe Lightroom 24/7.
There are a ton of benchmark videos on YouTube. I saw one recently for the new MacBook Air comparing the 8/16/24 GB models. They found that 8GB was significantly slower than 16GB for tasks like exporting video, but there was no difference between 16 and 24 gb.
I wish that was true.
Do you understand kernel memory management fundamentals? I’m asking because what you wrote here strongly suggests otherwise - so, unless you’re able to show me I’m wrong, I’m going to stick with my conclusion that this is all incorrect and likely complete bullshit.
You do you.
You seem particular in a way that is breathtakingly unfun.
It’s worth mentioning that windows will use as much ram as possible just because it can and leave available with what it considers “reasonable”
Then WHAT ARE YOU DOING?
Code code code
I don’t think even Eclipse can burn so much memory
Dude, that’s how much RAM I used to have on a super high-end dev box at work with 56 cores. It was very helpful for compiling Chrome. WTF are you doing with a personal machine that needs that much RAM?
I mean it’s my personal machine but I am a software engineer consultant/contractor so I use it for work, too.
Ok fair enough. It’s just surprising to see someone say that. The standard-issue dev machine where I work is a laptop with 32 GB.
I have a five year old MBP here with 16 gigs of RAM and it runs the latest version of macOS. I can run multiple web browsers with dozens of open tabs, VS Code, an LLM, and a video editing app on it, all simultaneously, without breaking a sweat.
IDK what Apple’s secret sauce is but their shit just works better than everyone else’s, that’s a fact.
My basic web dev Docker suite uses about 13GB just on its own, which - assuming you were on 16GB (double Apple’s minimum) - wouldn’t leave much for things like browser tabs, which also eat memory for breakfast.
A fast swap is not an argument to short-change on RAM, especially since SSDs have a shorter lifespan than RAM modules. 16GB remains the absolute bare minimum for modern computing, and Apple is making weak, ridiculous excuses to pocket just a few extra bucks per MacBook.
Have you seen the difference between the 8 and 16Gb Macbooks, it is ridiculously expensive.
Nah its about £13 retail.
Oh wait, you mean from apple… Its £200 from them.
Yes, my bad, I wanted to say the difference in price between the 8 and 16Gb model, I know that RAM became dirt cheap nowadays and there aren’t any excuses for Apple to continue offering 8Gb model, as this is exactly a planned obsolescence.
Yeah I was just pointing out the insanity of their pricing, using sarcasm. Its the main way we communicate over here.
The price difference between the first 2 models where 8gb ram is the only change, is £200. Post 2025 I’m going to need some solution to replace my windows install which solely runs CAD/CAM software. If it wasn’t for this scumbaggery I’d buy a Mac to replace win10, but at present apple are such a shower of cunts I think I may have to put up with win11.
What a fucking choice…
You know there’s a third way…
I already run Linux for everything else. Its not an option for my CAD work unfortunately
What do you bist that takes that much memory?
PS5 has 16GB and it’s a toy.
Wow! 13GB! I did some heavy stuff on my computer with like a shit ton of Docker servers running together + deployment and I never reached 13GB!
Without disclosing private company information lol what are you doing ;)
not OP, but I have to run fronted and backend of a project in docker simultaneously (multiple postgres and redis dbs, queues, search index, etc., plus two webservers), plus a few browser tabs and two VSCode instances open, regularly pushes my machine over 15gb ram usage
pretty much like this
That was a fun song, t4t.
That is basically my use-case. You add a DB service (or two), DNS, reverse proxy, Redis, Memcached, etc… maybe some containers for additional proprietary backend services like APIs, and then the application themselves that need those things to run… it adds up FAST. The advantage is that you can have multiple projects all running simultaneously and you can add/remove/swap them pretty easily.
RAM is cheap. There is no excuse for shipping a 8GB computer… even if it’s mostly going to be used for family photos and internet.
Running a suite of services in containers (DBs, DNS, reverse proxy, memcached, redis, elasticsearch, shared services, etc) plus a number of discreet applications that use all those things. My day-to-day usage hovers around 20GB with spikes to 32 (my max allocation) when I run parallelized test suites.
Dockers memory usage really adds up fast.
Playing devils advocate here: As someone who deals with stuff like that, you also wouldn’t buy the base model mac. The average computer user can get by with 8GB just fine and it’s not like you can’t configure Macs with more than that.
That of course doesn’t justify the abhorrent price of the upgrades…
And here I am, putting 16gb in every machine I work on because it’s so damn cheap there’s no reason not to future proof
I mean, same. The difference in price for 8GB and 16GB is negligible, especially if you want dual channel on desktops
For apple, that difference is $200… not negligible I’d say
Oh yea, absolutely. I meant that in regards to the price of memory itself, be it as modules for your desktop PC or the chips itself for soldered solutions. Apple’s markup is bonkers
That’s because apple is a greedy grabby company who wants all your money. The easiest solution is to stop buying their products
My girlfriends mum wanted to know why her laptop was slow… It was because HP thought that 4gb of ram is acceptable in 2022 (when the laptop was sold). Granted ram wasn’t as cheap then as it is now… Still I paid £30 for a brand new 8gb DDR4 sodimm, there’s not reason hp couldn’t do that. It’s annoying the corners these company cut.
My experience is, that 4GB is just about useable for a bit of web browsing and similar stuff. Even on windows 11. I have an old Surface Pro 4 laying around that, in a pinch, works perfectly fine with 11. Of course, it’s not fast. But it’s totally useable.
Her laptop just wasn’t having it, windows 11, windows was using 3.7gb ram took about 30 seconds for task manager to open. As soon as I upgraded the ram is was usable.
I checked for any surprising background services or anti virus software and there was nothing really
That sounds more like issues Windows would have running on an HDD (or maybe eMMC) instead of an SSD… Bit that wouldn’t explain why it got better, when you upgraded the RAM…
It’s not worth trying to understand windows ram usage, it will drive any same person insane. The laptop uses intel optane as it’s main drive, which is slower than an SSD but much much lower latency so should actually be perfect for the job of being swap. But it shit the bed.
I just slap in 32GB on every computer I build because the MoBos can take 128GB and anything less feels cheap and silly.
Hard disagree. The average computer user is idling at 5gb already because the average computer user is stupid.
Still leaves 3gb for the web browser and the average user isn’t using anything else anyways. And even on chrome that’s quite a few pages.
No they can’t. I ran 8gb of ram for years and it turns out that that’s why my computer sucked
Maybe you’re not an average user then. Most people just browse the web and maybe manage some photos or fill out a document once in a while. You could do that on 4GB if you wanted to, let alone 8.
I wouldn’t say 4gb is usable for the average consumer. Using the assumption they’re using windows 11 that’ll eat 3.7 ish GB of ram just idling.
You forget there though, that a lot of the RAM, that Windows (and most modern operating systems) uses, while idling, is a cache of programs you’re likely to open and that gets cleared, if you open something else. That has been a thing since Vista and was btw one of the reasons why Vista was criticized for high memory useage. Windows 11 is very useable with 4GB of RAM, if you’re not planning to do something bigger than browsing the web or editing a word document.
I’m not forgetting that, but it won’t just clear that ram it will want to put it into swap, and depending on your storage speed that can slow tasks down. Making it quite stuttery.
I mean, a (good) SSD is worth quite a lot, even on very old systems. I have an old 2008 MacBook laying around. It’s certainly not fast but with an SSD it’s totally useable, even on current macOS versions.
Oh for sure, I remember buying my first SSD and booting windows in under 10 seconds and being like whaaaat.
I am starting to think maybe I am a ram hog.
How? I have 108 tabs open and still use 2.67GB of RAM.
Tabs of what? Chromes ram usage is more of a meme than an actual ram issue, windows will only allow an application to use so much ram depending on ram availability
108 tabs in chromium. Mentioned RAM usage is total RAM usage including all system and kernel, but excluding page cache. Forgot to mention libreoffice in background.
Skill issue
average webdev
The people need to know how you use 13GB of ram worth of containers for web dev.
Docker is awesome for a lot of things. But it’s not particularly good for RAM.
Why tf can’t they sell mac with upgradable parts?? They are “so” into renewable and recycling stuff and saving planet and stuff. Then they should start selling shits with upgradable parts. Even cpu’s if possible. Now apple fan boys argue with that. And don’t bullshit me with soc should be near cpu for faster optimisation they can redesign the mobo.
There are legitimate advantages of the RAM being soldered right next to the SoC. However, if anyone could figure out how to create a proprietary RAM module, that slots in right next to the SoC (or even just an SoC module including RAM) that can be swapped out and that doesn‘t have any meaningful performance impact, it would be Apple. Just that it never could be Apple…
The problem is the electrical resistance of the socket. Most of the performance on apple silicon is achieved through extremely high bandwidth, low latency memory. Unfortunately that necessitates a socketless design at the moment, and you can see that happening on the snapdragon X too.
Yea, not just snapdragon and apple. Even intel and amd processors usually get paired with higher bandwidth soldered ram on many mobile offerings.
And on GPUs soldered VRAM has been a thing for a loooong time, with HBM memory being the prime example for what RAM close to the chip can do. AMD‘s Vega cards were highly sought after during the mining craze, even though they weren’t that fast in general computing, simply because their memory bandwidth was so beyond any other consumer cards…
Because that gives the user as much or more control over the device as Apple themselves have. One of the fairly consistent things about Apple over the years has been a desire to maintain tight control for themselves over the products they make.
Because then they can’t gaslight people into thinking their 8GB is magical.
There is what they say they are in favor of, and there is what they really are in favor of.
They are in favor of apple getting all the monies, the end
They certainly used to. My wife’s 2012 MacBook Pro has upgraded RAM and SSD parts I’ve put in over the years and still runs fine, though it isn’t used much anymore and OS upgrades stopped a while ago.
Their current environmental marketing is pure greenwashing bullshit and their stances on upgradability and repairability are terrible.
It’s basically just greenwashing. They pretend to be into renewables and recycling only when it doesn’t disincentivize people from buying the newest product. Ex: iPhone trade in for recycling - Yes, they do recover some raw material but you can only do it if you’re buying a new iPhone with that credit, and its probably also an attempt to keep cheap used iPhones off of the market.
Cant have users getting all uppity with excess memory after all
Of course they do.
Tim Apple be like “We’ve tried charging more money. Have we tried charging more money and delivering less stuff in exchange?”
Yes, they do constantly. Yet, people still keep buying. I hate that I have to use Apple for my job because of the software and interface is exclusive.
Yup, same. I really don’t like macOS, but that’s what we’ve standardized on. I’m a Linux guy and use Linux at home for everything.
I really like my macbook for dev work, and I think that now that macos is essentially a linux distro it’s quite nice, but it’s not that much better than the free distros and it’s getting worse while they get better. Right now the only thing keeping me on a mac at work is that they gave it to me and the only thing keeping me on a mac at home is that it’s already paid for.
you wanna expand on why you think it’s basically a linux distro? Last i heard macos was more closely based on BSD than it was linux, and this was ages ago. Unless they rewrote it without my knowledge it really shouldn’t be anything like either one of the two.
because I can pop a terminal into zsh and beyond that I don’t really know the taxonomy
you can do that with WSL though.
All modern terminals are actually terminal emulators, unless you’re sitting in TTY. It’s pretty trivial to implement a proper UNIX/linux like CLI environment.
okay
and for completion sake here, technically windows implements a “terminal” through CMD, it’s not linux/unix like at all, but it is still a CLI interface, so.
Lol, audio jacks come to mind. As well as a physical button. And shipping devices without cords or chargers.
Granted, I’m a developer and my dev ide already uses a good 10+GB, I have probably hundreds of tabs and windows open over 6 desktops… But I got 64GB, and I’m considering upgrading to 128, and these clowns think 8 is okay today? My development laptop of like 10 years ago has 8GB
I’ve been okay with 16 for a while. I use ViM as my editor, and occasionally VSCode. I use a single desktop, but I generally have a half dozen or more tmux tabs for various parts of the project.
That said, I’ve been feeling a bit squeezed with 16GB. The main RAM consumers are:
So I think 16GB should be the minimum, and 24GB should be average. I’m going to be adding another 16GB to my personal development machine (hobbies and whatnot), and my work laptop can’t be upgraded (MacBook), but I’ll be upgrading to an M3 or M4 soonish and will request more RAM.
8GB is probably fine if you’re just running a browser and that’s it. If you’re doing anything else, 16GB should be the minimum.
.
Most people are similar to him.
I have 16GB and I have to run shit I dev on local k8s. I have to close teams and my browser to get enough ram sometimes.
Buy more memory, if you have the financial means to do so. If not then I’m sorry you’re in that situation
My students with the 8gb version struggle to do basic audio work with only a few plugins. This is BS from apple. Unless you use your computer only for web browsing, in which case you shouldn’t get a stupid mac in the first place.
To be fair I have no idea why audio plugins need so much ram
to be fair, apple is the one literally curating this experience so it “just works” only to then fuck it up somehow.
Apple has a masterclass of tiering their products in just a way so that in every tier but the upper tiers, you’re giving up something really important. If you spend the least you possibly can on a MacBook, apple guarantees you’re going to have a very bad time for “doing the bare minimum to be seen with a laptop with an apple logo on it”. Their whole tier system is an exercise in “how can we get away with fucking up these things just enough so the customer feels like it is necessary to spend a little bit more” every step of the way. Then they make it unupgradable so you can’t sidestep their crafted feature tier system.
nvidia also does this. It’s actually insane.
Love spending 200 USD on 16GB of ram in 2024 because of apple, very cool, or however much they charge, it’s still too much.
Latency is a bitch. If you want anything to run on real time with zero latency, then it means everything, including those pretty large sample data, has to be stored as close to the processor as possible. Compressing/decompressing takes a shit tonne of time and effort, and to keep both delay down and fidelity up, you have to pay in absurd amounts of RAM to the DAW shrine.
Lmao I’d take my chonky ass dell laptop with expandable ram any day of the week
my w520 would win in a fist fight against the latest macbook, hell any of them ever produced.
Isn’t “it’s good enough for most users” a little too close to “it’s good enough to be bought, used for a bit, and then tossed”? Usually computers that were adequate for X stop being able to do X. There’s little to no margin and you can’t upgrade it?
Yeah, sure. Even if what they say about the OS resource usage is true, it’s only a fraction of the total usage. A lot of the multiplatform software will use the same resources regardless of the OS. Many apps eat RAM for breakfast, doesn’t matter if it’s content creation or software development. Heck, even smartphones these days have have this much or more RAM.
I won’t argue, I just won’t buy an Apple product in the near future or probably ever at all.
buys [insert price] laptop, top of the line, flagship, custom silicon, built ground up to be purpose specific.
Opens final cut pro: crashes
ok…
Especially paired with Apple’s 128gb integrated, non replaceable hard drives. Whoops you installed all of Microsoft office? Looks like you have no room to save any documents :(
ah yes, we can’t forget the proprietary non controller based nvme drives that use m.2 but arent actually nvme drives, they’re just flash.
No way. It isn’t NVME?!?!
it’s NVME in the sense that it’s non volatile flash, probably even higher quality than most existing NVME ssds out there today.
The thing is that it literally just the flash. On a card with an m.2 pin out, that fits into an m.2 slot, it doesn’t have a storage controller or any standardized method of communication, that already exists. It’s literally a proprietary non standard standard form factor SSD.
The controller is integrated onto the silicon chip die itself, there is no storage controller on the storage itself.
Yet another reason to never go back to Apple
apple, we innovate where no one else does, because for some reason, we like doing that.
Apple, we innovate where “everyone” else has already done.
Fixed it for you.
Apple, we innovate things, sometimes, for reasons.
Same. And I bet you the price will also go up with less ram.
I mean. It makes sense. The vast majority of people buying apple computers are loyalists or people that simply need an Internet/word processor.
And if you want to develop in apple then you have to spend a massive premium for their higher end hardware.
Their CPUs are actually really good now, when the apps are actually optimized for them. Especially in single core, they are very competitive with top Intel or AMD chips while being way more power efficient.
ex: in Geekbench 5.1 single core the M2 max gets 1967 points (85%) compared to 2311 points from the 7950X3D and 2369 from the 14900k. The M2 max (12 cores (8 p + 4 e), 12 threads) can draw a maximum of 36 watts while the 7950X3D (16 cores, 32 threads) can draw around 250 watts, and the 14900k (16 cores (8 p + 16 e), 32 threads) can draw around 350 watts.
Apple’s GPUs are definitely lacking though, in terms of performance.
Ya. Their CPUs are really good. Got to give credit where credit is due.
8GB RAM is what my phone has.
Having that in a laptop shows what they think of people buying their kit. They think you’re only buying it so you can type easier on Facebook.
TBF 8gb of ram on a phone is actually psychotic. You really shouldn’t be doing all that much on a phone lol.
Then what should I be doing on my phone?
Obviously using it as a thin client for this MacBook, duh.
nothing that requires 8GB of ram lol.
I’ve played the entirety of java minecraft on an old thinkpad with 4GB of ram. It didn’t crash (i dont use swap)
There literally shouldn’t be anything capable of using that much memory.
Is this bait? Because like, you could be rendering, simulating, running virtual machines. Lots of stuff that aren’t web browsers also eat ram
Web browsers also eat ram.
90% of which can be paged in the background, it’s not like most people are chronically browsing the web on their phones.
Yes, they do.
and it’s also the worst place to do that. If you’re going to be chronically online like me, you should at least give it clear boundaries between something you carry on you at all times, and something that you regularly have access to, like my workstation for instance.
Unless you like being horribly depressed or something.
I was trying to mention things that weren’t just web browsers. Since it seemed the comment was about programs that use more ram than they seemingly need to.
Edit: There’s like photogrammetry and stuff that happens on phones now!
And games!
games are probably a better argument honestly, but even at that point, it’s not a really good experience. Unless you buy a gaming phone, which i guess is an option. Regardless the mobile gaming market is actually vile.
i suppose photo editing would be one? Maybe? I’m not sure how advanced photo editing would be on mobile, it’s not like you’re going to load up the entirety of GIMP or something.
As for photogrammetry, i’m not sure that would consume very much ram. It could, i honestly don’t think it would be that significant.
No, the photogrammetry apps all use cloud processing. The LIDAR ones don’t, but that’s only for Apple phones and the actual mesh quality is pretty bad.
on a phone? Why the fuck would anyone be running virtual machines on a phone?
My man, have you been to selfhosted? People are using smart phones for all kinds of crazy stuff. They are basically mini ARM computers. Particularly the flagships, they can do many things like editing video, rendering digital drawings, after they end their use life they can host adguards, do torrent to NAS, host nextcloud. You name it.
Something like the Samsung Dex app that basically turns your phone into a mini computer with kbm and a monitor wouldn’t bee too bad tbh for most people. Take all your shit with you in your pocket and dock it at home or at work or whatever.
yeah, i literally selfhost a server, running like 8 different services. I’m quite acclimated to it by now. Using a phone for this kind of thing is the wrong device. A chromebook is going to be a better alternative. You can probably get those cheaper anyway.
A big problem with phones is that they just aren’t really designed for that kind of thing, you leave a phone plugged in constantly and it’s going to spicy pillow itself. Let alone even trying to do that on something that isn’t an android. I cannot imagine the hell that self hosting on an android would be, let alone on an iphone.
I could see a usecase for it as a network relay in the event that you need a hyper portable node or something. GLHF with the dongling if you need those.
Unfortunately, if you already have a server, it’s going to be better to just spin up a new task on that server, as the cost of running a new device is going to outweight the cost of just using an existing one that’s already running. Also, you can get stuff like a raspi or le potato for pretty cheap also. not very powerful, but probably more utility, especially given the IO.
Yeah, god forbids anyone ever does anything suboptimal or worse…for fun 😱
i’m not saying that you can’t but like, you shouldn’t buy a phone with the prospect to turn it into a server. You should sell your old phone. Or use it until it dies. That’s probably going to be better in the long run honestly. You use a laptop? A desktop? An SBC even? All of those can be converted into a server with MUCH longer lifespans, and better software support.
Mobile hardware often has a support period of like 2-3 years, although that’s changed recently, the hardware expectancy is probably more like 5 years at most. Meanwhile, desktop hardware, and mobile hardware in particular can easily last like 10 years. Even longer if you’re ok with running legacy hardware.
My primary mobile laptops are 10 12 years old respectively. They’re perfectly fine for what i need. I would NOT want to be using a 10 year old phone for that.
If you aren’t the type of person buying or owning laptops, you almost certainly do not know what self hosting is.
It sounds a lot more cost effective to get a used mini-pc than a flagship phone for any sort of server stuff.
literally this, anything other than a phone is going to be more purpose suited. cheaper, and probably more versatile. You’re spending money on a really expensive screen that you are literally not going to be using. You might as well buy something with a shitty screen, or none at all.
I got a ThinkCentre M700 with an i7-6700, 16gb of ram and a 256gb SSD for $70 total. It’s really hard to get a phone with anywhere near that value for money.
exactly, even if we’re talking buying brand new modern desktop hardware. The sheer benefit you gain of having an sata port, and being able to stuff an 18TB exos drive on it, for example, will immediately pay itself off in terms of what cloud storage would cost, while also not being limited to your internet uplink speeds. You could easily run 10gig if you really wanted to. Although realistically, 2.5gb is going to be more apt.
.
Does the JVM count?
thats really funny, but no.
On a phone? I guess you could, although 4gb is probably enough for any video game that any amount of people use.
People use phone apps for photo and video editing these days. The common TikTok kid out there doesn’t use Adobe Premiere on a desktop workstation.
Phone apps often are desktop applications with a specialized GUI these days.
i mean yeah, but even then those aren’t significant filters, and what makes you think that tiktok isn’t running a render farm somewhere in china to collect shit tons of data? They’re already collecting the data, might as well provide a rendering service to make the UI nicer, but i don’t use tiktok so don’t quote me on it.
Those are also all built into tiktok, and im pretty sure tiktok doesn’t require 8GB of ram to open.
Pretty sure my Adobe Premiere comparison made it clear I wasn’t talking about the TikTok app itself but 3rd party apps to later upload to online services like TikTok.
Just because you are completely inapt to think of use cases, doesn’t mean they don’t exist.
i mean yeah, you could, but then tiktok doesn’t have you on it’s app, and im pretty sure tiktok has a pretty comprehensive editing tool set, otherwise people wouldnt be making as much edited content on it.
even then, there are still a lot of people that do edit video intended for 9:16 consumption, and they do it on PC. Primarily because it’s just a better place to edit things.
What about running a chrooted nix install and using a vnc to connect to it? While web browsing and playing a background video? Just because you don’t use your ram doesn’t mean others don’t. And no, I don’t use all my ram, but a little overhead is nice.
on a phone? I mean i suppose you could do that, but VNC is not a very slick remote access tool for anything other than, well, remote access. The latency and speed over WIFI would be a significant problem, i suppose you could stream from your phone to your TV, but again, most TVs that exist today are smart TVs so literally a non issue.
my example here was using a computer rather than a phone, to show that even desktop computing tasks, don’t really use all that much ram.
Well, then by that logic, since desktop computing tasks don’t really use all that ram: we shouldn’t need more than 8GB in a desktop ever. Yes, my example was a tad extreme, vnc-ing into your own VM on your phone, but my point was rather phones are becoming capable and replacing traditional computers more and more. A more realistic example is when I was using Samsung Dex the other day I had 80ish chrome tabs open, a video chat, and a terminal ssh’d into my computer fixing it. I liked the overhead of ram I had above me. Was I even close to 12GB? No. But it gave me room if I wanted another background program or had to spin something up quickly without disrupting my flow or lagging out/crashing.
if this is the logic we’re using, then we shouldn’t have phones at all. Since clearly they do nothing more than a computer. Or we shouldn’t have desktops/laptops at all. Because clearly they do nothing more than a phone.
I understand that phones are more capable, my point is that they have no reason to be more capable. 99% of what you do on a phone is going to be the same whether you spend 200 dollars on it, or 2000.
Yeah, but if you have plenty of RAM on Android, there’s a chance those apps you left in the background will still be running when you go back to them, rather than doing the usual Android thing of just restarting them.
yeah i get that, but i often only have like 2 apps open on my android phone (maybe three). And even if you didn’t have enough ram there’s no reason android can’t cache old apps to page file or something. Then you don’t need to restart them, just load it from page. Given how fast modern phone storage is likely to be, this should be pretty negligible.
My phone was manufactured in 2022, cost under USD250, and has 8gb of ram. New phones generally come with 12gb or more.
what a weird title bro, of course they argue in favor of it, they sell the fucking hardware that they created. Be a little weird if they just argued against it after spending billions designing and manufacturing it.
Regardless, i still can’t believe apple thought 8GB minimum was ok, genuinely baffling to me.
i have more ram on my old gpu apple sucks
A friend has a phone with more ram.
all my phones have more ram since like 2015
I also can not figure out why so many companies are selling them with only a 500Gb drive. SSD or HDD.
So they can charge more for an upgrade. Simple business tactics.
Don’t forget cloud services!
My X220 and T520 each have 16GB. The designed max was actually “only” 8GB, but it turns out 16 GB actually works. I replaced the RAM modules myself without asking Lenovo for permission. Those models came out in 2011.
My HP Omen 17" was designed for a maximum of 32GB ram. I’m currently running 64GB on it.
This was also true for Apple computers before they started soldering the ram in place. I remember going way over spec in my old G4 tower. Hell, I doubt the system would crash if you found larger ram chips and soldered them in.
You can’t even swap components with official ones from other upgraded models. Everything is tied down with verification codes and shit nowadays. So I doubt you could solder in new ram and get it to work.
Yeah lol my thinkcentre with a 6gen intel had only 8GB (I paid under 100€ for it) so I went shopping to double that on a second hand site, but the price for 4, 8 or the 16GB ddr4 ram stick (sodimm, there seems to be a flood of used ones) I bought was about the same, like 30€ shipping included, so now I got 24GB.
I get upgrades help the bottom line but considering that 8GB of RAM chokes the silicon they are allegedly so proud of… seems like a slap in the face to their own engineers (and the customer as well but that is not my point).
Like the upper management and C-suite give a fuck about any of their employees.
does that mean people wont be able to use chrome in their macs?
One tab only.
Apple has been really stretching their takes lately. Nice to see some fire under their ass though it’s not going to matter. Too many ignorant people falling for likeable propaganda.
As engineers, we should never insert proprietary interfaces into our designs. We shouldn’t obfuscate the design.
The motivation for these toxic practices comes from the business side because it’s profitable. These people won’t share the profits with you because they are psychopaths. Ultimately we are making more waste when electronics cannot be upgraded, maintained and repaired. It’s bad for people and it’s bad for the environment.
So much stuff in both the hardware and software world really annoys me and makes me think our future is shit the more I think about it.
Things could be so much better. Pretty much everything could be open and standardised, yet it isn’t.
Software can be made in a way that isn’t user-hostile, but that’s not the way of things. Hardware could be repairable and open, without OEMs having to navigate a minefield of IP and patents, much of which shouldn’t have been granted in the first place, or users having no ability to repair or upgrade their devices.
It’s all so tiresome.
I think Napoleon said something similar to “the army is commanded by me and the sergeants”?
Well, not true anymore today. All this connectivity and processing power, however seemingly inefficiently they are used, allow to centralize the world more than it could ever be. No need to consider what sergeants think.
(Which also means no Napoleons, cause much more average, grey, unskilled and generally unpleasant and uninteresting people are there now.)
It’s about power and it happened in the last 15 years.
I think it’s a political tendency, very intentional for those making decisions, not a “market failure” and other smartassery. It comes down to elites making laws. I feel they are more similar to Goering than to Hitler all over the world today.
This post may seem nuts, but our daily lives significantly depend on things more complex and centralized in supply chains and expertise than nukes and spaceships.
We don’t need desktop computers which can’t be fully made in, say, Italy, or at least in a few European countries taken together. Yes, this would mean kinda going back to late 90s at best in terms of computing power per PC, but we waste so much of it on useless things that our devices do less now than then.
We trade a lot of unseen security for comfort.
I haven’t used 8GB since… 2008 or so? TBF, I’m a power user (as are most people on any Lemmy instance, I presume), but still…
And sure, Mac OS presumably uses less RAM than Windows, but all the applications don’t.
There is exactly one reason why they do this: So they can charge you $200 to upgrade it to 16GB and in doing so make the listed price of the device look $200 cheaper than it actually is. Or sometimes $400 if it’s a model where the base model comes with a 256GB SSD (the upgrade to 512GB, the minimum I’d ever recommend, is also $200).
The prices Apple charges for storage and RAM are plain offensive. And I say that as someone who enjoys using their stuff.
That’s why I dropped them when my mid-2013 MBP got a bit long in the tooth. Mac OS X, I mean OS X, I mean macOS is a nice enough OS but it’s not worth the extortionate prices for hardware that’s locked down even by ultralight laptop standards. Not even the impressive energy efficiency can save the value proposition for me.
Sometimes I wish Apple hadn’t turned all of their notebook lines into MacBook Air variants. The unibody MBP line was amazing.
Typing this from a M2 Max Macbook Pro with 32GB, and honestly, this thing puts the “Pro” back in the MBP. It’s insanely powerful, I rarely have to wait for it to compile code, transcode video, or run AI stuff. It also does all of that while sipping battery, it’s not even breaking a sweat. Yes, it’s pretty thin, but it’s by no means underpowered. Apple really is onto something with their M* lineup.
But yeah, selling “Pro” laptops with 8GB in 2024 is very stupid.
I can’t believe I’m reading this in 2024
of course they will, it is for profit