There have been multiple accounts created with the sole purpose of posting advertisement posts or replies containing unsolicited advertising.

Accounts which solely post advertisements, or persistently post them may be terminated.

Etterra ,

Wait, that’s it? Seriously?

potentiallynotfelix ,

And it probably won’t be able to be upgraded by the user, which should be the bare minimum.

Reverendender ,

And it probably absolutely guaran-fucking-teed won’t be able to be upgraded by the user, which should be the bare minimum.

Petter1 ,

You loose performance by making RAM upgradeable, hope the new RAM design, where you can install ram as if it was soldered in, is coming soon:

www.tomshardware.com/…/what-is-camm2

Valmond ,

You lose performance if you can’t upgrade.

M500 ,

It’s because AI needs a not a ram. I think Apple did not expect or plan for ai which shows in the fact that only the latest pro phone can have Apple intelligence. It’s because that phone has enough ram.

Now they will boost ram across the board because Apple intelligence will not run well without it.

Depending on pricing, I may actually buy a MacBook in 2025.

I’ve wanted one since the m1, but I’ve held out until 16gb was the starting amount of ram.

cm0002 ,

Or you could just get just about any other non-mac system that lets you upgrade RAM easily when you need too…

Just stop supporting Apples soldered in BS

bamboo ,

I hate to be the bearer of bad news, but most things and light laptops have had soldered ram for many years now. There are exceptions, but they’re few and far between.

TheGrandNagus , (edited )

Bad news: literally all current CPU gen laptops use soldered RAM.

All of them. Every single one. No exceptions.

Hopefully that’ll change, but as it stands right now, if you want newest gen, you cannot get replaceable RAM.

And even before current gen, the vast majority of Windows laptops were soldered too.

E: idk why you’re downvoting, it’s true lol.

AGuyAcrossTheInternet ,

I really don't know where you're looking because I only see that in business-class laptops and even then not all of them have soldered RAM.

And I'm already counting the ones with one expansion slot with the soldered bunch.

Of course, if you paid attention only to HP, Dell and Lenovo, then I'd see why you'd think so. But beyond those brands, you don't have that soldered nonsense everywhere.
At the very least, you have things like Clevo, Framework and the like to sell you laptops without soldered ram.

I bet there are even websites that let you filter laptop models without soldered ram. Personally, I only know about Germany-based websites like that, though.

TheGrandNagus , (edited )

You are looking at previous-gen platforms.

E.g. for Framework, you’re looking at APUs like the 7840U, which is not current gen. It’s two generations old. (7840U/Phoenix > 8850U/Hawk Point > AI 9 365 (awful naming btw AMD)/Strix Point).

Like I said, all current CPU gen laptops cannot use SODIMM. I really hope that changes though.

And let me be clear here, I’m not exaggerating for effect; I do not mean most of them. I do not mean the vast majority of them. I do not mean practically all of them. I literally mean all of them. 100% of them. Every single one that exists.

AMD, Intel, and Qualcomm do not currently have compatibility with SODIMM on their newest gen mobile CPUs.

I hope that changes, and I expect it eventually will, but as it stands right now, no you cannot have SODIMM modules if you are buying any laptop with the newest gen CPUs.

AGuyAcrossTheInternet ,

Well fudge me sideways. Every day is a school day.

They've all got LPDDR5, so yeah, you're unfortunately right. It feels kinda weird having to consider the 7000 and 8000-series last gen already; true as it is, though.

forgotaboutlaye ,

I know it’s not a like for like comparison, but the Pixel 9 Pro that launched this month has 16gb of RAM.

stoy ,

I remember back in the early 2000s when I saw a PDA with a 232mhz cpu and 64mb ram, and I realized how far technology had come since I got my computer with a 233mhz cpu and 64mb ram…

Obviously different architechtures, but damn that felt strange…

narc0tic_bird ,

Yup, while the current iPhone 15 Pro is the only model which has 8 GB of RAM, with the regular iPhone 15 having 6 GB. All iPhone 16 models (launching next month) will still only have 8 GB according to rumors, which happens to be the bare minimum required to run Apple Intelligence.

Giving the new models only 8 GB seems a bit shortsighted and will likely mean that more complex AI models in future iOS versions won’t run on these devices. It could also mean that these devices won’t be able to keep a lot of apps ready in the background if running an AI model in-between.

16 GB is proper future-proofing on Google’s part (unless they lock new software features behind newer models anyway down the road), and Apple will likely only gradually increase memory on their devices.

filister ,

Pretty much what NVIDIA is doing with their GPUs. Refusing to provide adequate future proof amount of VRAM on their cards. That’s planned obsolescence in action.

TheGrandNagus , (edited )

And like Apple, Nvidia has no shortage of fanboys that insist the pitiful amounts of (V)RAM is enough. The marketing sway those two companies have is incredible.

It’s a complete joke that Sapphire had an 8GB version of the R9 290X, what, 11 years ago or something? And yet Nvidia is still selling 8GB cards now, for exorbitant prices, and people lap it up.

CheeseNoodle ,

The current GPU situation actually has me curious about AMDs upcoming Halo APU chips. They’re likely going to be pretty expensive relative to their potential GPU equivelent performance but if they work out similar to the combined price of a CPU and GPU then it might be worthwhile as they use onboard RAM as their VRAM. Probably a crazy idea but one I look forward to theory-building in spring when they release.

Petter1 ,

This happens if you sell your hardware as DRM key to use their software (i(Pad)OS, macOS etc. and Cuda)

tankplanker ,

If you were being cynical, you could say it was planned obsolescence and that when the new ai feature set rolls out that you have to get the new phone for them.

nous ,

I would say it is more so they can advertise a lower price. But then expect you to get the more expensive ones as the bare minimum is just not enough.

tankplanker ,

For the base model yeah, but apple loves charging a packet for more memory so I don’t see it for the top of the range models. Would be typical for them to only offer 16gb with the increased storage as well, just to bump the price up

narc0tic_bird ,

I think they got caught with their pants down when everybody started doing AI and they were like “hey, we have this cool VR headset”. Otherwise they would’ve at least prepared the regular iPhone 15 (6 GB) to be ready for Apple Intelligence. Every (Apple Silicon) device with 8 GB or more get Apple Intelligence, so M1 iPads from 2021 get it as well for example, even though the M1’s NPU is much weaker than some of the NPUs in unsupported devices with less RAM.

They are launching their AI (or at least everything under the “Apple Intelligence” umbrella) with iOS 18.1 which won’t even release with the launch of the new iPhones, and it’ll be US only (or at least English only) with several of the features announced at WWDC still missing/coming later and it’s unclear how they proceed in the EU.

Petter1 ,

I bet, that the next non pro iPhone will be one of the most sold iPhones, all time. Or it is the SE one, if it supports apple’s “AI”. I think, they planned that this way, so they have an explanation compared to when they tried sell new hardware for stage manager.

tankplanker ,

With how polished Apples AI on mobile was at launch compared to Gemini on Android at launch were it could not even do basics like timers I suspect Apple had it in the works for far longer and it would not have been a total surprise.

Also you are describing the situation at launch for new hardware, the software will evolve every year going forward and the requirements will likely increase every year. If I am buying a flagship phone right now I want it to last at least 3 years of updates, if not 5 years. The phone has to be able to cope with what is a very basic requirement that is enough RAM.

This isn’t some NPU thing, this is just basic common sense that more RAM is better for this, something the flagship iPhones could have benefited from for a while now.

narc0tic_bird ,

I’m not sure if you’re agreeing or disagreeing with me here. Either way, hardware has a substantially longer turnaround time compared to software. The iPhone 15 would’ve been in development years before release (I’m assuming they’re developing multiple generations in parallel, which is very likely the case) and keep in mind that the internals are basically identical to the iPhone 14 Pro, featuring the same SoC.

AI and maybe AAA games like Resident Evil aside, 6 GB seems to work very well on iPhones. If I had a Pixel 6/7/8 Pro with 12 GB and an iPhone 12/13/14 Pro (or 15) with 6 GB, I likely wouldn’t notice the difference unless I specifically counted the number of recent applications I could reopen without them reloading. 6 GB keeps plenty of recent apps in memory on iOS.

But I’m not sure going with 8 GB in the new models knowing that AI is a thing and the minimum requirement for their first series of models is 8 GB is too reassuring. I’m sure these devices will get 5-8 years of software updates, but new AI features might be reduced or not present at all on these models then.

When talking about “AI” in this context I’m talking about everything new under the “Apple Intelligence” umbrella, like LLMs and image generators. They’ve done what you’d call “AI” nowadays for many years on their devices, like photo analysis, computational photography, voice isolation, “Siri” recommendations etc.

bruhduh ,
@bruhduh@lemmy.world avatar

Some phones have 24gb since 2 years ago already

HauntedCupcake ,

Whoa, that’s like 32GB of Windows RAM. Seems excessive to me tbh

cmnybo ,

Considering that RAM is shared with the GPU, it’s still not enough.

floofloof ,

It’s OK - for an extra $400 they’ll sell you one with an extra $50 worth of RAM.

JohnDClay ,

It doesn’t even cost that for them.

scottmeme ,

$800 for $30 of ram

PoopMonster ,

“yeah but that’s like having 32GB on a lame PC”

freeman ,

I remember an Apple fanboy arguing that this made things better!

cmnybo ,

It does make some things better, but there are a number of downsides too. The biggest downside is that it’s not practical to make the memory socketed because of the speed that’s required.

Omgboom ,

Linux requirements: CPU (optional)

frunch ,

Paper Linux, computing done olde world style 📜 🤖

JeeBaiChow ,

…is this actually 16Gb or 8Gb feeling like 16Gb, as per previous statements?

AlphaOmega ,

Welcome to 2010

Kalcifer ,
@Kalcifer@sh.itjust.works avatar

I upgraded from 8GB to 16GB like 2 months ago.

Kazumara ,

Welcome to 2010 to you as well then!

Toes ,

Why are apple products always so anemic on memory?

Munkisquisher ,

Greed, it lowers the advertised price, but once you spec it decently you’ve added a grand in extras

tal ,
@tal@lemmy.today avatar

Price discrimination based on memory loadout is real, but it’s not specific to Apple, either.

luves2spooge , (edited )

Because there are two types of mac users:

  • People that are buying them with their own money because they’re trendy and just using them as glorified Internet browsers. 16gb is plenty.
  • People using them professionally so their company is paying and Apple can over charge for the necessary memory upgrade
aStonedSanta ,

I have an m2 8 gb. And it’s plenty. It’s just a browsing/discord/stream box basically.

bamboo ,

This pretty much. I don’t care that much that a maxed out MBP is $6000 or whatever, my employer pays for that.

bamboo ,

This pretty much. I don’t care that much that a maxed out MBP is $6000 or whatever, my employer pays for that.

nicepool ,

Isn’t Apple the company that charges $5k+ for 16GB? All while intentionally deprecating the hardware within 2 years. /s

I’ve had to support their products on a professional level for over a decade. I will NEVER buy an Apple product.

cm0002 ,

I’ve had to support their products on a professional level for over a decade.

Their enterprise stuff…can only be described as a quintessential example of an ill-conceived, horrendously executed fiasco, so utterly devoid of utility and coherence that it defies all logic and reasonable expectation. It stands as a paragon of dysfunction, a conflagration of conceptual failures so intense and egregious that it resembles a blazing inferno of pure, unadulterated refuse. It is, in every conceivable sense, a searing, molten heap of garbage—hot, steaming, and reeking with the unmistakable stench of profound ineptitude and sheer impracticality.

Lenis_78 ,

That was…. poetry!

masterspace ,

Let me know how many multiple thousands of dollars it’s going to cost for a MAX variant of the chip that can run three external monitors like it’s 2008.

SteveFromMySpace ,

Wasn’t that only the M1 specifically that lacked that feature?

Not saying it was acceptable but pretty sure all chips after have supported 3+ monitors

carleeno ,

My last job issued me an M2 air that could only power 1 external monitor. Was annoying as hell.

masterspace , (edited )

Nope. All base Mx Series Macs can only support a single external monitor in addition to their internal one.

Pro Series are professional enough that Apple deems your work worthy of using two (2) external monitors.

Max Series are the only ones that have proved their Maximum enough to Apple to let them use 3 monitors.

It’s honestly absurd. And none of them support Display Port’s alt mode so they can’t daisy chain between monitors and they max out at 3, whereas an equivalent Windows or Linux machine could do 6 over the same Thunderbolt 3 connection.

Windows and Linux machines also support sub pixel text rendering, so text looks far better on 1080p and 1440p monitors.

I have to use MacOS for work and while I’ve come to accept many parts and even like some, their external monitor support is just mind numbingly bad.

tal ,
@tal@lemmy.today avatar

I guess you could get an eGPU. Probably not cheaper than just giving Apple their pound of flesh, though.

brbposting ,

sub pixel text rendering, so text looks far better on 1080p and 1440p monitors.

Why would you need that? Buy an Ultra Pro Retina Max Display and please get the stand if you don’t want Apple to go out of business.

narc0tic_bird ,

What you’re describing as “DisplayPort alt mode” is DisplayPort Multi-Stream Transport (MST). Alt mode is the ability to pass native DisplayPort stream(s) via USB-C, which all M chip Macs are capable of. MST is indeed unsupported by M chip hardware, and it’s not supported in macOS either way - even the Intel Macs don’t support it even though the hardware is capable of it.

MST is nice for a dual WQHD setup or something (or dual UHD@60 with DisplayPort 1.4), but attempt to drive multiple (very) high resolution and refresh rate displays and you’ll be starved for bandwidth very quickly. Daisy-chaining 6 displays might technically be possible with MST, but each of them would need to be set to a fairly low resolution for today’s standards. Macs that support more than one external display can support two independent/full DisplayPort 1.4 signals per Thunderbolt port (as per the Thunderbolt 4 spec), so with a proper Thunderbolt hub you can connect two high resolution displays via one port no problem.

I agree that even base M chips should support at least 3 simultaneous displays (one internal and two external, or 3 external in clamshell mode), and they should add MST support for the convenience to be able to connect to USB-C hubs using MST with two (lower-resolution) monitors, and support proper sub-pixel font anti-aliasing on these low-DPI displays (which macOS was perfectly capable of in the past, but they removed it). Just for the convenience of being able to use any random hub you stumble across and it “just works”, not because it’s necessarily ideal.

But your comparison is blown way out of proportion. “Max” Macs support the internal display at full resolution and refresh rate (120 Hz), 3 external 6K 60Hz displays and an additional display via HDMI (4K 144 Hz on recent models). Whatever bandwidth is left per display when daisy-chaining 6 displays to a single Thunderbolt port on a Windows machine, it won’t be anywhere near enough to drive all of them at these resolutions.

mrvictory1 ,

It’s about time.

daddy32 ,

…in 2030.

reddig33 ,

“640k ought to be enough for anyone.”

cupcakezealot ,
@cupcakezealot@lemmy.blahaj.zone avatar

i could run wing commander 2, ultima underworld, and lands of lore. who needs any more?

tal , (edited )
@tal@lemmy.today avatar

EMACS: Eight Megabytes And Constantly Swapping

henfredemars ,

It’s amazing that with transparent huge pages in Linux I can have memory pages bigger than that entire 640k.

tal ,
@tal@lemmy.today avatar

My CPU’s L1 cache is larger than that 640K system’s main memory.

  • All
  • Subscribed
  • Moderated
  • Favorites
  • [email protected]
  • random
  • lifeLocal
  • goranko
  • All magazines