The most exciting 2024 tech isn't AI (www.spacebar.news)
from corbin@infosec.pub to technology@lemmy.world on 08 Jan 2024 16:20
https://infosec.pub/post/6927393

2024 might be the breakout year for efficient ARM chips in desktop and laptop PCs.

#technology

threaded - newest

c0mbatbag3l@lemmy.world on 08 Jan 2024 16:29 next collapse

“The most exciting tech isn’t the thing that currently exists and is being improved and integrated daily, it’s this other thing we don’t even know for sure will maybe happen.”

FaceDeer@kbin.social on 08 Jan 2024 17:58 next collapse

"Forget about the possibility that we may finally have developed machines that think, that comprehend the world in a way similar to how humans do and can communicate with us on our level. This new chip design might end up with comparable capabilities to the existing chip design!"

Yeah, there was no need to try to hype this up as the biggest thing ever.

richieadler@lemmy.myserv.one on 08 Jan 2024 18:30 collapse

Forget about the possibility that we may finally have developed machines that think, that comprehend the world in a way similar to how humans do and can communicate with us on our level.

That isn’t what’s happening with “IA” right now.

FaceDeer@kbin.social on 08 Jan 2024 18:59 next collapse

Which is why I said possibility, I knew picky people would jump on the comment like this.

c0mbatbag3l@lemmy.world on 08 Jan 2024 19:02 collapse

You clearly don’t work in a field where it’s gutting swaths through workflows and taking up serious slack.

You can describe your problem to it in native English, so it does communicate on our level. It comprehends training data in the same way a human comprehends our lived experience and assimilates the data in the same manner. It’s not truly “reasoning”, but it’s leagues ahead of anything we had even four years ago and it’s only going to grow from here.

Commercial ventures are finding new uses cases everyday and to people in IT it’s hilarious in the same way that people who thought the Internet was a fad were hilarious.

richieadler@lemmy.myserv.one on 08 Jan 2024 22:07 collapse

I object to your characterization that current AI “thinks”. It does nothing of the sort.

c0mbatbag3l@lemmy.world on 08 Jan 2024 22:43 collapse

I literally said “it’s not truly reasoning” to clarify that while it’s drawing on its training data in the same way you draw on your experiences when making new decisions, it can’t really create original thought.

Once again lemmy proves reading comprehension is too damn hard.

richieadler@lemmy.myserv.one on 08 Jan 2024 22:49 collapse

I thought you were the one saying IA thinks, it was someone else. Apologies for that.

OTOH you can take your sarcasm and insert it rectally.

corbin@infosec.pub on 09 Jan 2024 07:22 collapse

Right, it’s less exciting now because it’s already here. I’m not expecting radically improved GPT models or whatever in 2024, probably just more iteration. The most exciting stuff there might be local AI tech becoming more usable, like we’ve seen with stable diffusion.

sir_reginald@lemmy.world on 09 Jan 2024 21:19 collapse

I’m just expecting performance optimisations, especially for local LLMs. Right now there are models as good as GPT-4 (Goliath 120B), but they require 2 RTX 4090 to run.

The models that require less powerful equipment are not as good, of course.

But hopefully, given enough time, good enough models will be able to run with mid end hardware.

eager_eagle@lemmy.world on 08 Jan 2024 16:31 next collapse

Can’t wait. I recently bought a firewall that gets noticeably warm on idle, even with a little case that has a heat sink. We need more energy efficient PCs.

jlh@lemmy.jlh.name on 08 Jan 2024 17:11 next collapse

This entire article just to hype up Qualcomm releasing a new CPU? I havent seen any evidence to suggest that this new Qualcomm CPU won’t be trash like all the other ones.

ARM on PC isn’t happening any time soon. They’re not more efficient than x86 CPUs at all.

Here’s a speed comparison between Qualcomm and AMD’s best cpus from last year. Same TDP.

cpu-monkey.com/…/compare_cpu-qualcomm_snapdragon_…

Here’s Jim Keller, the father of both AMD Ryzen and the Apple M1, saying that ARM is not necessarily more efficient than x86:

chipsandcheese.com/…/arm-or-x86-isa-doesnt-matter…

The only reason why Apple was able to make a successful ARM CPU was because they control the entire OS and the entire supply chain, and they have super expensive exclusivity contracts with TSMC. (because they literally make 50% of all phones in the world)

AMD’s x86 CPUs are actually faster and more efficient than Apple’s ARM CPUs on the same 5nm process node, but Apple is consistently 2 years ahead when it comes to silicon manufacturing, because of their TSMC deals.

Qualcomm doesn’t have any of that, and there is no way their CPUs are going to be so much better than AMD’s that people are going to be willing to put up with ISA incompatibilities. Windows on ARM has been a flop.

At least servers are more reasonable to see ARM chips, because all the software is open-source and all the major cloud vendors are making their own CPUs.

Nothing against ARM, or alternative ISAs in general, people just don’t understand that x86 vs ARM is not about power efficiency at all, it’s about supply chains and software compatibility.

corbin@infosec.pub on 08 Jan 2024 17:40 next collapse

The SQ3 was a custom design only for Surface tablets, I’m not sure it’s representative of Qualcomm’s future generally-available hardware. Early benchmarks on the Snapdragon Elite are much more promising but TDP and other important details are still missing.

You’re definitely right that software vertical integration is the missing piece. We’re starting to see a little bit of that in the PC ecosystem (e.g. windows using the AI core on newer CPUs/SoCs for live camera and mic effects) but more needs to happen there.

jlh@lemmy.jlh.name on 08 Jan 2024 19:12 collapse

That’s true. I haven’t looked that closely at QC’s most recent chips, just pointing out that they’re usually slower/hotter/more-expensive

It’s good to see competition, but people should manage their expectations. They’re gonna have to be a lot faster/efficient than the AMD 7840u in order to make running ARM worth it on PC.

It’ll be a fight, and in 2025 they’ll have to compete with Zen 5, too.

helenslunch@feddit.nl on 08 Jan 2024 17:49 next collapse

The only reason why Apple was able to make a successful ARM CPU was because they control the entire OS and the entire supply chain

One has to assume similar efforts are being undertaken with Qualcomm, Intel, Google, Microsoft, etc.

I don’t think anyone thinks slapping an ARM processor in a Windows laptop is going to suddenly make them more efficient.

MonkderZweite@feddit.ch on 09 Jan 2024 18:09 collapse

I don’t think anyone thinks slapping an ARM processor in a Windows laptop is going to suddenly make them more efficient.

I say most think exactly that.

Lojcs@lemm.ee on 08 Jan 2024 19:31 next collapse

Here’s a speed comparison between Qualcomm and AMD’s best cpus from last year. Same TDP.

Amd’s chip runs on 28 watts and is built on 4nm, qc’s runs on 7 watts and is built on 5nm. They are not equivalent.

AMD’s x86 CPUs are actually faster and more efficient than Apple’s ARM CPUs on the same 5nm process node, but Apple is consistently 2 years ahead when it comes to silicon manufacturing, because of their TSMC deals.

Comparing amd 7840u pro (4 nm, 28W) with apple m2 pro 10 core (5 nm, 28W), amd is 7% faster in single core and 10% faster in multi core. It’s unclear how it would be if they were on the same node. Feels they’d be the same

jlh@lemmy.jlh.name on 09 Jan 2024 16:58 collapse

I think the QC chip is 28 watts too. They use the same chassis as the Intel chip.

That is a good point that AMD’s node is technically slightly newer, even though they are both 5nm class. TSMC’s N4P is claimed to be up to 5% faster or 10% more power efficient than N5P. So, fair enough, they’re about even.

techradar.com/…/the-future-of-leading-edge-chips-…

tomshardware.com/…/tsmc-announces-n4p-process-a-r…

Lojcs@lemm.ee on 09 Jan 2024 22:13 collapse

I think chassis choice is just to keep it consistent. Sq3 is apparently based on 8cx gen3 which runs on 7 watts. The site you linked says sq3 has 4 medium and 4 small cores, but judging by how they run at the same frequencies as 8cx’s 4 large and 4 medium cores and the benchmark scores of the two chips being pretty much the same, I think it’s safe to say they’re the same chip. At the very least if sq3 pulled 4x the power to produce the same result Microsoft would just use the 8cx gen3

frezik@midwest.social on 09 Jan 2024 21:45 next collapse

You shouldn’t trust TDP numbers. They’re most useful to get a ballpark idea of what size cooler you’ll need for a given chip (and even then, Nactua has their own rating system for matching coolers to chips). AMD, in particular, reinvents their TDP formula regularly and plays with the numbers to get the output they want for comparison purposes.

Anyway, I’d be fine if ARM ends up being only on par with x86. It’s still a way out of the insanity of the x86 architecture and opens up so many more companies who can make chips.

mryessir@lemmy.sdf.org on 09 Jan 2024 22:09 collapse

My X13s with Linux, at 250 nits brightness while browsing via WLAN and playing music from the browser via bluetooth uses 5-8W in total.

geekworking@lemmy.world on 08 Jan 2024 17:26 next collapse

One of the hurdles to ARM is that you need to recompile and maintain a separate version of every piece of software for the different processors.

This is a much easier task for a tightly controlled ecosystem like Mac than the tons of different suppliers Windows ecosystem. You can do some sort of emulation to run non-native stuff, but at the cost of the optimization that you were hoping to gain.

Another OS variation also adds a big cost/burden to enterprise customers where they need to manage patches, security, etc.

I would expect to see more inroads in non-corporate areas following Apple success, but not any sort of explosion.

originalucifer@moist.catsweat.com on 08 Jan 2024 17:43 next collapse

micrsoft has spent the last few years rebuilding their shit to work on ARM. no idea how far theyve come, but you will absolutely see windows on arm for the enterprise.

frezik@midwest.social on 09 Jan 2024 22:10 collapse

Apple has the benefit of having done architecture transitions a few times already. Microsoft has been trying to get everyone out of the “Program Files (x86)” directory for over decade.

originalucifer@moist.catsweat.com on 09 Jan 2024 22:25 collapse

apple doesnt have the burden of being backwards compatible for 3 decades and able to run on most commoditized hardware.

apple undoubtedly has it easier than a company thats actually in use in most of the business world.

frezik@midwest.social on 10 Jan 2024 00:15 collapse

Uhh, one reason they don’t is that they have made the switch twice. Even if they didn’t have to deal with any other third party, they still had to convince Adobe, and Adobe doesn’t want to do shit if they don’t have to.

qjkxbmwvz@lemmy.sdf.org on 08 Jan 2024 18:15 collapse

On the other hand, a completely open ecosystem works well too — ARM for Linux feels exactly like ARM on x86/64 in my experience. Granted this is for headless stuff on an (RPi and Orange Pi, both ARM, both running Debian), but really the only difference is the bootloader situation.

helenslunch@feddit.nl on 08 Jan 2024 17:46 next collapse

AI is currently limited in application (and legislation). I think when we start seeing it in things like document ecosystems like Google Workspace or Microsoft Office, or in operating systems like Windows 12 and Android, that’s when we’ll start seeing what it’s really capable of.

Also open-source applications that aren’t necessarily limited by laws or corporate optics.

Thinks like creating helper bots that aid in troubleshooting or “assistants” that can draft/send emails, create calendar events, answer questions based on emails, etc.

But yeah in it’s current state it is mostly just a glorified search engine.

melroy@kbin.melroy.org on 08 Jan 2024 17:57 collapse

ow yea 2024 will definitely be the year where AI gets integrated into all those products.

melroy@kbin.melroy.org on 08 Jan 2024 17:56 next collapse

I really hope so... Those x86 architecture chips are killing me.

blazera@kbin.social on 08 Jan 2024 18:16 next collapse

It says it a few times about x86 being decades old...but so is ARM? I dont know whats supposed to be game changing about it.

abhibeckert@lemmy.world on 08 Jan 2024 20:59 next collapse

I’m guessing you’ve never used an ARM Mac.

They don’t look all that fast on GeekBench (more on that in further down) but in real world usage they are incredibly fast. As in an entry level 13" school homework laptop will have performance on par with a high end gaming PC with a thousand watt PSU.

I don’t have a high end gaming PC to compare, but I do have a mid-range one and I’ve stopped using it… my laptop is so much faster, quieter, cooler, that even though the PC has more games… I just put up with the modest selection (about half the games I own) that run on a Mac. It’s not just gaming either… I’m also able to compile software perfectly fast, I can run docker with a dozen containers open at the same time without breaking a sweat (this is particularly impressive on the Mac version of Docker which uses virtual machines instead of running directly on the host), and stable diffusion generates images in about 20 seconds or so with typical generation settings.

The best thing though is I can do all of that on a tiny battery that lasts almost an entire day under heavy load and multiple days under normal load. I’ve calculated the average power draw with typical use is somewhere around 3 watts for the entire system including the screen. It’s hard to believe, especially considering how fast it is.

On the modest GeekBench score Apple ARM processors have - it’s critical to understand GeekBench is designed to test very short bursts and avoid thermal throttling. Intel’s recent i9 processors, with good cooling, will thermal throttle after about 12 seconds and GeekBench is designed to avoid hitting that number by doing much shorter bursts than that. Apple’s processors not only take far longer to thermal throttle, they also “throttle” by reducing performance to barely lower than full speed.

But even worse than that - one of the ways Apple achieves incredible battery life is they don’t run the processors at high clock rates for short bursts. The CPU starts slow and ramps up to full speed when you keep it under high load. So something quick, like loading a webpage, won’t run at full speed and therefore GeekBench also isn’t running at full speed either.

A third difference, and probably the biggest one, is Apple’s processor has very fast memory and also massive memory caches which are even faster. Again that often doesn’t show up on CPU benchmark because it’s not really measuring compute power. But real world software spends a massive amount of time just reading and writing to memory and those operations are fast on Apple’s ARM processors.

You really can’t trust the benchmarks when you’re comparing completely different processors. You need to try real world usage, and the real world usage difference is game changing. Trust me, when proper fast processors (not just a laptop running with a phone CPU) are available on PCs, everyone will realise Mac users were right - ARM is way better than x86. This isn’t like AMD vs Intel. It’s more like HDD vs SSD.

0ddysseus@lemmy.world on 08 Jan 2024 21:34 next collapse

Haha “entry level school homework Mac” Hahahahaha Sure thing Richy Rich

abhibeckert@lemmy.world on 08 Jan 2024 21:52 collapse

The Mac I use is a few years old and available secondhand for under $500. You can get the same CPU/GPU in an iPad which is available, brand new, for $600. I think that’s a reasonable price for a school computer.

SupraMario@lemmy.world on 09 Jan 2024 01:05 next collapse

You can buy a PC running Linux or Windows that stomps that for the same price, new.

Macs are over priced for what you get.

[deleted] on 09 Jan 2024 02:42 collapse

.

[deleted] on 09 Jan 2024 00:18 next collapse

.

sir_reginald@lemmy.world on 09 Jan 2024 02:52 collapse

they are getting dowvoted because they said macbooks are “entry-level school laptops”, which I find hilarious.

macbooks are a luxury, paying way more for the same specs (with more battery life, I’ll grant you that).

[deleted] on 09 Jan 2024 02:59 collapse

.

xaxl@lemmy.world on 09 Jan 2024 03:04 collapse

Personally I down voted once I read that somehow one of these laptops are on par with a high end gaming PC which is simply not true at all.

joshhsoj1902@lemmy.ca on 09 Jan 2024 03:00 next collapse

I work on an ARM Mac, it’s fine. If you’re just doing light work on it, it works great! Like any other similarly priced laptop would.

Under load, or doing work outside what it is tuned for, it doesn’t perform spectacularly.

It’s a fine laptop, the battery life is usually great. But as soon as you need to use the x86 translation layer, performance tanks, battery drains, it’s not a great time.

Things are getting better, and for a light user, It works great, but I’m much more excited about modern x86 laptop processors for the time being.

KeenFlame@feddit.nu on 09 Jan 2024 05:05 collapse

All apple products I’ve bought have had their performance artifically destroyed by firmware

Not doing that again

Ever

Have fun when this computer breaks on purpose so you buy a new one

frezik@midwest.social on 09 Jan 2024 21:59 collapse

X86 has an incredible amount of cruft built up to support backwards compatibility all the way back to the 8086. ARM isn’t free of cruft, but it’s nowhere on the same level. Most of that isn’t directly visible to customers, though.

What is visible is that more than three companies can license and manufacture them. The x86 market has one company that owns it, another who licenses it but also owns the 64 bit extensions, and a third one who technically exists but is barely worth talking about. It’s also incredibly difficult to optimize, and the people who know how already work for one of main two companies (arguably only one at this point). Even if you could legally license it as a fourth player, you couldn’t get people who could design an x86 core that’s worth a damn.

Conversely, ARM cores are designed by CS students all the time. That’s the real advantage to end users: far more companies who can produce designs. If one of them fails the way Intel has of late, we’re not stuck with just one other possibility.

Hypx@kbin.social on 08 Jan 2024 18:18 next collapse

This is just a repeat of the same old pro-RISC myths from decades ago. There is very little performance difference between x86 and any RISC based CPU, at least when pertaining to the ISA itself. Apple merely has the advantage of having far more resources available for CPU development than their competitors.

frezik@midwest.social on 09 Jan 2024 22:13 collapse

Modern x86 is a CISC outer layer around a RISC inner core. It didn’t hang on this long by ignoring RISC, but by assimilating it. RISC really did change everything, but not by the way everyone thought.

Grass@sh.itjust.works on 09 Jan 2024 03:29 next collapse

Wake me up when risc-v has performance parity and more software

fubarx@lemmy.ml on 09 Jan 2024 05:03 next collapse

How long before indie devs can make their own custom processor chips?

stealth_cookies@lemmy.ca on 09 Jan 2024 05:05 next collapse

Now? FPGAs have been a thing for decades and are the closest thing I can see to getting custom chips made without massive investments.

fubarx@lemmy.ml on 09 Jan 2024 05:10 collapse

Yup. But was thinking more of ultra-small-run ARM or RISC-V processors. Be cool if we ever get there.

stardreamer@lemmy.blahaj.zone on 09 Jan 2024 10:02 collapse

You can build a risc core using an fpga. Plenty of people have done that.

Performance will probably be an issue.

sir_reginald@lemmy.world on 09 Jan 2024 21:14 collapse

do you know how a CPU is designed? it’s just crazy hard to study the design of simple RISC CPUs we studied in college. And those were very simple, old processors.

A modern processor with performance that can match modern CPUs is no task for one indie dev, at all.

You need a team of professionals in the field, a huge budget and the technology to manufacture it, which you would probably end outsourcing to one of the big manufacturers anyway because it’s very rare.

So the answer to your question is never, unless you’re expecting low performance CPUs based on FPGAs.

bamboo@lemm.ee on 09 Jan 2024 06:40 next collapse

It would be fascinating to see Qualcomm, NVIDIA, AMD, Mediatek, and possibly others all competing to build the best ARM SoCs for windows devices, especially after so many years of Intel stagnating and Apple eating their lunch with their ARM SoCs.

akrot@lemmy.world on 09 Jan 2024 07:28 collapse

competing to build the best ARM SoCs for windows devices

You mean desktop, and not Windows? Because if anything Windows is becoming a botnet device. I hope linux support is OOB.

bamboo@lemm.ee on 09 Jan 2024 07:44 collapse

Windows arm devices boot with UEFI, so standard ARM UEFI images should work, just like on x86. I would bet drivers should be alright too, since these ARM SoCs will likely be similar to ones used in Linux SBCs and Android devices.

TheGrandNagus@lemmy.world on 09 Jan 2024 06:49 next collapse

This is a Qualcomm marketing piece.

And no, the most exciting 2024 tech won’t be a CPU with similar or lower performance to other comparable CPUs on the market, with the added benefit of less software compatibility.

GiddyGap@lemm.ee on 09 Jan 2024 19:00 next collapse

2024 might be the year I win a million dollars.

Something_Complex@lemmy.world on 09 Jan 2024 22:13 collapse

I will tell you how to win 2 million if you give me 1

Floshie@lemmy.blahaj.zone on 10 Jan 2024 01:11 next collapse

Omg the porting of games would be awfull

Bishma@discuss.tchncs.de on 08 Jan 2024 16:45 next collapse

Does anyone else worry that the rise of personal computers using super custom SOCs is going to have negative effects on our abilities to build our own machines?

chemicalwonka@discuss.tchncs.de on 08 Jan 2024 18:19 next collapse

But one detail that we cannot forget is that with the increase in ARM architecture in PCs and laptops we will probably see an increase in fully locked hardware. We don’t need the expansion of the ARM architecture for PCs if it doesn’t come with hardware and software freedom

smileyhead@discuss.tchncs.de on 08 Jan 2024 20:26 collapse

Ah yes, let’s welcome one device - one operating system myth to the desktops, with people choosing hardware because of software feauture that could be installable. Welcome the expiration date on computers called “years of software support” and welcome overall unfriendlyness for alternative systems.

Performance and efficency is one side of the coin. But let me remind you that Qualcomm (among with Google) is the reason we cannot have lifetime updates for our phones, ROMs build needs to be specific for each model and making a phone with anything but Android is nearly impossible.

I’ll take ARM over x86, but I’ll take AMD/Intel over Qualcomm thousand times more.