Survey shows most people wouldn't pay extra for AI-enhanced hardware | 84% of people said no (www.techspot.com)
from ForgottenFlux@lemmy.world to technology@lemmy.world on 18 Jul 2024 19:07
https://lemmy.world/post/17691799

Companies are going all-in on artificial intelligence right now, investing millions or even billions into the area while slapping the AI initialism on their products, even when doing so seems strange and pointless.

Heavy investment and increasingly powerful hardware tend to mean more expensive products. To discover if people would be willing to pay extra for hardware with AI capabilities, the question was asked on the TechPowerUp forums.

The results show that over 22,000 people, a massive 84% of the overall vote, said no, they would not pay more. More than 2,200 participants said they didn’t know, while just under 2,000 voters said yes.

#technology

threaded - newest

Telorand@reddthat.com on 18 Jul 2024 19:21 next collapse

…just under 2,000 voters said “yes.”

And those people probably work in some area related to LLMs.

It’s practically a meme at this point:

Nobody:

Chip makers: People want us to add AI to our chips!

ozymandias117@lemmy.world on 18 Jul 2024 22:27 collapse

The even crazier part to me is some chip makers we were working with pulled out of guaranteed projects with reasonably decent revenue to chase AI instead

We had to redesign our boards and they paid us the penalties in our contract for not delivering so they could put more of their fab time towards AI

nickwitha_k@lemmy.sdf.org on 19 Jul 2024 02:56 collapse

That’s absolutely crazy. Taking the Chicago School MBA philosophy to things as time consuming and expensive to setup as silicon production.

BlackLaZoR@kbin.run on 18 Jul 2024 19:28 next collapse

There's really no point unless you work in specific fields that benefit from AI.

Meanwhile every large corpo tries to shove AI into every possible place they can. They'd introduce ChatGPT to your toilet seat if they could

x4740N@lemm.ee on 18 Jul 2024 19:37 next collapse

Imagining a chatgpt toilet seat made me feel uncomfortable

Davel23@fedia.io on 18 Jul 2024 19:42 next collapse

https://www.youtube.com/watch?v=cjTd2nLDr9Y&t=11s

Lost_My_Mind@lemmy.world on 18 Jul 2024 19:59 collapse

Aw maaaaan. I thought you were going to link that youtube sketch I can’t find anymore. Hide and go poop.

BlackLaZoR@kbin.run on 18 Jul 2024 21:15 next collapse

Don't worry, if Apple does it, it will sell a like fresh cookies world wide

SeaJ@lemm.ee on 18 Jul 2024 21:41 next collapse
Arbiter@lemmy.world on 18 Jul 2024 22:46 collapse

Idk, they can’t even sell VR.

Odo@lemmy.world on 18 Jul 2024 22:18 collapse
br3d@lemmy.world on 18 Jul 2024 19:40 next collapse

“Shits are frequently classified into three basic types…” and then gives 5 paragraphs of bland guff

Krackalot@discuss.tchncs.de on 18 Jul 2024 19:53 next collapse

With how much scraping of reddit they do, there’s no way it doesn’t try ordering a poop knife off of Amazon for you.

catloaf@lemm.ee on 18 Jul 2024 23:35 collapse

It’s seven types, actually, and it’s called the Bristol scale, after the Bristol Royal Infirmary where it was developed.

br3d@lemmy.world on 19 Jul 2024 05:21 collapse

I know. But I was satirising GPT’s bland writing style, not providing facts

Lost_My_Mind@lemmy.world on 18 Jul 2024 19:57 next collapse

Which would be approptiate, because with AI, theres nothing but shit in it.

fuckwit_mcbumcrumble@lemmy.dbzer0.com on 18 Jul 2024 20:22 collapse

Someone did a demo recently of AI acceleration for 3d upscaling (think DLSS/AMDs equivilent) and it showed a nice boost in performance. It could be useful in the future.

I think it’s kind of a ray tracing. We don’t have a real use for it now, but eventually someone will figure out something that it’s actually good for and use it.

nekusoul@lemmy.nekusoul.de on 18 Jul 2024 22:13 next collapse

AI acceleration for 3d upscaling

Isn’t that not only similar to, but exactly what DLSS already is? A neural network that upscales games?

fuckwit_mcbumcrumble@lemmy.dbzer0.com on 18 Jul 2024 22:29 collapse

But instead of relying on the GPU to power it the dedicated AI chip did the work. Like it had it’s own distinct chip on the graphics card that would handle the upscaling.

I forget who demoed it, and searching for anything related to “AI” and “upscaling” gets buried with just what they’re already doing.

barsoap@lemm.ee on 18 Jul 2024 23:09 collapse

That’s already the nvidia approach, upscaling runs on the tensor cores.

And no it’s not something magical it’s just matrix math. AI workloads are lots of convolutions on gigantic, low-precision, floating point matrices. Low-precision because neural networks are robust against random perturbation and more rounding is exactly that, random perturbations, there’s no point in spending electricity and heat on high precision if it doesn’t make the output any better.

The kicker? Those tensor cores are less complicated than ordinary GPU cores. For general-purpose hardware and that also includes consumer-grade GPUs it’s way more sensible to make sure the ALUs can deal with 8-bit floats and leave everything else the same. That stuff is going to be standard by the next generation of even potatoes: Every SoC with an included GPU has enough oomph to sensibly run reasonable inference loads. And with “reasonable” I mean actually quite big, as far as I’m aware e.g. firefox’s inbuilt translation runs on the CPU, the models are small enough.

Nvidia OTOH is very much in the market for AI accelerators and figured it could corner the upscaling market and sell another new generation of cards by making their software rely on those cores even though it could run on the other cores. As AMD demonstrated, their stuff also runs on nvidia hardware.

What’s actually special sauce in that area are the RT cores, that is, accelerators for ray casting though BSP trees. That’s indeed specialised hardware but those things are nowhere near fast enough to compute enough rays for even remotely tolerable outputs which is where all that upscaling/denoising comes into play.

fuckwit_mcbumcrumble@lemmy.dbzer0.com on 19 Jul 2024 00:53 next collapse

Nvidia’s tensor cores are inside the GPU, this was outside the GPU, but on the same card (the PCB looked like an abomination). If I remember right in total it used slightly less power, but performed about 30% faster than normal DLSS.

AdrianTheFrog@lemmy.world on 19 Jul 2024 06:13 collapse

from the articles I’ve found it sounds like they’re comparing it to native…

fuckwit_mcbumcrumble@lemmy.dbzer0.com on 19 Jul 2024 01:03 collapse

Found it.

neowin.net/…/powercolor-uses-npus-to-lower-gpu-po…

I can’t find a picture of the PCB though, that might have been a leak pre reveal and now that it’s revealed good luck finding it.

AdrianTheFrog@lemmy.world on 19 Jul 2024 06:11 collapse

Having to send full frames off of the GPU for extra processing has got to come with some extra latency/problems compared to just doing it actually on the gpu… and I’d be shocked if they have motion vectors and other engine stuff that DLSS has that would require the games to be specifically modified for this adaptation. IDK, but I don’t think we have enough details about this to really judge whether its useful or not, although I’m leaning on the side of ‘not’ for this particular implementation. They never showed any actual comparisons to dlss either.

As a side note, I found this other article on the same topic where they obviously didn’t know what they were talking about and mixed up frame rates and power consumption, its very entertaining to read

The NPU was able to lower the frame rate in Cyberpunk from 263.2 to 205.3, saving 22% on power consumption, and probably making fan noise less noticeable. In Final Fantasy, frame rates dropped from 338.6 to 262.9, resulting in a power saving of 22.4% according to PowerColor’s display. Power consumption also dropped considerably, as it shows Final Fantasy consuming 338W without the NPU, and 261W with it enabled.

nekusoul@lemmy.nekusoul.de on 19 Jul 2024 09:38 collapse

I’ve been trying to find some better/original sources [1] [2] [3] and from what I can gather it’s even worse. It’s not even an upscaler of any kind, it apparently uses an NPU just to control clocks and fan speeds to reduce power draw, dropping FPS by ~10% in the process.

So yeah, I’m not really sure why they needed an NPU to figure out that running a GPU at its limit has always been wildly inefficient. Outside of getting that investor money of course.

AdrianTheFrog@lemmy.world on 19 Jul 2024 15:54 collapse

Ok, i guess its just kinda similar to dynamic overclocking/underclocking with a dedicated npu. I don’t really see why a tiny 2$ microcontroller or just the cpu can’t accomplish the same task though.

AdrianTheFrog@lemmy.world on 19 Jul 2024 05:56 collapse

We have plenty of real uses for ray tracing right now, from blender to whatever that avatar game was doing to lumen to partial rt to full path tracing, you just can’t do real time GI with any semblance of fine detail without RT from what I’ve seen (although the lumen sdf mode gets pretty close)

although the rt cores themselves are more debatably useful, they still give a decent performance boost most of the time over “software” rt

Zatore@lemm.ee on 18 Jul 2024 19:36 next collapse

Most people won’t pay for it because a lot of AI stuff is done cloud side. Even stuff that could be done locally is done in the cloud a lot. If that wasn’t possible, probably more people would wand the hardware. It makes more sense for corporations to invest in hardware.

helenslunch@feddit.nl on 18 Jul 2024 22:57 collapse

a lot of AI stuff is done cloud side.

If it’s done in the cloud then there’s no need for them to buy “AI-accelerated hardware”

_haha_oh_wow_@sh.itjust.works on 18 Jul 2024 19:39 next collapse

“enhanced”

independantiste@sh.itjust.works on 18 Jul 2024 19:42 next collapse

Personally I would choose a processor with AI capabilities over a processor without, but I would not pay more for it

Godort@lemm.ee on 18 Jul 2024 19:42 next collapse

This is one of those weird things that venture capital does sometimes.

VC is is injecting cash into tech right now at obscene levels because they think that AI is going to be hugely profitable in the near future.

The tech industry is happily taking that money and using it to develop what they can, but it turns out the majority of the public don’t really want the tool if it means they have to pay extra for it. Especially in its current state, where the information it spits out is far from reliable.

cheese_greater@lemmy.world on 18 Jul 2024 20:19 next collapse

I don’t want it outside of heavily sandboxed and limited scope applications. I dont get why people want an agent of chaos fucking with all their files and systems they’ve cobbled together

Fiivemacs@lemmy.ca on 18 Jul 2024 20:55 collapse

NDA also legally prevent you from using this forced garbage too. Companies are going to get screwed over by other companies, capitalism is gonna implode hopefully

Tenthrow@lemmy.world on 18 Jul 2024 21:48 next collapse

I have to endure a meeting at my company next week to come up with ideas on how we can wedge AI into our products because the dumbass venture capitalist firm that owns our company wants it. I have been opting not to turn on video because I don’t think I can control the cringe responses on my face.

TipRing@lemmy.world on 18 Jul 2024 22:08 collapse

Back in the 90s in college I took a Technology course, which discussed how technology has historically developed, why some things are adopted and other seemingly good ideas don’t make it.

One of the things that is required for a technology to succeed is public acceptance. That is why AI is doomed.

SkyeStarfall@lemmy.blahaj.zone on 19 Jul 2024 01:01 collapse

AI is not doomed, LLMs or consumer AI products, might be

In industries AI is and will be used (though probably not LLMs, still, except in a few niche use cases)

TipRing@lemmy.world on 19 Jul 2024 03:28 collapse

Yeah, I mean the AI being shoveled at us by techbros. Actual ML stuff is currently and will continue to be useful for all sorts on not-sexy but vital research and production tasks. I do task automation for my job and I use things like transcription models and OCR, my company uses smart sorting using rapid image recognition and other really cool uses for computers to do things that humans are bad at. It’s things like LLMs that just aren’t there - yet. I have seen very early research on AI that is trained to actually understand language and learns by context, it’s years away, but eventually we might see AI that really can do what the current AI companies are claiming.

cyborganism@lemmy.ca on 18 Jul 2024 19:46 next collapse

I don’t mind the hardware. It can be useful.

What I do mind is the software running on my PC sending all my personal information and screenshots and keystrokes to a corporation that will use all of it for profit to build user profile to send targeted advertisement and can potentially be used against me.

Lost_My_Mind@lemmy.world on 18 Jul 2024 19:56 next collapse

84% said no.

16% punched the person asking them for suggesting such a practice. So they also said no. With their fist.

shonn@lemmy.world on 18 Jul 2024 20:04 next collapse

I wouldn’t even pay less.

catloaf@lemm.ee on 18 Jul 2024 23:36 collapse

I would pay less, and then either use it for dumb stuff or just not use it at all.

OhmsLawn@lemmy.world on 18 Jul 2024 20:12 next collapse

I honestly have no Idea what AI does to a processor, and would therefore not pay extra for the badge.

If it provided a significant speed improvement or something, then yeah, sure. Nobody has really communicated to me what the benefit is. It all seems like hand waving.

originalucifer@moist.catsweat.com on 18 Jul 2024 20:27 next collapse

what they mean is that they are putting in dedicated processors or other hardware just to run an LLM . it doesnt speed up anything other than the faux-AI tool they are implementing.

LLMs require a ton of math that is better suited to video processors than the general purpose cpu on most machines.

tal@lemmy.today on 18 Jul 2024 22:54 collapse

I honestly have no Idea what AI does to a processor

Parallel processing capability. CPUs historically worked with mostly-non-massively-parallelizable tasks; maybe you’d use a GPU if you wanted that.

I mean, that’s not necessarily “AI” as such, but LLMs are a neat application that uses them.

On-CPU video acceleration does parallel processing too.

Software’s going to have to parallelize if it wants to get much by way of performance improvements, anyway. We haven’t been seeing rapid exponential growth in serial computation speed since the early 2000s. But we can get more parallel compute capacity.

TheEntity@lemmy.world on 18 Jul 2024 20:27 next collapse

And what do the companies take away from this? “Cool, we just won’t leave you any other options.”

Wooki@lemmy.world on 18 Jul 2024 23:11 collapse

Plenty of companies offering sane normal solutions and make bank in the process

catloaf@lemm.ee on 18 Jul 2024 23:33 collapse

History has shown that not to be the case.

Wooki@lemmy.world on 19 Jul 2024 09:17 collapse

plenty of history to shows it is.

Kraiden@kbin.run on 18 Jul 2024 20:59 next collapse

someone tried to sell me a fucking AI fridge the other day. Why the fuck would I want my fridge to "learn my habits?" I don't even like my phone "learning my habits!"

Ragnarok314159@sopuli.xyz on 18 Jul 2024 23:57 next collapse

And it would improve your life zero. That is what is absurd about LLM’s in their current iteration, they provide almost no benefit to a vast majority of people.

All a learning model would do for a fridge is send you advertisements for whatever garbage food is on sale. Could it make recipes based on what you have? Tell it you want to slowly get healthier and have it assist with grocery selection?

Nah, fuck you and buy stuff.

captainlezbian@lemmy.world on 19 Jul 2024 16:22 collapse

Exactly, it’s entirely about extra monetization. They all think in terms of hype and money, never in terms of life improvement.

I’d actually love AI to control something like a home assistant setup by learning how I like things and predicting change (mind you I still need to get it set up at all). But most people don’t even want a smart home.

Make something that makes the unpleasant parts of life easier and people will be happy with it

Zron@lemmy.world on 19 Jul 2024 00:00 next collapse

Why does a fridge need to know your habits?

It has to keep the food cold all the time. The light has to come on when you open the door.

What could it possibly be learning

1995ToyotaCorolla@lemmy.world on 19 Jul 2024 00:06 next collapse

Hi Zron, you seem to really enjoy eating shredded cheese at 2:00am! For your convenience, we’ve placed an order for 50lbs of shredded cheese based on your rate of consumption. Thanks!

variants@possumpat.io on 19 Jul 2024 01:00 next collapse

We also took the liberty of canceling your health insurance to help protect the shareholders from your abhorrent health expenses in the far future

rottingleaf@lemmy.world on 19 Jul 2024 05:03 collapse

If your fridge spies after you, certain people can have better insights into healthiness of your food habits, how organized you are, how often things go bad and are thrown out, what medicine (requiring to be kept cold) do you put there and how often do you use it.

That will then affect your insurances, your credit rating, and possibly many other ratings other people are interested in.

SmackemWittadic@lemmy.world on 19 Jul 2024 07:42 collapse

I wish products followed your lead and had no AI features, 1995 Toyota Corolla :/

Kraiden@kbin.run on 19 Jul 2024 11:23 collapse

I think you're being sarcastic, but I unironically agree. Cars and fridges can, and should stay dumb, with the notable exception of battery management systems in electric vehicles. That's the single acceptable use case for a car IMHO.

SmackemWittadic@lemmy.world on 19 Jul 2024 13:47 next collapse

Oh I absolutely agree, some things don’t need to be “smart”.

Imagine if someone put a microchip in a potato peeler claiming that it would add features like “sensing the amount of pressure applied to the potato to ensure clean peels”. The reason they haven’t done that is that data would only benefit the user, and they can’t think of a way to have it benefit the company’s profit margins.

captainlezbian@lemmy.world on 19 Jul 2024 16:14 collapse

I think car play is a wonderful feature. My car should absolutely allow syncing up to my phone. I don’t think it should telemetry or anything like that though. But I think internal process monitoring should also be a thing. Display error codes, show me that a tire is low, monitor a battery, etc. but the manufacturer shouldn’t get that info. My car shouldn’t know my sex life, and the manufacturer definitely shouldn’t

njm1314@lemmy.world on 19 Jul 2024 01:01 next collapse

So I can see what you like to eat, then it can tell your grocery store, then your grocery store can raise the prices on those items. That’s the point. It’s the same thing with those memberships and coupon apps. That’s the end goal.

rottingleaf@lemmy.world on 19 Jul 2024 05:05 collapse

They can see what you like to eat by what you’re buying, LOL. No, not this.

A fridge can give them information on how do you eat.

JackbyDev@programming.dev on 19 Jul 2024 03:02 collapse

  1. Know when you’re about to put groceries in so it makes the fridge colder so the added heat doesn’t make things go bad.
  2. Know when you don’t use it and let it get a tiny bit warmer to save a teeny bit of power. (The vast majority of power is cooling new items, not keeping things cold though.)
  3. Tell you where things are?
  4. Ummm… Maybe give you an optimized layout of how to store things?
  5. Be an attack vector on your home’s wifi
  6. Wait, no, uh,
  7. Push notifications
  8. Do you not have phones?
upside431@lemmy.world on 19 Jul 2024 00:39 next collapse

To remind you when should go to buy groceries haha

jballs@sh.itjust.works on 19 Jul 2024 02:16 next collapse
explodicle@sh.itjust.works on 19 Jul 2024 02:38 next collapse

<img alt="" src="https://sh.itjust.works/pictrs/image/3775f32a-551c-42d1-b471-ef2cf630620d.png">

I still want this fridge. (Source)

trollblox_@programming.dev on 19 Jul 2024 02:54 next collapse

always xkcd

AdrianTheFrog@lemmy.world on 19 Jul 2024 05:28 next collapse

it doesn’t seem all that hard to make, as long as you don’t mind the severely reduced flexibility in capacity and glass bottles shattering against each other at the bottom

TheGrandNagus@lemmy.world on 19 Jul 2024 11:09 collapse

Not to mention the increased expense, loudness, greater difficulty cleaning, and many more points of failure!

Kraiden@kbin.run on 19 Jul 2024 11:26 collapse

Now THIS I could get behind! Still not AI though. it's a very dumb timer system that would be very useful. 1950's tech could do this!

jubilationtcornpone@sh.itjust.works on 19 Jul 2024 03:59 next collapse

I’m still pissed about the fact that I can’t buy a reasonably priced TV that doesn’t have WiFi. I should never have left my old LG Plasma bolted to the wall of my previous house when I sold it. That thing had a fantastic picture and doubled as a space heater in the winter.

cestvrai@lemm.ee on 19 Jul 2024 08:00 collapse

Projector gang checking in 🤓📽️

Everything alright here?

You can always join us in the peaceful realm of select input.

(there are still WiFi-free options)

Kraiden@kbin.run on 19 Jul 2024 11:28 collapse

what's the affordable option for daytime viewing with the curtains open?

Apollo42@lemmy.world on 19 Jul 2024 12:18 collapse

Audio description.

fruitycoder@sh.itjust.works on 19 Jul 2024 07:12 collapse

I want AI in my fridge for sure. Grocery shopping sucks. Forgetting how old something was sucks. Letting all the cool out to crawl around to see what I have sucks.

I want my fridge to be like the Sims, just get deliveries or pickup the order. Fill it out and get told what ingredients I have. Bonus points if you can just tell me what recipes I can cook right now, even better if I can ask for time frame.

That would be sick!

Still not going to give ecorp all of my data or put some half back internet of stings device on my WiFi for it. But it would be cool.

Kraiden@kbin.run on 19 Jul 2024 11:33 next collapse

Ye, that'd be sick! and that's also not what was being sold! this fridge did none of that. What exactly made it "AI" I didn't bother to find out, but I work in IT. I guarantee it wasn't this. Also, not convinced I want my fridge to be able to spend my money for me. I want to be able to have a Ramen month if I need/want

fruitycoder@sh.itjust.works on 19 Jul 2024 17:00 collapse

Automatic spending definitely takes next level of trust for sure!

Tinks@lemmy.world on 19 Jul 2024 14:05 next collapse

Absolutely this. There IS a scenario in which I would love a “smart” or “AI” fridge, but it’s gotta be damn impressive to even be worth my time.

It needs to know everything in my fridge, how long it’s been there and it’s expiration date, and I want it to build grocery lists for me based on what is low, and let me know ahead of time that I should use something up that’s going bad soon. Bonus points if it recommends some options for how to do that based on my tastes. And I want to do this without having to manually input or remove everything.

But we’re still SO far from being able to do this reliably, let alone at any kind of acceptable price point, and yet fridge makers keep shoving out dumb fridges with a screen on them and calling them “smart”. I hate it.

fruitycoder@sh.itjust.works on 19 Jul 2024 17:00 collapse

For sure playing ads on my fridge or just spying on me aren’t “smart” at all to me.

technocrit@lemmy.dbzer0.com on 19 Jul 2024 16:45 collapse

Would you be willing to destroy the whole planet in order make millions of these fridges?

fruitycoder@sh.itjust.works on 19 Jul 2024 17:09 collapse

A couple planets! /s

I would be willing to never have one in my life time just to see climate change slowed to a rate nature can naturally adapt and people can afford to adjust to honestly.

I dont forsee it being any worse then food waste and wasted grocery trips are for me.

Computer vision, a couple services, a db, and network access can be pretty light weight. Any extra voice, natural language interface, etc is probably overkill and without special hardware (and the ecogical cost of that) not worth it on an energy use stand point.

All speculation of course

tal@lemmy.today on 18 Jul 2024 21:56 next collapse

That’s kind of abstract. Like, nobody pays purely for hardware. They pay for the ability to run software.

The real question is, would you pay $N to run software package X?

Like, go back to 2000. If I say “would you pay $N for a parallel matrix math processing card”, most people are going to say “no”. If I say “would you pay $N to play Quake 2 at resolution X and fps Y and with nice smooth textures,” then it’s another story.

I paid $1k for a fast GPU so that I could run Stable Diffusion quickly. If you asked me “would you pay $1k for an AI-processing card” and I had no idea what software would use it, I’d probably say “no” too.

Grimy@lemmy.world on 18 Jul 2024 22:17 next collapse

Yup, the answer is going to change real fast when the next Oblivion with NPCs you can talk to needs this kind of hardware to run.

[deleted] on 18 Jul 2024 22:47 next collapse

.

tal@lemmy.today on 18 Jul 2024 22:47 collapse

I’m still not sold that dynamic text generation is going to be the major near-term application for LLMs, much less in games. Like, don’t get me wrong, it’s impressive what they’ve done. But I’ve also found it to be the least-practically-useful of the LLM model categories. Like, you can make real, honest-to-God solid usable graphics with Stable Diffusion. You can do pretty impressive speech generation in TortoiseTTS. I imagine that someone will make a locally-runnable music LLM model and software at some point if they haven’t yet; I’m pretty impressed with what the online services do there. I think that there are a lot of neat applications for image recognition; the other day I wanted to identify a tree and seedpod. Someone hasn’t built software to do that yet (that I’m aware of), but I’m sure that they will; the ability to map images back to text is pretty impressive. I’m also amazed by the AI image upscaling that Stable Diffusion can do, and I suspect that there’s still room for a lot of improvement there, as that’s not the main goal of Stable Diffusion. And once someone has done a good job of building a bunch of annotated 3d models, I think that there’s a whole new world of 3d.

I will bet that before we see that becoming the norm in games, we’ll see LLMs regularly used for either pre-generated speech synth or in-game speech synthesis, so that the characters say text (which might be procedurally-generated, aren’t just static pre-recorded samples, but aren’t necessarily generated from an LLM). Like, it’s not practical to have a human voice actor cover all possible phrases with static recorded speech that one might want an in-game character to speak.

Grimy@lemmy.world on 19 Jul 2024 05:28 next collapse

I think it’s coming pretty fast. There’s already a mod for Skyrim that lets you talk to your companion. People are spending hours talking to llms and roleplaying, the first triple A game to incorporate it is going to bee a massive hit imo. I’m actually surprised no one’s been coming out with visual novels using them, it seems like a perfect use case.

It’s definitely going to be used first for making the content of the game like you said though.

AdrianTheFrog@lemmy.world on 19 Jul 2024 06:23 collapse

there are some local genai music models, although I don’t know how good they are yet as I haven’t tried any myself (stable audio is one, but I’m sure there are others)

also minor linguistic nitpick but LLM stands for ‘language model’ (you could maybe get away with it for pixart and sd3 as they use t5 for prompt encoding, which is an llm, i’m sure some audio models with lyrics use them too), the term you’re looking for is probably ‘generative’

QuarterSwede@lemmy.world on 20 Jul 2024 02:40 collapse

This. Apple is doing it the right way, avoiding the term AI and instead focusing on what benefits it brings in iOS18. Other companies need to figure out what problem people need to solve and what AI would do to solve it. Instead they’re trying to cram it into everything and people are largely nonplussed about it.

peopleproblems@lemmy.world on 18 Jul 2024 22:28 next collapse

AI in Movies: “The only Logical solution, is the complete control/eradication of humanity.”

AI in Real Life: “Dave, I see you only have beer, soda, and cheese in your fridge. I am concerned for your health. I can write you a reminder to purchase better food choices.” Dave: “THESE AI ARE EVIL, I WILL NEVER SUBMIT TO YOUR POWER!”

[deleted] on 18 Jul 2024 22:59 collapse

.

peopleproblems@lemmy.world on 18 Jul 2024 23:31 next collapse

Yeah this is probably more likely. It’s just so depressing

Zorque@lemmy.world on 19 Jul 2024 01:57 next collapse

Please drink verification can.

fruitycoder@sh.itjust.works on 19 Jul 2024 07:25 collapse

I think South Parks vision is still the worst. AI so human you can fall in love with but it still tries to manipulate you to sell you things.

metaStatic@kbin.earth on 18 Jul 2024 22:53 next collapse

this goes to show just how far the current grift has gone.

AI enhanced hardware? Jesus Fuck take all my money that's amazing.

Dedicated LLM chatbot hardware? Die in a fire for even suggesting this is AI.

helenslunch@feddit.nl on 18 Jul 2024 22:54 next collapse

Show me a practical use for AI and I’ll show you the money. Genmoji ain’t it.

Give me a virtual assistant that actually functions and I will give you A LOT of money…

nobleshift@lemmy.world on 18 Jul 2024 23:18 next collapse

Pepper Pots levels of capability with local storage and I’ll just hand you my card.

brbposting@sh.itjust.works on 19 Jul 2024 01:31 next collapse

40% of translators report having lost income due to it :0

helenslunch@feddit.nl on 19 Jul 2024 01:55 collapse

I don’t need a translator 🤷

fruitycoder@sh.itjust.works on 19 Jul 2024 07:19 collapse

I use a llm fine tuned on medical stuff for minor medical questions or to prep for medical appointments (getting on the same page as the doc can save some serious time, say the wrong thing and they’ll get hung up on it for a year lol).

I really want to combine it something like fasten health so I can go over my medical data on my own machines faster. Pipe dream rn because getting that data from the docs is a pain in the dick, but still would be cool to me.

Sam_Bass@lemmy.world on 18 Jul 2024 23:27 next collapse

Its bad enough they shove it on you in some websites. Really not interested in being their lab rats

JackbyDev@programming.dev on 19 Jul 2024 02:56 collapse

✨chat assistants✨

magiccupcake@lemmy.world on 18 Jul 2024 23:29 next collapse

Most people have pretty decent ai hardware already in the form of a gpu.

Sure dedicated hardware might be more efficient for mobile devices, but that’s already done better in the cloud.

Nomecks@lemmy.ca on 18 Jul 2024 23:51 next collapse

It’s not really done better in the cloud if you can push the compute out to the device. When you can leverage edge hardware you save bandwidth fees and a ton of cloud costs. It’s faster in the cloud because you can leverage a cluster with economies of scale, but any AI company would prefer the end-user to pay for that compute instead, if they can service requests adequately.

AdrianTheFrog@lemmy.world on 19 Jul 2024 05:50 collapse

Yeah, you also have to deal with the latency with the cloud, which is a big problem for a lot of possible applications

PriorityMotif@lemmy.world on 19 Jul 2024 00:53 collapse

Google coral TPU has been around for years and it’s cheap. Works well for object detection.

docs.frigate.video

There’s a lot of use cases in manufacturing where you can do automated inspection of parts as they go by on a conveyor, or have a robot arm pick and place parts/boxes/pallets etc.

Those types of systems have been around for decades, but they can always be improved.

Fedizen@lemmy.world on 19 Jul 2024 00:41 next collapse

30% of people will believe literally anything. 16% means even half of the deranged people aren’t interested.

ipkpjersi@lemmy.ml on 19 Jul 2024 00:50 collapse

Interested or not, more hardware is going to be “AI-enhanced” and believe it or not, it’s going to cost more.

This is our future.

snek_boi@lemmy.ml on 19 Jul 2024 00:43 next collapse

I agree that we shouldn’t jump immediately to AI-enhancing it all. However, this survey is riddled with problems, from selection bias to external validity. Heck, even internal validity is a problem here! How does the survey account for social desirability bias, sunk cost fallacy, and anchoring bias? I’m so sorry if this sounds brutal or unfair, but I just hope to see less validity threats. I think I’d be less frustrated if the title could be something like “TechPowerUp survey shows 84% of 22,000 respondents don’t want AI-enhanced hardware”.

t00l@lemmy.world on 19 Jul 2024 00:52 next collapse

They want you to buy the hardware and pay for the additional energy costs so they can deliver clippy 2.0, the watching-you-wank-edition.

radivojevic@discuss.online on 19 Jul 2024 02:02 next collapse

If you unbend him, clippy could be very useful 🍆📎

aphlamingphoenix@lemm.ee on 19 Jul 2024 03:06 next collapse

God damn you.

Plopp@lemmy.world on 19 Jul 2024 08:25 collapse

Sound advice.

Ad4mWayn3@lemmy.world on 22 Jul 2024 13:55 collapse

I hate that I understood this

Petter1@lemm.ee on 19 Jul 2024 04:17 collapse

Well, NPU are not in pair with modern GPU. General GPU has more power than most NPUs, but when you look at what electricity cost, you see that NPU are way more efficient with AI tasks (which are not only chatbots).

Steve@startrek.website on 19 Jul 2024 00:56 next collapse

Any “ai” hardware you but today will be obsolete so fast it will make your dick bleed

iopq@lemmy.world on 19 Jul 2024 13:20 collapse

it will just be not as fast as the newer stuff

technocrit@lemmy.dbzer0.com on 19 Jul 2024 16:43 collapse

it will just be not as fast even more slow as the newer stuff

ClamDrinker@lemmy.world on 19 Jul 2024 01:05 next collapse

Depends on what kind of AI enhancement. If it’s just more things nobody needs and solves no problem, it’s a no brainer. But for computer graphics for example, DLSS is a feature people do appreciate, because it makes sense to apply AI there. Who doesn’t want faster and perhaps better graphics by using AI rather than brute forcing it, which also saves on electricity costs.

But that isn’t the kind of things most people on a survey would even think of since the benefit is readily apparent and doesn’t even need to be explicitly sold as “AI”. They’re most likely thinking of the kind of products where the manufacturer put an “AI powered” sticker on it because their stakeholders told them it would increase their sales, or it allowed them to overstate the value of a product.

Of course people are going to reject white collar scams if they think that’s what “AI enhanced” means. If legitimate use cases with clear advantages are produced, it will speak for itself and I don’t think people would be opposed. But obviously, there are a lot more companies that want to ride the AI wave than there are legitimate uses cases, so there will be quite some snake oil being sold.

AdrianTheFrog@lemmy.world on 19 Jul 2024 05:33 collapse

well, i think a lot of these cpus come with a dedicated npu, idk if it would be more efficient than the tensor cores on an nvidia gpu for example though

edit: whatever npu they put in does have the advantage of being able to access your full cpu ram though, so I could see it might be kinda useful for things other than custom zoom background effects

yamanii@lemmy.world on 19 Jul 2024 12:04 collapse

But isn’t ram slower then a GPU’s vram? Last year people were complaining that suddenly local models were very slow on the same GPU, and it was found out it’s because a new nvidia driver automatically turned on a setting of letting the GPU dump everything on the ram if it filled up, which made people trying to run bigger models very annoyed since a crash would be preferable to try again with lower settings than the increased generation time a regular RAM added.

AdrianTheFrog@lemmy.world on 19 Jul 2024 15:39 collapse

Ram is slower than GPU VRAM, but that extreme slowdown is due to the bottleneck of the pcie bus that the data has to go through to get to the GPU.

bitwolf@lemmy.one on 19 Jul 2024 01:33 next collapse

No, but I would pay good money for a freely programmable FPGA coprocessor.

If the AI chip is implemented as one, and is useful for other things I’m sold.

profdc9@lemmy.world on 19 Jul 2024 02:24 collapse

I think manufacturers need to get a lot more creative about simplified computing. The RPi Pico’s GPIO engine is powerful yet simple, and a good example of what is possible with some good application analysis and forethought.

JackbyDev@programming.dev on 19 Jul 2024 02:55 next collapse

Whichnoart of the pico are you referring to specifically? Never heard the term “GPIO engine” before. Is that sort of like the USB stack but for GPIO?

phlegmy@sh.itjust.works on 19 Jul 2024 04:13 next collapse

I think they meant PIO (programmable IO). It’s like a small processor tied to some of the IO pins. There’s a very small set of instructions and some state machines.
It can be used to implement your own IO protocols without worrying about the issues that come with bit-banging from the cpu.

profdc9@lemmy.world on 20 Jul 2024 04:17 collapse

The GPIO engine is a simple state machine that can be programmed to implement high-speed data transfer, digital video output, and many other purposes. It is one of the best and most innovative features on the Pico.

bruhduh@lemmy.world on 19 Jul 2024 03:02 next collapse

I have few pi pico but i didn’t knew about it, can you please elaborate, because I’ve been using them just like any other esp32 stm32 esp8266 i have

jj4211@lemmy.world on 19 Jul 2024 11:28 collapse

Problem for the big market is that it’s hardly profitable. In fact make things too easily multipurpose and you undercut your specialized devices opportunities. Why buy a smart device for 500 dollars that requires a monthly subscription when you could get a 100 dollar device with a popular preload of a solution on it?

Like when the WRT54G came out in the day and OpenWRT basically drove Cisco to buy out Linksys to neuter the “home router” to stop it displacing expensive products in the business sector. The WRT54G was the best product for the market, but not the best product to exist for vendor profitablity.

FiniteBanjo@lemmy.today on 19 Jul 2024 02:44 next collapse

People already aren’t paying for them, nVidia’s main source of income is industry use and not consumer parts, right now.

ryannathans@aussie.zone on 19 Jul 2024 05:26 next collapse

I just want good voice to text that runs on my own phone offline

HerrBeter@lemmy.world on 19 Jul 2024 06:48 collapse

FUTO Keyboard & voice input?

photonic_sorcerer@lemmy.dbzer0.com on 19 Jul 2024 07:27 next collapse

Fr, this one is great

franklin@lemmy.world on 19 Jul 2024 14:31 collapse

Yup, it uses a local version of Whisper AI and it is FANTASTIC, swipe texting is not quite as good as gboard but voice recognition is next level.

EliteDragonX@lemmy.world on 19 Jul 2024 05:43 next collapse

This is yet another dent in the “exponential growth AGI by 2028” argument i see popping up a lot. Despite what the likes of Kurzweil, Musk, etc would have you believe, AI is severely overhyped and will take decades to fully materialise.

You have to understand that most of what you read about is mainly if not all hype. AI, self driving cars, LLM’s, job automation, robots, etc are buzzwords that the media loves to talk about to generate clicks. But the reality is that all of this stuff is extremely hyped up, with not much substance behind it.

It’s no wonder that the vast majority of people hate AI. You only have to look at self driving cars being unable to handle fog and rain after decades of research, or dumb LLM’s (still dumb after all this time) to see why. The only real things that have progressed quickly since the 80s are cell phones, computers, etc. Electric cars, self driving cars, stem cells, AI, etc etc have all not progressed nearly as rapidly. And even the electronics stuff is slowing down soon due to the end of Moore’s Law.

cestvrai@lemm.ee on 19 Jul 2024 07:49 next collapse

There is more to AI than self driving cars and LLMs.

For example, I work at a company that trained a deep learning model to count potatoes in a field. The computer can count so much faster than we can, it’s incredible. There are many useful, but not so glamorous, applications for this sort of technology.

I think it’s more that we will slowly piece together bits of useful AI while the hyped areas that can’t deliver will die out.

jj4211@lemmy.world on 19 Jul 2024 11:21 next collapse

Machine vision is absolutely the most slam dunk “AI” does work and has practical applications. However it was doing so a few years before the current craze. Basically the current craze was driven by ChatGPT, with people overestimating how far that will go in the short term because it almost acts like a human conversation, and that seemed so powerful .

AA5B@lemmy.world on 19 Jul 2024 12:01 collapse

That’s why I love ai: I know it’s been a huge part of phone camera improvements in the last few years.

I seem to get more use out of voice assistants because I know how to speak their language, but if language processing noticeably improves, that will be huge

Motion detection and person detection have been a revolution in cheap home cameras by very reliably flagging video of interest, but there’s always room for improvement. More importantly I want to be able to do that processing real time, on a device that doesn’t consume much power

khaleer@sopuli.xyz on 19 Jul 2024 13:24 next collapse

AI camera generation isn’t camera improvement.

AA5B@lemmy.world on 19 Jul 2024 14:11 collapse

When my phone takes a clearer picture in darker situations and catch a recognizable action shot of my kid across a soccer field, it’s a better camera. It doesn’t matter whether the improvements were hardware or software, or even how true to life in some cases, it’s a better camera

Apple has done a great job of not only making cameras physically better, but integrating LiDAR for faster focus, image composition across multiple lenses, improved low light pictures, and post-processing to make dramatically better pictures in a wide range of conditions

technocrit@lemmy.dbzer0.com on 19 Jul 2024 16:39 collapse

None of what you’re describing is anything close to “intelligence”. And it’s all existed before this nonsense hype cycle.

technocrit@lemmy.dbzer0.com on 19 Jul 2024 16:37 next collapse

So… A machine is “intelligent” because it can count potatoes? This sort of nonsense is a huge part of the problem.

[deleted] on 19 Jul 2024 17:15 collapse

.

EliteDragonX@lemmy.world on 19 Jul 2024 19:04 collapse

That’s nice and all, but that’s nowhere close to a real intelligence. That’s just an algorithm that has “learned” what a potato is.

captainlezbian@lemmy.world on 19 Jul 2024 12:55 collapse

Idk robots are absolutely here and used. They’re just more Honda than Jetsons. I work in manufacturing and even in a shithole plant there are dozens of robots at minimum unless everything is skilled labor.

nadram@lemmy.world on 19 Jul 2024 13:06 collapse

I might be wrong but those do not make use of AI do they? It’s just programming for some repetitive tasks.

captainlezbian@lemmy.world on 19 Jul 2024 14:30 collapse

They use machine learning these days in the nice kind, but I misinterpreted you. I interpreted you as saying that robots were an example of hype like AI is, not that using AI in robots is hype. The ML in robots is stuff like computer vision to sort defects, detect expected variations, and other similar tasks. It’s definitely far more advanced than back in the day, but it’s still not what people think.

daniskarma@lemmy.dbzer0.com on 19 Jul 2024 06:35 next collapse

I would pay for a power efficient AI expansion card. So I can self host AI services easily without needing a 3000€ gpu that consumes 10 times more than the rest of my pc.

AA5B@lemmy.world on 19 Jul 2024 11:56 next collapse

I would consider it a reason to upgrade my phone a year earlier than otherwise. I don’t know what ai will stick as useful, but most likely I’ll use it from my phone, and I want there to be at least a chance of on-device ai rather than “all your data are belong to us” ai

eleitl@lemm.ee on 19 Jul 2024 12:12 collapse

I will be looking into AMD Halo Strix’ performance as a poor man’s GPU to run LLMs and some scientific codes locally.

Blackmist@feddit.uk on 19 Jul 2024 11:16 next collapse

One of our helpdesk told me about his amazing idea for our software the other day.

“We should integrate AI into it…”

“Right? And have it do what?”

“Uh, I don’t know”

This from the same man who came up with an idea for orange juice pumped directly into your home, and you pay with crypto.

And the scary thing is, I can imaging these things coming out of the mouths of people in actual positions of power, where laughing at them might actually get people fired…

Draegur@lemm.ee on 19 Jul 2024 11:56 next collapse

This fucker seriously proposed brawndo on tap. Except perishable. Jesus fucking Christ.

Toribor@corndog.social on 19 Jul 2024 12:25 next collapse

orange juice pumped directly into your home, and you pay with crypto

[Furiously taking notes]

rottingleaf@lemmy.world on 19 Jul 2024 13:55 collapse

who came up with an idea for orange juice pumped directly into your home

That maybe not as cool, but pneumatic city-wide mail system would be cool. Too expensive and hard to maintain, not even talking about pests and bacteria which would live there, but imagine ordering a milkshake with some fries and in 10 minutes hearing “thump”, opening that little door in the wall of your apartment and seeing a package there (it’ll be a mess inside though).

bluewing@lemm.ee on 19 Jul 2024 11:44 next collapse

Remember when the IoT was very new? There were similar grumblings of “Why would I want talk to my refridgerator?” And now more and more things are just IoT connected for no reason.

I suspect AI will follow as similar path into the consumer mainstream.

yamanii@lemmy.world on 19 Jul 2024 11:53 collapse

IoT became very valuable, for them at least, as data collecting devices.

ChapulinColorado@lemmy.world on 20 Jul 2024 04:11 collapse

I feel like the local only devices do have a place in the home automation sector (e.g. home assistant compatible with no cloud integrations).

Most vendors want to lock you into their crappy cloud system that will someday be offline and render things useless however.

Draegur@lemm.ee on 19 Jul 2024 11:55 next collapse

I would pay extra to make sure that there is no AI anywhere near my hardware.

todd_bonzalez@lemm.ee on 19 Jul 2024 12:35 next collapse

People don’t want the hardware if the software sucks.

Why would I need a GPU if the only games that exist to play on it are the equivalent of WildTangent malware games?

If AI matures into something that people actually like, you’ll get a different answer here.

exanime@lemmy.world on 19 Jul 2024 13:11 next collapse

AI for IT companies is looking more and more like 3D was for movie industry

All fanfare and overhype, a small handful of examples that do seem a solid step forward with millions others that are just a polished turd. Massive investment for something the market has not demanded

foggianism@lemmy.world on 19 Jul 2024 14:09 collapse

It’s just a gimmick, a new “feature” to justify higher product prices.

thermal_shock@lemmy.world on 19 Jul 2024 16:05 collapse

barely a feature, just a buzzword

OCATMBBL@lemmy.world on 19 Jul 2024 15:42 next collapse

Why would I pay more for x company to have a robot half ass the work of all the employees they’re gonna cut?

Wogi@lemmy.world on 19 Jul 2024 16:34 collapse

So the trades have been unknowingly fucking with AI for decades, because of the time honored tradition of fucking with apprentices.

A lot of forums are filled with absolutely unhinged advice, and sprinkled in there is some good advice. If you know what you’re doing, you can spot the bullshit.

But if you don’t know anything about it, the advice seems perfectly reasonable. There’s a skill in giving unhinged advice. Literally you can’t get your master cert without convincing at least one apprentice to ask where the board stretcher is.

Do I actually have a dedicated vise for Vaseline when I run a tap cycle or is that old timer bullshit? HOW WOULD YOU POSSIBLY KNOW??

MrAlternateTape@lemm.ee on 19 Jul 2024 16:44 next collapse

I have no clue why any anybody thought I would pay more for hardware if it goes with some stupid trend that will be blow up in our faces soon or later.

I don’t get they AI hype, I see a lot of companies very excited, but I don’t believe it can deliver even 30% of what people seem to think.

So no, definitely not paying extra. If I can, I will buy stuff without AI bullshit. And if I cannot, I will simply not upgrade for a couple of years since my current hardware is fine.

In a couple of years either the bubble is going to burst, or they really have put in the work to make AI do the things they claim it will.

A_Random_Idiot@lemmy.world on 19 Jul 2024 16:51 next collapse

I guarantee most this AI bullshit is nothing but a backdoor to harvest more user info, anyway.

T156@lemmy.world on 19 Jul 2024 17:06 next collapse

It just doesn’t really do anything useful from a layman point of view, besides being a TurboCyberQuantum buzzword.

I’ve apparently got AI hardware in my tablet, but as far as I’m aware, I’ve never/mostly never actually used it, nor had much of a use for it. Off the top of my head, I can’t think of much that would make use of that kind of hardware, aside from some relatively technical software that is almost as happy running on a generic CPU. Opting for AI capabilities would be paying extra for something I’m not likely to ever make use of.

And the actual stuff that might make use of AI is pretty much abstracted out so far as to be invisible. Maybe the autocorrecting feature on my tablet keyboard is in fact powered by the AI hardware, but from the user perspective, nothing has really changed from the old pre-AI keyboard, other than some additions that could just be a matter of getting newer, more modern hardware/software updates, instead of any specific AI magic.

slurpeesoforion@startrek.website on 19 Jul 2024 17:19 next collapse

AI enhanced anything is pointless when everything seems intended to separate you from your money.

cmrn@lemmy.world on 19 Jul 2024 20:15 next collapse

I still don’t understand how the buzzword of AI 10x’d all these valuations, when it’s always either: a) exactly what they’ve been doing before, now with a fancy new name b) deliberately shoehorning AI in, in ways with no practical benefit

dinckelman@lemmy.world on 22 Jul 2024 00:26 collapse

Isn’t that the entire point behind what most business people do? The whole goal is to upsell some schmuck by speaking too fast, and mentioning a lot of words that don’t really mean anything. Except the difference now is that the business person in this case is the leadership behind most of the tech industry

nayminlwin@lemmy.ml on 20 Jul 2024 03:04 next collapse

Can’t help but think of it as a scheme to steal the consumers’ compute time and offload AI training to their hardware…

sagrotan@lemmy.world on 20 Jul 2024 07:05 collapse

They’ll pay for it. When the tech companies decide, it’s a thing to make money off & advertise it, all the good ants will buy, buy, buy and the rest of the time they will work, work, work for it.