Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec (www.businessinsider.com)
from L4s@lemmy.world to technology@lemmy.world on 25 Sep 2023 20:00
https://lemmy.world/post/5724337

Running AI is so expensive that Amazon will probably charge you to use Alexa in future, says outgoing exec::In an interview with Bloomberg, Dave Limp said that he “absolutely” believes that Amazon will soon start charging a subscription fee for Alexa

#technology

threaded - newest

autotldr@lemmings.world on 25 Sep 2023 20:00 next collapse

This is the best summary I could come up with:


The emerging generation of “superhuman” AI models are so expensive to run that Amazon might charge you to use its Alexa assistant one day.

In an interview with Bloomberg, outgoing Amazon executive Dave Limp said that he “absolutely” believes that Amazon could start charging a subscription fee for Alexa, and pointed to the cost of training and running generative artificial intelligence models for the smart speaker’s new AI features as the reason why.

Limp said that the company had not discussed what price it would charge for the subscription, adding that “the Alexa you know and love today is going to remain free” but that a future subscription-based version is “not years away.”

Generative AI models require huge amounts of computing power, with analysts estimating that OpenAI’s ChatGPT costs $700,000 a day or more to run.

Limp, Amazon’s senior VP of devices and services, announced he would step down from his role at the company after 13 years a month before the launch of the new products.

Insider’s Ashley Stewart reported that former Microsoft exec Panos Panay is expected to replace Limp.


The original article contains 298 words, the summary contains 182 words. Saved 39%. I’m a bot and I’m open source!

db2@sopuli.xyz on 25 Sep 2023 20:57 collapse

Dave Limp 💀

Mojojojo1993@lemmy.world on 25 Sep 2023 20:04 next collapse

I use Alexa as a way to use an old speaker system. I wouldn’t pay to use any “smart” speaker systems. They are pretty dumb and I’ve already paid once

Kalkaline@leminal.space on 25 Sep 2023 20:09 next collapse

Alexa is so bad though. Who’s going to pay for that?

LazaroFilm@lemmy.world on 25 Sep 2023 20:38 next collapse

“By the way, you can now pay for Alexa AI option if you want me to reply in a slightly smarter way, but I will still cut you off with ads and other useless things. To activate AlexaAI say activate”

db2@sopuli.xyz on 25 Sep 2023 20:56 next collapse

“Welcome to the PiHole, Alexa.”

spitfire@infosec.pub on 26 Sep 2023 07:41 collapse

Just made the switch to NextDNS. For $2/month I get a lot of the same features but also on my phone when not on WiFi. Still love my pihole though!

FireTower@lemmy.world on 25 Sep 2023 20:58 next collapse

*to the same degree of intelligence as you’ve previously experienced. (Ps if you don’t we’re making Alexa have a room temp IQ)

JonEFive@midwest.social on 26 Sep 2023 03:45 collapse

“No”

“I heard ‘activate’. Thank you! Your credit card will be charged $129 annually. To cancel, please log on to the website because there’s no way we’re letting you get out of this mess the same way we got you into it.”

bobs_monkey@lemm.ee on 27 Sep 2023 03:46 collapse

To cancel, please log on to the website because there’s no way we’re letting you get out of this mess the same way we got you into it.

Unless you’re in California

SpaceNoodle@lemmy.world on 25 Sep 2023 20:52 next collapse

Still better than Siri …

glimpseintotheshit@sh.itjust.works on 26 Sep 2023 15:33 collapse

Siri was always shit but somehow managed to devolve even further lately. I never trusted her to do more than than turning lights on or off but now this shit happens:

Me: Siri, turn off the lights in the living room

Siri: OKAY, WHICH ROOM? BATHROOM, BEDROOM, KITCHEN, HALLWAY, LIVING ROOM?

Imagine living in a mansion with this cunt

Staple_Diet@aussie.zone on 27 Sep 2023 02:30 collapse

I use Google to turn on my TV by saying ‘turn on TV’, easily done. But then when I ask it to adjust volume it asks me which TV… I only have one TV online and it had just turned it on.

glimpseintotheshit@sh.itjust.works on 28 Sep 2023 00:34 collapse

Jeez… at least Siri has an excuse to some degree because of Apples privacy pledge

tdawg@lemmy.world on 25 Sep 2023 23:22 next collapse

Guess we’ll find out when they finally pull the trigger

pyldriver@lemmy.world on 25 Sep 2023 23:45 next collapse

Mine can’t ever seem to tell the difference between on and off if there is any sound in my house

atx_aquarian@lemmy.world on 26 Sep 2023 03:32 collapse

I use "turn on ___" and "kill ___". Much more reliable matches.

JonEFive@midwest.social on 26 Sep 2023 03:42 next collapse

Do you change the names of all your devices to people names? The living room lamp is Steve, the bedroom fan is Maryanne…

SpaceNoodle@lemmy.world on 26 Sep 2023 15:44 collapse

Which one’s Bill?

InnerScientist@lemmy.world on 27 Sep 2023 05:24 collapse

Good to see people already training ai to kill.

scarabic@lemmy.world on 26 Sep 2023 21:06 collapse

But he acknowledged that Alexa will need to drastically improve before that happens.

I get tired of the outrage-headline game.

NocturnalMorning@lemmy.world on 25 Sep 2023 20:17 next collapse

AI is being touted as the solution to everything these days. It’s really not, and we are going to find that out the hard way.

Valmond@lemmy.mindoki.com on 25 Sep 2023 20:40 next collapse

Hey that’s only because Amazon, Google and Microsoft (et al) just doesn’t have the Money to Make it good!!

So what about 9.99 a month?

4.99 if you pay up front for a year?

Euh, or how much can you cough up, like for a year or at least for Q4, I’m literally on a bad roll here.

NocturnalMorning@lemmy.world on 25 Sep 2023 21:08 collapse

I’m not going to buy into a subscription model for something I’ve already paid for. This subscription model crap is complete bullshit.

We even tried to do it with heated seats recently. Like install heated seats in your car, but disable them in software. It’s crazy that companies think they can get away with this.

slumberlust@lemmy.world on 25 Sep 2023 22:46 next collapse

While I agree with you, they are 💯 going to get away with it, because your average consumer just doesn’t care.

Stumblinbear@pawb.social on 25 Sep 2023 23:08 collapse

I think there’s a massive difference between unlocking a feature that’s already there and requires no maintenance and a cloud-based service that demands 24/7 uptime and constant developer support, as well as ongoing feature development

KairuByte@lemmy.dbzer0.com on 25 Sep 2023 21:53 next collapse

I get what you’re saying, but voice assistants are one of the main places LLMs belong.

HughJanus@lemmy.ml on 26 Sep 2023 00:58 collapse

Yes, but so much more. An actually useful assistant that could draft emails, set reminders appropriately, create automations, etc. would be worth A LOT of money to me.

whofearsthenight@lemm.ee on 26 Sep 2023 23:44 collapse

I think if there ends up actually being a version of AI that is privacy focused and isn’t screwing over creators it’d be so much less controversial. Also, everyone (including me) is really, really fucking sick of hearing about it all of the time in the same way that everyone is/was sick of hearing about the blockchain. As in: “Bro your taco stand needs AI/the blockchain.”

HughJanus@lemmy.ml on 27 Sep 2023 00:10 collapse

You wouldn’t need any kind of special training for this. Just the ability to do simple things like make calendar appointments, draft emails/responses, and set reminders based on time/locations/etc. It really doesn’t seem very complicated but as far as I know no one has figured out how to do it yet. All the existing “assistants” are so bad that I don’t even bother trying to use them anymore. They can’t even do something simple like turning on a light with any degree of reliability.

Hoomod@lemmy.world on 26 Sep 2023 00:29 next collapse

If IBM actually manages to convert COBOL into Java like they’re advertising, they’ll end up killing their own cash cow

So much still runs on COBOL

antibios@lemmy.ml on 26 Sep 2023 23:19 collapse

It’s not even A.I. either

Nindelofocho@lemmy.world on 25 Sep 2023 20:23 next collapse

im actually surprised companies havent tried to charge for voice assistants already considering pretty much everything you say to them gets sent to some service somewhere

kescusay@lemmy.world on 25 Sep 2023 20:29 next collapse

Yeah… The moment they do that is the moment I turn off and disconnect every smart speaker I own, take them to the electronics recycling place, and start building out an open-source smart home setup.

I use commercial smart speakers because they’re easy and cheap. The moment they stop being one or the other is the moment they stop being in my home.

SmoothLiquidation@lemmy.world on 25 Sep 2023 20:46 next collapse

I shut mine off a while back when I was sure they were advertising stuff based on things we were talking about in the same room. We were discussing moving the chairs out of the office and the next time we went to play music she wanted to sell us new ones.

It might be a total coincidence but screw that.

[deleted] on 25 Sep 2023 20:59 collapse

.

Buelldozer@lemmy.today on 25 Sep 2023 23:11 collapse

and start building out an open-source smart home setup.

Why are you waiting? Home Assistant is here now and it works pretty damn well for running a smart home.

At this point all Alexa is / does is act as a voice enabler for my HA setup and even that will be going away soon. They’re working hard to have a localized VA of their own. It’s actually usable NOW it just doesn’t have “Wake Word” support yet so you have to PTT (Push To Talk) on something for HA to know you want to talk to it.

Hasuris@sopuli.xyz on 25 Sep 2023 20:34 collapse

They could be handing those out for free and they’d still rot on the shelves. People just don’t know what to do with them and I am not certain Amazon does.

Charge for them? That’s meta verse idea quality.

can@sh.itjust.works on 25 Sep 2023 22:16 collapse

They were pretty much were giving away google home minis away in Canada and I still decided against it. See them at thrift stores all the time.

greenskye@lemm.ee on 25 Sep 2023 20:59 next collapse

Well so far my Google Smart speakers have two functions:

  • Voice activated timer
  • Wi-Fi speaker that I only ever cast to with my phone, never actually talk to them.

Don’t think I’ll miss that if they decide to charge for it.

lps2@lemmy.ml on 25 Sep 2023 21:08 next collapse

My Google homes have gotten progressively worse over the years. Half the time it will say it’s setting a timer but nope, no timer. Recently I’ll tell it to play music and it will reply that I don’t have any devices with that feature… they’re all Google homes or Chromecast which absolutely play music. Really like the hardware but the software is utter shit

Stumblinbear@pawb.social on 25 Sep 2023 23:10 next collapse

I hate when it’s playing music and I tell it to shut the fuck up, then it decides to turn off every actively going alarm in the house instead of turning off the goddamn music playing on the one it literally responded in. This happens most mornings.

criticalimpact@lemm.ee on 26 Sep 2023 00:23 collapse

They also removed the ability to link third-party list applications so now saying “Add X to shopping list” just sends it into some nether realm where the item is never to be seen ever again.

bilbofraggins@lemmy.sdf.org on 26 Sep 2023 03:54 collapse

Mine also finds my phone, it’s primary use.

ShittyRedditWasBetter@lemmy.world on 25 Sep 2023 21:31 next collapse

Alexa isn’t nearly good enough to pay for. I basically use it for timers, math, weather, and conversions.

ViewSonik@lemmy.world on 25 Sep 2023 22:05 next collapse

Yep, used to be much better. There was SO much potential with it too. I wish there was a Smart Speaker with integration into ChatGPT. Id love to stand in the shower and ask it shit

nexusband@lemmy.world on 25 Sep 2023 22:09 next collapse

Building your own dystopia I see…

ViewSonik@lemmy.world on 25 Sep 2023 22:10 collapse

Not following

deranger@lemmy.world on 25 Sep 2023 22:34 collapse

You can do this with a Siri shortcut.

It still falls short because LLMs aren’t smart, they’re just approximately not wrong most of the time. I thought it would be a lot cooler than it actually is.

CoderKat@lemm.ee on 26 Sep 2023 02:06 collapse

Yeah, they’re all pretty disappointing. I’d love to have something that feels like how movies portray digital assistants. Movie assistants never misunderstand you or say “I’m sorry, I couldn’t recognize your voice”. I’ve mostly used the Google one and it’s so bad at doing what I feel like is feasible even with inaccuracy.

Eg, I’ve tried to tell my assistant to like a song that was currently playing on YTM but could not find a voice command that worked (and some commands backfired by making it skip to the next song). I’ve had very poor success with getting assistant to cast something to my Chromecast with my voice. It sometimes works, but it fails or gets it wrong so often that it’s not worth the time.

Sometimes I use it for rewinding (e.g., “ok google, rewind 30 seconds”) because many apps don’t have granular rewind buttons and tracking on the track bar is way too inaccurate. But lol, it’s so slow! It takes a few seconds to figure out what I said (so I have to ask it to rewind more than I wish) and it seems every app is unoptimized for rewinding, as it usually takes several seconds of loading.

It can’t really do any kind of research either. You basically can just ask it to google things and it sometimes is able to extract the meaningful part from simple questions. It’s a far way from how Hollywood thinks a digital assistant will work.

foggy@lemmy.world on 25 Sep 2023 21:55 next collapse

…charge me to use Alexa?

I already avoid it like the plague.

reversebananimals@lemmy.world on 25 Sep 2023 22:11 next collapse

They’re already trying this, sort of.

theverge.com/…/amazon-echo-show-8-photos-edition-…

They know charging for total access will cause a riot, so instead they’re enshitifying the whole experience and holding access to the current non-shit experience hostage with monthly fees.

mvirts@lemmy.world on 25 Sep 2023 22:13 next collapse

Na, once ml inference and training chips are purpose built it’ll be built into devices. AI models are the mainframes of today

tinfox@lemmy.world on 25 Sep 2023 22:34 next collapse

Ok. I’ll be the weirdo. If it’s actually useful, I would pay for it.

Not if it’s just the parlor trick that it currently is.

theragu40@lemmy.world on 25 Sep 2023 23:48 collapse

This is the killer for all this shit right now as far as I’m concerned. All of it lives squarely in “huh…neat” territory. I have yet to see anything I felt was truly necessary. Until that happens, paying is a non starter for me.

JonEFive@midwest.social on 26 Sep 2023 03:40 collapse

This is why I’m so confused by Amazon’s approach. I know they’ve already sunk millions if not billions of dollars into this, so why has the user experience not improved in the last 8 years?

I’m not going to buy things with my voice when just getting the lights to turn off or music to play can be an infuriating endeavor. Speech recognition has stagnated.

The third party integrations are just so clunky too. They could have made money by selling licenses to businesses in order to access the service, but again, they haven’t improved that experience at all.

The “Alexa, let me talk to dominos.” or “Alexa, ask LG to turn off the TV” is just stupidly cumbersome. Why can’t you set up preferred providers? I don’t have to say “ask Spotify to play music” I just say “play music”, so we know it’s possible. It would be trivial to implement other preferred service providers compared to the overall scale of Alexa.

theragu40@lemmy.world on 26 Sep 2023 03:52 collapse

I don’t know if you’re in IT at all, but the really crazy thing is that as half baked as Alexa stuff feels…a ton of AWS’s offerings feel the exact same way. Their marketing material is great, and I do believe their engineers are passionate and have the right intentions. But none of it feels “finished”. It all feels like an elaborate beta test. Things don’t work, documentation is out of date or just plain wrong, it’s impossible to get actual expert support from Amazon directly.

AWS is their biggest money maker and even that is a cobbled together, confusing pile half the time. Sometimes feels like everything is a house of cards.

JonEFive@midwest.social on 26 Sep 2023 05:51 collapse

It’s weird to me that a company of this size is just that inept. It’s like once they have enough momentum, nothing can stop them.

Chreutz@lemmy.world on 26 Sep 2023 07:00 collapse

The same goes for Google, and to some degree MS.

theragu40@lemmy.world on 26 Sep 2023 11:26 collapse

It’s really true. I’m actually annoyed that MS is starting to feel this way, particularly with some Azure related services. MS was always the one you could count on to at least be stable, well tested internally, and predictable. At least in comparison to Google and Amazon. But it feels like they have been leaving some of that behind with their cloud stuff as CI/CD becomes more prevalent.

galaxies_collide@lemmy.world on 25 Sep 2023 22:37 next collapse

So they get massive amounts of free data for Machine Learning, but want to charge users for supplying it?

tempest@lemmy.ca on 25 Sep 2023 23:07 next collapse

That’s often the case. They can have their cake and eat it too. Shareholders would expect nothing less.

HughJanus@lemmy.ml on 26 Sep 2023 01:02 next collapse

It’s like charging you for cable and then shoving ads down your throat.

billiam0202@lemmy.world on 26 Sep 2023 03:36 collapse

It’s like charging you for Prime Video and then shoving ads down your throat.

HughJanus@lemmy.ml on 26 Sep 2023 12:50 collapse

It’s like RAAAAIIAAIIIN on your wedding day

ChaoticNeutralCzech@feddit.de on 26 Sep 2023 10:35 next collapse

I think the data is probably less valuable than people think, especially if the users expect an AI response whenever a data point can be collected from them.

thrakkerzog@lemmy.world on 26 Sep 2023 23:19 collapse

I wake my echo and then grind coffee beans.

DemBoSain@midwest.social on 25 Sep 2023 22:58 next collapse

Alexa has a feature where you tell it you’re leaving the house and it will listen for smoke detectors or breaking glass, alerting you through your phone if it detects something. Amazon is putting that behind a paywall next year.

TurboDiesel@lemmy.world on 26 Sep 2023 03:58 next collapse

Google did that with Nest Aware years ago as well. It’s super annoying.

pineapplelover@lemm.ee on 26 Sep 2023 04:00 collapse

Let me in your house and I’ll observe it for you for less money

art@lemmy.world on 25 Sep 2023 23:41 next collapse

We need to move AI from the cloud to our own hardware running in our homes. Free, open source, privacy focused hardware. It’ll eventually be very affordable.

pyldriver@lemmy.world on 25 Sep 2023 23:44 next collapse

God I wish, I would just love local voice control to turn my lights and such on and off… but noooooooooooo

Otkaz@lemmy.world on 26 Sep 2023 00:05 next collapse

www.home-assistant.io

pyldriver@lemmy.world on 27 Sep 2023 12:58 collapse

I have home assistant, but have not heard anything good about rhasspy. Just want to control lights and be able to use it to play music and set timers. That being said I run home assistant right now and can control it with Alexa and Siri but… I would like local only

Kolanaki@yiffit.net on 26 Sep 2023 03:35 collapse

I have that with just my phone, using Wiz lights and ITEEE. It’s the only home automation I even have because it’s the only one I found that doesn’t necessarily need a special base station like an Alexa or Google Home.

AA5B@lemmy.world on 26 Sep 2023 14:52 collapse

But you want a local base station, else there’s no local control. You want to use local-only networks like z-wave, zigbee, Thread, Bluetooth, etc, even though they require a base station because that’s what gives you a local-only way of controlling things.

Matter promises a base station may no longer be necessary for smart devices to control each other, but it is rolling out very slowly

I also wonder what I’ll be able to do with the Thread radio in the iPhone 15 Pro

Kolanaki@yiffit.net on 26 Sep 2023 20:37 collapse

The base stations are what uses the cloud/AI shit. The setup I have doesn’t even require an Internet connection or wifi; it’s entirely bluetooth. Why in the hell would I want a base station that costs money, is controlled by Amazon or Google, and requires an Internet connection for my local shit?

I don’t want a piece of hardware that does nothing but act like a fucking middleman for no good reason.

AA5B@lemmy.world on 27 Sep 2023 03:02 next collapse

I’m a huge fan of Home Assistant. You might look into it

[deleted] on 27 Sep 2023 14:23 collapse

.

foggenbooty@lemmy.world on 27 Sep 2023 14:23 collapse

That is not necessarily true. Some base stations use the internet, yes, but not all. For example a Philips hue does not require internet access, nor does Lutron Caseta. As the other person posted, Home Assistant is the absolute best (IMO) way to do everything locally without the internet.

Your system, while it might work for you, does not scale well due to the limited range and reliability of Bluetooth. You’d likely be better off to adopt a more robust protocol like Z-wave, or ZigBee and get a hub that you have full control over.

[deleted] on 26 Sep 2023 00:36 next collapse

.

Soundhole@lemm.ee on 26 Sep 2023 01:38 next collapse

That’s already here. Anyone can run AI chatbots similar to, but not as intelligent as, Chatgpt or Bard.

Llama.cpp and koboldcpp allow anyone to run models locally, even with only a CPU if there’s no dedicated graphics card available (although more slowly). And there are numerous open source models available that can be trained for just about any task.

Hell, you can even run llama.cpp on Android phones.

This has all taken place in just the last year or so. In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.

Zetta@mander.xyz on 26 Sep 2023 02:06 next collapse

Yes, and you can run a language model like Pygmalion Al locally on koboldcpp and have a naughty AI chat as well. Or non sexual roleplay

Soundhole@lemm.ee on 26 Sep 2023 02:19 collapse

Absolutely and there are many, many models that have iterated on and surpassed Pygmalion as well as loads of uncensored models specifically tuned for erotic chat. Steamy role play is one of the driving forces behind the rapid development of the technology on lower powered, local machines.

Chreutz@lemmy.world on 26 Sep 2023 06:58 next collapse

Never underestimate human ingenuity

When they’re horny

das@lemellem.dasonic.xyz on 26 Sep 2023 12:03 next collapse

And where would one look for these sexy sexy AI models, so I can avoid them, of course…

Soundhole@lemm.ee on 26 Sep 2023 14:58 collapse

Huggingface is where the models live. Anything that’s uncensored (and preferably based on llama 2) should work.

Some popular suggestions at the moment might be HermesLimaRPL2 7B and MythomaxL2 13B for general roleplay that can easily include nsfw.

There are lots of talented people releasing models everyday tuned to assist with coding, translation, roleplay, general assistance (like chatgpt), writing, all kinds of things, really. Explore and try different models.

General rule: if you don’t have a dedicated GPU, stick with 7B models. Otherwise, the bigger the better.

Zetta@mander.xyz on 26 Sep 2023 14:45 collapse

Which models do you think beat Pygmalion for erotic roleplay? Curious for research haha

Soundhole@lemm.ee on 26 Sep 2023 15:05 collapse

Hey, I replied below to a different post with the same question, check it out.

Zetta@mander.xyz on 26 Sep 2023 15:38 collapse

Oh I see, sorry for the repeat question. Thanks!

Soundhole@lemm.ee on 27 Sep 2023 04:20 collapse

lol nothing to be sorry about, I just wanted to make sure you saw it.

MaxHardwood@lemmy.ca on 26 Sep 2023 12:26 next collapse

GPT4All is a neat way to run an AI chat bot on your local hardware.

Soundhole@lemm.ee on 26 Sep 2023 15:09 collapse

Thanks for this, I haven’t tried GPT4All.

Oobabooga is also very popular and relatively easy to run, but it’s not my first choice, personally.

teuast@lemmy.ca on 26 Sep 2023 23:13 collapse

it does have a very funny name though

scarabic@lemmy.world on 26 Sep 2023 21:07 next collapse

Don’t these models require rather a lot of storage?

art@lemmy.world on 26 Sep 2023 23:56 next collapse

Storage is getting cheaper every day and the models are getting smaller with the same amount of data.

scarabic@lemmy.world on 27 Sep 2023 00:26 collapse

I’m just curious - do you know what kind of storage is required?

Soundhole@lemm.ee on 27 Sep 2023 04:13 collapse

13B quantized models, generally the most popular for home computers with dedicated gpus, are between 6 and 10 gigs each. 7B models are between 3 and 6. So, no, not really?

It is relative so, I guess if you’re comparing that to an atari 2600 cartridge then, yeah, it’s hella huge. But you can store multiple models for the same storage cost as a single modern video game install.

scarabic@lemmy.world on 27 Sep 2023 18:39 collapse

Yeah that’s not a lot. I mean… the average consumer probably has 10GB free on their boot volume.

It is a lot to download. If we’re talking about ordinary consumers. Not unheard of though - some games on Steam are 50GB+

So okay, storage is not prohibitive.

teuast@lemmy.ca on 26 Sep 2023 23:14 collapse

In five to ten years, imo, AI will be everywhere and may even replace the need for mobile Internet connections in terms of looking up information.

You’re probably right, but I kinda hope you’re wrong.

Soundhole@lemm.ee on 27 Sep 2023 04:14 collapse

Why?

teuast@lemmy.ca on 27 Sep 2023 08:33 collapse

Call it paranoia if you want. Mainly I don’t have faith in our economic system to deploy the technology in a way that doesn’t eviscerate the working class.

Soundhole@lemm.ee on 27 Sep 2023 15:05 collapse

Oh, you are 100% justified in that! It’s terrifying, actually.

But what I am envisioning is using small, open source models installed on our phones that can answer questions or just keep us company. These would be completely private, controlled by the user only, and require no internet connection. We are already very close to this reality, local AI models can be run on Android phones, but the small AI “brains” that are best for phones are still pretty stupid (for now).

Of course, living in our current Capitalist Hellscape, it’s hard not to imagine that going awry to the point where we’ll all ‘rent’ AI from some asshole who spies on everything we do, censors the AI for our own ‘protection’, or puts ads in there somehow. But I guess I’m a dreamer.

a1studmuffin@aussie.zone on 26 Sep 2023 11:29 next collapse

It’s the year of the voice for Home Assistant. Given their current trajectory, I’m hopeful they’ll have a pretty darn good replacement for the most common use cases of Google Home/Alexa/Siri in another year. Setting timers, shopping list management, music streaming, doorbell/intercom management. If you’re on the fence about a Nabu Casa subscription, pull the trigger as it helps them stay independent and not get bought out or destroyed by commercial interests.

AA5B@lemmy.world on 26 Sep 2023 14:40 collapse

Thumbs up for Nabu Casa and Home Assistant!

I haven’t yet played with the local voice stuff but have been following it with interest. Actually, now that Taspberry Piis are starting to become available again, I’m on the fence between buying a few more, vs finding something with a little more power, specifically for voice processing

foggenbooty@lemmy.world on 27 Sep 2023 14:26 collapse

Get something with a little more power. Pi’s are reaching outside the price where they make sense these days. You can get an Intel N100 system on AliExpress/Amazon for pretty cheap now and I’ve got mine running ProxMox hosting all kinds of stuff.

AA5B@lemmy.world on 26 Sep 2023 14:33 next collapse

While you may have points against Apple and how effective Siri may be, with this latest version kind of products, even the watch has enough processing power to do voice processing on device. No ads. No cloud services

whofearsthenight@lemm.ee on 26 Sep 2023 23:30 collapse

Pretty much. If you want a voice assistant right now, Siri is probably the best in terms of privacy. I bought a bunch of echos early, then they got a little shitty but I was in, and now I just want them out of my house except for one thing - music. Spotify integration makes for easy multi-room audio in a way that doesn’t really work as well on the other platform that I’ll consider (Apple/Siri) and basically adds sonos-like functionality for a tiny fraction of the price. The Siri balls and airplay are just not as good, and of course, don’t work as well with Spotify.

But alexa is so fucking annoying that at this point I mostly just carry my phone (iPhone) and talk to that even though it’s a little less convenient because I’m really goddamned tired of hearing “by the way…”

captain_aggravated@sh.itjust.works on 27 Sep 2023 00:24 collapse

I do wonder how much of those voice assistants could run on-device. Most of what I use Bixby for (I know. I KNOW.) is setting timers. I think simple things like that can run entirely on the phone. It’s got a shocking amount of processing in it.

DigitalFrank@lemmy.world on 25 Sep 2023 23:48 next collapse

I already don’t use it, you don’t have to sell me on it.

KingThrillgore@lemmy.ml on 26 Sep 2023 01:39 next collapse

How to make me go back to buying shit in person, by Amazon.com

topinambour_rex@lemmy.world on 26 Sep 2023 05:01 next collapse

I have doubts alexa is using AI, seen how dumb it is, but well

DarienGS@lemmy.world on 26 Sep 2023 22:06 collapse

From the article:

Amazon has bet big on AI, with the company unveiling a new, AI-powered version of Alexa alongside updated versions of its Echo Frames and Carrera smart glasses last week.

Treczoks@lemmy.world on 26 Sep 2023 06:49 next collapse

So they expect that people pay for being spied upon and seriously data mined?

BirdyBoogleBop@lemmy.dbzer0.com on 26 Sep 2023 06:53 next collapse

Yes and people will pay for it.

DragonTypeWyvern@literature.cafe on 26 Sep 2023 07:48 next collapse

Sometimes I can’t help thinking some people deserve to be taken advantage of.

Esqplorer@lemmy.zip on 26 Sep 2023 12:30 collapse

I don’t know about that. They never delivered on Smart Home promises and the only truly useful thing my Google AI does is to give me the forecast. Otherwise it’s just a wifi speaker.

If they finally integrate Bard, I would actually consider paying for the service.

dragonflyteaparty@lemmy.world on 26 Sep 2023 18:07 collapse

We have a Google mini. They listen to my six year old request the song Poopy Bum Bum all day. The ads get interesting.

lloram239@feddit.de on 26 Sep 2023 08:00 next collapse

Just FYI, Alexa records everything you speak into it, you can listen to it here:

www.amazon.com/gp/help/customer/display.html?node…

Natanael@slrpnk.net on 27 Sep 2023 10:32 collapse

And much of it can be listened to by staff that are hired to label it to train the model.

gingerwolfie@lemmy.world on 26 Sep 2023 09:34 next collapse

I think at this point with so many tech giants introducing ads to their services and increasing subscription prices, I think we can expect some kind of subscription fee to access assistants with the AI/LLM capability. It would make sense to offer a ‘basic’ version of these services for free since people have already invested in the hardware, but wouldn’t be surprised if these companies suddenly block us from using the smart functionality suddenly unless you pay.

OrangeCorvus@lemmy.world on 26 Sep 2023 10:34 next collapse

Good luck, I guess? Got the first Google home, at first it was great, I was asking it tons of questions. Then the questions stopped, used it for turning on the lights and other automations. Then I installed Home Assistant and the only command Google Home got was to set a timer to know when to pull things out of the oven. Eventually I stopped doing that.

At the moment all Google/Nest Homes have their mic cut off, I only use them to stream music in my house from my NAS via Plex. So yeah…

B1ackmsth@lemmy.ca on 26 Sep 2023 18:09 next collapse

I still use mine for voice commands with home assistant. Works great.

tony@lemmy.hoyle.me.uk on 27 Sep 2023 09:52 collapse

All mine to is turn lights on and off… very occasionally they might be used to find a phone, or set a reminder, but I wouldn’t miss it if that went.

I wondered if I was unusual in not using the voice features much, but according to this thread it seems I’m not.

5BC2E7@lemmy.world on 26 Sep 2023 15:54 next collapse

Alexa is more like a telemarketer disguised as an assistant. Every interaction is followed by a “by the way . Its a shit experience so I stopped using mine.

Corkyskog@sh.itjust.works on 26 Sep 2023 16:00 next collapse

Alexa was designed explicitly for that purpose. They lose money on every Echo sold, the whole idea was they would make money selling you stuff. Turns out people would rather use their Echo to check the weather, get recipes, etc. rather than voice shop.

hightrix@lemmy.world on 26 Sep 2023 17:55 next collapse

I just can’t see a use case for voice shopping. There are almost zero instances where I want to buy something without having a visual of that thing in front of me at time of purchase.

I could possibly see something like “buy another stick of deodorant”, but even then I want to see if there are deals or some other options and would want to check the price at a minimum.

Seems like yet another MBA idea.

OpenPassageways@lemmy.zip on 26 Sep 2023 19:10 next collapse

It’s really only good for re-ordering things you’ve already ordered. It will let you know that it found something in your order history and then you can decide whether you want to order again.

hightrix@lemmy.world on 26 Sep 2023 19:27 collapse

And this makes sense, but I’d still want to check prices to make sure that my $3 deodorant didn’t get discontinued and priced at $30/stick.

GamingChairModel@lemmy.world on 26 Sep 2023 22:57 collapse

Well you think this way because you’ve seen what happened to Amazon in the past 10 years. 10 years ago, when they were getting ready to launch the Echo, Amazon was a great retailer that people trusted. Now a decade of sellers gaming listings and reviews, and Amazon customer service deteriorating, we’ve been trained not to trust Amazon’s defaults.

SpaceCowboy@lemmy.ca on 26 Sep 2023 23:56 collapse

Yeah it seems the execs who had the idea for Alexa never used Amazon for shopping. It’s a shit shopping site full of scammy products. I’d never buy anything from them without checking out the prices reviews, etc.

Cort@lemmy.world on 26 Sep 2023 18:35 next collapse

Ha, I use mine almost exclusively as a light switch. I don’t have to get out of bed to turn off my lights or turn on my fan. I’m sure they’re losing a bunch of money on me

maniclucky@lemmy.world on 26 Sep 2023 21:51 next collapse

It’s a great lightswitch. Also, thermostat adjustor (my husband is very particular and changes it about a dozen times a day).

LittleTrollInAHut@lemmy.world on 27 Sep 2023 10:41 collapse

And they also have smart lights and phone apps that do the same thing.

SpaceCowboy@lemmy.ca on 26 Sep 2023 23:53 next collapse

It’s also good for setting a timer. But yeah, I’m not buying shit from it.

kelargo@lemmy.world on 27 Sep 2023 10:17 collapse

I use mine to stream spotify.

o0oradaro0o@lemmy.world on 26 Sep 2023 16:10 next collapse

Setting all my Alexa’s to UK English got rid of all marketing “by the ways.” I still regret going with the Alexa ecosystem but at least for now there is a workaround for the most rage inducing part of it.

locuester@lemmy.zip on 27 Sep 2023 07:16 collapse

By the way, did you know that you can find out more about telemarketing with an audio book from audible on the subject. Would you like to hear a preview of that now?

Illuminostro@lemmy.world on 26 Sep 2023 16:19 next collapse

They won’t be charging me, because I don’t buy shit from Amazon, and don’t use their spy platform.

sentinelthesalty@lemmy.world on 26 Sep 2023 17:33 next collapse

Rip Bozo, no one will sub for your knock off siri. Aside from a certain cub set of consoomer cattle.

satans_crackpipe@lemmy.world on 26 Sep 2023 19:24 collapse

I love when satire.

FidiFadi@lemmy.world on 26 Sep 2023 21:25 next collapse

Has Amazon considered making a stationary AI?

TechAnon@lemm.ee on 27 Sep 2023 14:51 collapse

You just typed that question on one. See: GPT4All You can download many models and run them locally. They were about 5-16GB in size the last time I downloaded one. Pretty slow if you don’t have a hefty GPU, but it works!

wer2@lemm.ee on 26 Sep 2023 23:38 next collapse

My Home Assistant Voice is getting really close to displacing Alexa.

stevedidwhat_infosec@infosec.pub on 27 Sep 2023 00:15 collapse

Same. I’ve already got an entire setup between gpt with customizable system level prompting capabilities and it uses custom voice models I’ve trained over at eleven labs

Now I just gotta slap my lil monsters phat ass into a raspberry pi and then destroy the fuck out of my Alexa devices and ship em to Jeff bozo

StubbornCassette8@feddit.nl on 27 Sep 2023 00:56 collapse

Can you share details? Been thinking of doing this with a new PC build. Curious what your performance and specs are.

Silentrizz@lemmy.world on 27 Sep 2023 01:08 next collapse

+1 interest

stevedidwhat_infosec@infosec.pub on 27 Sep 2023 12:22 collapse

You shouldn’t need anything really, all the components run via cloud services so you just need a network connection.

That’s why it’ll run just fine on a cheap pi model

Essentially the script in Python just sends api requests directly to OpenAI and returns the AI response. Next I just pass that response to the elevenlabs api and play that audio binary stream via any library that supports audio playback.

(That last bit is what I’ll have to toy around with on a pi but, I’m not worried about finding a suitable option, there’s lots of libraries out there)

StubbornCassette8@feddit.nl on 28 Sep 2023 02:27 collapse

Oh wait, I think I misunderstood. I thought you had local language models running on your computer. I have seen that be discussed before with varying results.

Last time I tried running my own model was in the early days of the Llama release and ran it on an RTX 3060. The speed of delivery was much slower than OpenAI’s API and the material was way off.

It doesn’t have to be perfect, but I’d like to do my own API calls from a remote device phoning home instead of OpenAI’s servers. Using my own documents as a reference would be a plus to, just to keep my info private and still accessible by the LLM.

Didn’t know about Elevenlabs. Checking them out soon.

Edit because writing is hard.

stevedidwhat_infosec@infosec.pub on 28 Sep 2023 12:32 collapse

That could be fun! I’ve made and trained my own models prior but I find that getting the right amount of data (in terms of both size and diversity to ensure features are orthogonal out of the gate) can be pretty tough.

If you don’t get that right balance of size and diversity in your data, that efficacy upper limit is gonna be way lower than you’d like, but you might have some good data sets laying around I got no clue ^_^

Lemmy know how it goes!

WindowsEnjoyer@sh.itjust.works on 27 Sep 2023 00:05 next collapse

We upgraded our technologies so much that it’a becoming unsustainable for us, so we are increasing prices.

Random_user@lemmy.world on 27 Sep 2023 00:58 next collapse

All I want alexa to do is turn my lights on and off, set timers, and show me my own pictures. And it can BARELY do that without fucking it up. Everyone I know wants the same, they expect nothing more from it. “AI” features of Alexa aren’t t needed or wanted by anyone I’ve talked to about it.

EmperorHenry@discuss.tchncs.de on 27 Sep 2023 02:27 next collapse

They’re never going to be satisfied with the amount of money that have.

phoneymouse@lemmy.world on 27 Sep 2023 03:38 collapse

That’s capitalism. Endless growth. Grew 1000%? Grow more. More. More. More.

EmperorHenry@discuss.tchncs.de on 27 Sep 2023 04:18 collapse

In a real free market, all the banks that destroyed the economy through fraud wouldn’t have gotten bailouts, they would’ve had to “pull themselves up by their bootstraps” like everyone else had to.

atyaz@reddthat.com on 27 Sep 2023 10:48 next collapse

In a real free market, the banks would have gotten too big to fail and we would have bailed them out (ask me how I know this)

dustyData@lemmy.world on 27 Sep 2023 12:41 collapse

No true capitalism…

gearheart@lemm.ee on 27 Sep 2023 02:42 next collapse

Their using the public to train AI. Then charging the public for the AI it trained.

coffeebiscuit@lemmy.world on 27 Sep 2023 10:13 next collapse

Now you know why it’s called a smart device.

Cosmos7349@lemmy.world on 27 Sep 2023 13:56 collapse

can’t be that smart if it’s using me for training

[deleted] on 27 Sep 2023 11:09 next collapse

.

Turun@feddit.de on 27 Sep 2023 15:37 collapse

Yes?

I’ll happily join in bashing Amazon for plenty of reasons, but training an AI is a step that adds significant value to the material. Just like any other product that is an effort that people are willing to pay for.

negativeyoda@lemmy.world on 27 Sep 2023 02:58 next collapse

They thought people would be like “Alexa, but me a ton of shit on Amazon” but people just use it for timers and the weather

butterflyattack@lemmy.world on 27 Sep 2023 05:44 next collapse

Yeah, can you imagine trusting that thing to buy stuff for you? Bit scary, actually.

vaultdweller013@sh.itjust.works on 27 Sep 2023 10:06 next collapse

“Alexa, release the Roombas”

30 roombas make their way out from under my bed

barsoap@lemm.ee on 27 Sep 2023 19:18 collapse

They had those “re-order product” physical buttons for a while which you were supposed to glue to your washing machine so you could reorder when your detergent ran out.

Besides legal issues (at least over here all they could do is put things in your shopping cart) apparently the primary customers of those buttons were hardware hackers, turning them into all kinds of stuff.

rockandsock@lemm.ee on 27 Sep 2023 04:34 next collapse

Oh no!

I’ll just have to install a weather app and use the timer on my stove instead of using Alexa.

locuester@lemmy.zip on 27 Sep 2023 07:07 collapse

Exactly. I never did find another use for that thing. Had one 2014-2022. It didn’t survive my last move. Was voted off the island.

Shard@lemmy.world on 27 Sep 2023 08:44 next collapse

youtu.be/nwPtcqcqz00?si=QNG8ElAz7yZDGQUx

All I picture is this.

PipedLinkBot@feddit.rocks on 27 Sep 2023 08:45 collapse

Here is an alternative Piped link(s):

https://youtu.be/nwPtcqcqz00?si=QNG8ElAz7yZDGQUx

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

WuTang@lemmy.ninja on 27 Sep 2023 10:51 next collapse

Let’s see if they will make real money with their AI, now.

virtualbriefcase@lemm.ee on 27 Sep 2023 10:56 next collapse

We’re seeing this all over the tech and tech adjacent space. Can’t grow forever at a loss, especially not with increased interest rates and a potential economic downturn.

My guess, if you want to have decent services we’re going to end up needing to pick few (or a suite of the basics) to pay for on a monthly basic and cut out all the “free” stuff that is/will get enshittified.

SitD@feddit.de on 27 Sep 2023 11:30 next collapse

in my eyes they put themselves in an awkward position by garnering a reputation of always collecting more user data than justified, and at this point i assume they do the same with paid products as it’s an industry norm. however I’m not ok with it and will never pay when the product doesn’t respect privacy. the saying used to be “if you don’t pay, you’re the product”, but it is increasingly shifting to: you’re the product and also you have to pay so that our shareholders can experience more infinite growth

JustZ@lemmy.world on 27 Sep 2023 11:31 next collapse

Yo ho, yo ho.

phillaholic@lemm.ee on 27 Sep 2023 12:57 collapse

Bubble about to pop

Hazdaz@lemmy.world on 27 Sep 2023 12:01 next collapse

I don’t understand this. Hasn’t Intel or Nvidia (or someone else) been making claims about their next CPUs having AI functionality built-in?

tb_@lemmy.world on 27 Sep 2023 12:46 next collapse

You can record and edit videos on your own devices, but that doesn’t mean it’s suddenly free for Netflix or YouTube to stream their videos to you.

Surely a local version of Alexa could be developed, but that development would come with its own costs.
Some things simply can’t be done locally, such as a web search. Often your route calculations for a map application are also done in the cloud.

ours@lemmy.film on 27 Sep 2023 12:53 collapse

Having “AI functionality” doesn’t mean they can just get rid of their big/expensive models they use now.

If they are anything like Open AI’s LLM, it requires very beefy machines with a ton of expensive RAM.

Hazdaz@lemmy.world on 27 Sep 2023 12:59 collapse

Well that’s exactly what I was thinking when these companies were making these claims… like HOW could they possibly handle this locally on a CPU or GPU when there must be a massive database that (I assume) is constantly being updated? Didn’t make sense.

EDIT: this entire website can go fuck off. You ask a simple question about some reasonably new tech, and you get downvoted for having the audacity to learn some new stuff. People on here are downright pathetic.

ours@lemmy.film on 27 Sep 2023 13:09 next collapse

“AI” doesn’t use databases per se, they are trained models built from large amounts of training data.

Some models run fine on small devices (like the model running on phones to make better pictures) but others are huge like Open AI’s LLM.

Hazdaz@lemmy.world on 27 Sep 2023 13:34 collapse

training data.

Wouldn’t that data be stored in some kind of database?

ours@lemmy.film on 27 Sep 2023 14:11 next collapse

No, the data will influence the model.

Some of the data may be found in the model itself (i.e. the AI generated images outputting mangled author signatures from the original works that were used during training) but not in the traditional form of a database. You can’t directly retrieve that data back in its original form even if some models can be coerced to do something similar.

It’s basically a statistical model built from training data.

The training of these huge models also cost a fortune. Allegedly in the millions of $ in a data and processing-intensive process.

TipRing@lemmy.world on 27 Sep 2023 15:57 next collapse

The training data isn’t stored in the model. You can take an existing model and fine tune it on a whole bunch of additional data and the model size won’t change.

Dark_Arc@social.packetloss.gg on 27 Sep 2023 16:02 collapse

The other answers are a bit confusing…

Yes, that’s in a database.

However, you can think of it like a large library of books on how to best tune a ukulele. There might be a lot of information to figure out how to tone the ukulele and a lot of skill to put all that knowledge to use, but the ukulele once tuned, is quite small and portable.

blazeknave@lemmy.world on 27 Sep 2023 16:17 collapse

You’re right. Run an llm locally adjacent to your application sandboxes and local user apps and your office will lower its heating bills.

hark@lemmy.world on 27 Sep 2023 12:16 next collapse

Running AI may be currently expensive, but the hardware will continue to improve and get cheaper. If they institute a subscription fee and people actually pay for it, they’ll never remove that fee even after it becomes super cheap to run.

Elderos@lemmings.world on 27 Sep 2023 13:40 next collapse

That is sort of the issue when mixing good conscience with capitalism. Either the goods are valued at what we’re willing to pay, or either they’re valued at what we think the profit margin of the business should be, but mixing the two ultimately leads us to fall for PR crap. Business are quick to gather sympathy when the margins are low, and we fall for this PR crap, but then as soon they own a part of the market it turns into raising the price as much as they possibly can.

That being said, Amazon became what it is because Bezos was hell bent on not rug pulling customers, at least in the early years, so it is possible they would decrease prices eventually to gain market advantage, that’s their whole strategy.

barsoap@lemm.ee on 27 Sep 2023 19:12 collapse

but the hardware will continue to improve and get cheaper.

Eh. I mean sure the likes of A100s will invariably get cheaper because they’re overpriced AF, but there isn’t really that much engineering going into those things hardware-wise: Accelerating massive chains of fmas is quite a smaller challenge than designing a CPU or GPU. Meanwhile moore’s law is – well maybe not dead but a zombie. In the past advances in manufacturing meant lower price per transistor, that hasn’t been true for a while now and the physics of everything aren’t exactly getting easier, they’re now battling quantum uncertainty in the lithography process itself.

Where there might still be significant headways to be made is by switching to analogue, but, eeeh. Reliability. Neural networks are rather robust against small perturbations but it’s not like digital systems can’t make use of that by reducing precision, and controlling precision is way harder in analoge. Everything is harder there, it’s an arcane art.


tl;dr: Don’t expect large leaps, especially not multiple. This isn’t a naughts “buy a PC twice as fast at half the price two years later” kind of situation, AI accelerators are silicon like any other they already make use of the progress we made back then.

[deleted] on 27 Sep 2023 13:50 next collapse

.

RaoulDook@lemmy.world on 27 Sep 2023 14:34 collapse

I never got the appeal of those things even ignoring how their design is the antithesis of privacy. It just seems dumb to talk to the computer box, like it’s a thing to talk to when it’s just a microphone and software. I simply prefer direct, precise, and silent control of devices

GladiusB@lemmy.world on 27 Sep 2023 15:07 next collapse

It’s very sci fi. Star Trek amongst many others from the 80s. If you are old enough then you would remember that this was the stuff of fantasy. I can see why it appeals to people with disabilities and possibly kids for homework or something. But I am 1000 percent with you on the privacy part. No thanks.

eronth@lemmy.world on 27 Sep 2023 15:26 collapse

It’s good for hands/device free control. Setting timers while cooking by simply saying “set a timer” or controlling lights from across the room without fiddling with a phone or remote.

ram@bookwormstory.social on 27 Sep 2023 16:30 collapse

Set a timer’s and set an alarm’s the only two I ever found useful personally. I stopped using google assistant because it just legitimately stopped understanding me correctly and I got frustrated with it.

kamen@lemmy.world on 27 Sep 2023 16:05 next collapse

Something tells me that they’ll still listen to you for free.

blazeknave@lemmy.world on 27 Sep 2023 16:15 next collapse

As someone at a company still using free AI credits in their commercial products and hasn’t figured out how he’s going to price the shit when the credits are up… this AI market looks a lot like Uber subsidies…

PeterPoopshit@lemmy.world on 27 Sep 2023 18:58 collapse

Like a year or two from now, probably any AI stuff that isn’t self hosted is going to be 100% inaccessible to normal people due to cost. It’s just a question of how hard they’re going to fight to keep current free to download LLM models off the internet once this happens.

blazeknave@lemmy.world on 28 Sep 2023 03:05 collapse

That’s a pretty accurate encompassing statement. Well done.

Octavio@lemmy.world on 27 Sep 2023 17:00 collapse

I no longer use mine for much but to control one light with an illogical switch placement. I could pretty much replace her with The Clapper™️