QuadratureSurfer@lemmy.world
on 06 Jul 12:35
nextcollapse
Someone just got the AWS bill.
crunchy@lemmy.dbzer0.com
on 06 Jul 14:12
nextcollapse
That’s got to be it. Cloud compute is expensive when you’re not being funded in Azure credits. One the dust settles from the AI bubble bursting, most of the AI we’ll see will probably be specialized agents running small models locally.
Retro_unlimited@lemmy.world
on 07 Jul 22:42
collapse
I have been using a program called GPT4ALL and you can download many models and run them locally. They give you a prompt at boot if you want to share data or not. I select no and use it offline anyway.
Cloud compute is gonna be cheap compared to the API costs for LLMs they use/offer.
TrumpetX@programming.dev
on 06 Jul 13:06
nextcollapse
Well shit, I’ve been on vacation, and I signed up with Cursor a month ago. Not allowed at work, but for side projects at home in an effort to “see what all the fuss is about”.
So far, the experience was rock solid, but I assume when I get home that I’ll be unpleasantly surprised.
I’ve primarily use claude-4-sonnet in cursor and was surprised to see a message telling me it would start costing extra above and beyond my subscription. This was prolly after 100 queries or so. However, switching to “auto” instead of a specific model continues to not cost anything and that still uses claude-4-sonnet when it thinks it needs to. Main difference I’ve noticed is it’s actually faster because it’ll sometimes hit cheaper/dumber APIs to address simple code changes.
It’s a nice toy that does improve my productivity quite a bit and the $20/month is the right price for me, but I have no loyalty and will drop them without delay if it becomes unusable. That hasn’t happened yet.
I mean yeah? I wasn’t counting in detail, it’s an estimate.
Previously you got 500 requests a month and then it’d start charging you, even on “auto.” So the current charging scheme seems to be encouraging auto use so they can use cheaper LLMs when they make sense (honestly a good thing).
SreudianFlip@sh.itjust.works
on 07 Jul 17:53
collapse
In the English language, specifically North American dialects, this is a form of idiom.
Confused_Emus@lemmy.dbzer0.com
on 07 Jul 21:52
collapse
That’s not an idiom, it’s just an elided word.
SreudianFlip@sh.itjust.works
on 07 Jul 22:03
collapse
Well we can argue over the niceties of the word idiom, but as it’s referring to the way the word is pronounced in specific regions of North America, it qualifies as meeting one of the definitions of idiom.
Elision refers more to the absence of an understood word, such as saying ‘my bad’.
My bad, elision can also refer to slurring syllables together, so it’s both.
Confused_Emus@lemmy.dbzer0.com
on 07 Jul 22:06
collapse
An elision is the absence of a sound or syllable in a word. An idiom is an entire phrase or expression that does not mean what it literally says.
There’s no argument here, you’re just wrong.
No, it isn’t both.
SreudianFlip@sh.itjust.works
on 07 Jul 22:25
collapse
I dunno, cf. 1.b definition of idiom in the OED: dialect usage, and 2.a is dialect usage for effect. Maybe the definition is changing with the ages, or your usage is overly strict.
Confused_Emus@lemmy.dbzer0.com
on 07 Jul 22:44
collapse
Idiom. Elide. It’s really not that confusing. Idioms are about meaning, elision is about sound.
SreudianFlip@sh.itjust.works
on 08 Jul 00:47
collapse
Hm, I guess an encyclopedia article is more relevant than a dictionary definition, so sure. I was using the looser secondary definition… in this case an elision that references a dialect in order to call up regional relevance to the opinion expressed.
Jesusaurus@lemmy.world
on 06 Jul 13:09
nextcollapse
Hopefully (?) this is the start of a trend and people might begin to realize how all those products are not worth their price and AI is an overhyped mess made to hook users before exploiting them…
cley_faye@lemmy.world
on 07 Jul 06:42
nextcollapse
The whole industry is projecting something like negative $200B for next years. They know it’s not worth the price.
Are you a software engineer who has made use of these and similar tools?
If not, this is epic level armchairing.
The tools are definitely hyped, but they are also incredibly functional. They have many problems, but they also work and achieve their intended purpose.
I have a rough idea of their efficiency as I’ve used them, not in professional settings but I wager it would not be too different.
My point is more that it feels like the rugs are finally starting to get pulled.
This tech is functionnal as you said, it works to a point and that point is enough for a sizeable amount of people. But I doubt that the price most people are paying now is enough to cover the cost of answering their queries.
Now that some people, especially younger devs or people who never worked without those tools are dependant on it, they can go ahead and charge more.
But it’s not too late, so I’m hoping it will make some people more aware of that kind of scheme and that they will stop feeding the AI hype in general.
Glitchvid@lemmy.world
on 07 Jul 01:54
nextcollapse
Imagine the price hikes when they need to get that return on hundreds of billions they’ve poured into these models, datacenters and electricity.
threaded - newest
Someone just got the AWS bill.
That’s got to be it. Cloud compute is expensive when you’re not being funded in Azure credits. One the dust settles from the AI bubble bursting, most of the AI we’ll see will probably be specialized agents running small models locally.
I’m still running Qwen32b-coder on a Mac mini. Works great, a little slow, but fine.
I’m somewhat tech savvy, how do I run llm locally. Any suggestions? How to know if my local data is safe
Checkout lm studio lmstudio.ai and you can pair it with vs continue extension docs.continue.dev/getting-started/overview.
I have been using a program called GPT4ALL and you can download many models and run them locally. They give you a prompt at boot if you want to share data or not. I select no and use it offline anyway.
More like they just got their Anthropic bill.
Cloud compute is gonna be cheap compared to the API costs for LLMs they use/offer.
Well shit, I’ve been on vacation, and I signed up with Cursor a month ago. Not allowed at work, but for side projects at home in an effort to “see what all the fuss is about”.
So far, the experience was rock solid, but I assume when I get home that I’ll be unpleasantly surprised.
Has anyone here had rate limiting hit them?
I’ve primarily use claude-4-sonnet in cursor and was surprised to see a message telling me it would start costing extra above and beyond my subscription. This was prolly after 100 queries or so. However, switching to “auto” instead of a specific model continues to not cost anything and that still uses claude-4-sonnet when it thinks it needs to. Main difference I’ve noticed is it’s actually faster because it’ll sometimes hit cheaper/dumber APIs to address simple code changes.
It’s a nice toy that does improve my productivity quite a bit and the $20/month is the right price for me, but I have no loyalty and will drop them without delay if it becomes unusable. That hasn’t happened yet.
“Prolly”
I mean yeah? I wasn’t counting in detail, it’s an estimate.
Previously you got 500 requests a month and then it’d start charging you, even on “auto.” So the current charging scheme seems to be encouraging auto use so they can use cheaper LLMs when they make sense (honestly a good thing).
I was questioning the use of the word “prolly”
it means “probably” 🤗
Nah, you should find a new bone to pick.
In the English language, specifically North American dialects, this is a form of idiom.
That’s not an idiom, it’s just an elided word.
Well we can argue over the niceties of the word idiom, but as it’s referring to the way the word is pronounced in specific regions of North America, it qualifies as meeting one of the definitions of idiom.
Elision refers more to the absence of an understood word, such as saying ‘my bad’.My bad, elision can also refer to slurring syllables together, so it’s both.
An elision is the absence of a sound or syllable in a word. An idiom is an entire phrase or expression that does not mean what it literally says.
There’s no argument here, you’re just wrong.
No, it isn’t both.
I dunno, cf. 1.b definition of idiom in the OED: dialect usage, and 2.a is dialect usage for effect. Maybe the definition is changing with the ages, or your usage is overly strict.
Idiom. Elide. It’s really not that confusing. Idioms are about meaning, elision is about sound.
Hm, I guess an encyclopedia article is more relevant than a dictionary definition, so sure. I was using the looser secondary definition… in this case an elision that references a dialect in order to call up regional relevance to the opinion expressed.
Sounds like charge back territory
Hopefully (?) this is the start of a trend and people might begin to realize how all those products are not worth their price and AI is an overhyped mess made to hook users before exploiting them…
The whole industry is projecting something like negative $200B for next years. They know it’s not worth the price.
Are you a software engineer who has made use of these and similar tools?
If not, this is epic level armchairing.
The tools are definitely hyped, but they are also incredibly functional. They have many problems, but they also work and achieve their intended purpose.
I have a rough idea of their efficiency as I’ve used them, not in professional settings but I wager it would not be too different.
My point is more that it feels like the rugs are finally starting to get pulled. This tech is functionnal as you said, it works to a point and that point is enough for a sizeable amount of people. But I doubt that the price most people are paying now is enough to cover the cost of answering their queries. Now that some people, especially younger devs or people who never worked without those tools are dependant on it, they can go ahead and charge more.
But it’s not too late, so I’m hoping it will make some people more aware of that kind of scheme and that they will stop feeding the AI hype in general.
Imagine the price hikes when they need to get that return on hundreds of billions they’ve poured into these models, datacenters and electricity.
Ah they’re learning from the “unlimited” mobile carriers.
“Unlimited” until you meet your limit, then throttled.
Common People
<img alt="" src="https://sh.itjust.works/pictrs/image/822aa830-283c-4c0f-a60c-82605badfcc4.gif">