So Far, AI Is a Money Pit That Isn't Paying Off (gizmodo.com)
from FlyingSquid@lemmy.world to technology@lemmy.world on 11 Oct 2023 14:41
https://lemmy.world/post/6652576

#technology

threaded - newest

bappity@lemmy.world on 11 Oct 2023 14:42 next collapse

if A.I. dies out because capitalism I will wheeze

WrittenWeird@lemmy.world on 11 Oct 2023 14:56 collapse

The current breed of generative “AI” won’t ‘die out’. It’s here to stay. We are just in the early Wild-West days of it, where everyone’s rushing to grab a piece of the pie, but the shine is starting to wear off and the hype is juuuuust past its peak.

What you’ll see soon is the “enshittification” of services like ChatGPT as the financial reckoning comes, startup variants shut down by the truckload, and the big names put more and more features behind paywalls. We’ve gone past the “just make it work” phase, now we are moving into the “just make it sustainable/profitable” phase.

In a few generations of chips, the silicon will have made progress in catching up with the compute workload, and cost per task will drop. That’s the innovation to watch out for now, who will de-throne Nvidia and its H100?

lurch@sh.itjust.works on 11 Oct 2023 15:28 next collapse

It can totally die out tho, if people stop using it, it will fade to nothingness, like a flash browser game.

iwenthometobeafamilyman@lemmy.world on 11 Oct 2023 15:32 next collapse

Deleted comment

deranger@sh.itjust.works on 11 Oct 2023 15:46 collapse

It doesn’t even existing in the medical field, stop lying

iwenthometobeafamilyman@lemmy.world on 11 Oct 2023 15:48 collapse

Deleted comment

deranger@sh.itjust.works on 11 Oct 2023 15:55 collapse

devices marketed in the United States

Not a list of devices in use

None of this image reading tech you refer to exists in actual hospitals yet

iwenthometobeafamilyman@lemmy.world on 11 Oct 2023 15:59 collapse

Deleted comment

Sanctus@lemmy.world on 11 Oct 2023 15:45 next collapse

Flash games did not die out because people stopped playing them. The smart phone was created and this changed the entire landscape of small game development.

o0joshua0o@lemmy.world on 11 Oct 2023 15:49 collapse

Steve Jobs killed Flash. It was premeditated.

bitteorca@artemis.camp on 11 Oct 2023 15:57 next collapse

Flash deserved to die

Sanctus@lemmy.world on 11 Oct 2023 16:13 collapse

It was atrocious compared to what we have now. But god fucking dammit I love those games. They mean more to me than a lot of AAA studios.

FaceDeer@kbin.social on 11 Oct 2023 17:07 collapse

If it had been killed without an adequate replacement (eg. mobile gaming) then people wouldn't have let Flash die. There are open-source flash players.

Szymon@lemmy.ca on 11 Oct 2023 16:25 collapse

Flash games didnt die on their own, the technology was purposefully killed off via similar corporate requirements to maximize profits.

kirklennon@kbin.social on 11 Oct 2023 16:40 collapse

It died because Safari for iPhone supported only open web standards. Flash was also the leading cause of crashes on the Mac because it was so poorly-written. It was also a huge security vulnerability and a leading vector for malware, and Adobe just straight up wasn't able to get it running well on phones. Flash games were also designed with the assumption of a keyboard and mouse so many could never work right on touchscreen devices.

Szymon@lemmy.ca on 11 Oct 2023 17:16 collapse

There you go, lots of reasons care of this person here

GenderNeutralBro@lemmy.sdf.org on 11 Oct 2023 15:48 next collapse

This is why I, as a user, am far more interested in open-source projects that can be run locally on pro/consumer hardware. All of these cloud services are headed down the crapper.

My prediction is that in the next couple years we’ll see a move away from monolithic LLMs like ChatGPT and toward programs that integrate smaller, more specialized models. Apple and even Google are pushing for more locally-run AI, and designing their own silicon to run it. It’s faster, cheaper, and private. We will not be able to run something as big as ChatGPT on consumer hardware for decades (it takes hundreds of gigabytes of memory at minimum), but we can get a lot of the functionality with smaller, faster, cheaper models.

WrittenWeird@lemmy.world on 11 Oct 2023 15:56 next collapse

Definitely. I have experimented with image generation on my own mid-range RX GPU and though it was slow, it worked. I have not tried the latest driver update that’s supposed to accelerate those tools dramatically, but local AI workstations with dedicated silicon are the future. CPU, GPU, AIPU?

nodsocket@lemmy.world on 11 Oct 2023 16:39 next collapse

Wait, you guys don’t already have hundreds of gigabytes of memory?

GenderNeutralBro@lemmy.sdf.org on 11 Oct 2023 16:50 next collapse

Technically I could upgrade my desktop to 192GB of memory (4x48). That’s still only about half the amount required for the largest BLOOM model, for instance.

To go beyond that today, you’d need to move beyond the Intel Core or AMD Ryzen platforms and get something like a Xeon. At that point you’re spending 5 figures on hardware.

I know you’re just joking, but figured I’d add context for anyone wondering.

p03locke@lemmy.dbzer0.com on 11 Oct 2023 17:45 collapse

Don’t worry about the RAM. Worry about the VRAM.

nodsocket@lemmy.world on 11 Oct 2023 18:11 collapse

Google drive is my swap space

FaceDeer@kbin.social on 11 Oct 2023 17:09 collapse

Hundreds of gigabytes of memory in consumer PCs is not decades away. There are already motherboards that accept 128 GB.

GenderNeutralBro@lemmy.sdf.org on 11 Oct 2023 17:46 collapse

You’re right, I shouldn’t say decades. It will be decades before that’s standard or common in the consumer space, but it could be possible to run on desktops within the next generation (~5 years). It’d just be very expensive.

High-end consumer PCs can currently support 192GB, and that might increase to 256 within this generation when we get 64GB DDR5 modules. But we’d need 384 to run BLOOM, for instance. That requires a platform that supports more than 4 DIMMs, e.g. Intel Xeon or AMD Threadripper, or 96GB DIMMs (not yet available in the consumer space). Not sure when we’ll get consumer mobos that support that much.

_number8_@lemmy.world on 11 Oct 2023 16:02 collapse

GPT already got way shittier from the version we all saw when it first came out to the heavily curated, walled garden version now in use

autotldr@lemmings.world on 11 Oct 2023 14:45 next collapse

This is the best summary I could come up with:


A new report from the Wall Street Journal claims that, despite the endless hype around large language models and the automated platforms they power, tech companies are struggling to turn a profit when it comes to AI.

Github Copilot, which launched in 2021, was designed to automate some parts of a coder’s workflow and, while immensely popular with its user base, has been a huge “money loser,” the Journal reports.

OpenAI’s ChatGPT, for instance, has seen an ever declining user base while its operating costs remain incredibly high.

A report from the Washington Post in June claimed that chatbots like ChatGPT lose money pretty much every time a customer uses them.

Platforms like ChatGPT and DALL-E burn through an enormous amount of computing power and companies are struggling to figure out how to reduce that footprint.

To get around the fact that they’re hemorrhaging money, many tech platforms are experimenting with different strategies to cut down on costs and computing power while still delivering the kinds of services they’ve promised to customers.


The original article contains 432 words, the summary contains 172 words. Saved 60%. I’m a bot and I’m open source!

OldWoodFrame@lemm.ee on 11 Oct 2023 14:54 next collapse

Yeah, so far. It’s super early in the modern incarnation of AI that actually has the chance to pay off, LLMs.

This isn’t like Bitcoin where there’s huge hype for a pretty small market opportunity. We all realize the promise, we are just still figuring out how to get rid of hallucinations and making it consistent and tuned to a certain business usage.

iwenthometobeafamilyman@lemmy.world on 11 Oct 2023 15:11 next collapse

Deleted comment

deranger@sh.itjust.works on 11 Oct 2023 15:17 collapse

It’s not “paying off” as this isn’t implemented anywhere, thus not making money.

I think you’re way off the mark and buying into the hype. That’s my opinion from an electronic medical record software analyst.

iwenthometobeafamilyman@lemmy.world on 11 Oct 2023 15:21 collapse

Deleted comment

deranger@sh.itjust.works on 11 Oct 2023 15:38 collapse

I’m in cardiology - radiology is right across the hall. We’re both under ancillary services. Lab techs are not rad techs, and rad techs don’t read films. You don’t work in this field, do you?

None of this is implemented so none of it is paying off.

Also nobody thinks this will take their jobs, because it looks like Theranos to us, in that it’s very hyped in tech and ridiculous to those of us in the medical field.

iwenthometobeafamilyman@lemmy.world on 11 Oct 2023 15:47 collapse

Deleted comment

deranger@sh.itjust.works on 11 Oct 2023 15:52 collapse

Dude, I am an Epic analyst. We’re a 10 star organization (ie cutting edge adoption of features).

I don’t know how to tell you how wrong you are. None of this is even remotely near production.

Helping with SlicerDicer queries is not reading a film. This is a ridiculous comparison.

Again, even language processing features are not remotely near production. It’s not even in any proof of concept environments.

iwenthometobeafamilyman@lemmy.world on 11 Oct 2023 15:59 collapse

Deleted comment

Kbin_space_program@kbin.social on 11 Oct 2023 16:11 collapse

Well, and also navigating the minefields that the LLMs absolutely have copyrighted material in them that wasn't paid for or licensed. E.G. Dall-E can produce a full image of Fresh Cut Grass, a character owned by Critical Role.

And that the stuff they produce isn't copyright-able.

FaceDeer@kbin.social on 11 Oct 2023 17:04 collapse

And that the stuff they produce isn't copyright-able.

Even if that were true, is there no value in public domain art resources?

Kbin_space_program@kbin.social on 11 Oct 2023 17:57 collapse

Not to the companies looking to use AI.

FaceDeer@kbin.social on 11 Oct 2023 19:22 collapse

Exhibit A, Disney, a giant megacorp whose most famous works are literally founded on public domain material.

Bear in mind that public domain is not like a copyleft license, it's not "viral." If I make a movie and the Mona Lisa shows up in it, that movie is still copyright to me even though there's a public domain element in it. It's even easier with unique AI-generated stuff because you can't even tell what's public domain and what isn't.

Kbin_space_program@kbin.social on 11 Oct 2023 20:28 collapse

Something has to be ownable to be public domain. AI produced items are un-ownable, since the AI is the owner, but it can't own them since it's a legally a "tool".

FaceDeer@kbin.social on 11 Oct 2023 21:28 collapse

You are deeply confused about what "public domain" means. Something that is un-ownable (in an intellectual property sense) is public domain.

You may be referring to the Thaler v. Perlmutter case when you say "AI is the owner?" That's a widely misunderstood case that's gone through quite the game of telephone in the media. The judge in it ruled that an AI cannot own copyright, but that doesn't mean that AI-produced art is uncopyrightable. Just that AIs aren't people, from a legal perspective, and you need to be a legal person to own copyright. If Thaler had claimed copyright for himself, as a person, things might have gone differently. But he didn't.

iwenthometobeafamilyman@lemmy.world on 11 Oct 2023 14:59 next collapse

Deleted comment

Potatos_are_not_friends@lemmy.world on 11 Oct 2023 15:24 next collapse

So much fucking this.

Every cash grab right now around AI is just a frontend for a chatGPT API. And every investor who throws money at them is the mark. And now they’re crying a river.

Nougat@kbin.social on 11 Oct 2023 15:34 next collapse

Never mind that LLMs are a far cry from AI.

Lmaydev@programming.dev on 11 Oct 2023 16:02 next collapse

They are literally AI, neural networks specifically. As are path finding algorithms for games.

People just don’t get what AI is. Any program that simulates intelligence is AI.

You’re likely thinking of general AI from sci-fi.

QuaternionsRock@lemmy.world on 11 Oct 2023 17:37 collapse

the capability of computer systems or algorithms to imitate intelligent human behavior

I don’t know about you, but I would consider writing papers/books/essays/etc. (even bad ones) and code (even with mistakes) intelligent human behavior, and they’re pretty good at imitating it.

trashgirlfriend@lemmy.world on 11 Oct 2023 17:22 next collapse

IRL, people are doing some amazing things with generative AI, esp in 2D graphic art.

Woah, shiny bland images that are a regurgitation of stolen artwork!!!

ComradeBunnie@aussie.zone on 11 Oct 2023 20:21 next collapse

It’s also helped me find the names of several books and films that have been rattling around in my mind, some for decades, which actually made me very happy because not remembering that sort of thing drives me a little mad.

I’m stuck on two books that it can’t work out - both absolute trash pulp fiction, one that I stopped reading because it was so terrible and the other that was so bad but I actually wouldn’t mind reading again.

Oh well, can’t have it all.

FLX@lemmy.world on 12 Oct 2023 11:47 collapse

people are doing

No they ain’t doing shit, they just prompt

[deleted] on 11 Oct 2023 15:05 next collapse

.

TWeaK@lemm.ee on 11 Oct 2023 15:22 next collapse

Sounds like the internet in the 90s.

1bluepixel@lemmy.world on 11 Oct 2023 15:45 next collapse

It also reminds me of crypto. Lots of people made money from it, but the reason why the technology persists has more to do with the perceived potential of it rather than its actual usefulness today.

There are a lot of challenges with AI (or, more accurately, LLMs) that may or may not be inherent to the technology. And if issues cannot be solved, we may end up with a flawed technology that, we are told, is just about to finally mature enough for mainstream use. Just like crypto.

To be fair, though, AI already has some very clear use cases, while crypto is still mostly looking for a problem to fix.

iopq@lemmy.world on 11 Oct 2023 16:20 next collapse

I’m still trying to transfer $100 from Kazakhstan to me here. By far the lowest fee option is actually crypto since the biggest difference is the currency conversion. If you have to convert anyway, might as well only pay 0.30% on both ends

demesisx@infosec.pub on 11 Oct 2023 16:35 collapse

Look into DJED on Cardano. It’s WAY cheaper than ETH (but perhaps not cheaper than some others). A friend of mine sent $10,000 to Thailand for less than a dollar in transaction fees. To 1bluepixel: Sounds like a use-case to me!

FaceDeer@kbin.social on 11 Oct 2023 17:01 next collapse

Layer-2 rollups for Ethereum are also way cheaper than the base layer, this page lists the major ones.

demesisx@infosec.pub on 11 Oct 2023 18:14 collapse

Hmm.

You still have to deal with ETH fees just to get the funds into the roll up. I admit that ETH was revolutionary when it was invented but the insane fee market makes it a non-starter and the accounts model is just a preposterously bad (and actually irreparably broken) design decision for a decentralized network, makes Ethereum near impossible to parallelize since the main chain is required for state and the contracts that run on it are non-deterministic.

FaceDeer@kbin.social on 11 Oct 2023 19:17 collapse

There are exchanges where you can buy Ether and other tokens directly on a layer 2, once it's on layer 2 there are no further fees to get it there.

Layer 2 rollups are a way to parallelize things, the activity on one layer 2 can proceed independently of activity on a different layer 2.

I have no idea why you think contracts on Ethereum are nondeterminstic, the blockchain wouldn't work at all if they were.

demesisx@infosec.pub on 12 Oct 2023 01:54 collapse

I think that because it’s true. Smart contracts on Ethereum can fail and still charge the wallet. Because of the open ended nature of Ethereum’s design, a wallet can be empty when the contract finally executes, causing a failure. This doesn’t happen in Bitcoin and other utxo chains like Ergo, and Cardano (where all transactions must have both inputs and outputs accounted for FULLY to execute). Utxo boasts determinism while the accounts model can fail due to an empty wallet. Determinism makes concurrency harder for sure…but at least your entire chain isn’t one gigantic unsafe state machine. Ethereum literally is by definition non-deterministic.

oroboros@sh.itjust.works on 11 Oct 2023 17:07 collapse

If only I had some money to transfer somewhere :(

demesisx@infosec.pub on 11 Oct 2023 16:25 next collapse

Crypto found a problem to fix. The reason the problem remains: everything is run by that problem so it was astroturfed to death by parties that run the current financial system and the enemy of their enemy (who’s a friend), opportunistic scammers like SBF and Do Kwan.

p03locke@lemmy.dbzer0.com on 11 Oct 2023 17:40 next collapse

No, this isn’t crypto. Crypto and NFTs were trying to solve for problems that already had solutions with worse solutions, and hidden in the messaging was that rich people wanted to get poor people to freely gamble away their money in an unregulated market.

AI has real, tangible benefits that are already being realized by people who aren’t part of the emotion-driven ragebait engine. Stock images are going to become extinct in several years. People can make at least a baseline image of what they want, no matter the artistic ability. Musicians are starting to use AI tools. ChatGPT makes it easy to generate low-effort, high-time-consuming letters and responses like item descriptions, or HR responses, or other common draft responses. Code AI engines allow programmers to present reviewable solutions in real-time, or at least something to generate and tweak. None of this is perfect, but it’s good enough for 80% of the work that can be modified after the initial pass.

Things like chess AI has existed for decades, and LLMs are just extensions of the existing generative AI technology. I dare you to tell Chess.com that “AI is a money pit that isn’t paying off”, because they would laugh their fucking asses off, as they are actively pouring even more money and resources into Torch.

The author here is a fucking idiot. And he didn’t even bother to change the HTML title (“Microsoft’s Github Copilot is Losing Huge Amounts of Money”) from its original focus of just Github Copilot. Clickbait bullshit.

PipedLinkBot@feddit.rocks on 11 Oct 2023 17:40 next collapse

Here is an alternative Piped link(s):

starting to use AI tools

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

Revonult@lemmy.world on 11 Oct 2023 18:24 collapse

I totally agree. However, I do feel like the market around AI is inflated like NFTs and Crypto. AI isn’t a bust, there will be steady progress at universities, research labs, and companies. There is too much hype right now, slapping AI on random products and over promising the current state of technology.

p03locke@lemmy.dbzer0.com on 11 Oct 2023 22:15 next collapse

slapping [Technology X] on random products and over promising the current state of technology

A tale as old as time…

Still waiting on those “self-driving” cars.

instamat@lemmy.world on 12 Oct 2023 08:19 collapse

Self driving will be available next year.*

*since 2014

DudeDudenson@lemmings.world on 13 Oct 2023 03:52 collapse

I love how suddenly companies started advertising things as AI that would have been called a chatbot a year ago. I saw a news article headlinethe other day that said that judges were going to improve the time they took to render judgments significantly by using AI.

Reading the content of the article they went on to explain that they would use it to draft the documents. Its like they never heard of templates

thecrotch@sh.itjust.works on 11 Oct 2023 20:37 collapse

Let’s combine AI and crypto, and migrate it to the cloud. Imagine the PowerPoints middle managers will make about that!

Lmaydev@programming.dev on 11 Oct 2023 15:59 collapse

Or computers decades before that.

Many of these advances are incredibly recent.

And also many of the things we use in our day to day are ai powered without people even realising.

elbarto777@lemmy.world on 11 Oct 2023 17:40 collapse

AI powered? Like what?

Lmaydev@programming.dev on 11 Oct 2023 18:31 next collapse

fusionchat.ai/…/10-everyday-ai-applications-you-d…

Some good examples here.

Most social media uses it. Video and music streaming services. SatNav. Speech recognition. OCR. Grammar checks. Translations. Banks. Hospitals. Large chunks of internet infrastructure.

The list goes on.

elbarto777@lemmy.world on 12 Oct 2023 21:03 collapse

Got it. Thanks.

TWeaK@lemm.ee on 11 Oct 2023 20:56 next collapse

The key fact here is that it’s not “AI” as conventionally thought of in all the scifi media we’ve consumed over our lifetimes, but AI in the form of a product that tech companies of the day are marketing. It’s really just a complicated algorithm based off an expansive dataset, rather than something that “thinks”. It can’t come up with new solutions, only re-use previous ones; it wouldn’t be able to take one solution for one thing and apply that to a different problem. It still needs people to steer it in the right direction, and to verify its results are even accurate. However AI is now probably better than people at identifying previous problems and remembering the solution.

So, while you could say that lots of things are “powered by AI”, you can just as easily say that we don’t have any real form of AI just yet.

elbarto777@lemmy.world on 12 Oct 2023 21:05 collapse

Oh but those pattern recognition examples are about machine learning, right? Which I guess it’s a form of AI.

TWeaK@lemm.ee on 12 Oct 2023 21:37 collapse

Perhaps, but at best it’s still a very basic form of AI, and maybe shouldn’t even be called AI. Before things like ChatGPT, the term “AI” meant a full blown intelligence that could pass a Turing test, and a Turing test is meant to prove actual artificial thought akin to the level of human thought - something beyond following mere pre-programmed instructions. Machine learning doesn’t really learn anything, it’s just an algorithm that repeatedly measures and then iterates to achieve an ideal set of values for desired variables. It’s very clever, but it doesn’t really think.

elbarto777@lemmy.world on 13 Oct 2023 09:14 collapse

I have to disagree with you in the machine learning definition. Sure, the machine doesn’t think in those circumstances, but it’s definitely learning, if we go by what you describe what they do.

Learning is a broad concept, sure. But say, if a kid is learning to draw apples, then is successful to draw apples without help in the future, we could way that the kid achieved “that ideal set of values.”

TWeaK@lemm.ee on 13 Oct 2023 09:31 collapse

Machine learning is a simpler type of AI than an LLM, like ChatGPT or AI image generators. LLM’s incorporate machine learning.

In terms of learning to draw something, after a child learns to draw an apple they will reliably draw an apple every time. If AI “learns” to draw an apple it tends to come up with something subtley unrealistic, eg the apple might have multiple stalks. It fits the parameters it’s learned about apples, parameters which were prescribed by its programming, but it hasn’t truly understood what an apple is. Furthermore, if you applied the parameters it learned about apples to something else, it might completely fail to understand it all together.

A human being can think and interconnect its throughts much more intricately, we go beyond our basic programming and often apply knowledge learned in one thing to something completely different. Our understanding of things is much more expansive than AI. AI currently has the basic building blocks of understanding, in that it can record and recall knowledge, but it lacks the full amount of interconnections between different pieces and types of knowledge that human beings develop.

elbarto777@lemmy.world on 13 Oct 2023 09:35 collapse

Thanks. I understood all that. But my point is that machine learning is still learning, just like machine walking is still walking. Can a human being be much better at walking than a machine? Sure. But that doesn’t mean that the machine isn’t walking.

Regardless, I appreciate your comment. Interesting discussion.

Aceticon@lemmy.world on 12 Oct 2023 08:28 collapse

Automated mail sorting has been using AI to read post codes from envelopes for deacades, only back then - pre hype - it was just called Neural Networks.

That tech is almost 3 decades old.

elbarto777@lemmy.world on 12 Oct 2023 21:02 collapse

But was it using neural networks or was it using OCR algorithms?

DudeDudenson@lemmings.world on 13 Oct 2023 03:53 next collapse

I love people who talk about AI that don’t know the difference between an LLM and a bunch of if statements

Aceticon@lemmy.world on 13 Oct 2023 08:23 collapse

At the time I learned this at Uni (back in the early 90s) it was already NNs, not algorithms.

(This was maybe a decade before OCR became widespread)

In fact a coursework project I did there was recognition of handwritten numbers with a neural network. The thing was amazingly good (our implementation actually had a bug and the thing still managed to be almost 90% correct on a test data set, so it somehow mostly worked its way around the bug) and it was a small NN with no need for massive training sets (which is the main difference with Large Language Models versus the more run-off-the-mill neural networks), this at a time when algorithmic number and character recognition were considered a very difficult problem.

Back then Neural Networks (and other stuff like Genetic Algorithms) were all pretty new and using it in automated mail sorting was recent and not yet widespread.

Nowadays you have it doing stuff like face recognition, built-in on phones for phone unlocking…

elbarto777@lemmy.world on 13 Oct 2023 09:16 collapse

Very interesting. Thanks for sharing!

Potatos_are_not_friends@lemmy.world on 11 Oct 2023 15:23 next collapse

These are the same kind of people who go, “We spent money on Timmy’s clothes for over two years and it’s not paying off.”

Bro, AI is an investment.

bluGill@kbin.social on 11 Oct 2023 15:34 collapse

It is a risky investment. Taking care of your kid is something where we have done it enough that we understand the risks and pay off and most parents can make a reasonable prediction. (a few kids will "turn 21 in prison doing life without parole" - but most turn out okay and return love to their parents and attempt to improve society - though you may not agree with their definition of improve society)

I have no idea if the current faults with AI will be solved or not. That is a risk you are taking. It is useful for some things, but we don't know how useful.

iopq@lemmy.world on 11 Oct 2023 17:39 collapse

There’s also the “not in prison, but mostly just lives at home and smokes weed” money pit of children

My childhood friend ended up this way and I’ve given up on him

treadful@lemmy.zip on 11 Oct 2023 15:32 next collapse

People are literally paying monthly subscriptions for access to a bunch of these things.

ripcord@kbin.social on 11 Oct 2023 17:33 collapse

Did you read the article? The problem hasn't been getting some people to pay for some things, it's that the things that are available so far are losing loads of money. Or at least, that's the premise.

alienanimals@lemmy.world on 11 Oct 2023 15:35 next collapse

AI isn’t paying off if you’re too dumb to figure out how to use the many amazing tools that have come about.

BolexForSoup@kbin.social on 11 Oct 2023 15:39 next collapse

I was going to say...I use AI-transcription tools for video editing, AI-upscaling, and Resolve dropped an incredible AI green screen tool that makes it effortless. I also use AI to repair audio as of 6mo ago lol. I don't think I gone more than 48hrs without using an AI tool professionally.

NegativeLookBehind@kbin.social on 11 Oct 2023 15:43 next collapse

I wonder if “AI not paying off” in the context of this article actually means “Companies haven’t been able to lay off a bunch of their staff yet, like they’re hoping to do”

ripcord@kbin.social on 11 Oct 2023 17:32 collapse

If anyone read the article you'd know what they meant, and it wasn't either of the things you two mentioned.

NegativeLookBehind@kbin.social on 11 Oct 2023 17:37 next collapse

Yea I didn’t read it. But isn’t it safe to assume that this is a major goal for many companies?

ripcord@kbin.social on 11 Oct 2023 19:15 collapse

Read the article

NegativeLookBehind@kbin.social on 11 Oct 2023 19:21 collapse

I can’t read :(

ripcord@kbin.social on 11 Oct 2023 19:26 collapse

I'm sorry to - hey wait a minute

alienanimals@lemmy.world on 11 Oct 2023 17:42 collapse

I know exactly what the journalist meant. They meant to get more clicks with some click bait headline and a bad article that will make them look extremely stupid in the future.

TimewornTraveler@lemm.ee on 13 Oct 2023 03:27 collapse

It’s about MASSIVE CARBON FOOTPRINT and a waning userbase

Semi-Hemi-Demigod@kbin.social on 11 Oct 2023 15:50 next collapse

AI is a lot more like the Internet than it is like Facebook. It's a set of techniques you can use to create tools. These are incredibly useful tools, but you're not going to make Facebook money off of them because the techniques are pretty easy to replicate and the genie is out of the bottle.

What the tech bros are looking for is a way to control access to AI so they can be a chokepoint. Like if Craftsman could charge for every single time you used their tool to make something. For one very recent example, see what happened to Unity. Creating chokepoints and then collecting rent is the modern corporate feudal strategy, but that won't work if everybody with an AWS account and enough money can spin up an LLM and start training it.

RickyRigatoni@lemmy.ml on 11 Oct 2023 15:59 next collapse

What is this AI tool to repair audio? Would it be able to fix poorly compressed audio?

BolexForSoup@kbin.social on 11 Oct 2023 16:45 collapse

Yes I use it all the time. Adobe Audio Enhance. It’s the flagship feature of their upcoming podcast app, but you can use it in browser currently. If you have an adobe subscription, it doesn’t charge extra or anything. It’s only for spoken word though, not music. If you throw music on it, though, you get some pretty wild stuff as it tries to create words out of the sounds.

To further answer your question, yes, it is actually very good with highly compressed audio. I regularly feed it zoom audio to make more intelligible. Obviously there are always limits, but I assure you it can do more than you can manually 85% of the time and buy a large margin. My only frustration is it is a simple slider, you can’t really fine tune it, but it’s still incredibly effective and I often use it as a first pass on the original audio file before I even start editing.

_number8_@lemmy.world on 11 Oct 2023 16:03 next collapse

AI stem splitting for songs is magical as well

[deleted] on 11 Oct 2023 16:26 next collapse

.

mPony@kbin.social on 11 Oct 2023 16:27 collapse

@BolexForSoup can you recommend a good quality Upscaler ?

BolexForSoup@kbin.social on 11 Oct 2023 16:48 collapse

Topaz Labs makes a decent one. You’ll need to do a lot of trial and error to kind of find your own favorite settings for baking, but as far as cost and efficacy go, there aren’t a lot better out right now.

They do a watermark free version You can test with. I think it also only let you do a couple of minutes a video at a time. But frankly it’s incredibly processor intensive so you will only want to test a 15-20s clip at a time anyway.

stealth_cookies@lemmy.ca on 11 Oct 2023 17:02 collapse

The problem here is that AI in the media has become synonymous with generalized LLMs, while other “AI” applications have been in place for many years doing more specific things that have more obvious use cases that can be more easily commercialised.

usualsuspect191@lemmy.ca on 11 Oct 2023 15:42 next collapse

Can something be a money pit and pay off? I feel like not paying off is part of the definition of a money pit… Or was the headline written by AI

art@lemmy.world on 11 Oct 2023 16:20 collapse

I think they mean that it’s costing companies a lot of money to operate but their returns aren’t high enough to justify the costs.

Lugh@futurology.today on 11 Oct 2023 16:04 next collapse

It should also worry investors open-source AI is only months behind the big tech leaders. I looked into AI voice cloning lately. There’s a few really pricey options. Like $25 a month for a couple of hours voice cloning.

However, there’s already an open-source version of what they’re selling.

RanchOnPancakes@lemmy.world on 11 Oct 2023 18:07 next collapse

Thats how this works. Blow though VC money to try and “strike gold” fail. Change model to become profitable." Move to the next scam.

InternetTubes@lemmy.world on 11 Oct 2023 18:44 next collapse

Well, they don’t want to do the one thing needed to make it successful: transparency. Maybe it can’t be.

btaf45@lemmy.world on 12 Oct 2023 03:54 collapse

So far what I’ve seen from AI is that it lies and lies and lies. It lies about history. It lies about science. It lies about politics. It lies about case law. It lies about programming libraries. Maybe this will all be fixed some day, or maybe it will just get worse. Until then the only thing I would trust it is about something in which their is no wrong answer.

RagingRobot@lemmy.world on 12 Oct 2023 05:15 collapse

I never ask it things I don’t know. I don’t think that’s really what’s it’s useful for. It’s really good at combining words though. So it can write a better sentence than I could. Better in a sense that it’s easier for others to understand what my thoughts are if I feed them in as input. Since they were my thoughts originally I can spot the bullshit pretty fast.

Nobody@lemmy.world on 11 Oct 2023 19:24 next collapse

Who could have predicted writing bullshit-y papers for kids in school wasn’t a billion dollar business?

eltrain123@lemmy.world on 11 Oct 2023 19:40 next collapse

Do people really not understand that we are in the early stages of ai development? The first time most people were made aware of LLMs was, like, 6 months ago. What ChatGPT can do is impressive for a self contained application, but is far from mature enough to do the things people are complaining it can’t do.

The point the industry is trying to warn about is that this technology is past its infancy and moving into, from a human comparison standpoint, childhood or adolescence. But, it iterates significantly faster than humans, so the time it can do the type of things people are bitching about is years, not decades, away.

If you think businesses have sunk this much money and effort into AI and didn’t do a cost-benefit analysis that stretched out decades, you are being naive or disingenuous.

flumph@programming.dev on 11 Oct 2023 23:34 next collapse

If you think businesses have sunk this much money and effort into AI and didn’t do a cost-benefit analysis that stretched out decades, you are being naive or disingenuous.

Are you kidding? We literally just watched the same bubble and burst in companies that rushed to get their piece of the Metaverse and NFT cash grab. I worked at a SaaS company that decided to add AI features because it was in the news and Azure offered it as a service. There was zero financial analysis done, just like for every other feature they added

I’m sure Microsoft has a plan since they invested heavily. But even Google is playing catch-up like they did with GCP.

atetulo@lemm.ee on 12 Oct 2023 00:05 collapse

AI is actually useful.

The metaverse and NFTs aren’t.

Your analogy is not a 1:1 representation of the situation and only serves to distract from the topic at hand.

jj4211@lemmy.world on 12 Oct 2023 01:05 collapse

But there is a similarity, the hype pulls in all sorts of companies to blindly add buzzwords without even knowing how it might possibly apply to their product, even if it were the perfect realization of the ideal.

Yes AI techniques obviously have utility. 90% of the spend is by companies that don’t even know what that utility might be. With that much noise, it’s hard to keep track of the value.

atetulo@lemm.ee on 12 Oct 2023 01:07 collapse

But there is a similarity, the hype pulls in all sorts of companies to blindly add buzzwords without even knowing how it might possibly apply to their product

Yes, I see what you are saying. I guess we can add ‘blockchain’ to that list, then.

atetulo@lemm.ee on 12 Oct 2023 00:05 next collapse

Do people really not understand that we are in the early stages of ai development?

Yes. Top post in this thread is someone cheering that AI won’t replace people in hollywood.

Just give it time. Remember how poor voice recognition and translation software was at first?

MargotRobbie@lemmy.world on 12 Oct 2023 00:15 collapse

Top post in this thread is someone cheering that AI won’t replace people in hollywood.

I really like how I’m just “someone” here now.

Stabbitha@lemmy.world on 12 Oct 2023 05:35 collapse

To be fair, who pays attention to user names?

MargotRobbie@lemmy.world on 12 Oct 2023 09:37 collapse

I do. 🥺

vrighter@discuss.tchncs.de on 12 Oct 2023 03:13 collapse

pretty much all improvements aren’t “better tech”, but just “bigger tech”. Reducing their footprint is an unsolved problem (just like it has always been with neural networks, for decades)

WhiteHawk@lemmy.world on 12 Oct 2023 07:52 collapse

Optimization is a problem that cannot be “solved” by definition, but a lot of work is being done on it with some degree of success

macallik@kbin.social on 11 Oct 2023 19:40 next collapse

What I don't like about the article is that the phrasing 'paying off' can apply to making investors money OR having worthwhile use cases. AI has created plenty of use cases from language learning to code correction to companionship to brainstorming, etc.

It seems ironic that a consumer-facing website is framing things from a skeptical "But is it making rich people richer?" perspective

xantoxis@lemmy.world on 11 Oct 2023 23:07 collapse

In my case, I still want to know if it’s not making rich people richer, because a) fuck rich people, and b) I don’t want to buy into things that will disappear in a year when the hype dies down. As a “consumer” my purchasing decisions impact my life, and the actions of the wealthy affect that more than you’d like.

kromem@lemmy.world on 11 Oct 2023 22:50 next collapse

Great, now factor in the cost of data collection if not subsidizing usage that you are effectively getting free RLHF from…

The one thing that’s been pretty much a guarantee over the last 6 months is that if there’s a mainstream article with ‘AI’ in the title, there’s going to be idiocy abound in the text of it.

MargotRobbie@lemmy.world on 11 Oct 2023 22:56 next collapse

Oh surprise surprise, looks like generative AI isn’t going to fulfill Silicon Valley and Hollywood studios’ dream of replacing artist, writers, and programmers with computer to maximize value for the poor, poor shareholders. Oh no!

As I said here before, generative AIs are not universal solution to everything that has ever existed like they are hyped up to be, but neither are they useless. At the end of the day, they are ultimately tools. Complex, powerful, useful tools, but tools nonetheless. A good artist can create better work faster with the help of a diffusion model, the same way LLM code generation can help a good programmer finish their project faster and better. (I think). All of these AI models are trained on data from data from everyone on Internet, which is why I think its reasonable that everyone should have access to these generative AI models for the benefit of humanity and not profit, and not just those who took other people’s work for free to trained the models. In other words, these generative AI models should belong to everyone.

And here lies my distaste for Sam Altman: OpenAI was founded as a nonprofit for the benefit of humanity, but at the first chance of money he immediately started venture capitalisting and put anything from GPT-2 onwards under locks and keys for money, and now it looks like that they are being crushed under the weight of their own operating costs while groups like Facebook and Stability catches up with actual open models, I will not be sad if "Open"AI fails.

(For as much crap as I give Zuck for the other awful things they do, I do admire their commitment to open source.)

I have to admit, playing with these generative models is pretty fun.

FLX@lemmy.world on 11 Oct 2023 23:20 next collapse

A powerful tool maybe, but useless

If your drill needs a nuclear plant and monthly subcription to drill a hole, it’s a shitty tool

warbond@lemmy.world on 11 Oct 2023 23:57 collapse

Going to have to disagree with you there. I’ve gotten plenty of use out of chat GPT in multiple scenarios. I find it difficult to imagine what exactly you think is useless about it because it seems so indispensable to me at this point.

FLX@lemmy.world on 12 Oct 2023 10:19 collapse

Indispensable, nothing less. lmao

Have fun when they decide to multiply the price x10 and you are too dependant to have an alternative, or when it becomes stupid or malevolent 👍

warbond@lemmy.world on 12 Oct 2023 11:17 next collapse

Sorry, I’m not sure I understand how that makes it useless. I get the feeling that you just want to feel smug, so if it makes you feel better go ahead, I guess.

FLX@lemmy.world on 12 Oct 2023 11:29 collapse

Because it’s too fragile and not ready to be use at scale without causing massive damage

Not useless for now (even if i’d like to know more about the domains where it’s really “indispensable”), but as useless as a drill with a dead battery the day they decide to cut it.

I don’t find it future-proof, as impressive as some results are

DocRekd@lemm.ee on 12 Oct 2023 11:42 collapse

Nowdays LLM can be ran on consumer hardware, so the “dead battery” analogy fall short here too.

FLX@lemmy.world on 12 Oct 2023 12:12 collapse

With the same efficiency ? I’m interested in an example

Why everyone using these crappy SaaS then ?

AdrianTheFrog@lemmy.world on 13 Oct 2023 03:23 next collapse

Llama 2 and its derivatives, mostly. Simple local ui available here.

Not as good as chatGPT 3.5 in my experience. Just kinda falls apart on anything too complex, and is a lot more likely to get things wrong.

I tried it out using the ‘Open-Orca/OpenOrcaxOpenChat-Preview2-13B’ 4 bit 32g model. Its surprisingly fast to generate. It seems significantly faster than ChatGPT on my 3060. (with ExLlama)

There are also some models tuned specifically to actually answer your requests instead of the ‘As an AI language model’ kind of stuff.

Edit: just tried a newer model and its a lot better. (dolphin-2.1-mistral-7b)

DocRekd@lemm.ee on 15 Oct 2023 00:00 collapse

For the same reason SaaS is popular in general: yes, you could get a VPS, install all the needed software on it, keep it up to date, oor you could pay a company to do all that for you.

guacupado@lemmy.world on 12 Oct 2023 23:28 collapse

You sound like the people who thought credit cards would never replace cash.

FLX@lemmy.world on 13 Oct 2023 11:45 collapse

And you sound like the people who thought cryptos would replace credit cards ;)

atetulo@lemm.ee on 12 Oct 2023 00:02 next collapse

Hm. I think you should zoom out a bit and try to recognize that AI isn’t stagnant.

Voice recognition and translation programs to years before they were appropriate for real-world applications. AI is also going to require years before it’s ready. But that time is coming. We haven’t reached a ‘ceiling’ for AI’s capabilities.

MargotRobbie@lemmy.world on 12 Oct 2023 00:29 collapse

Breakthrough technological development usually can be described as a sigmoid function (s-shaped curve), while there is an exponential progress in the beginning, it usually hit a climax then slow down and plateau until the next breakthrough.

There are certain problem that are not possible to resolve with the current level of technology for which development progress has slowed to a crawl, such as level 5 autonomous driving (by the way, better public transport is a way less complex solution.), and I think we are hitting the limit of what far transformer based generative AI can do since training has become more and more expensive for smaller and smaller gains, whereas hallucination seems to be an inherent problem that is ultimately unfixable with the current level of technology.

cyberpunk_sunbear@lemmy.zip on 12 Oct 2023 01:04 collapse

One thing that I think makes AI a possibility to deviate from that S model is that it can be honed against itself to magnify improvements. The better it gets the better the next gen can get.

vrighter@discuss.tchncs.de on 12 Oct 2023 03:10 collapse

that is a studied, documented, surefire way to very quickly destroy your model. It just does not work that way. If you train an llm on the output of another llm (or itself) it will implode.

barsoap@lemm.ee on 12 Oct 2023 23:54 collapse

Also at best it’s an refinement, not a new sigmoid. So are new hardware/software designs for even faster dot products or advancements in network topology within the current framework. T3 networks would be a new sigmoid but so far all we know is why our stuff fundamentally doesn’t scale to the realm of AGI, and the wider industry (and even much of AI research going on in practice) absolutely doesn’t care as there’s still refinements to be had on the current sigmoid.

batmangrundies@lemmy.world on 12 Oct 2023 04:20 next collapse

There was a smallish VFX group here that was attached to a volume screen company. They employed something like 20 people I think? So pretty small.

But the volume screen employed a guy who could do an adequate enough job with generative tools instead and the company folded. The larger VFX company they partner with had 200 employees, they recently cut to 50.

In my field, a team leader in 2018 could earn about 180,000 AUD P/A. Now those jobs are advertised for 130,000 AUD, because new models can do ~80% of the analysis with human accuracy.

AI is already folding companies and cutting jobs. It’s not in the news maybe, but as industries shift to compete with smaller firms leveraging AI it will cascade.

I had/have my own company, we were attached to Metropolis which unfortunately folded. I think that had a role to play in the job cuts as well. Luckily for me I wasn’t overleveraged, but I am packing up and changing careers for sure.

MargotRobbie@lemmy.world on 12 Oct 2023 04:42 collapse

Generative AI can make each individual artist/writer/programmer much more efficient at their job, but the shareholders and executives get their way and only big companies have access to this technologu, this increased productivity will instead be used reduce headcount and make the remaining people do more work on a tighter deadline, instead of helping everyone work less, do better work, and be happier.

This is the reason I think democratizing generative AI via local models is important, because as your example shows, it levels the playing field between small and big players, and helps people work less while making more cool stuff.

batmangrundies@lemmy.world on 12 Oct 2023 04:59 collapse

A big problem in Aus is the industry culture. They don’t care about using technology to improve results. They only care about cutting costs, even if the final product doesn’t meet the previous standard.

And we’ve seen that with VFX across the globe, the overall quality dropped drastically. Because studios play silly buggers to weasel out of paying VFX companies what they are due.

From what I hear, even DNEG is in trouble, and were even before the strike.

It’s a race to the bottom it seems.

My honest hope for the film industry is likely the same as yours. That we have smaller productions with access to better post due to improvements in AI-driven compositing software and so on.

But it’s likely that a role that was earning $$$ before is devalued significantly. And while I’m an unabashed anti-capitalist, I think a lot of folks misunderstand what this sudden downward pressure on income can do. Cost of living increasing while wages shrink is an awful combination

I’m 35, left a six figure job, folding my company and starting an electrician’s apprenticeship. To give you an idea around what my views about AI are. And of course this is as an Australian. We have a garbage white collar work culture anyway.

I think there will be a net improvement. But I worry that others will fail to adapt quickly. Too many are writing off AI as this thing that already came and went, but the tools have just landed, and we don’t yet have workflows that correctly implement and leverage these yet.

nickwitha_k@lemmy.sdf.org on 12 Oct 2023 07:37 next collapse

This is exactly why the SAG-AFTRA and WGA strikes have been vitally important, I think. Without pressure on industry, as we’ve seen across the board in the US for the last near half-century, fewer and fewer things that should improve lives are allowed to do so.

lloram239@feddit.de on 12 Oct 2023 09:40 next collapse

And we’ve seen that with VFX across the globe, the overall quality dropped drastically.

The joke that the cost increased drastically too. Modern VFX movies are far more expensive than older movies, while also looking worse. As what the studio bosses really want isn’t cheap movies, but movies they can control and micro-manage. VFX makes it much easier to rerender a scene with some changes than a practical shot where you have to rebuild the whole thing from scratch. The end result of that is of course a lack of planing ahead, endless reshoots, overworked VFX studios and bad results, but the bosses got what they wanted, so that’s ok.

AdrianTheFrog@lemmy.world on 13 Oct 2023 03:07 collapse

It’s crazy that with current economic systems, tools that make people work more efficiently have such a negative impact on society.

nickwitha_k@lemmy.sdf.org on 12 Oct 2023 08:18 collapse

Oh surprise surprise, looks like generative AI isn’t going to fulfill Silicon Valley and Hollywood studios’ dream of replacing artist, writers, and programmers with computer to maximize value for the poor, poor shareholders. Oh no!

It really is incredible how much this rhymes with the crypto hype. To be fair, the technology does actually have uses but, as someone in the latter category, after I saw it in action, I quickly felt less worried about my job prospects.

Fortunately, enough people in charge of staffing seem to have listened to people with technical knowledge to not make my earlier prediction (mass layoffs directly due to LLMs, followed by mass, panicked re-hirings when said LLMs ruined the business) come true. But, the worry itself, along with the RTO pushes (not to mention exploitation of contractors and H1B holders) really underscore his desperately the industry needs to get organized. Hopefully, what’s going on in the games industry with IATSE gets more traction and more of my colleagues on the same page but, that’s one area where I’m not as optimistic as I’d like to be - I’ll just have to cheer on SAG, WGA, and UAW for the time being.

(For as much crap as I give Zuck for the other awful things they do, I do admire their commitment to open source.)

Absolutely agreed. There’s a surprising amount of good in the open source world that has come from otherwise ethically devoid companies. Even Intuit donated the Argo project, which has evolved from a cool workflow tool to a toolkit with far more. There is always the danger of EEE, however, so, we’ve got to stay vigilant.

xantoxis@lemmy.world on 11 Oct 2023 23:04 next collapse

Goood. Gooooooooooood.

Smacks@lemmy.world on 12 Oct 2023 02:14 next collapse

AI is a tool to assist creators, not a full on replacement. Won’t be long until they start shoving ads into Bard and ChatGPT.

BeautifulMind@lemmy.world on 13 Oct 2023 00:53 collapse

AI is a tool to assist plagiarize the work of creators

Fixed it

LOL OK it’s a super-powerful technology that will one day generate tons of labor very quickly, but none of that changes that in order to train it to be able to do that, you have to feed it the work of actual creators- and for any of that to be cost-feasible, the creators can’t be paid for their inputs.

The whole thing is predicated on unpaid labor, stolen property.

2ncs@lemmy.world on 13 Oct 2023 01:52 collapse

At what line does it become stolen property? There are plenty of tools which artists use today that use AI. Those AI tools they are using are more than likely trained on some creation without payment. It seems the data it’s using isn’t deemed important enough for that to be an issue. Google has likely scraped billions of images from the Internet for training on Google Lens and there was not as much of an uproar.

Honestly, I’m just curious if there is an ethical line and where people think it should be.

kibiz0r@lemmy.world on 13 Oct 2023 03:44 collapse

Well see, it shaves off a fraction of the creative work’s statistical signal, and deposits it into this vector database that we created…

PipedLinkBot@feddit.rocks on 13 Oct 2023 03:44 collapse

Here is an alternative Piped link(s):

Well see, it shaves off a fraction of the creative work’s statistical signal, and deposits it into this vector database that we created…

Piped is a privacy-respecting open-source alternative frontend to YouTube.

I’m open-source; check me out at GitHub.

mojo@lemm.ee on 12 Oct 2023 02:46 next collapse

Silicon like usual thinking these things are as big as the invention as the internet, and trying to get their money in there the first place. AI was and still is a massive game changer, but nothing can live up to the hype of which they throw a stupid amount of money at these things. They didn’t learn their lesson after crypto or the “metaverse” either lol. I see AI being a tool, an incredibly useful one. That also means it has a lot of jobs it simply can’t do. It can’t replace artists, but artists can use it as a tool to help them work off of things.

Aceticon@lemmy.world on 12 Oct 2023 08:15 next collapse

Ever since the Internet Bubble crashed around 2000 that the business community in the Valley has been repeatedly trying to pump up a new bubble, starting with what they called Web 2.0 which started being hyped maybe even before the dust settled on tha crash after the first Tech bubble.

And if you think about it, it makes sense: the biggest fortunes ever made in Tech are still from companies which had their initial growth back then, such as Google, Amazon and even Paypal (Microsoft and Apple being maybe the most notable exceptions, both predating it).

snek@lemmy.world on 12 Oct 2023 08:32 next collapse

So far I’ve only seen AI being used to fire employees that a company totally absolutely still needs but just doesn’t want to pay wages to. Companies are dumb as fuck, that’s my conclusion, but what else can you expect by organizations run by ladder-climbing CEO figures?

bane_killgrind@lemmy.ml on 12 Oct 2023 12:04 collapse

There’s utility in keeping workers desperate, it depresses wages.

Think about the coordinated tech layoffs that happened and now the tech industry has a labor surplus.

Saves them money.

snek@lemmy.world on 12 Oct 2023 12:47 collapse

The company that laid me off is paying me 4 months of wages without any work, this is the severance package I get in exchange for them not being forced to try to find me another suitable position in the company or “prove” that there no tasks at the company that I could do with my existing skill set (that’s how the law works where I live).

Was it really worth it?

bane_killgrind@lemmy.ml on 12 Oct 2023 21:34 next collapse

My point was it’s not dumb, it’s malicious and self serving.

It’s not worth it.

Chocrates@lemmy.world on 12 Oct 2023 23:51 collapse

Are you in the states? I’m surprised you have those protections.

snek@lemmy.world on 13 Oct 2023 01:43 collapse

No, Sweden

lloram239@feddit.de on 12 Oct 2023 09:22 next collapse

Things will live up to the hype and easily surpass it. That’s not the issue. The issue is that people take the world of today and imagine how much better/faster/richer they could become if they had AI. The crux is by the time they have AI, everybody else has it too. Thus it loses its competitive advantage. It just raises the baseline.

If I had to create the thousands of images I have generated with AI three years ago it would have costs thousands if not millions of dollar, a gigantic almost insurmountable task. But that doesn’t mean they have any value today. Everybody can produce similar images with a few clicks.

The whole point of AI is after all that it makes work that used to be difficult and expensive, cheap and easy, and nobody is going to pay huge amounts of money for a task that has become trivial.

guacupado@lemmy.world on 12 Oct 2023 23:26 collapse

What I’m curious is what’s going to happen to all these companies that went all-in on building data centers when they weren’t doing it previously. Places like Meta and Amazon are huge enough that it’s always been a sound investment but with this hype there are other companies trying to set up server farms with no real prize in sight.

barsoap@lemm.ee on 12 Oct 2023 23:40 collapse

I mean A100s don’t exactly break that quickly and they’re specialised enough hardware so that they will continue to be able to rent them out. They’re also overpriced AF though which might cut into the bottom line but they’re probably not going to end up with a giant loss, I don’t really doubt they will break even. Opportunity costs are stellar, but OTOH there’s so much billionaire capital floating around screaming for opportunities to park itself in that macro-economically it’s negligible. Also I’m not exactly in the habit of crying about billionaires having a low ROI.

AtmaJnana@lemmy.world on 12 Oct 2023 09:50 next collapse

Tough shit. It’s the Next Big Thing so everyone has to have it.It doesn’t matter that it’s not useful for most use cases (yet.)

Kanda@reddthat.com on 12 Oct 2023 11:42 next collapse

Wait, R&D doesn’t research and develop dollar bills into existence?

aesthelete@lemmy.world on 13 Oct 2023 01:29 next collapse

You’d think at this point that investors would wait for a thing to fill out the question mark second step in their business plan before investing in it, but you’d be way, way wrong.

Every new tech company comes to the investor panel with:

  1. build expressive to run new tool and give it away to end users for free

  2. ???

  3. profit!

And somehow they keep falling for it.

punkwalrus@lemmy.world on 13 Oct 2023 02:30 next collapse

Because people assume all these investors know what they are doing. They don’t. Now, some investors are good, but they usually don’t go for shit like this. At lot of investors are VCs, rich upper class twits, who can afford to lose money. Pure and simple. It’s like a bunch of lotto winners telling people they know how to pick numbers, betting outside bets once in a while, get lucky, and have selective bias.

Plus, they have enough money to hedge their bets. For example, say you invest $1mil in companies A, B, C, D, E, and F. All lose everything except A and B, which earn you $3mil each. You put in $6mil, got back $6mil. You broke even, tell people you knew what you were doing because you picked A and B, and conveniently never mention the rest. Then rich twits people invest in what YOU invest in. So you invest in H, others invest in H because you did, drives up the value. Now magnify this by a lot of investors, hundreds of letters, and it’s all like some weird game of luck and timing.

But a snapshot in time leads to your 2) ??? Point. Many know this is a confidence game, based on luck, charm, and timing. Some just stumble through it, and others are fleeced, but who cares? Daddy’s got money.

Money works different for rich people. It’s truly puzzling.

quackers@lemmy.blahaj.zone on 13 Oct 2023 04:20 collapse

They sure as hell are doing a good job of making me reliant on AI though. Soon I’ll probably be payinf 200$ a month because i cant remember how to do things without AI. I think thats the plan anyway.

aesthelete@lemmy.world on 13 Oct 2023 06:09 collapse

Soon I’ll probably be payinf 200$ a month because i cant remember how to do things without AI.

Sounds like a problem TBH, I’d get that checked out by a professional.

quackers@lemmy.blahaj.zone on 14 Oct 2023 00:08 collapse

Okay, ill ask ChatGPT

jimbo@lemmy.world on 13 Oct 2023 04:26 collapse

Have they not tried simply asking the AI how to make it profitable?