Yes, AI will eventually replace some workers. But that day is still a long way off (www.theguardian.com)
from Powderhorn@beehaw.org to technology@beehaw.org on 11 May 14:55
https://beehaw.org/post/19937253

Anyone firing employees because they thought that AI would do their jobs in 2025 should be fired. It really doesn’t take much research to see AI isn’t at the place where it’s replacing people – yet. And business managers – particularly in small and mid-sized companies – who think it is better think again.

At best, generative AI platforms are providing a more enhanced version of search, so that instead of sifting through dozens of websites, lists and articles to figure out how to choose a great hotel in Costa Rica, fix a broken microwave oven or translate a phrase from Mandarin to English, we simply ask our chatbot a question and it provides the best answer it finds. These platforms are getting better and more accurate and are indeed useful tools for many of us.

But these chatbots are nowhere near replacing our employees.

It’s somewhat akin to claiming that now that we have hammers, carpenters aren’t needed.

#technology

threaded - newest

Dark_Arc@social.packetloss.gg on 11 May 15:04 next collapse

I don’t think LLMs will ever replace a single worker.

30p87@feddit.org on 11 May 15:15 next collapse

Meanwhile companies keep pushing “AI” (as in, LLMs integrated with image/video generation, STT and TTS, networking, file generation and reading, etc.), as traditional, useful ML, built for one purpose and fulfilling that purpose at least well, sinks into irrelevancy.

Opinionhaver@feddit.uk on 11 May 15:29 next collapse

It almost certainly already has replaced several.

Also, AI is not synonymous with LLM.

Dark_Arc@social.packetloss.gg on 12 May 04:51 next collapse

It almost certainly already has replaced several.

Has it actually replaced them?

Sure maybe some people have lost their jobs, but I don’t think they’ve really been replaced.

It’s closer to laying someone off without replacement … because evening I’ve seen has suggested AI not only can’t do the work but it also doesn’t improve the productivity of workers using it in any meaningful way.

Also, AI is not synonymous with LLM.

I’d argue that’s all AI means anymore if it has any meaning left at all.

jarfil@beehaw.org on 12 May 05:36 next collapse

A lot of people have been working tedious and repetitive “filler” jobs.

  • Computers replaced a lot of typists, drafters, copyists, calculators, filers, clerks, etc.
  • LLMs are replacing receptionists, secretaries, call center workers, translators, slop “artists”, etc.
  • AI Agents are in the process of replacing aides, intermediate administrative personnel, interns, assistants, analysts, spammers salespeople, basic customer support, HR personnel, etc.

In the near future, AI-controlled robots are going to start replacing low skilled labor, then intermediate skilled ones.

“AI” has the meaning of machines replacing what used to require humans to perform. It’s a moving goalpost: once one is achieved, we call it an “algorithm” and move to the next one, and again, and again.

Right now, LLMs are at the core of most AI, but AI has already moved past that, to “AI Agents”, which is a fancy way of saying “a loop of an LLM and some other tools”. There are already talks of moving past that too, the next goalpost.

Dark_Arc@social.packetloss.gg on 14 May 13:17 collapse

feddit.org/post/12430949

and even those people can’t be replaced.

jarfil@beehaw.org on 14 May 23:46 collapse

One of the worst possible examples ever: Klarna is a payment processor, people don’t call their bank to get the same answer the system is already giving them, they call to negotiate something about their money. AIs are at a troubleshooting level, at best some very basic negotiation, nowhere near dealing with people actually concerned about their money… much less in 2023.

Seems like Klarna fell hook, line, and sinker for the hype. Tough luck, need to know the limits.

Opinionhaver@feddit.uk on 12 May 08:01 collapse

The term artificial intelligence is broader than many people realize. It doesn’t refer to a single technology or a specific capability, but rather to a category of systems designed to perform tasks that would normally require human intelligence. That includes everything from pattern recognition, language understanding, and problem-solving to more specific applications like recommendation engines or image generation.

When people say something “isn’t real AI,” they’re often working from a very narrow or futuristic definition - usually something like human-level general intelligence or conscious reasoning. But that’s not how the term has been used in computer science or industry. A chess-playing algorithm, a spam filter, and a large language model can all fall under the AI umbrella. The boundaries of AI shift over time: what once seemed like cutting-edge intelligence often becomes mundane as we get used to it.

So rather than being a misleading or purely marketing term, AI is just a broad label we’ve used for decades to describe machines that do things we associate with intelligent behavior. The key is to be specific about which kind of AI we’re talking about - like “machine learning,” “neural networks,” or “generative models” - rather than assuming there’s one single thing that AI is or isn’t.

<img alt="" src="https://feddit.uk/pictrs/image/4339ad4a-1ab7-4ee4-8be9-094441de3ed4.webp">

Ledericas@lemm.ee on 12 May 08:07 collapse

AI can replace management, they are about the most least useful in a company.

sculd@beehaw.org on 11 May 16:02 collapse

Companies that care about quality cannot replace workers with LLM.

Problem is some “executives” think they can save cost by “AI”, and they are trying.

Midnitte@beehaw.org on 11 May 16:07 collapse

They’re going/are replacing workers - the problem is they’re going to make someone else do more work and check/fix the output.

In the end, there won’t be any cost savings (or frankly, even any “productivity”) - just another tool companies pay for because every other company uses it.

sculd@beehaw.org on 11 May 16:32 collapse

That’s the problem. I am already seeing AI slop in my area of work. And they usually need heavy clean up. In the end, its not saving any time.

VagueAnodyneComments@lemmy.blahaj.zone on 11 May 15:31 next collapse

LLMs do not enhance search. Search is worse than it has ever been. Pure non-techie drivel.

jarfil@beehaw.org on 12 May 05:05 collapse

Whose LLMs?

Content farms and SEO experts have been polluting search results for decades. Search LLMs have leveled the playing field: any trash a content farm LLM can spit out, a search LLM can filter out.

Basically, this:

<img alt="" src="https://beehaw.org/pictrs/image/85fcba9b-ee78-478e-8bd9-57bb03887e05.webp">

VagueAnodyneComments@lemmy.blahaj.zone on 12 May 05:29 next collapse

This is not accurate. You shouldn’t spread corpo propaganda.

jarfil@beehaw.org on 12 May 05:40 next collapse

Can you elaborate? It does match my personal experience, and I’ve been on both ends of the trash flinging.

msage@programming.dev on 12 May 10:51 collapse

What is accurate then?

VagueAnodyneComments@lemmy.blahaj.zone on 12 May 15:13 collapse

The premise that AI enhances search is false. The stated barriers to “AI” adoption for small businesses are dated and false. The statement that LLMs and associated technology will become more accurate and reliable is false.

The accurate statement in the article, that AI has no impact on earnings or hours, is from an outside source.

So you see there is nothing of value provided by the article itself, because the article is propaganda designed to convince you that LLMs have a productive future and are presently useful for applications such as search. These are both lies. The article is lying.

msage@programming.dev on 12 May 17:38 collapse

Yeah, but the image seems pretty spot on, and that’s what you replied to.

VagueAnodyneComments@lemmy.blahaj.zone on 12 May 18:39 collapse

The image is boomer humor for people who don’t understand what is going on

msage@programming.dev on 13 May 06:12 collapse

It would be, until I’ve seen millenials do the exact same thing.

apotheotic@beehaw.org on 12 May 08:33 collapse

Which search llm is filtering out any of the content farm and seo stuff?

jarfil@beehaw.org on 12 May 09:10 collapse

All of them. The moment they summarize results, it automatically filters out all the chaff. Doesn’t mean what’s left is necessarily true, just like publishing a paper doesn’t mean it wasn’t p-hacked, but all the boilerplate used for generating content and SEO, is gone.

Starting with Google’s AI Overview, all the way to chatbots in “research” mode, or AI agents, they return the original “bulletpoint” that stuff was generated from.

p03locke@lemmy.dbzer0.com on 11 May 23:58 next collapse

Oh, geesh, where have I heard this pattern before with technology? Is it self-driving cars? Remember when the sky was falling because everybody said car transportation was going to change overnight because a bunch of fucking college students figured out how to win a self-driving car race. Where the hell is my fully-autonomous Level 5 self-driving car that was going to replace all of the truck driving jobs? Oh, what’s that? Are all of the jobs still here?

Technology moves the needle. It does not jam it all the way to the other side. Stop spending trillions of dollars on pipe dreams, only to have to force expectations back to reality ten years later.

Powderhorn@beehaw.org on 12 May 00:58 next collapse

You can have your Level 5 car just as soon as we’ve tackled jet packs and fusion reactors. Priorities!

civilcoder@lemm.ee on 12 May 19:13 collapse

See you on the metaverse mate where we all hang out on the block chain

RickRussell_CA@beehaw.org on 12 May 03:52 next collapse

With respect to the article, it’s wrong. AI help desk is already a thing. Yes, it’s terrible, but human help desk was already terrible. Businesses are ABSOLUTELY cutting out tier 1 call center positions.

LLMs are exceptionally good at language translation, which should be no surprise as that kind of statistical chaining is right up their alley. Translators are losing jobs. AI Contract analysis & legal blacklining are going to put a lot of junior employees and paralegals out of business.

I am very much an AI skeptic, but I also recognize that people who do the things LLMs are already pretty good at are in real trouble. As AI tools get better at more stuff, that target list of jobs will grow.

I_am_10_squirrels@beehaw.org on 12 May 18:04 collapse

Cutting junior employees today sounds like a great option, until 10 years down the road you realize you don’t have any experienced people to backfill the senior employees who are leaving.

RickRussell_CA@beehaw.org on 21 May 19:35 collapse

Capitalism does an extremely poor job of planning beyond the next accounting period.

Korhaka@sopuli.xyz on 12 May 08:55 collapse

I have worked with people that could be replaced with a small bash script. It’s more a question of when and how many