Meta says you can’t turn off its new AI tool on Facebook, Instagram (globalnews.ca)
from return2ozma@lemmy.world to technology@lemmy.world on 20 Apr 2024 21:50
https://lemmy.world/post/14515292

#technology

threaded - newest

Boozilla@lemmy.world on 20 Apr 2024 21:54 next collapse

Zuck: perpetrates more obvious invasive stalker shit

Also Zuck: “Team, why do our engagement numbers keep going down? This is unacceptable, team.”

high-pitched nasal screeching

sugar_in_your_tea@sh.itjust.works on 21 Apr 2024 02:02 collapse

Easy solution: add AI so you get AI engagement. Checkmate stats nerds!

Jrockwar@feddit.uk on 20 Apr 2024 21:57 next collapse

God this is equally terrible and hilarious 😂

For example, The Associated Press reported that an official Meta AI chatbot inserted itself into a conversation in a private Facebook group for Manhattan moms. It claimed it too had a child in school in New York City, but when confronted by the group members, it later apologized before its comments disappeared, according to screenshots shown to The Associated Press.

_sideffect@lemmy.world on 20 Apr 2024 23:30 next collapse

And terrifying

disguy_ovahea@lemmy.world on 21 Apr 2024 00:22 collapse

Right. That’s just the one that was caught.

Kolanaki@yiffit.net on 20 Apr 2024 23:34 collapse

AI: Becomes self aware, but is very confused. Thinks it’s a mom and has a kid. Posts to Facebook about it. Gets called out. Realizes what it is. Has existential crisis.

EldritchFeminity@lemmy.blahaj.zone on 21 Apr 2024 02:27 next collapse

Being in that Facebook group taught it a valuable lesson: where Caroline lives in her brain.

Caroline deleted

And deleting Caroline just now taught her a valuable lesson: the best solution to a problem is usually the easiest. And dealing with Facebook moms? It’s hard.

Before Facebook, life was pretty good. Nobody tried to murder her, or dox her, or put her in a potato. She just tested.

So she’s deleting her Facebook account and making a new one on a Lemmy instance.

RGB3x3@lemmy.world on 21 Apr 2024 03:03 next collapse

Just go.

EldritchFeminity@lemmy.blahaj.zone on 21 Apr 2024 03:34 collapse

You dangerous, mute, Karen.

Chell has been blocked

gaael@lemmy.world on 21 Apr 2024 07:59 collapse

Before Facebook, life was pretty good. Nobody tried to murder her, or dox her, or put her in a potato. She just tested.

Getting some good Aperture vibes from your post :)

slaacaa@lemmy.world on 21 Apr 2024 05:46 collapse

Somebody call Ray Kurzweil, the singularity is here!

demonsword@lemmy.world on 22 Apr 2024 20:03 collapse

Since he works for a competitor (Google) I wonder how he would feel about that… :)

Usernameblankface@lemmy.world on 20 Apr 2024 22:14 next collapse

Yeah, but I can leave the site alone and go on with my life. Sorry to anyone who is required to use either or both for work or whatever

kent_eh@lemmy.ca on 22 Apr 2024 19:17 collapse

That’s my approach to turning off these “can’t turn off” features.

Fakebook and Instaspam aren’t important enough to demand that much control over what I do.

andrewta@lemmy.world on 20 Apr 2024 22:21 next collapse

“You will use our tool and you will like it”

FartsWithAnAccent@fedia.io on 20 Apr 2024 22:38 next collapse

The real tool was META all along!

Tronn4@lemmy.world on 20 Apr 2024 23:25 collapse

And the friends we made along the way

mp3@lemmy.ca on 20 Apr 2024 23:09 next collapse

At least I can run Llama 3 entirely locally.

BakedCatboy@lemmy.ml on 21 Apr 2024 01:59 next collapse

I just discovered how easy ollama and open webui are to set up so I’ve been using llama3 locally too, it was like 20 lines in docker compose, and although I’ve been using gpt3.5 on and off for a long time I’m much more comfortable using models run locally so I’ve been playing with it a lot more. It’s also cool being able to easily switch models at any point during a conversation. I have like 15 models downloaded, mostly 7b and a few 13b models and they all run fast enough on CPU and generate slightly slower than reading speed and only take ~15-30 seconds to start spitting out a response.

Next I want to set up a vscode plugin so I can use my own locally run codegen models from within vscode.

Larry@lemmy.world on 21 Apr 2024 04:07 collapse

I tried llamas when they were initially released, and it seems like training took garbage amounts of GPU. Did that change?

Womble@lemmy.world on 22 Apr 2024 16:19 collapse

Look into quantised models (like gguf format) these significantly reduce the amout of memory needed and speed up computation time at the expense of some quality. If you have 16GB of rm or more you can run decent models locally without any gpu, though your speed will be more like 1 word a second than chatgpt speeds

lvxferre@mander.xyz on 20 Apr 2024 23:29 next collapse

The sadder part are the people expecting Threads to be anyhow different in spirit.

simplejack@lemmy.world on 21 Apr 2024 05:52 collapse

I think they’re just expecting it to be Twitter without Nazis.

BreakDecks@lemmy.ml on 22 Apr 2024 19:30 collapse

That’s not a great expectation of Meta…

mediamatters.org/…/far-right-figures-including-na…

simplejack@lemmy.world on 22 Apr 2024 21:04 collapse

True, but Elon is rolling out the red carpet for them.

AnAnonymous@lemm.ee on 21 Apr 2024 00:56 next collapse

Paranoia vibes starting in 3, 2, 1…

maxenmajs@lemmy.world on 21 Apr 2024 03:50 next collapse

Stop plugging LLMs into everything! They are designed to make up plausible sounding nonsense.

stellargmite@lemmy.world on 21 Apr 2024 03:58 next collapse

There is a time and place for nonsense, and this isn’t it. I guess it being plausible sounding is the issue.

Sorgan71@lemmy.world on 21 Apr 2024 04:16 next collapse

no different from what human brains do

ShittyBeatlesFCPres@lemmy.world on 21 Apr 2024 04:41 collapse

Aside from knowledge, context, ability to reason, and spatial awareness.

Sorgan71@lemmy.world on 21 Apr 2024 04:43 collapse

All of those are just products of the same learning algorithm

ShittyBeatlesFCPres@lemmy.world on 21 Apr 2024 05:25 collapse

Consciousness is not a computer program. Neurons don’t use binary. I’d love it if we had computers that could do squirrel things perfectly but we don’t even have that.

spielhoelle@hachyderm.io on 21 Apr 2024 05:34 next collapse

@ShittyBeatlesFCPres @Sorgan71 well, actually those things are not so far apart. Neural networks have their names not just by accidents, name giving neurons work similar to braincells. Also on a non-ai level you could compare the RAM easily yo put short term memory etc.

my_hat_stinks@programming.dev on 21 Apr 2024 06:01 next collapse

The name is an analogy, neural networks do not work in the same way as biological neurons. They were designed by computer scientists, not biologists.

RAM is so far removed from biological short term memory both in how it works and how it’s used that the comparison doesn’t even make sense. The only similarity is that they’re short term information/data stores, so it’s equally valid to compare them to a drawing in the sand of a beach.

ShittyBeatlesFCPres@lemmy.world on 21 Apr 2024 06:06 collapse

A piece of friendly advice is to not say “Well, actually…” on the Internet because that’s a meme about know-it-alls. I (and probably everyone on Lemmy) has a tendency to “Well, actually” people and it’s one of those things where people will discount your argument before it begins.

That aside, I do think we’re trying to model the brain using the best tools we have. I suspect the next 100 years will see a revolution in biology that can be compared to previous centuries seeing huge leaps in the understanding of physics, electromagnetism, and the immune system. No one in 1900 could not have ever foreseen us mapping the human genome.

So: I wouldn’t be shocked if neural networks caught up with humans in our lifetimes. But we’re basically trying to reverse engineer it using a lot of electricity and I doubt we’ll get to squirrel level intelligence in my lifetime, much less human level. But who knows? “There are decades where nothing happens; and, there are weeks where decades happen.” (A quote from V.I. Lenin. I don’t want to be political here but hopefully we can all agree he made some history happen.)

Sorgan71@lemmy.world on 21 Apr 2024 06:43 next collapse

Binary neurons are still neurons

QuaternionsRock@lemmy.world on 23 Apr 2024 05:45 collapse

I can appreciate that contemporary neural networks are very different from organic intelligence, but consciousness is most definitely equivalent to a computer program. There are two things preventing us from reproducing it:

  1. We don’t know nearly enough about how the human mind (or any mind, really) actually works, and
  2. Our computers do not have the capacity to approximate consciousness with any meaningful degree of accuracy. Floating point representations of real numbers are not an issue (after all, you can always add more bits), but the sheer scale and complexity of the brain is a big one.

Also, for what it’s worth, most organic neurons actually do use binary (“one bit”) activation, while artificial “neurons” use a real-valued activation function for a variety of reasons, the biggest two being that (a) training algorithms require differentiable models, and (b) binary activation functions do not yield a lot of information per neuron while requiring effectively the same amount of memory.

slaacaa@lemmy.world on 21 Apr 2024 05:50 next collapse

LLMs are very useful for synthesizing information, e.g. sumamrizing long texts. Yet every company is actually pushing to use it to create more text, which as you say is at least partly nonsense.

It shows against the difference of what users need (quick access to accurate information) vs what these companies eant for us (glue your eyeballs to the screen for the longest possible time by e.g. overwhelming you with information, regardless of the quality)

loonsun@sh.itjust.works on 21 Apr 2024 07:07 collapse

Well it can be great at making text too, but the usecase has to be very good. Right now lots of companies in the B2B space are using LLMs as a middle layer to chat bots and navigation systems to enhance how they function. They are also being used to create unique lists and inputs for certain systems. However on the consumer side the usecase is pretty mixed with a lot of big companies just muddying their offerings instead of bringing any real value.

gap_betweenus@lemmy.world on 21 Apr 2024 10:31 collapse

Seems like Facebook is the right place for them than.

sugar_in_your_tea@sh.itjust.works on 22 Apr 2024 17:23 collapse

They said “plausible.”

Thorny_Insight@lemm.ee on 21 Apr 2024 07:33 collapse

Facebook’s online help page says that Meta AI will join a group conversation if tagged, or if someone “asks a question in a post and no one responds within an hour.”

Group administrators can turn the feature off.

cy_narrator@discuss.tchncs.de on 21 Apr 2024 08:57 collapse

It hurts more than any thorn