autotldr@lemmings.world
on 16 Oct 2023 15:35
nextcollapse
This is the best summary I could come up with:
Of all Elon Musk’s exploits — the Tesla cars, the SpaceX rockets, the Twitter takeover, the plans to colonize Mars — his secretive brain chip company Neuralink may be the most dangerous.
Former Neuralink employees as well as experts in the field alleged that the company pushed for an unnecessarily invasive, potentially dangerous approach to the implants that can damage the brain (and apparently has done so in animal test subjects) to advance Musk’s goal of merging with AI.
The letter warned that “AI systems with human-competitive intelligence can pose profound risks to society and humanity” and went on to ask: “Should we develop nonhuman minds that might eventually outnumber, outsmart, obsolete and replace us?
If the intravascular approach can restore key functioning to paralyzed patients, and also avoids some of the safety risks that come with crossing the blood-brain barrier, such as inflammation and scar tissue buildup in the brain, why opt for something more invasive than necessary?
Which perhaps helps make sense of the company’s dual mission: to “create a generalized brain interface to restore autonomy to those with unmet medical needs today and unlock human potential tomorrow.”
Watanabe believes Neuralink prioritized maximizing bandwidth because that serves Musk’s goal of creating a generalized BCI that lets us merge with AI and develop all sorts of new capacities.
The original article contains 3,109 words, the summary contains 219 words. Saved 93%. I’m a bot and I’m open source!
_number8_@lemmy.world
on 16 Oct 2023 15:43
nextcollapse
another smaller horror of this is: there will 1000% be ads. zero percent chance there will not be ads. even worse, people will be understandably reticent about ads at first, and services will be advertised without ads. then they’ll become subscriptions with ads. just like everything else.
For your comfort and convenience, this update deactivates the part of your brain that makes you think you dislike advertisements, among other bug fixes and improvements.
superduperenigma@lemmy.world
on 16 Oct 2023 20:21
nextcollapse
then they’ll become subscriptions with ads
The credit card on file has expired, please update your payment information within 72 hours or you will lose access to your brain function.
Sudomeapizza@lemm.ee
on 17 Oct 2023 08:39
nextcollapse
The function of an ad is to manipulate your behavior. If an implant can do this directly ads aren't needed anymore.
tdawg@lemmy.world
on 16 Oct 2023 15:45
nextcollapse
Personally I’m a fan of us further researching biological advancements for aiding people. Stuff like genetic engineering is amazing. Like to imagine there’s a future were you could get a yearly shot of anti-cancer bio-bots that basically clean your system. Or a shot for anti-senescence that repairs DNA and resets the clock on your cells
richieadler@lemmy.myserv.one
on 16 Oct 2023 16:00
nextcollapse
They will sell you biochips with expiration dates and the updates will be increasingly expensive. If you don’t pay they’ll dissolve and you’ll end as a drooling idiot at 50.
SatanicNotMessianic@lemmy.ml
on 16 Oct 2023 16:55
collapse
Great! Most people feel the same way!
The problem is that most of us who go into scientific research involving human subjects have about a decade’s worth of increasingly specialized education on a specific subject, have worked in a junior capacity in study design, execution, and analysis, and were generally not billionaires trying to become the first trillionaire.
There’s a reason why academic research works the way it does - because we learned the hard way.
The Tuskegee experiment carried out by the US Public Health Service and the CDC was a program in which black American males were deliberately infected with syphilis and left untreated, so that researchers could watch the profession of the disease. The program didn’t officially end until 1972. Just yesterday I read a news story about a doctor successfully being sued for giving prisoners, without their consent, high doses of Ivermectin to treat Covid, going off of his intuition and the idea that he was qualified to do medical research.
When I was doing this kind of work, I had to go through something called a Human Studies Board. Every university has one. The HSB is a team of senior researchers which will review your proposal to make sure what you’re asking to do is both justified and does no harm to your subjects. If they say “no,” you’re going back to the drawing board. I have had PhD students whose thesis research had to go through multiple revisions because the HSB felt that they weren’t properly controlling for potential harm. Bit there were not billions of dollars on the line, and I didn’t have billions in personal wealth and the ability to influence the HSB.
Another example: About ten or so years ago, Facebook decided to run an experiment in which they promoted sad news stories to some people, and happy news stories to others. They then followed up on those individuals’ posts to see if the former group became noticeably depressed. They did. Facebook did this without the users’ consent, and didn’t make provisions for followup with any human subjects who did become depressed. Some of their subjects may have committed acts of violence or self-harm because of pre-existing psychological states. They have no idea. They just came up with the hypothesis that sad news might make people sad, and ran with it. It was unethical. I do not believe they faced any consequences other than the researchers and the company being universally berated in the academic community.
We are researching brain implants. That’s already underway in universities around the world. Elon wants to move fast and break things to make it go faster, but in this case the “things” are people.
So you have Elon, who is legendary in the industry for thinking he’s very much smarter than he is and pushing his experts into screwing things up. You have his vast wealth as well as a drive to create more wealth (academic researchers very rarely grow wealthy from their discoveries and rarely have wealth as a driving factor).
We are already doing what you’re asking for. I have a colleague at one of the top US university that’s specifically researching telomere repair and other aspects of DNA-focused methods to prevent some of the effects of aging, and I’ve personally done modeling on the molecular biology associated with deregulation and cancer in grad school.
We would be better off if the government would just take Elon’s money and use it to fund actual scientific research.
Miclux@lemmings.world
on 16 Oct 2023 15:54
nextcollapse
Regarding the brain damage, it looks like he tested it by himself.
PeleSpirit@lemmy.world
on 16 Oct 2023 16:03
nextcollapse
I half expect him to walk out of X with usable wings on one day. He’ll then obviously, fly too close to the sun.
Jaysyn@kbin.social
on 16 Oct 2023 16:06
nextcollapse
Elon Musk says a lot of stupid things.
Kolanaki@yiffit.net
on 16 Oct 2023 16:06
nextcollapse
Let Elon go first. He can’t possibly damage his brain more.
dontcarebear@lemmy.world
on 16 Oct 2023 16:08
nextcollapse
This NEEDS to be open source, or the EULA will look like a war criminal’s defence speech.
Spaghetti_Hitchens@kbin.social
on 16 Oct 2023 16:27
collapse
You hereby give Daddy Musk the right to:
power of attorney
collect sensitive information such as pin numbers and passwords
insert ads into your dreams and waking thoughts
monitor, influence, regulate, and terminate all subconscious and conscious thought as deemed necessary
stream your vision during sexy time
darq@kbin.social
on 16 Oct 2023 16:10
nextcollapse
If you are allowing a company that Elon Musk of all people is involved in to operate on your head, maybe the damage has already been done.
I'm all for transhumanism, and I sincerely hope that the people who are hopeful for Neuralink to be therapeutic for their condition find some relief. But nobody should trust anything Elon Musk touches with their brain.
CarlsIII@kbin.social
on 16 Oct 2023 16:11
nextcollapse
That sentence makes no sense and I don’t want to find out what it means.
stolid_agnostic@lemmy.ml
on 16 Oct 2023 16:16
nextcollapse
Every time that this topic comes up, I realize that a lot of Musk lovers will take themselves out of the gene pool here.
const_void@lemmy.ml
on 16 Oct 2023 16:26
nextcollapse
Mine is already damaged from hearing about this fucking moron every day.
The_Picard_Maneuver@startrek.website
on 16 Oct 2023 16:31
nextcollapse
We’re all living in the times before consciousness can be preserved, but someone will figure out how to do it eventually.
I don’t think it’s going to be Elon Musk.
NeoNachtwaechter@lemmy.world
on 16 Oct 2023 16:59
nextcollapse
We’re all living in the times before consciousness can be preserved
It makes no sense.
Nobody knows if that is going to happen at all. We only know that it has not happened yet.
In the same way you can say that we’re all living in the times before our colony on Alpha Centauri takes us over.
The_Picard_Maneuver@startrek.website
on 16 Oct 2023 21:16
collapse
Ok, maybe it’s optimistic to expect that we’ll definitely get there one day, but I hope that we do.
Or you’re living in the times after it already was, with the tech later being used to replicate the earlier state of the world, and just aren’t aware of it.
Time isn’t necessarily linear.
Especially when we’re discussing a local future involving merging human consciousness and AI.
The_Picard_Maneuver@startrek.website
on 17 Oct 2023 13:45
collapse
AllNewTypeFace@leminal.space
on 16 Oct 2023 16:56
nextcollapse
If it’s completely voluntary, and is taken up by a few hundred Musk superfans with dreams of being a new Martian aristocracy or something, that’s basically a billionaire deathsub situation: a little tragic, a little hilarious but ultimately having little impact. The problem is if the candidates aren’t self-selecting muskies but economically coerced precarious workers, having to choose between this and struggling to keep their heads above water.
chemicalwonka@discuss.tchncs.de
on 16 Oct 2023 17:08
nextcollapse
c’mon bro, trust this lunatic
Valmond@lemmy.mindoki.com
on 16 Oct 2023 17:39
nextcollapse
Elon wants to do this, Elon wants to do that…
Shut up Elon.
RanchOnPancakes@lemmy.world
on 16 Oct 2023 17:44
nextcollapse
Sorry but tech has some privacy and technology abandonment issues to sort out before i want to put it in my body.
Red_October@lemmy.world
on 16 Oct 2023 17:55
nextcollapse
Don’t worry, anyone who agrees to try it was obviously brain damaged already, so no real loss.
alienanimals@lemmy.world
on 16 Oct 2023 18:11
nextcollapse
Downvote Musk spam.
The billionaire doesn’t need your help ensuring him and his businesses stay in the 24 hour news cycle. Don’t be a useful idiot.
ram@bookwormstory.social
on 16 Oct 2023 21:05
nextcollapse
Downvote Elon Musk spam spam.
The billionaire doesn’t need a PR team to downplay his outright foolishness. Don’t be a useful idiot.
jtk@lemmy.sdf.org
on 17 Oct 2023 02:40
nextcollapse
I’m all for promoting the shit out of this. The more idiots that kill themselves signing up, the better off we all are.
Just try be clear, you’re saying this article that suggests Elon musk will literally physically destroy people’s actual real brains in search of profit after similarly murdering hundreds of monkeys is promoting him?
PM_me_your_vagina_thanks@kbin.social
on 16 Oct 2023 18:41
nextcollapse
Rich dipshit wants to hypothetical thing with something that doesn't exist. What the fuck is this article?
7fb2adfb45bafcc01c80@lemmy.world
on 16 Oct 2023 19:17
nextcollapse
It’s already been done, and will soon be revealed…
In the middle of his cage match with Mark Zuckerberg, Musk will say “No, I am your father.” After Zuck yells “Noooo!” he’ll follow up with, “Well, just the AI parts.”
Flabbergassed@artemis.camp
on 16 Oct 2023 19:48
nextcollapse
5
GentlemanLoser@ttrpg.network
on 16 Oct 2023 20:29
nextcollapse
Heaps
Gerula@lemmy.world
on 16 Oct 2023 21:00
nextcollapse
Are there people still naive enough to believe this conman?
Fades@lemmy.world
on 16 Oct 2023 21:53
nextcollapse
can literally only end in disaster, just like the monkeys that were tortured to death for Musk’s neuralink bullshit (he wants to rush to human testing btw)
I read it somewhere that these monkeys are bred for testing purpose just like chickens and cows are bred for human consumption but not sure how true it is.
foggenbooty@lemmy.world
on 17 Oct 2023 00:35
nextcollapse
Just as it still makes it not right for cows and chickens it makes it not right for monkeys. Or would you be OK if I was breeding humans for slavery purposes?
lightnsfw@reddthat.com
on 17 Oct 2023 00:40
collapse
I’m all for it. Let any idiot that wants to allow that asshole to put a chip in their head get it. They can all speedrun themselves out of society.
Whirlgirl9@kbin.social
on 16 Oct 2023 22:11
nextcollapse
Cyberpsychos immanent...who needs to wait for 2077...
c0mbatbag3l@lemmy.world
on 17 Oct 2023 14:54
collapse
Technically that kind of thing existed as far back as today in their universe.
uriel238@lemmy.blahaj.zone
on 16 Oct 2023 22:57
nextcollapse
Mad Science Needs Zombies
Stephen King could probably do a good treatment of the concept. Or AI Michael Crichton.
Starkstruck@lemmy.world
on 16 Oct 2023 23:56
nextcollapse
The biggest reason to never put anything like this in your brain is the idea that they could put ads in your brain.
crystalmerchant@lemmy.world
on 17 Oct 2023 00:31
nextcollapse
Could?? No, not could. Will.
kromem@lemmy.world
on 17 Oct 2023 00:45
nextcollapse
I mean, the high likelihood of developing an infection in your brain seems maybe a bit more concerning, but to each their own I guess.
Sami_Uso@lemmy.world
on 17 Oct 2023 02:58
nextcollapse
My initial reaction to this was that would be preferable to ads lmao
Starkstruck@lemmy.world
on 17 Oct 2023 04:36
nextcollapse
Of course, I just meant more in the hypothetical ‘this technology is safe for human use and is being widely adopted’ stage.
HipsterTenZero@dormi.zone
on 17 Oct 2023 04:45
collapse
Ads are kind of like brain infections if you think about it
c0mbatbag3l@lemmy.world
on 17 Oct 2023 14:52
collapse
Even if it was medically safe that’s still your least concern when it comes to IoT devices and network security.
crystalmerchant@lemmy.world
on 17 Oct 2023 00:31
nextcollapse
More or less than the monkeys?
sucricdrawkcab@lemmy.world
on 17 Oct 2023 02:13
nextcollapse
This sounds like the beginning of RoboCop 2. Specifically the part when OCP is trying to make a new RoboCop and the test RoboCops flip out. That’s kinda how I see this playing out.
jtk@lemmy.sdf.org
on 17 Oct 2023 02:31
nextcollapse
Anyone that would let that idiot near their brains already killed themselves putting light inside the body.
HerrBeter@lemmy.world
on 17 Oct 2023 04:00
collapse
Desperate poor being offered a few dollars
TwoGems@lemmy.world
on 17 Oct 2023 03:17
nextcollapse
Hopefully his own
crossfadedragon@lemmy.world
on 17 Oct 2023 08:02
nextcollapse
He wants to merge with ai? He can start by sleeping with mark Zuckerberg.
HoloTheWolf@lemmy.world
on 17 Oct 2023 08:49
nextcollapse
Whenever someone brings up a topic like this, I’m always reminded of a book I read as a kid. M.T. Anderson’s Feed
FlyingSquid@lemmy.world
on 17 Oct 2023 08:53
nextcollapse
I notice Elon has never signed up to go up in one of his rockets. I wonder how quickly he’ll sign up to get one of his brain implants?
Send_me_nude_girls@feddit.de
on 17 Oct 2023 11:37
nextcollapse
Imagine delivering your vulnerable brain willingly to a troll like Musk
PsychedSy@sh.itjust.works
on 17 Oct 2023 14:31
collapse
Seems like a better use of my brain at least.
Send_me_nude_girls@feddit.de
on 17 Oct 2023 21:32
collapse
Even your brain is worth more than that. Give it a hug.
threaded - newest
This is the best summary I could come up with:
Of all Elon Musk’s exploits — the Tesla cars, the SpaceX rockets, the Twitter takeover, the plans to colonize Mars — his secretive brain chip company Neuralink may be the most dangerous.
Former Neuralink employees as well as experts in the field alleged that the company pushed for an unnecessarily invasive, potentially dangerous approach to the implants that can damage the brain (and apparently has done so in animal test subjects) to advance Musk’s goal of merging with AI.
The letter warned that “AI systems with human-competitive intelligence can pose profound risks to society and humanity” and went on to ask: “Should we develop nonhuman minds that might eventually outnumber, outsmart, obsolete and replace us?
If the intravascular approach can restore key functioning to paralyzed patients, and also avoids some of the safety risks that come with crossing the blood-brain barrier, such as inflammation and scar tissue buildup in the brain, why opt for something more invasive than necessary?
Which perhaps helps make sense of the company’s dual mission: to “create a generalized brain interface to restore autonomy to those with unmet medical needs today and unlock human potential tomorrow.”
Watanabe believes Neuralink prioritized maximizing bandwidth because that serves Musk’s goal of creating a generalized BCI that lets us merge with AI and develop all sorts of new capacities.
The original article contains 3,109 words, the summary contains 219 words. Saved 93%. I’m a bot and I’m open source!
another smaller horror of this is: there will 1000% be ads. zero percent chance there will not be ads. even worse, people will be understandably reticent about ads at first, and services will be advertised without ads. then they’ll become subscriptions with ads. just like everything else.
twitter.com/todayininfosec/…/1001834788037128193
advertising is cancer.
(he)adblocker
For your comfort and convenience, this update deactivates the part of your brain that makes you think you dislike advertisements, among other bug fixes and improvements.
The credit card on file has expired, please update your payment information within 72 hours or you will lose access to your brain function.
It’ll just be WallE all over again!
The function of an ad is to manipulate your behavior. If an implant can do this directly ads aren't needed anymore.
Personally I’m a fan of us further researching biological advancements for aiding people. Stuff like genetic engineering is amazing. Like to imagine there’s a future were you could get a yearly shot of anti-cancer bio-bots that basically clean your system. Or a shot for anti-senescence that repairs DNA and resets the clock on your cells
They will sell you biochips with expiration dates and the updates will be increasingly expensive. If you don’t pay they’ll dissolve and you’ll end as a drooling idiot at 50.
They can call it Enmuskification.
Great! Most people feel the same way!
The problem is that most of us who go into scientific research involving human subjects have about a decade’s worth of increasingly specialized education on a specific subject, have worked in a junior capacity in study design, execution, and analysis, and were generally not billionaires trying to become the first trillionaire.
There’s a reason why academic research works the way it does - because we learned the hard way.
The Tuskegee experiment carried out by the US Public Health Service and the CDC was a program in which black American males were deliberately infected with syphilis and left untreated, so that researchers could watch the profession of the disease. The program didn’t officially end until 1972. Just yesterday I read a news story about a doctor successfully being sued for giving prisoners, without their consent, high doses of Ivermectin to treat Covid, going off of his intuition and the idea that he was qualified to do medical research.
When I was doing this kind of work, I had to go through something called a Human Studies Board. Every university has one. The HSB is a team of senior researchers which will review your proposal to make sure what you’re asking to do is both justified and does no harm to your subjects. If they say “no,” you’re going back to the drawing board. I have had PhD students whose thesis research had to go through multiple revisions because the HSB felt that they weren’t properly controlling for potential harm. Bit there were not billions of dollars on the line, and I didn’t have billions in personal wealth and the ability to influence the HSB.
Another example: About ten or so years ago, Facebook decided to run an experiment in which they promoted sad news stories to some people, and happy news stories to others. They then followed up on those individuals’ posts to see if the former group became noticeably depressed. They did. Facebook did this without the users’ consent, and didn’t make provisions for followup with any human subjects who did become depressed. Some of their subjects may have committed acts of violence or self-harm because of pre-existing psychological states. They have no idea. They just came up with the hypothesis that sad news might make people sad, and ran with it. It was unethical. I do not believe they faced any consequences other than the researchers and the company being universally berated in the academic community.
We are researching brain implants. That’s already underway in universities around the world. Elon wants to move fast and break things to make it go faster, but in this case the “things” are people.
So you have Elon, who is legendary in the industry for thinking he’s very much smarter than he is and pushing his experts into screwing things up. You have his vast wealth as well as a drive to create more wealth (academic researchers very rarely grow wealthy from their discoveries and rarely have wealth as a driving factor).
We are already doing what you’re asking for. I have a colleague at one of the top US university that’s specifically researching telomere repair and other aspects of DNA-focused methods to prevent some of the effects of aging, and I’ve personally done modeling on the molecular biology associated with deregulation and cancer in grad school.
We would be better off if the government would just take Elon’s money and use it to fund actual scientific research.
Regarding the brain damage, it looks like he tested it by himself.
I half expect him to walk out of X with usable wings on one day. He’ll then obviously, fly too close to the sun.
He’s a genius though, he’d obviously go at night when it’s dark and cooler.
They got 'bid adieu' to the birds, though. Bird wasn't cool and edgy enough for Elron. He'd prefer a jet pack.
Icky-rus Elon
Unfortunately the damage has been on the animals. https://www.wired.com/story/elon-musk-pcrm-neuralink-monkey-deaths/
Funny that the very long vox article doesn't mention the wired investigation
and it worked like a charm…
Elon Musk says a lot of stupid things.
Let Elon go first. He can’t possibly damage his brain more.
This NEEDS to be open source, or the EULA will look like a war criminal’s defence speech.
You hereby give Daddy Musk the right to:
If you are allowing a company that Elon Musk of all people is involved in to operate on your head, maybe the damage has already been done.
I'm all for transhumanism, and I sincerely hope that the people who are hopeful for Neuralink to be therapeutic for their condition find some relief. But nobody should trust anything Elon Musk touches with their brain.
That sentence makes no sense and I don’t want to find out what it means.
Every time that this topic comes up, I realize that a lot of Musk lovers will take themselves out of the gene pool here.
Mine is already damaged from hearing about this fucking moron every day.
We’re all living in the times before consciousness can be preserved, but someone will figure out how to do it eventually.
I don’t think it’s going to be Elon Musk.
It makes no sense.
Nobody knows if that is going to happen at all. We only know that it has not happened yet.
In the same way you can say that we’re all living in the times before our colony on Alpha Centauri takes us over.
Ok, maybe it’s optimistic to expect that we’ll definitely get there one day, but I hope that we do.
Or you’re living in the times after it already was, with the tech later being used to replicate the earlier state of the world, and just aren’t aware of it.
Time isn’t necessarily linear.
Especially when we’re discussing a local future involving merging human consciousness and AI.
Wait, am I in the matrix?
Would you really want to know if you were?
If it’s completely voluntary, and is taken up by a few hundred Musk superfans with dreams of being a new Martian aristocracy or something, that’s basically a billionaire deathsub situation: a little tragic, a little hilarious but ultimately having little impact. The problem is if the candidates aren’t self-selecting muskies but economically coerced precarious workers, having to choose between this and struggling to keep their heads above water.
c’mon bro, trust this lunatic
Elon wants to do this, Elon wants to do that…
Shut up Elon.
Sorry but tech has some privacy and technology abandonment issues to sort out before i want to put it in my body.
Don’t worry, anyone who agrees to try it was obviously brain damaged already, so no real loss.
Downvote Musk spam.
The billionaire doesn’t need your help ensuring him and his businesses stay in the 24 hour news cycle. Don’t be a useful idiot.
Downvote Elon Musk spam spam.
The billionaire doesn’t need a PR team to downplay his outright foolishness. Don’t be a useful idiot.
I’m all for promoting the shit out of this. The more idiots that kill themselves signing up, the better off we all are.
Just try be clear, you’re saying this article that suggests Elon musk will literally physically destroy people’s actual real brains in search of profit after similarly murdering hundreds of monkeys is promoting him?
Rich dipshit wants to hypothetical thing with something that doesn't exist. What the fuck is this article?
It’s already been done, and will soon be revealed…
In the middle of his cage match with Mark Zuckerberg, Musk will say “No, I am your father.” After Zuck yells “Noooo!” he’ll follow up with, “Well, just the AI parts.”
5
Heaps
Are there people still naive enough to believe this conman?
can literally only end in disaster, just like the monkeys that were tortured to death for Musk’s neuralink bullshit (he wants to rush to human testing btw)
I read it somewhere that these monkeys are bred for testing purpose just like chickens and cows are bred for human consumption but not sure how true it is.
What’s your point?
Woah! You didn’t get the point. Life must be really hard for ya. What did you understand after reading my comment ?
It still fails the ethics part, lab animals are purpose bred however it should be scrutinised by a board of ethics
Just as it still makes it not right for cows and chickens it makes it not right for monkeys. Or would you be OK if I was breeding humans for slavery purposes?
I’m all for it. Let any idiot that wants to allow that asshole to put a chip in their head get it. They can all speedrun themselves out of society.
Cyberpsychos immanent...who needs to wait for 2077...
Technically that kind of thing existed as far back as today in their universe.
Mad Science Needs Zombies
Stephen King could probably do a good treatment of the concept. Or AI Michael Crichton.
The biggest reason to never put anything like this in your brain is the idea that they could put ads in your brain.
Could?? No, not could. Will.
I mean, the high likelihood of developing an infection in your brain seems maybe a bit more concerning, but to each their own I guess.
My initial reaction to this was that would be preferable to ads lmao
Of course, I just meant more in the hypothetical ‘this technology is safe for human use and is being widely adopted’ stage.
Ads are kind of like brain infections if you think about it
Even if it was medically safe that’s still your least concern when it comes to IoT devices and network security.
More or less than the monkeys?
This sounds like the beginning of RoboCop 2. Specifically the part when OCP is trying to make a new RoboCop and the test RoboCops flip out. That’s kinda how I see this playing out.
Anyone that would let that idiot near their brains already killed themselves putting light inside the body.
Desperate poor being offered a few dollars
Hopefully his own
He wants to merge with ai? He can start by sleeping with mark Zuckerberg.
Whenever someone brings up a topic like this, I’m always reminded of a book I read as a kid. M.T. Anderson’s Feed
I notice Elon has never signed up to go up in one of his rockets. I wonder how quickly he’ll sign up to get one of his brain implants?
Imagine delivering your vulnerable brain willingly to a troll like Musk
Seems like a better use of my brain at least.
Even your brain is worth more than that. Give it a hug.
Leave the brains alone.