TheFeatureCreature@lemmy.world
on 04 Jun 2024 07:27
nextcollapse
On the plus side, the industry is rapidly moving towards locally-run AI models specifically because they don’t want to purchase and run fleets of these absurd things or any other expensive hardware.
FrostyCaveman@lemm.ee
on 04 Jun 2024 08:12
nextcollapse
Every cloud has a silver lining it seems (heheh)
lemmylommy@lemmy.world
on 04 Jun 2024 08:18
collapse
Yeah, I think the author misses the point in regard to power consumption. Companies will not buy loads of these and use them in addition to existing hardware. They will buy these to get rid of current hardware. It's not clear (yet) if that will increase, decrease or not affect power consumption.
themoonisacheese@sh.itjust.works
on 04 Jun 2024 10:16
nextcollapse
The lack of last-last gen hardware on the used market suggests this isn’t true. Even if it were available, the buyers will run it and the overall energy consumption will still increase. It’s not like old hardware disappears after it’s replaced with newer models.
Even if companies were replacing existing hardware, the existing hardware uses less power. So whether it is additional hardware or not, there will be an increase in energy demand, which is bad for climate change.
I have personally worked on a project where we replaced several older nodes in datacenters with only one modern one. That used more power than two older nodes combined, but since we were shutting down 15-20, we saved a lot of power. Not every replacement is 1:1, most aren't.
The tragic irony of the kind of misinformed article this is linking is that the server farms that would be running this stuff are fairly efficient. The water is reused and recycled, the heat is often used for other applications. Because wasting fewer resources is cheaper than wasting more resources.
But all those locally-run models on laptop CPUs and desktop GPUs? That's grid power being turned into heat and vented into a home (probably with air conditioning on).
The weird AI panic, driven by an attempt to repurpose the popular anti-crypto arguments whether they matched the new challenges or not, is going to PR this tech into wasting way more energy than it would otherwise by distributing it over billions of computer devices paid by individual users. And nobody is going to notice or care.
I do hate our media landscape sometimes.
Chee_Koala@lemmy.world
on 04 Jun 2024 08:38
nextcollapse
But efficiency is not the only consideration, privacy and self reliance are important facets as well. Your argument about efficiënt computing is 100% valid but there is a lot more to it.
Oh, absolutely. There are plenty of good reasons to run any application locally, and a generative ML model is just another application. Some will make more sense running from server, some from client. That's not the issue.
My frustration is with the fact that a knee-jerk reaction took all the 100% valid concerns about wasteful power consumption on crypto and copy-pasted them to AI because they had so much fun dunking on cryptobros they didn't have time for nuance. So instead of solving the problem they added incentive for the tech companies owning this stuff to pass the hardware and power cost to the consumer (which they were always going to do) and minimize the perception of "costly water-chugging power-hungry server farms".
It's very dumb. The entire conversation around this has been so dumb from every angle, from the idiot techbros announcing the singularity to the straight-faced arguments that machine learning models are copy-pasting things they find on the Internet to the hyperbolic figures on potential energy and water cost. Every single valid concern or morsel of opportunity has been blown way out of reasonable proportion.
It's a model of how our entire way of interacting with each other and with the world has changed online and I hate it with my entire self.
Thanks for the perspective. I despise the way the generative models destroy income for entry level artists, the unhealthy amount it is used to avoid learning and homework in schools, and how none of the productivity gains will be shared with the working class. So my view around it is incredibly biased and when I hear any argument that puts AI into bad light I accept it without enough critical thinking.
From what I learned over the years: AI isn't likely to destroy income for entry-level artists. They destroy the quagmires those artists got stuck in. The artists this will replace first and foremost are those creating elevator music, unassuming PowerPoint presentation backgrounds, Stock photos of coffee mugs. All those things where you really don't need anything specific and don't really want to think about anything.
Now look how much is being paid for those artworks by the customers on Shutterstock and the like. Almost nothing. Now imagine what Shutterstock pays their artists. Fuck all is what. Artists might get a shred of credit here and there, a few pennies, and that's that. The market AI is “disrupting” as they say, is a self-exploitative freelancing hellhole. Most of those artists cannot live off their work, and to be frank: Their work isn't worth enough to most people to pay them the money they'd need to live.
Yet, while they chase the carrot dangling in front of them, dreaming of fame and collecting enough notoriety through that work to one day do their real art, instead of interchangeable throwaway-stuff made to fit into any situation at once, Corporations continue to bleed them dry, not allowing any progress for them whatsoever. Or do you know who made the last image of a coffee mug you saw in some advert?
The artists who manage to make a living (digital and analog) are those who manage to cultivate a following. Be that through Patreon, art exhibitions, whatever. Those artists will continue to make a living because people want them to do exactly what they do, not an imitation of it. They will continue to get commissioned because ´people want their specific style and ideas.
So in reality, it doesn't really destroy artists, it replaces one corpo-hellhole (freelancing artist) with another (freelancing AI trainer/prompter/etc)
I will keep that perspective in mind, thank you. I am very held back by the amount of resistance and pushback by myself against ai developments, and it is very hard to warm up to something being shoved down by these huge malicious corporations and not be worried about how they will use it against us.
It sounds like one of the most impressive things in recent history and something that would fill me with joy and excitement but we’re in such a hostile environment that I am missing out on all that. I haven’t even managed to get myself to warm up to at least trying one out.
It's really not that exciting. Quite the opposite. The rush for AI in everything is absolutely bonkers, since those LLMs are just stupid as fuck and not suited for any sort of productive performance they get hyped up to achieve.
I'm annoyed that we're going crazy because computers manage to spew out bullshit that vaguely sounds like the bullshit humans spew out, yet is somehow even less intelligent. At the same time, people think, this empty yapping is more accurate and totally a miracle, while all it really shows is that computers are good at patterns and language and information follow patterns - go figure.
I'm annoyed that Silicon Valley tech evangelists get away with breaking every law they fucking want, once again in the creation of those tools.
Yet, I'm neither worried about the ecological impact nor about the impact on the workforce. Yes, jobs will shift, but that was clear as day since I was a kid. I don't even necessarily think “AI” will be the huge game changer it's made up to be.
When they run out of training data (which is fueled by slave labor, because of fucking course it is) or AIs start ingesting too many AI-generated texts, the models we have today just collapse, disintegrating into a blabbering mess.
I think the same I just really grasp every opportunity to get convinced otherwise because it’s such a bummer
rottingleaf@lemmy.zip
on 04 Jun 2024 09:07
nextcollapse
The weird AI panic, driven by an attempt to repurpose the popular anti-crypto arguments whether they matched the new challenges or not, is going to PR this tech into wasting way more energy than it would otherwise by distributing it over billions of computer devices paid by individual users. And nobody is going to notice or care.
I think the idea was that these things are bad idea locally or otherwise, if you don’t control them.
No it wasn't. Here's how I know: all the valid concerns that came about how additional regulation would disproportionately stifle open source alternatives were immediately ignored by the vocal online critics (and the corporate techbros overhyping sci-fi apocalypses). And then when open alternatives appeared anyway nobody on the critical side considered them appropriate or even a lesser evil. The narrative didn't move one bit.
Because it wasn't about openness or closeness, it was a tribal fight, like all the tribal fights we keep having, stoked by greed on one end and viral outrage on the other. It's excruciating to watch.
RedWeasel@lemmy.world
on 04 Jun 2024 10:31
collapse
I wouldn’t say bad, but the generative ai and llm are definitely underbaked and shoving everything under the sun into them is going to create garbage in, garbage out. And using it for customer support where it will inevitably offer either bad advice or open you up to lawsuits seems shortsighted to say the least.
They were calling the rest machine learning(ML) a couple years ago. There are valid uses for ML though. Image/video upscaling and image search are a couple examples.
Melvin_Ferd@lemmy.world
on 04 Jun 2024 11:05
nextcollapse
Honestly, a lot of the effects people attribute to "AI" as understood by this polemic are ongoing and got ignited by algorithmic searches first and then supercharged by social media. If anything, there are some ways in which the moral AI panic is actually triggering regulation that should have existed for ages.
Melvin_Ferd@lemmy.world
on 04 Jun 2024 12:06
collapse
Regulation is only going to prevent regular people from benefiting from AI while keeping it as a tool for the upper crust to continue to benefit. Artists are a Trojan horse on this.
We're thinking about different "regulation", and that's another place where extreme opinions have nuked the ground into glass.
Absolutely yeah, techbros are playing up the risks because they hope regulators looking for a cheap win will suddenly increase the cost for competitors, lock out open alternatives and grandfather them in as the only valid stewards of this supposedly apocalyptic technology. We probably shouldn't allow that.
But "maybe don't make an app that makes porn out of social media pictures of your underage ex girlfriend at the touch of a button" is probably reasonable, AI or no AI.
Software uses need some regulation like everything else does. Doesn't mean we need to sell the regulation to disingenuous corporations.
Melvin_Ferd@lemmy.world
on 04 Jun 2024 14:13
collapse
We already have laws that protect people when porn is made of them without consent. AI should be a tool that’s as free and open to be used as possible and built upon. Regulation is only going to turn it into a tool for the haves and restrict the have not’s. Of course you’re going to see justifiable reasons just like protecting children made sense during the satanic panics. Abuse happens in daycares across the countries. Satanists do exist. Pen pineapple apple pen.
Its not like you control these things by making arguments that make no sense. They’re structured to ensure you agree with them especially during the early phase roll out otherwise it would just become something that again never pans out the way we fear. Media is there to generate the fear and arguments to convince us to hobble ourselves.
No, that's not true at all. That's the exact same argument that the fearmongers are using to claim that traditional copyright already covers the use cases of media as AI training materials and so training is copying.
It's not. We have to acknowledge that there is some novel element to these, and novel elements may require novel frameworks. I think IP and copyright are broken anyway, but if the thing that makes us rethink them is the idea that machines can learn from a piece of media without storing a copy and spit out a similar output... well, we may need to look at that.
And if there is a significant change in how easily accessible, realistic or widespread certain abusive practices are we may need some adjustments there.
But that's not the same as saying that AI is going to get us to Terminator within five years and so Sam Altman is the only savior that can keep the grail of knowledge away from bad actors. Regulation should be effective and enable people to compete in the novel areas where there is opportunity.
Both of those things can be true at the same time. I promise you don't need to take the maximalist approach. You don't even need to take sides at all. That's the frustrating part of this whole thing.
Melvin_Ferd@lemmy.world
on 04 Jun 2024 15:44
collapse
I think we should stop applying broken and primitive regulations and laws created before any of this technology and ability was ever even dreamed of. Sorry to say but I don’t want to protect the lowly artist over the ability for people to collaborate and advance our knowledge and understanding forward. I want to see copyright, IP and other laws removed entirely.
We should have moved more towards the open sharing of all information. We have unnecessarily recreated all the problems of the predigital age and made them worse.
If it was up to me I would abolish copyright and IP laws. I would make every corner of the internet a place for sharing data and information and anyone putting their work online would need to accept it will be recreated, shared and improved upon. We all should have moved in a different direction then what we have now.
Oh, man, I do miss being a techno-utopian. It was the nineties, I had just acquired a 28.8k modem in high school, my teachers were warning me about the risks of algorithmically selected, personalized information and I was all "free the information, man" and "people will figure it out" and "the advantages of free information access outweigh the negatives of the technology used to get there".
And then I was so wrong. It's not even funny how wrong I was. Like, sitting on the smoldering corpse of democracy and going "well, that happened" wrong.
But hey, I'm sure we'll mess it up even further so you can get there as well.
For the record, I don't mean to defend the status quo with that. I agree that copyright and intellectual property are broken and should be fundamentally reformulated. Just... not with a libertarian, fully unregulated framework in mind.
ChanSecodina@sh.itjust.works
on 05 Jun 2024 08:52
collapse
Hi fellow traveler. I think you and I took a similar path to get here except I started with a 33.6k modem in high school and the catch phrase I remember is “Information wants to be free.” What’s your thought on copyright reform? Somewhere along the lines of 25 years and non-renewable? How you feeling about the concept of software/algorithm patents? Talking about stuff like this is reminding me of /. :)
Well, if this was travel and not a fall down a very long, very dark hole, then one of the stops was learning when to say "I don't know".
I don't have all the answers for copyright. I don't think my problem is primarily with terms. I'm probably closer to thinking perhaps the system should acknowledge where we landed consuetudinarily. Just let people share all materials, acknowledge a right of the original author to be the sole profit holder in for-profit exploitation. That's effectively how most of the Internet works anyway. Even then there's obviously tons of stuff we'd have to sort out. What happens with ownership transfer? What about terms? What about derivative work? Components of larger works? I don't know.
We're talking about reworking some of the biggest markets and industries on the planet from the ground up. It's not a shower thought, it's something a whole bunch of very smart people with different backgrounds should and would have to get together for years to put together. Probably on a global scale.
It's an absurd question to have a locked down opinion about. The gap between beign able to tell "yeah, duh, something's not working" and being able to fix it is enormous here. Figuring out that much is probably as far as my trip is gonna take me at this point. And I know even less about patent law.
Yeeeeah, you're gonna have to break down that math for me.
Because if an output takes some amount of processing to generate and your energy cost per unit of compute is higher we're either missing something in that equation or we're breaking the laws of thermodynamics.
If the argument is that the industry uses more total energy because they keep the training in-house or because they do more of the compute at this point in time, that doesn't change things much, does it? The more of those tasks that get offloaded to the end user the more the balance will shift for generating outputs. As for training, that's a fixed cost. Technically the more you use a model the more the cost spreads out per query, and it's not like distributing the training load itself among user-level hardware would make its energy cost go down.
The reality of it is that the entire thing was more than a bit demagogic. People are mad at the energy cost of chatbot search and image generation, but not at the same cost of image generation for videogame upscaling or frame interpolation, even if they're using the same methods and hardware. Like I said earlier, it's all carryover from the crypto outrage more than it is anything else.
XeroxCool@lemmy.world
on 05 Jun 2024 04:39
nextcollapse
If I make a gas engine with 100% heat efficiency but only run it in my backyard, do the greenhouse gases not count because it’s so efficient? Of course they do. The high efficiency of a data center is great, but that’s not what the article laments. The problem it’s calling out is the absurdly wasteful nature of why these farms will flourish: to power excessively animated programs to feign intelligence, vainly wasting power for what a simple program was already addressing.
It’s the same story with lighting. LEDs seemed like a savior for energy consumption because they were so efficient. Sure they save energy overall (for now), but it prompted people to multiply the number of lights and total output by an order of magnitude simply because it’s so cheap. This stems a secondary issue of further increasing light pollution and intrusion.
Greater efficiency doesn’t make things right if it comes with an increase in use.
For one thing, it's absolutely not true that what these apps provide is the same as what we had. That's another place where the AI grifters and the AI fearmongers are both lying. This is not a 1:1 replacement for older tech. Not on search, where LLM queries are less reliable at finding facts and being accurate but better at matching fuzzy searches without specific parameters. Not with image generation, obviously. Not with tools like upscaling, frame interpolation and so on.
For another, some of the numbers being thrown around are not realistic or factual, are not presented in context or are part of a power increase trend that was already ongoing with earlier applications. The average high end desktop PC used to run on 250W in the 90s, 500W in the 2000s. Mine now runs at 1000W. Playing a videogame used to burn as much power as a couple of lightbulbs, now it's the equivalent of turning on your microwave oven.
The argument that we are burning more power because we're using more compute for entertainment purposes is not factually incorrect, but it's both hyperbolic (some of the cost estimates being shared virally are deliberate overestimates taken out of context) and not inconsistent with how we use other computer features and have used other computer features for ages.
The only reason you're so mad about me wasting some energy asking an AI to generate a cute picture but not at me using an AI to generate frames for my videogame is that one of those is a viral panic that maps nicely into the viral panic about crypto people already had and the other is a frog that has been slow burning for three decades so people don't have a reason to have an opinion about it.
Nah, even that won't be. Because most of this workload is going to run on laptops and tablets and phones and it's going to run at lower qualities where the power cost per task is very manageable on hardware accelerated devices that will do it more efficiently.
The heavy load is going to stay on farms because nobody is going to wait half an hour and waste 20% of their battery making a picture of a cute panda eating a sandwich. They'll run heavily quantized language models as interfaces to basic apps and search engines and it'll do basic upscaling for video and other familiar tasks like that.
I'm not trying to be obtusely equidistant, it's just that software developers are neither wizards that will bring about the next industrial revolution because nobody else is smart enough... nor complete morons that can't balance the load of a task across a server and a client.
But it's true that they'll push as much of that compute and energy cost onto the user as possible, as a marketing ploy to sell new devices, if nothing else. And it's true that on the aggregate that will make the tasks less efficient and waste more heat and energy.
Also, I'm not sure how downvoted I am. Interoperable social networks are a great idea in concept, but see above about software developers. I assume the up/downvote comes from rolling a d20 and adding it to whatever the local votes are.
FiniteBanjo@lemmy.today
on 06 Jun 2024 06:11
collapse
I guarantee you that much more power will be used as a result of the data centers regardless of how much efficiency they have per output.
Yes, no, you said that. But since that is a meaningless statement I was expecting some clarification.
But nope, apparently we have now established that a device existing uses up more power than that device not existing.
Which is... accurate, I suppose, but also true of everything. Turns out, televisions? Also consume less power if they don't exist. Refrigerators. Washing machines? Lots less power by not existing.
So I suppose you're advocating a return to monke situation, but since I do appreciate having a telephone (which would, in fact, save power by not existing), we're going to have to agree to disagree.
FiniteBanjo@lemmy.today
on 06 Jun 2024 21:49
collapse
LLMs major use is mimicking human beings at the cost of incredible amounts of electricity. Last I checked we have plenty of human beings and will all die if our power consumption keeps going up, so it’s absolutely not worth it. Comparing it to literally any useful technology is disingenuous.
And don’t go spouting some bullshit about it getting better over time, because the Datacenters aren’t being built in the hypothetical future when it is better, they’re being built NOW.
Look, I can suggest you start this thread over and read it from the top, because the ways this doesn't make much sense have been thoroughly explained.
Because this is a long one and if you were going to do that you would have already, I'll at least summarize the headlines: LLMs exist whether you like them or not, they can be quantized down to more reasonable power usage, are running well locally on laptops and tablets burning just a few watts for just a few seconds (NOW, as you put it). They are just one application of ML tech, and are not useless at all (fuzzy searches with few specific parameters, accessibility features, context-rich explanations of out of context images or text), even if their valid uses are misrepresented by both advocates and detractors. They are far from the only commonplace computing task that is now using a lot more power than the equivalent a few years ago, which is a larger issue than just the popularity of ML apps. Granting that LLMs will exist in any case, running them on a data center is more efficient, and the issue isn't just "power consumption" but also how the power is generated and what the reclamation of the waste products (in this case excess heat and used water) is on the other end.
I genuinely would not recommend that we engage in a back and forth breaking that down because, again, that's what this very long thread has been about already and a) I have heard every argument the AI moral panic has puth forth (and the ones the dumb techbro singularity peddlers have put forth, too), and b) we'd just go down a circular rabbit hole of repeating what we've already established here over and over again and certainly not convince each other of anything (because see point A).
FiniteBanjo@lemmy.today
on 07 Jun 2024 07:23
collapse
They exist at the current scale because we’re not regulating them, not whether we like it or not.
Absolutely not true. Regulations are both in place and in development, and none of them seem like they would prevent any of the applications currently in the market. I know the fearmongering side keeps arguing that a copyright case will stop the development of these but, to be clear, that's not going to happen. All it'll take is an extra line in an EULA to mitigate or investing in the dataset of someone who has a line in their EULA (Twitter, Reddit already, more to come for sure). The industry is actually quite fond of copyright-based training restrictions, as their main effect is most likely to be to close off open source alternatives and make it so that only Meta, Google, and MS/OpenAI can afford model training.
These are super not going away. Regulation is needed, but it's not restricting or eliminating these applications in any way that would make a dent on the also poorly understood power consumption costs.
FiniteBanjo@lemmy.today
on 07 Jun 2024 07:41
collapse
Regulating markets absolutely does prevent practices in those markets. Literally the point.
Yeah, who's saying it doesn't? It prevents the practices it prevents and allows the rest of the practices.
The regulation you're going to see on this does not, in fact, prevent making LLMs or image generators, though. And it does not, in fact prevent running them and selling them to people.
You guys have gotten it in your head that training data permissions are going to be the roadblock here, and they're absolutely not going to be. There will be common sense options, like opt-outs and opt-out defaults by mandate, just like there are on issues of data privacy under GDPR, but not absolute bans by any means.
So how much did opt-out defaults under GDPR stop social media and advertising companies from running social media and advertising data businesses?
Exactly.
What that will do is make it so you have to own a large set of accessible data, like social media companies do. They are positively salivating at the possibility that AI training will require paying them, since they'll have a user agreement that demands allowing your data to be sold for training. Meanwhile, developers of open alternatives, who are currently running out of a combination of openly accessible online data and monetized datasets put together specifically for research, will face more cost to develop alternatives. Ideally, hope the large AI corporations, too much cost pressure and they will be bullied out of the market, or at least forced to lag behind in quality by several generations.
That's what's currently happening regarding regulation, along with a bunch of more reasonable guardrails about what you should and should not generate and so on. You'll notice I didn't mention anything about power or specific applications there. LLMs and image generators are not going away and their power consumption is not going to be impacted.
helenslunch@feddit.nl
on 12 Jun 2024 06:58
collapse
Doesn’t really make any sense. You could have 1 4090 running AI for a hundred people rather than a 4060 for a single person 24/7.
Dariusmiles2123@sh.itjust.works
on 04 Jun 2024 12:37
nextcollapse
The article is really interesting and all your comments too.
For now I have a negative bias towards AI as I only see its downsides, but I can see that not everyone thinks like me and it’s great to share knowledge and understanding.
best_username_ever@sh.itjust.works
on 05 Jun 2024 07:16
collapse
According to some people (who have never programmed and don’t know what AI can do), we will all be able to retire with a lot of money and we’ll all write poetry and become painters or make music and have fun. It’s not realistic and it won’t happen.
The only positive thing that AI can do is detect bad stuff in the human body before a surgery as long as it’s validated by a professional. I could throw everything else in the trash as it’s meant to replace humans forever.
mPony@lemmy.world
on 04 Jun 2024 13:17
nextcollapse
This article is one of the most down-to-earth, realistic observations on technology I’ve ever read. Utterly striking as well.
Go Read This Article.
TheBest@midwest.social
on 04 Jun 2024 13:25
nextcollapse
Agreed, stop scrolling the comments and go read it random reader.
I used to get so excited by tech advances but now I’ve gotten to the point where its still cool and a fascinating application of science… but this stuff is legitimately existential. The author raises great points around it.
red_pigeon@lemm.ee
on 05 Jun 2024 15:21
nextcollapse
Come on. Stop reading the comments. Go check the article.
Ashen@sh.itjust.works
on 06 Jun 2024 04:42
collapse
This ironically(?) made me go read it. Normally I don’t.
Thank you.
Dkarma@lemmy.world
on 04 Jun 2024 13:44
nextcollapse
This article is a regurgitation of every tech article since the microchip.
There is literally nothing new here. Tech makes labor obsolete. Tech never considers the ramifications of tech.
These things have been known since the beginning of tech.
akwd169@sh.itjust.works
on 04 Jun 2024 15:29
nextcollapse
What about the climate impact? You didn’t even address that. That’s the worst part of the AI boom, were already way in the red for climate change, and this is going to accelerate the problem rather than slowing or stopping (let alone reversing it)
Not_mikey@slrpnk.net
on 04 Jun 2024 17:27
collapse
That’s a very solvable problem though, AI can easily be run off green energy and a lot of the new data centers being built are utilizing it, tons are popping up in Seattle with its abundance of hydro energy. Compare that to meat production or transportation via combustion which have a much harder transition and this seems way less of an existential problem then the author makes it out to be.
Also most of the energy needed is for the training which can be done at any time, so it can be run on off peak hours. It can also absorb surpluses from solar energy in the middle of the day which can put strain on the grid.
This is all assuming it’s done right, which it may not and could exasperate the ditch were already in, but the technology itself isn’t inherently bad.
dustyData@lemmy.world
on 04 Jun 2024 20:12
nextcollapse
AI can easily be run off green energy
This is all assuming it’s done right
That right there is the problem. I don’t trust any tech CEO to do the right thing ever, because historically they haven’t. For every single technological advancement since the industrial revolution brought forth by the corporate class, masses of people have had to beat them up and shed blood to get them to stop being assholes for a beat and abuse and murder people a little less.
groet@infosec.pub
on 04 Jun 2024 22:05
nextcollapse
It doesn’t matter if AI is run on green energy as long as other things are still running on fossil fuels. There is a limit to how fast renewables energy sources are built and if the power consumption of AI eats away all of that growth, then the amount of fossil energy doesn’t change.
All increases in energy consumption are not green because they force something else to run on fossil energy for longer.
frezik@midwest.social
on 05 Jun 2024 18:05
collapse
We need to deploy solar and wind at a breakneck pace to replace the fossil fuel usage we already have. Why compound that with a whole new source?
best_username_ever@sh.itjust.works
on 05 Jun 2024 07:13
collapse
The tech that exists so far haven’t had the potential to replace every job on earth, that’s the real difference for me.
aesthelete@lemmy.world
on 06 Jun 2024 05:09
collapse
haven’t had the potential to replace every job on earth, that’s the real difference for me.
This really doesn’t either tbh. But that’s certainly what they’re selling.
How do you know what the limits of this technology is? How do you know that they couldn’t be able to reach that point in 5-10-20-50-100-1000 years?
Unless you’re thinking of the current iteration of the technology and not its future evolutions.
aesthelete@lemmy.world
on 06 Jun 2024 21:49
collapse
Its future iterations that are definitely not this?
Sure, I don’t know.
I’d wager we’ll probably reach climate collapse / political crises that throw us off course before a “Westworld-esque” thing is ever possible.
People don’t seem to realize that these tech leaders are all just weaponizing your imagination against you (a.k.a. using a sales technique). GPUs and LLMs aren’t skynet no matter how much people want to project that onto them.
Nvidia cares maybe even less about the outcome than I do, they’ll sell you all the pickaxe you want to buy in the AI gold rush.
RagingHungryPanda@lemm.ee
on 04 Jun 2024 13:50
nextcollapse
One million Blackwell GPUs would suck down an astonishing 1.875 gigawatts of power. For context, a typical nuclear power plant only produces 1 gigawatt of power.
Fossil fuel-burning plants, whether that’s natural gas, coal, or oil, produce even less. There’s no way to ramp up nuclear capacity in the time it will take to supply these millions of chips, so much, if not all, of that extra power demand is going to come from carbon-emitting sources.
If you ignore the two fastest growing methods of power generation, which coincidentally are also carbon free, cheap and scalable, the future does indeed look bleak. But solar and wind do exist…
The rest is purely a policy rant. Yes, if productivity increases we need some way of distributing the gains from said productivity increase fairly across the population. But jumping to the conclusion that, since this is a challenge to be solved, the increase in productivity is bad, is just stupid.
Mrkawfee@lemmy.world
on 04 Jun 2024 13:47
nextcollapse
But without even just the cool space station to just stare at longingly…
demonsword@lemmy.world
on 04 Jun 2024 14:12
nextcollapse
I think the worst part of Huang’s keynote wasn’t that none of this mattered, it’s that I don’t think anyone in Huang’s position is really thinking about any of this at all. I hope they’re not, which at least means it’s possible they can be convinced to change course. The alternative is that they do not care, which is a far darker problem for the world.
well yeah… they just don’t care, after all the climate crisis is somebody else’s problem… and what really matters is that the line goes up next quarter, mankind’s future be damned
treadful@lemmy.zip
on 04 Jun 2024 17:47
nextcollapse
All these issues are valid and need solving but I’m kind of tired of people implying we shouldn’t do certain work because of efficiency.
And tech gets all the scrutiny for some reason (it’s transparency?). I can’t recall the last time I’ve seen an article on industrial machine efficiency and how we should just stop producing whatever.
What we really need to do is find ways to improve efficiency on all work while moving towards carbon neutrality. All work is valid.
If I want to compute pi for no reason or drive to the Grand Canyon for lunch, I should be able to do so.
Esqplorer@lemmy.zip
on 04 Jun 2024 19:34
nextcollapse
Anyone with experience in corporate operations will tell you the ROI on process changes is dramatically higher than technology. People invent so many stupid and dangerous ways to “improve” their work area. The worst part is that it just takes a little orchestration to understand their needs and use that creativity to everyone’s benefit.
Telodzrum@lemmy.world
on 05 Jun 2024 00:07
nextcollapse
lol at tech’s transparency. You have an availability heuristic issue with your thought process. Every other industry has similar critiques. Your media diet is leading you to false conclusions.
We’re literally in a technology community followed by tons of industry outsiders, of which there is a similar one on every other similar aggregation site. I don’t see any of that for things like plastics manufacturers, furniture makers, or miners. So yeah, I’d say transparency for the general public tends to be higher in tech than most other industries.
rimu@piefed.social
on 05 Jun 2024 03:35
nextcollapse
Efficiency??
This is about the total amount of emissions, not the emissions-per-unit-of-compute (or whatever).
I just disagree. Computing is expression and in my opinion freedom of travel should be a human right.
Even if you add “leisure” to it to bolster your argument.
kilgore_trout@feddit.it
on 07 Jun 2024 09:00
collapse
Well you are free to travel with your feet, because in 50 years from now I am not so sure you’ll be able to do it with a plane anymore.
kaffiene@lemmy.world
on 05 Jun 2024 06:38
collapse
Disagree that all work is valid. That only makes sense in a world with no resource constraints
chemicalwonka@discuss.tchncs.de
on 04 Jun 2024 18:03
nextcollapse
late stage capitalism news , nothing new under the sun.
drawerair@lemmy.world
on 05 Jun 2024 02:11
nextcollapse
I like that the writer thought re climate change. I think it’s been 1 of the biggest global issues for a long time. I hope there’ll be increasing use of sustainable energy for not just data centers but the whole tech world in the coming years.
I think a digital waiter doesn’t need a rendered human face. We have food ordering kiosks. Those aren’t ai. I think those suffice. A self-checkout grocer kiosk doesn’t need a face too.
I think “client help” is where ai can at least aid. Imagine a firm that’s been operating for decades and encountered so many kinds of client complaints. It can feed all those data to a large language model. With that model responding to most of the client complaints, the firm can reduce the number of their client support people. The model will pass the complaints that are so complex or that it doesn’t know how to address to the client support people. The model will handle the easy and medium complaints; the client support people will handle the rest.
Idk whether the government or the public should stop ai from taking human jobs or let it. I’m torn. Optimistically, workers can find new jobs. But we should imagine that at least 1 human will be fired and can’t find a new job. He’ll be jobless for months. He’ll have an epic headache as he can’t pay next month’s bills.
electric_nan@lemmy.ml
on 05 Jun 2024 03:53
nextcollapse
Boiling the oceans for deepfake porn, scamcoins and worse web search.
Believe it or not, peak humanity.
It’s all downhill from here
aesthelete@lemmy.world
on 06 Jun 2024 05:03
nextcollapse
I think we passed the peak a few years ago. But yeah, peak from here on out.
rottingleaf@lemmy.zip
on 07 Jun 2024 09:12
collapse
The Matrix was such a nice movie. In 2000 they already had Linux, PlayStation, ICQ, filesharing, old Star Wars (with a good chunk of the classical EU) and even the Phantom Menace (haters gonna hate), and the first 3 Harry Potter books. And WarCraft II, and X-Wing Alliance, and I’m lazy to go on with this
You still need a massive fleet of these to train those multi-billion parameter models.
On the invocation side, if you have a cloud SaaS service like ChatGPT, hosted Anthropic, or AWS Bedrock, these could answer questions quickly. But they cost a lot to operate at scale. I have a feeling the bean-counters are going to slow down the crazy overspending.
We’re heading into a world where edge computing is more cost and energy efficient to operate. It’s also more privacy-friendly. I’m more enthused about a running these models on our phones and in-home devices. There, the race will be for TOPS vs power savings.
sudo42@lemmy.world
on 05 Jun 2024 04:52
nextcollapse
So if each GPU takes 1,800W, isn’t that the equivalent of what a handheld hair dryer consumes?
phoneymouse@lemmy.world
on 05 Jun 2024 05:03
nextcollapse
And that energy doesn’t just go away after computing. You’ll have the equivalent of an average space heater of heat coming out of your computer. It’d be awesome to compute with heating energy when needed, but when you need AC it’s going to be a bitch.
Gladaed@feddit.de
on 05 Jun 2024 09:34
nextcollapse
Yes, but they are not gaming devices. They are meant to efficiently compute things. When used for that purpose they use little energy compared to other devices doing the same thing.
frezik@midwest.social
on 05 Jun 2024 18:02
collapse
Yes, and you leave it on all day at full blast. And you have a dedicated building where there’s thousands of them doing the same.
chiliedogg@lemmy.world
on 06 Jun 2024 01:22
collapse
All to take away jobs and break the internet.
Teppichbrand@feddit.de
on 05 Jun 2024 05:51
nextcollapse
Innovation is a scam, it breeds endless bullshit we keep buying and talking about like 10 year olds with their latest gimmick. Look, they replaced this button with A TOUCHSCREEN! Look! This artficial face has PORES NOW! LOOK! This coffee machine costs 2000$ now and uses PROPRIATARY SUPEREXPENSIVE CAPSULES!!
We need progress, which is harder to do because it takes a paradigm shift on an Individual and social level. It’s much less gadgety.
kaffiene@lemmy.world
on 05 Jun 2024 06:31
nextcollapse
Tech is neither good nor bad, but control of tech is a major issue.
technocrit@lemmy.dbzer0.com
on 05 Jun 2024 14:53
collapse
The existing capitalist control of tech is bad.
kaffiene@lemmy.world
on 05 Jun 2024 20:55
collapse
Agreed! And that’s where the problem lies. It’s not tech so much as our existing power structures.
Veraxus@lemmy.world
on 05 Jun 2024 16:16
nextcollapse
You’re not wrong. We’ve reached a point, technologically, where there is little-to-no true innovation left… and what I mean by that is that everything is now built on incredible amounts of work by others who came before. “Standing on the shoulders of giants”, as it were. And yet we have a corrupt “patent” system that is exclusively used to steal the work of those giants while at the same time depriving all of humanity of true progress. And why? So that a handful of very rich people can get even more rich.
Teppichbrand@feddit.de
on 05 Jun 2024 17:20
collapse
Exactly, innovations no longer help to satisfy real basic needs, they are used to create new, artificial needs. Always new toys that make us feel like we’re making progress.
rottingleaf@lemmy.zip
on 07 Jun 2024 09:52
collapse
That’s not true, but to have planned “innovation” bring profit you need to impede real progress. Cause real progress disrupts such plans.
UnpluggedFridge@lemmy.world
on 05 Jun 2024 19:01
nextcollapse
I remember hearing this argument before…about the Internet. Glad that fad went away.
As it has always been, these technologies are being used to push us forward by teams of underpaid unnamed researchers with no interest in profit. Meanwhile you focus on the scammers and capitalists and unload your wallets to them, all while complaining about the lack of progress as measured by the products you see in advertisements.
Luckily, when you get that cancer diagnosis or your child is born with some rare disease, that progress will attend to your needs despite your ignorance if it.
Semi_Hemi_Demigod@lemmy.world
on 05 Jun 2024 21:05
nextcollapse
Exactly. OP is mad at alienation, not at progress. In a different, less stupid world these labor saving devices would actually be great, leading to a better quality of life for everyone, and getting a really awesome coffee maker. But the people making the decisions aren’t the consumers or the researchers.
Teppichbrand@feddit.de
on 06 Jun 2024 05:52
collapse
You misunderstood me. I have nothing against progress. Medical progress is great! But what is often sold to us as innovation is not progress but just more nonsense that only pretends to get us further.
shrugs@lemmy.world
on 05 Jun 2024 22:35
nextcollapse
Innovatin is good if it results in clean water, meds, housing, safe food and goods and services.
It’s bad if it means: the most profit for useless shit that people only buy because advertisment made them believe they need it.
Capitalism is a tool. Please let’s grow a pair and stop letting it decide how it will be used. It’s like pulling the trigger on an ak47 without holding it tight. Do we expect the weapon to know where to shot?
Capitalism is a tool that wants to maximize its profits. Unfortunately it discovered that changing the politics and laws is an easy way to do that, even if it’s bad for the people.
Capitalism is per definition not bound to ethics or moral. We need to set rules, even if big corporations made us to believe we shouldn’t.
rottingleaf@lemmy.zip
on 07 Jun 2024 09:50
collapse
We need to set rules, even if big corporations made us to believe we shouldn’t.
That’s a strawman, possibly aimed at libertarians. Like everyone else, corps want to set rules which benefit them.
UnderpantsWeevil@lemmy.world
on 05 Jun 2024 23:01
nextcollapse
We need progress, which is harder to do because it takes a paradigm shift on an Individual and social level.
Sometimes it just takes a marginal improvement to the quality of the engineering. But these “what if manual labor but fascade of robots!” gimmicks aren’t improvements in engineering. They’re an effort to cut corners on quality in pursuit of a higher profit margin.
Wogi@lemmy.world
on 06 Jun 2024 02:42
nextcollapse
Fun fact the first Mr coffee cost 300 dollars in 1971, which would be more than 2000 dollars today
rottingleaf@lemmy.zip
on 07 Jun 2024 09:48
collapse
Improvement is not a scam.
Innovation is a scam created by representing change as improvement when it isn’t.
And every time change gets replaced with innovation, it’s connected to totalitarian\fascist tendencies, because it makes easier to sell societal change which is clearly not improvement.
A person who seriously affected my life advised “Homo Ludens” by Johan Huizinga, not sure whether because of the part of it about fascism in the 30s.
iAvicenna@lemmy.world
on 05 Jun 2024 06:57
nextcollapse
nice now combine this with shit for brains coin bros and we can all boil together
AnUnusualRelic@lemmy.world
on 05 Jun 2024 21:01
nextcollapse
Im starting to wonder if the Butlerian Jihad isn’t a good idea after all.
shrugs@lemmy.world
on 05 Jun 2024 22:23
nextcollapse
Is nobody concerned about this:
Behind the wall, an army of robots, also powered by new Nvidia robotics processors, will assemble your food, no humans needed. We’ve already seen the introduction of these kinds of ‘labor-saving’ technologies in the form of self-checkout counters, food ordering kiosks, and other similar human-replacements in service industries, so there’s no reason to think that this trend won’t continue with AI.
not being seen as the paradise? It’s like the enterprise crew is concerned about replicators because people will lose their jobs.
This is madness, to be honest, this is what humankind ultimately should evolve into. No stupid labour for anyone. But the truth is: capitalism will take care of that, it will make sure, that not everyone is free but that a small percentage is more free and the rest is fucked.There lies the problem not in being able to make human labour obsolete.
zbyte64@social.rootaccess.org
on 05 Jun 2024 22:31
nextcollapse
@shrugs@rwtwm I think the enterprise crew would be concerned if the Ferengi owned said replicators.
Eranziel@lemmy.world
on 05 Jun 2024 22:48
nextcollapse
The issue with “Human jobs will be replaced” is that society still requires humans to have a paying job to survive.
I would love a world where nobody had to do dumb labour anymore, and everyone’s needs are still met.
sgtgig@lemmy.world
on 05 Jun 2024 22:59
nextcollapse
Yup. Realistic result of things becoming automated is that we have several decades of social strife grappling with the fact there’s too many people for the amount of human labor actually needed, until there’s enough possibly violent unrest for the powers that be to realize "oh, maybe we shouldn’t require people to have jobs that don’t exist "
Solid agree, but it’s so hard to persuade the brainwashed (let alone their capitalist masters) that the purpose of economic growth should be to generate sufficient leisure time to permit self-actualising activities for those who seek them.
UnderpantsWeevil@lemmy.world
on 05 Jun 2024 22:55
nextcollapse
I’ve been watching people try to deliver the end-to-end Food Making conveyor belt for my entire life. What I’ve consistently seen delivered are novelties, more prone to creating a giant mess in your mechanical kitchen than producing anything both efficient and edible. The closest I’ve seen are those microwaved dinners, and they’re hardly what I’d call an exciting meal.
But they are cheap to churn out. That’s what is ultimately upsetting about this overall trend. Not that we’ll be eliminating a chronic demand on human labor, but that we’ll be excising any amount of artistry or quality from the menu in order to sell people assembly line TV dinners at 100x markups in pursuit of another percentage point of GDP growth.
As more and more of the agricultural sector falls under the domain of business interests fixated on profits ahead of product, we’re going to see the volume and quality of food squeezed down into what a robot can shove through a tube.
captain_aggravated@sh.itjust.works
on 05 Jun 2024 23:29
nextcollapse
The notion that everyone must earn their own living is going to be a problem soon.
anon_8675309@lemmy.world
on 05 Jun 2024 23:40
collapse
The wealthy ruling class have siphoned off nearly all of the productivity gains since the 70s. AI won’t stop that machine. If half of us die of starvation and half the remaining half die from fighting each other for cake, they don’t care.
Welt@lazysoci.al
on 05 Jun 2024 22:58
nextcollapse
My blood runs cold! My dignity has just been sold.
nVidia is the centerfold.
RememberTheApollo_@lemmy.world
on 05 Jun 2024 23:46
nextcollapse
Where my futurists now? Tell me again how a technological advancement will free humans from drudgery to engage in more free and enlightened pursuits?
31337@sh.itjust.works
on 06 Jun 2024 00:36
nextcollapse
A lot of the “elites” (OpenAI board, Thiel, Andreessen, etc) are on the effective-accelerationism grift now. The idea is to disregard all negative effects of pursuing technological “progress,” because techno-capitalism will solve all problems. They support burning fossil fuels as fast as possible because that will enable “progress,” which will solve climate change (through geoengineering, presumably). I’ve seen some accelerationists write that it would be ok if AI destroys humanity, because it would be the next evolution of “intelligence.” I dunno if they’ve fallen for their own grift or not, but it’s obviously a very convenient belief for them.
Effective-accelerationism was first coined by Nick Land, who appears to be some kind of fascist.
rottingleaf@lemmy.zip
on 07 Jun 2024 09:06
collapse
The problem with this approach is that progress here is viewed like a brick wall you build.
You don’t get progress from just burning a lot of wood in 1400s. You can get it if that wood is burnt with the goal of, I dunno, making better metal or bricks for some specific mechanism.
Same with our time, how can they expect solutions of problems to be found when they don’t understand what they are trying to find?
It’s like a cargo cult - “white people had this thing and it could fly and drop cargo, so we must reproduce its shape and we’ll be rich”, only in this case it’s even dumber - nobody has seen the things they are trying to reach anywhere outside of space opera series.
What differentiates IT from most other engineering areas is that most of people doing it solve abstract tasks in abstract environments, defined by social and market demand. They are, sadly, simply a grade below real engineers and scientists for that reason alone.
Moorshou@lemmy.zip
on 06 Jun 2024 03:28
nextcollapse
Holy crap, I thought I hated AI and I was uncertain. Now I’m sure I hate AI
Tyrangle@lemmy.world
on 06 Jun 2024 04:20
collapse
For thousands of years the ruling class has tolerated the rest of us because they needed us for labor and protection. We’re approaching the first time in human history where this may no longer be the case. If any of us are invited to the AI utopia, I suspect it will only be to worship those who control it. I’m not sure what utility we’ll have to offer beyond that. I doubt they’ll keep us around just to collect UBI checks.
threaded - newest
On the plus side, the industry is rapidly moving towards locally-run AI models specifically because they don’t want to purchase and run fleets of these absurd things or any other expensive hardware.
Every cloud has a silver lining it seems (heheh)
You can still upload the results to the cloud
Yeah, I think the author misses the point in regard to power consumption. Companies will not buy loads of these and use them in addition to existing hardware. They will buy these to get rid of current hardware. It's not clear (yet) if that will increase, decrease or not affect power consumption.
The lack of last-last gen hardware on the used market suggests this isn’t true. Even if it were available, the buyers will run it and the overall energy consumption will still increase. It’s not like old hardware disappears after it’s replaced with newer models.
Even if companies were replacing existing hardware, the existing hardware uses less power. So whether it is additional hardware or not, there will be an increase in energy demand, which is bad for climate change.
I have personally worked on a project where we replaced several older nodes in datacenters with only one modern one. That used more power than two older nodes combined, but since we were shutting down 15-20, we saved a lot of power. Not every replacement is 1:1, most aren't.
The tragic irony of the kind of misinformed article this is linking is that the server farms that would be running this stuff are fairly efficient. The water is reused and recycled, the heat is often used for other applications. Because wasting fewer resources is cheaper than wasting more resources.
But all those locally-run models on laptop CPUs and desktop GPUs? That's grid power being turned into heat and vented into a home (probably with air conditioning on).
The weird AI panic, driven by an attempt to repurpose the popular anti-crypto arguments whether they matched the new challenges or not, is going to PR this tech into wasting way more energy than it would otherwise by distributing it over billions of computer devices paid by individual users. And nobody is going to notice or care.
I do hate our media landscape sometimes.
But efficiency is not the only consideration, privacy and self reliance are important facets as well. Your argument about efficiënt computing is 100% valid but there is a lot more to it.
Oh, absolutely. There are plenty of good reasons to run any application locally, and a generative ML model is just another application. Some will make more sense running from server, some from client. That's not the issue.
My frustration is with the fact that a knee-jerk reaction took all the 100% valid concerns about wasteful power consumption on crypto and copy-pasted them to AI because they had so much fun dunking on cryptobros they didn't have time for nuance. So instead of solving the problem they added incentive for the tech companies owning this stuff to pass the hardware and power cost to the consumer (which they were always going to do) and minimize the perception of "costly water-chugging power-hungry server farms".
It's very dumb. The entire conversation around this has been so dumb from every angle, from the idiot techbros announcing the singularity to the straight-faced arguments that machine learning models are copy-pasting things they find on the Internet to the hyperbolic figures on potential energy and water cost. Every single valid concern or morsel of opportunity has been blown way out of reasonable proportion.
It's a model of how our entire way of interacting with each other and with the world has changed online and I hate it with my entire self.
Thanks for the perspective. I despise the way the generative models destroy income for entry level artists, the unhealthy amount it is used to avoid learning and homework in schools, and how none of the productivity gains will be shared with the working class. So my view around it is incredibly biased and when I hear any argument that puts AI into bad light I accept it without enough critical thinking.
From what I learned over the years: AI isn't likely to destroy income for entry-level artists. They destroy the quagmires those artists got stuck in. The artists this will replace first and foremost are those creating elevator music, unassuming PowerPoint presentation backgrounds, Stock photos of coffee mugs. All those things where you really don't need anything specific and don't really want to think about anything.
Now look how much is being paid for those artworks by the customers on Shutterstock and the like. Almost nothing. Now imagine what Shutterstock pays their artists. Fuck all is what. Artists might get a shred of credit here and there, a few pennies, and that's that. The market AI is “disrupting” as they say, is a self-exploitative freelancing hellhole. Most of those artists cannot live off their work, and to be frank: Their work isn't worth enough to most people to pay them the money they'd need to live.
Yet, while they chase the carrot dangling in front of them, dreaming of fame and collecting enough notoriety through that work to one day do their real art, instead of interchangeable throwaway-stuff made to fit into any situation at once, Corporations continue to bleed them dry, not allowing any progress for them whatsoever. Or do you know who made the last image of a coffee mug you saw in some advert?
The artists who manage to make a living (digital and analog) are those who manage to cultivate a following. Be that through Patreon, art exhibitions, whatever. Those artists will continue to make a living because people want them to do exactly what they do, not an imitation of it. They will continue to get commissioned because ´people want their specific style and ideas.
So in reality, it doesn't really destroy artists, it replaces one corpo-hellhole (freelancing artist) with another (freelancing AI trainer/prompter/etc)
I will keep that perspective in mind, thank you. I am very held back by the amount of resistance and pushback by myself against ai developments, and it is very hard to warm up to something being shoved down by these huge malicious corporations and not be worried about how they will use it against us.
It sounds like one of the most impressive things in recent history and something that would fill me with joy and excitement but we’re in such a hostile environment that I am missing out on all that. I haven’t even managed to get myself to warm up to at least trying one out.
It's really not that exciting. Quite the opposite. The rush for AI in everything is absolutely bonkers, since those LLMs are just stupid as fuck and not suited for any sort of productive performance they get hyped up to achieve.
ah so you were only annoyed that people are against doing the stupid computations in the datacenter and there will be less efficient grid version?
I'm annoyed that we're going crazy because computers manage to spew out bullshit that vaguely sounds like the bullshit humans spew out, yet is somehow even less intelligent. At the same time, people think, this empty yapping is more accurate and totally a miracle, while all it really shows is that computers are good at patterns and language and information follow patterns - go figure.
I'm annoyed that Silicon Valley tech evangelists get away with breaking every law they fucking want, once again in the creation of those tools.
Yet, I'm neither worried about the ecological impact nor about the impact on the workforce. Yes, jobs will shift, but that was clear as day since I was a kid. I don't even necessarily think “AI” will be the huge game changer it's made up to be.
When they run out of training data (which is fueled by slave labor, because of fucking course it is) or AIs start ingesting too many AI-generated texts, the models we have today just collapse, disintegrating into a blabbering mess.
I think the same I just really grasp every opportunity to get convinced otherwise because it’s such a bummer
I think the idea was that these things are bad idea locally or otherwise, if you don’t control them.
No it wasn't. Here's how I know: all the valid concerns that came about how additional regulation would disproportionately stifle open source alternatives were immediately ignored by the vocal online critics (and the corporate techbros overhyping sci-fi apocalypses). And then when open alternatives appeared anyway nobody on the critical side considered them appropriate or even a lesser evil. The narrative didn't move one bit.
Because it wasn't about openness or closeness, it was a tribal fight, like all the tribal fights we keep having, stoked by greed on one end and viral outrage on the other. It's excruciating to watch.
I wouldn’t say bad, but the generative ai and llm are definitely underbaked and shoving everything under the sun into them is going to create garbage in, garbage out. And using it for customer support where it will inevitably offer either bad advice or open you up to lawsuits seems shortsighted to say the least.
They were calling the rest machine learning(ML) a couple years ago. There are valid uses for ML though. Image/video upscaling and image search are a couple examples.
Modern media scares me more than AI
Honestly, a lot of the effects people attribute to "AI" as understood by this polemic are ongoing and got ignited by algorithmic searches first and then supercharged by social media. If anything, there are some ways in which the moral AI panic is actually triggering regulation that should have existed for ages.
Regulation is only going to prevent regular people from benefiting from AI while keeping it as a tool for the upper crust to continue to benefit. Artists are a Trojan horse on this.
We're thinking about different "regulation", and that's another place where extreme opinions have nuked the ground into glass.
Absolutely yeah, techbros are playing up the risks because they hope regulators looking for a cheap win will suddenly increase the cost for competitors, lock out open alternatives and grandfather them in as the only valid stewards of this supposedly apocalyptic technology. We probably shouldn't allow that.
But "maybe don't make an app that makes porn out of social media pictures of your underage ex girlfriend at the touch of a button" is probably reasonable, AI or no AI.
Software uses need some regulation like everything else does. Doesn't mean we need to sell the regulation to disingenuous corporations.
We already have laws that protect people when porn is made of them without consent. AI should be a tool that’s as free and open to be used as possible and built upon. Regulation is only going to turn it into a tool for the haves and restrict the have not’s. Of course you’re going to see justifiable reasons just like protecting children made sense during the satanic panics. Abuse happens in daycares across the countries. Satanists do exist. Pen pineapple apple pen.
Its not like you control these things by making arguments that make no sense. They’re structured to ensure you agree with them especially during the early phase roll out otherwise it would just become something that again never pans out the way we fear. Media is there to generate the fear and arguments to convince us to hobble ourselves.
No, that's not true at all. That's the exact same argument that the fearmongers are using to claim that traditional copyright already covers the use cases of media as AI training materials and so training is copying.
It's not. We have to acknowledge that there is some novel element to these, and novel elements may require novel frameworks. I think IP and copyright are broken anyway, but if the thing that makes us rethink them is the idea that machines can learn from a piece of media without storing a copy and spit out a similar output... well, we may need to look at that.
And if there is a significant change in how easily accessible, realistic or widespread certain abusive practices are we may need some adjustments there.
But that's not the same as saying that AI is going to get us to Terminator within five years and so Sam Altman is the only savior that can keep the grail of knowledge away from bad actors. Regulation should be effective and enable people to compete in the novel areas where there is opportunity.
Both of those things can be true at the same time. I promise you don't need to take the maximalist approach. You don't even need to take sides at all. That's the frustrating part of this whole thing.
I think we should stop applying broken and primitive regulations and laws created before any of this technology and ability was ever even dreamed of. Sorry to say but I don’t want to protect the lowly artist over the ability for people to collaborate and advance our knowledge and understanding forward. I want to see copyright, IP and other laws removed entirely.
We should have moved more towards the open sharing of all information. We have unnecessarily recreated all the problems of the predigital age and made them worse.
If it was up to me I would abolish copyright and IP laws. I would make every corner of the internet a place for sharing data and information and anyone putting their work online would need to accept it will be recreated, shared and improved upon. We all should have moved in a different direction then what we have now.
Oh, man, I do miss being a techno-utopian. It was the nineties, I had just acquired a 28.8k modem in high school, my teachers were warning me about the risks of algorithmically selected, personalized information and I was all "free the information, man" and "people will figure it out" and "the advantages of free information access outweigh the negatives of the technology used to get there".
And then I was so wrong. It's not even funny how wrong I was. Like, sitting on the smoldering corpse of democracy and going "well, that happened" wrong.
But hey, I'm sure we'll mess it up even further so you can get there as well.
For the record, I don't mean to defend the status quo with that. I agree that copyright and intellectual property are broken and should be fundamentally reformulated. Just... not with a libertarian, fully unregulated framework in mind.
Hi fellow traveler. I think you and I took a similar path to get here except I started with a 33.6k modem in high school and the catch phrase I remember is “Information wants to be free.” What’s your thought on copyright reform? Somewhere along the lines of 25 years and non-renewable? How you feeling about the concept of software/algorithm patents? Talking about stuff like this is reminding me of /. :)
Well, if this was travel and not a fall down a very long, very dark hole, then one of the stops was learning when to say "I don't know".
I don't have all the answers for copyright. I don't think my problem is primarily with terms. I'm probably closer to thinking perhaps the system should acknowledge where we landed consuetudinarily. Just let people share all materials, acknowledge a right of the original author to be the sole profit holder in for-profit exploitation. That's effectively how most of the Internet works anyway. Even then there's obviously tons of stuff we'd have to sort out. What happens with ownership transfer? What about terms? What about derivative work? Components of larger works? I don't know.
We're talking about reworking some of the biggest markets and industries on the planet from the ground up. It's not a shower thought, it's something a whole bunch of very smart people with different backgrounds should and would have to get together for years to put together. Probably on a global scale.
It's an absurd question to have a locked down opinion about. The gap between beign able to tell "yeah, duh, something's not working" and being able to fix it is enormous here. Figuring out that much is probably as far as my trip is gonna take me at this point. And I know even less about patent law.
Yes, yes, yes!
In fact, I think you could extend this to all art and possibly even to the online identity of all humans.
Efficiency at the consumer level is poor, but industry uses more total energy than consumers.
Yeeeeah, you're gonna have to break down that math for me.
Because if an output takes some amount of processing to generate and your energy cost per unit of compute is higher we're either missing something in that equation or we're breaking the laws of thermodynamics.
If the argument is that the industry uses more total energy because they keep the training in-house or because they do more of the compute at this point in time, that doesn't change things much, does it? The more of those tasks that get offloaded to the end user the more the balance will shift for generating outputs. As for training, that's a fixed cost. Technically the more you use a model the more the cost spreads out per query, and it's not like distributing the training load itself among user-level hardware would make its energy cost go down.
The reality of it is that the entire thing was more than a bit demagogic. People are mad at the energy cost of chatbot search and image generation, but not at the same cost of image generation for videogame upscaling or frame interpolation, even if they're using the same methods and hardware. Like I said earlier, it's all carryover from the crypto outrage more than it is anything else.
If I make a gas engine with 100% heat efficiency but only run it in my backyard, do the greenhouse gases not count because it’s so efficient? Of course they do. The high efficiency of a data center is great, but that’s not what the article laments. The problem it’s calling out is the absurdly wasteful nature of why these farms will flourish: to power excessively animated programs to feign intelligence, vainly wasting power for what a simple program was already addressing.
It’s the same story with lighting. LEDs seemed like a savior for energy consumption because they were so efficient. Sure they save energy overall (for now), but it prompted people to multiply the number of lights and total output by an order of magnitude simply because it’s so cheap. This stems a secondary issue of further increasing light pollution and intrusion.
Greater efficiency doesn’t make things right if it comes with an increase in use.
For one thing, it's absolutely not true that what these apps provide is the same as what we had. That's another place where the AI grifters and the AI fearmongers are both lying. This is not a 1:1 replacement for older tech. Not on search, where LLM queries are less reliable at finding facts and being accurate but better at matching fuzzy searches without specific parameters. Not with image generation, obviously. Not with tools like upscaling, frame interpolation and so on.
For another, some of the numbers being thrown around are not realistic or factual, are not presented in context or are part of a power increase trend that was already ongoing with earlier applications. The average high end desktop PC used to run on 250W in the 90s, 500W in the 2000s. Mine now runs at 1000W. Playing a videogame used to burn as much power as a couple of lightbulbs, now it's the equivalent of turning on your microwave oven.
The argument that we are burning more power because we're using more compute for entertainment purposes is not factually incorrect, but it's both hyperbolic (some of the cost estimates being shared virally are deliberate overestimates taken out of context) and not inconsistent with how we use other computer features and have used other computer features for ages.
The only reason you're so mad about me wasting some energy asking an AI to generate a cute picture but not at me using an AI to generate frames for my videogame is that one of those is a viral panic that maps nicely into the viral panic about crypto people already had and the other is a frog that has been slow burning for three decades so people don't have a reason to have an opinion about it.
.
Nah, even that won't be. Because most of this workload is going to run on laptops and tablets and phones and it's going to run at lower qualities where the power cost per task is very manageable on hardware accelerated devices that will do it more efficiently.
The heavy load is going to stay on farms because nobody is going to wait half an hour and waste 20% of their battery making a picture of a cute panda eating a sandwich. They'll run heavily quantized language models as interfaces to basic apps and search engines and it'll do basic upscaling for video and other familiar tasks like that.
I'm not trying to be obtusely equidistant, it's just that software developers are neither wizards that will bring about the next industrial revolution because nobody else is smart enough... nor complete morons that can't balance the load of a task across a server and a client.
But it's true that they'll push as much of that compute and energy cost onto the user as possible, as a marketing ploy to sell new devices, if nothing else. And it's true that on the aggregate that will make the tasks less efficient and waste more heat and energy.
Also, I'm not sure how downvoted I am. Interoperable social networks are a great idea in concept, but see above about software developers. I assume the up/downvote comes from rolling a d20 and adding it to whatever the local votes are.
I guarantee you that much more power will be used as a result of the data centers regardless of how much efficiency they have per output.
Much more power than what? What's your benchmark here?
Is this a joke? I said it. It was a single sentence, you can’t parse that?
is greater than
Even if they’re more efficient, they’re also producing more output and taking more power as a result.
Yes, no, you said that. But since that is a meaningless statement I was expecting some clarification.
But nope, apparently we have now established that a device existing uses up more power than that device not existing.
Which is... accurate, I suppose, but also true of everything. Turns out, televisions? Also consume less power if they don't exist. Refrigerators. Washing machines? Lots less power by not existing.
So I suppose you're advocating a return to monke situation, but since I do appreciate having a telephone (which would, in fact, save power by not existing), we're going to have to agree to disagree.
LLMs major use is mimicking human beings at the cost of incredible amounts of electricity. Last I checked we have plenty of human beings and will all die if our power consumption keeps going up, so it’s absolutely not worth it. Comparing it to literally any useful technology is disingenuous.
And don’t go spouting some bullshit about it getting better over time, because the Datacenters aren’t being built in the hypothetical future when it is better, they’re being built NOW.
Look, I can suggest you start this thread over and read it from the top, because the ways this doesn't make much sense have been thoroughly explained.
Because this is a long one and if you were going to do that you would have already, I'll at least summarize the headlines: LLMs exist whether you like them or not, they can be quantized down to more reasonable power usage, are running well locally on laptops and tablets burning just a few watts for just a few seconds (NOW, as you put it). They are just one application of ML tech, and are not useless at all (fuzzy searches with few specific parameters, accessibility features, context-rich explanations of out of context images or text), even if their valid uses are misrepresented by both advocates and detractors. They are far from the only commonplace computing task that is now using a lot more power than the equivalent a few years ago, which is a larger issue than just the popularity of ML apps. Granting that LLMs will exist in any case, running them on a data center is more efficient, and the issue isn't just "power consumption" but also how the power is generated and what the reclamation of the waste products (in this case excess heat and used water) is on the other end.
I genuinely would not recommend that we engage in a back and forth breaking that down because, again, that's what this very long thread has been about already and a) I have heard every argument the AI moral panic has puth forth (and the ones the dumb techbro singularity peddlers have put forth, too), and b) we'd just go down a circular rabbit hole of repeating what we've already established here over and over again and certainly not convince each other of anything (because see point A).
They exist at the current scale because we’re not regulating them, not whether we like it or not.
Absolutely not true. Regulations are both in place and in development, and none of them seem like they would prevent any of the applications currently in the market. I know the fearmongering side keeps arguing that a copyright case will stop the development of these but, to be clear, that's not going to happen. All it'll take is an extra line in an EULA to mitigate or investing in the dataset of someone who has a line in their EULA (Twitter, Reddit already, more to come for sure). The industry is actually quite fond of copyright-based training restrictions, as their main effect is most likely to be to close off open source alternatives and make it so that only Meta, Google, and MS/OpenAI can afford model training.
These are super not going away. Regulation is needed, but it's not restricting or eliminating these applications in any way that would make a dent on the also poorly understood power consumption costs.
Regulating markets absolutely does prevent practices in those markets. Literally the point.
Yeah, who's saying it doesn't? It prevents the practices it prevents and allows the rest of the practices.
The regulation you're going to see on this does not, in fact, prevent making LLMs or image generators, though. And it does not, in fact prevent running them and selling them to people.
You guys have gotten it in your head that training data permissions are going to be the roadblock here, and they're absolutely not going to be. There will be common sense options, like opt-outs and opt-out defaults by mandate, just like there are on issues of data privacy under GDPR, but not absolute bans by any means.
So how much did opt-out defaults under GDPR stop social media and advertising companies from running social media and advertising data businesses?
Exactly.
What that will do is make it so you have to own a large set of accessible data, like social media companies do. They are positively salivating at the possibility that AI training will require paying them, since they'll have a user agreement that demands allowing your data to be sold for training. Meanwhile, developers of open alternatives, who are currently running out of a combination of openly accessible online data and monetized datasets put together specifically for research, will face more cost to develop alternatives. Ideally, hope the large AI corporations, too much cost pressure and they will be bullied out of the market, or at least forced to lag behind in quality by several generations.
That's what's currently happening regarding regulation, along with a bunch of more reasonable guardrails about what you should and should not generate and so on. You'll notice I didn't mention anything about power or specific applications there. LLMs and image generators are not going away and their power consumption is not going to be impacted.
.
Doesn’t really make any sense. You could have 1 4090 running AI for a hundred people rather than a 4060 for a single person 24/7.
The article is really interesting and all your comments too.
For now I have a negative bias towards AI as I only see its downsides, but I can see that not everyone thinks like me and it’s great to share knowledge and understanding.
According to some people (who have never programmed and don’t know what AI can do), we will all be able to retire with a lot of money and we’ll all write poetry and become painters or make music and have fun. It’s not realistic and it won’t happen.
The only positive thing that AI can do is detect bad stuff in the human body before a surgery as long as it’s validated by a professional. I could throw everything else in the trash as it’s meant to replace humans forever.
This article is one of the most down-to-earth, realistic observations on technology I’ve ever read. Utterly striking as well.
Go Read This Article.
Agreed, stop scrolling the comments and go read it random reader.
I used to get so excited by tech advances but now I’ve gotten to the point where its still cool and a fascinating application of science… but this stuff is legitimately existential. The author raises great points around it.
Come on. Stop reading the comments. Go check the article.
This ironically(?) made me go read it. Normally I don’t.
Thank you.
This article is a regurgitation of every tech article since the microchip. There is literally nothing new here. Tech makes labor obsolete. Tech never considers the ramifications of tech.
These things have been known since the beginning of tech.
What about the climate impact? You didn’t even address that. That’s the worst part of the AI boom, were already way in the red for climate change, and this is going to accelerate the problem rather than slowing or stopping (let alone reversing it)
That’s a very solvable problem though, AI can easily be run off green energy and a lot of the new data centers being built are utilizing it, tons are popping up in Seattle with its abundance of hydro energy. Compare that to meat production or transportation via combustion which have a much harder transition and this seems way less of an existential problem then the author makes it out to be.
Also most of the energy needed is for the training which can be done at any time, so it can be run on off peak hours. It can also absorb surpluses from solar energy in the middle of the day which can put strain on the grid.
This is all assuming it’s done right, which it may not and could exasperate the ditch were already in, but the technology itself isn’t inherently bad.
That right there is the problem. I don’t trust any tech CEO to do the right thing ever, because historically they haven’t. For every single technological advancement since the industrial revolution brought forth by the corporate class, masses of people have had to beat them up and shed blood to get them to stop being assholes for a beat and abuse and murder people a little less.
It doesn’t matter if AI is run on green energy as long as other things are still running on fossil fuels. There is a limit to how fast renewables energy sources are built and if the power consumption of AI eats away all of that growth, then the amount of fossil energy doesn’t change.
All increases in energy consumption are not green because they force something else to run on fossil energy for longer.
We need to deploy solar and wind at a breakneck pace to replace the fossil fuel usage we already have. Why compound that with a whole new source?
The tech that exists so far haven’t had the potential to replace every job on earth, that’s the real difference for me.
This really doesn’t either tbh. But that’s certainly what they’re selling.
How do you know what the limits of this technology is? How do you know that they couldn’t be able to reach that point in 5-10-20-50-100-1000 years?
Unless you’re thinking of the current iteration of the technology and not its future evolutions.
Its future iterations that are definitely not this?
Sure, I don’t know.
I’d wager we’ll probably reach climate collapse / political crises that throw us off course before a “Westworld-esque” thing is ever possible.
People don’t seem to realize that these tech leaders are all just weaponizing your imagination against you (a.k.a. using a sales technique). GPUs and LLMs aren’t skynet no matter how much people want to project that onto them.
Nvidia cares maybe even less about the outcome than I do, they’ll sell you all the pickaxe you want to buy in the AI gold rush.
Well, ok then haha. You’ve convinced me.
Eh it’s not that great.
If you ignore the two fastest growing methods of power generation, which coincidentally are also carbon free, cheap and scalable, the future does indeed look bleak. But solar and wind do exist…
The rest is purely a policy rant. Yes, if productivity increases we need some way of distributing the gains from said productivity increase fairly across the population. But jumping to the conclusion that, since this is a challenge to be solved, the increase in productivity is bad, is just stupid.
Elysium incoming
But without even just the cool space station to just stare at longingly…
well yeah… they just don’t care, after all the climate crisis is somebody else’s problem… and what really matters is that the line goes up next quarter, mankind’s future be damned
All these issues are valid and need solving but I’m kind of tired of people implying we shouldn’t do certain work because of efficiency.
And tech gets all the scrutiny for some reason (it’s transparency?). I can’t recall the last time I’ve seen an article on industrial machine efficiency and how we should just stop producing whatever.
What we really need to do is find ways to improve efficiency on all work while moving towards carbon neutrality. All work is valid.
If I want to compute pi for no reason or drive to the Grand Canyon for lunch, I should be able to do so.
Anyone with experience in corporate operations will tell you the ROI on process changes is dramatically higher than technology. People invent so many stupid and dangerous ways to “improve” their work area. The worst part is that it just takes a little orchestration to understand their needs and use that creativity to everyone’s benefit.
lol at tech’s transparency. You have an availability heuristic issue with your thought process. Every other industry has similar critiques. Your media diet is leading you to false conclusions.
We’re literally in a technology community followed by tons of industry outsiders, of which there is a similar one on every other similar aggregation site. I don’t see any of that for things like plastics manufacturers, furniture makers, or miners. So yeah, I’d say transparency for the general public tends to be higher in tech than most other industries.
Efficiency??
This is about the total amount of emissions, not the emissions-per-unit-of-compute (or whatever).
Not sure what you’re getting at. Increased system efficiency lowers total emissions or at least increases work capacity.
Yeah, but no https://en.wikipedia.org/wiki/Jevons_paradox
The climate doesn't give a toss how much value for shareholders is generated, all that matters is the total amount of emissions.
Unless you’re looking to get rid of half of humanity and go back to living like the Amish I don’t think we can put that genie back in the bottle.
What we can do is work on how energy is generated and increase efficiency. And this has nothing to do with shareholders.
Are you able to explain why?
I’m sure I won’t be very eloquent about it but simply, liberty. Freedom of compute is on par with freedom of thought and expression.
Freedom of travel is something else, but I’m sure most people that don’t like being imprisoned can appreciate.
Work (as in energy expenditure) enables these freedoms and I think it’s important not to stifle that whenever possible.
This would be fine if there were no externalities.
“Hey fucker, your right to swing your fist ends where it collides with someone else’s face”
^ Dont make me tap the sign
Computing and leisure travel aren’t human rights, while freedoms of thought and expression are.
I just disagree. Computing is expression and in my opinion freedom of travel should be a human right.
Even if you add “leisure” to it to bolster your argument.
Well you are free to travel with your feet, because in 50 years from now I am not so sure you’ll be able to do it with a plane anymore.
Disagree that all work is valid. That only makes sense in a world with no resource constraints
late stage capitalism news , nothing new under the sun.
I like that the writer thought re climate change. I think it’s been 1 of the biggest global issues for a long time. I hope there’ll be increasing use of sustainable energy for not just data centers but the whole tech world in the coming years.
I think a digital waiter doesn’t need a rendered human face. We have food ordering kiosks. Those aren’t ai. I think those suffice. A self-checkout grocer kiosk doesn’t need a face too.
I think “client help” is where ai can at least aid. Imagine a firm that’s been operating for decades and encountered so many kinds of client complaints. It can feed all those data to a large language model. With that model responding to most of the client complaints, the firm can reduce the number of their client support people. The model will pass the complaints that are so complex or that it doesn’t know how to address to the client support people. The model will handle the easy and medium complaints; the client support people will handle the rest.
Idk whether the government or the public should stop ai from taking human jobs or let it. I’m torn. Optimistically, workers can find new jobs. But we should imagine that at least 1 human will be fired and can’t find a new job. He’ll be jobless for months. He’ll have an epic headache as he can’t pay next month’s bills.
Boiling the oceans for deepfake porn, scamcoins and worse web search.
Believe it or not, peak humanity.
It’s all downhill from here
I think we passed the peak a few years ago. But yeah, peak from here on out.
The Matrix was such a nice movie. In 2000 they already had Linux, PlayStation, ICQ, filesharing, old Star Wars (with a good chunk of the classical EU) and even the Phantom Menace (haters gonna hate), and the first 3 Harry Potter books. And WarCraft II, and X-Wing Alliance, and I’m lazy to go on with this
You still need a massive fleet of these to train those multi-billion parameter models.
On the invocation side, if you have a cloud SaaS service like ChatGPT, hosted Anthropic, or AWS Bedrock, these could answer questions quickly. But they cost a lot to operate at scale. I have a feeling the bean-counters are going to slow down the crazy overspending.
We’re heading into a world where edge computing is more cost and energy efficient to operate. It’s also more privacy-friendly. I’m more enthused about a running these models on our phones and in-home devices. There, the race will be for TOPS vs power savings.
So if each GPU takes 1,800W, isn’t that the equivalent of what a handheld hair dryer consumes?
It’s more than your average space heater.
And that energy doesn’t just go away after computing. You’ll have the equivalent of an average space heater of heat coming out of your computer. It’d be awesome to compute with heating energy when needed, but when you need AC it’s going to be a bitch.
Yes, but they are not gaming devices. They are meant to efficiently compute things. When used for that purpose they use little energy compared to other devices doing the same thing.
Yes, and you leave it on all day at full blast. And you have a dedicated building where there’s thousands of them doing the same.
All to take away jobs and break the internet.
Innovation is a scam, it breeds endless bullshit we keep buying and talking about like 10 year olds with their latest gimmick.
Look, they replaced this button with A TOUCHSCREEN!
Look! This artficial face has PORES NOW!
LOOK! This coffee machine costs 2000$ now and uses PROPRIATARY SUPEREXPENSIVE CAPSULES!!
We need progress, which is harder to do because it takes a paradigm shift on an Individual and social level. It’s much less gadgety.
Tech is neither good nor bad, but control of tech is a major issue.
The existing capitalist control of tech is bad.
Agreed! And that’s where the problem lies. It’s not tech so much as our existing power structures.
You’re not wrong. We’ve reached a point, technologically, where there is little-to-no true innovation left… and what I mean by that is that everything is now built on incredible amounts of work by others who came before. “Standing on the shoulders of giants”, as it were. And yet we have a corrupt “patent” system that is exclusively used to steal the work of those giants while at the same time depriving all of humanity of true progress. And why? So that a handful of very rich people can get even more rich.
Exactly, innovations no longer help to satisfy real basic needs, they are used to create new, artificial needs. Always new toys that make us feel like we’re making progress.
That’s not true, but to have planned “innovation” bring profit you need to impede real progress. Cause real progress disrupts such plans.
I remember hearing this argument before…about the Internet. Glad that fad went away.
As it has always been, these technologies are being used to push us forward by teams of underpaid unnamed researchers with no interest in profit. Meanwhile you focus on the scammers and capitalists and unload your wallets to them, all while complaining about the lack of progress as measured by the products you see in advertisements.
Luckily, when you get that cancer diagnosis or your child is born with some rare disease, that progress will attend to your needs despite your ignorance if it.
Exactly. OP is mad at alienation, not at progress. In a different, less stupid world these labor saving devices would actually be great, leading to a better quality of life for everyone, and getting a really awesome coffee maker. But the people making the decisions aren’t the consumers or the researchers.
You misunderstood me. I have nothing against progress. Medical progress is great! But what is often sold to us as innovation is not progress but just more nonsense that only pretends to get us further.
Innovatin is good if it results in clean water, meds, housing, safe food and goods and services.
It’s bad if it means: the most profit for useless shit that people only buy because advertisment made them believe they need it.
Capitalism is a tool. Please let’s grow a pair and stop letting it decide how it will be used. It’s like pulling the trigger on an ak47 without holding it tight. Do we expect the weapon to know where to shot?
Capitalism is a tool that wants to maximize its profits. Unfortunately it discovered that changing the politics and laws is an easy way to do that, even if it’s bad for the people.
Capitalism is per definition not bound to ethics or moral. We need to set rules, even if big corporations made us to believe we shouldn’t.
That’s a strawman, possibly aimed at libertarians. Like everyone else, corps want to set rules which benefit them.
Sometimes it just takes a marginal improvement to the quality of the engineering. But these “what if manual labor but fascade of robots!” gimmicks aren’t improvements in engineering. They’re an effort to cut corners on quality in pursuit of a higher profit margin.
Even setting aside you believe these aren’t just a line up of mechanical turks controlled from a sweetshop in the Philippines, their work product isn’t anything approaching good. Its just cheap.
Fun fact the first Mr coffee cost 300 dollars in 1971, which would be more than 2000 dollars today
Improvement is not a scam.
Innovation is a scam created by representing change as improvement when it isn’t.
And every time change gets replaced with innovation, it’s connected to totalitarian\fascist tendencies, because it makes easier to sell societal change which is clearly not improvement.
A person who seriously affected my life advised “Homo Ludens” by Johan Huizinga, not sure whether because of the part of it about fascism in the 30s.
nice now combine this with shit for brains coin bros and we can all boil together
.
.
THE MORE YOU BUY
THE MORE YOU SAVE
Well, this is fucking bleak. Everybody, I urge you to read this article.
I did it because you urged me.
what a dumb future
Im starting to wonder if the Butlerian Jihad isn’t a good idea after all.
Is nobody concerned about this:
not being seen as the paradise? It’s like the enterprise crew is concerned about replicators because people will lose their jobs.
This is madness, to be honest, this is what humankind ultimately should evolve into. No stupid labour for anyone. But the truth is: capitalism will take care of that, it will make sure, that not everyone is free but that a small percentage is more free and the rest is fucked.There lies the problem not in being able to make human labour obsolete.
@shrugs @rwtwm I think the enterprise crew would be concerned if the Ferengi owned said replicators.
The issue with “Human jobs will be replaced” is that society still requires humans to have a paying job to survive.
I would love a world where nobody had to do dumb labour anymore, and everyone’s needs are still met.
Yup. Realistic result of things becoming automated is that we have several decades of social strife grappling with the fact there’s too many people for the amount of human labor actually needed, until there’s enough possibly violent unrest for the powers that be to realize "oh, maybe we shouldn’t require people to have jobs that don’t exist "
Solid agree, but it’s so hard to persuade the brainwashed (let alone their capitalist masters) that the purpose of economic growth should be to generate sufficient leisure time to permit self-actualising activities for those who seek them.
I’ve been watching people try to deliver the end-to-end Food Making conveyor belt for my entire life. What I’ve consistently seen delivered are novelties, more prone to creating a giant mess in your mechanical kitchen than producing anything both efficient and edible. The closest I’ve seen are those microwaved dinners, and they’re hardly what I’d call an exciting meal.
But they are cheap to churn out. That’s what is ultimately upsetting about this overall trend. Not that we’ll be eliminating a chronic demand on human labor, but that we’ll be excising any amount of artistry or quality from the menu in order to sell people assembly line TV dinners at 100x markups in pursuit of another percentage point of GDP growth.
As more and more of the agricultural sector falls under the domain of business interests fixated on profits ahead of product, we’re going to see the volume and quality of food squeezed down into what a robot can shove through a tube.
The notion that everyone must earn their own living is going to be a problem soon.
The wealthy ruling class have siphoned off nearly all of the productivity gains since the 70s. AI won’t stop that machine. If half of us die of starvation and half the remaining half die from fighting each other for cake, they don’t care.
My blood runs cold! My dignity has just been sold. nVidia is the centerfold.
Where my futurists now? Tell me again how a technological advancement will free humans from drudgery to engage in more free and enlightened pursuits?
A lot of the “elites” (OpenAI board, Thiel, Andreessen, etc) are on the effective-accelerationism grift now. The idea is to disregard all negative effects of pursuing technological “progress,” because techno-capitalism will solve all problems. They support burning fossil fuels as fast as possible because that will enable “progress,” which will solve climate change (through geoengineering, presumably). I’ve seen some accelerationists write that it would be ok if AI destroys humanity, because it would be the next evolution of “intelligence.” I dunno if they’ve fallen for their own grift or not, but it’s obviously a very convenient belief for them.
Effective-accelerationism was first coined by Nick Land, who appears to be some kind of fascist.
The problem with this approach is that progress here is viewed like a brick wall you build.
You don’t get progress from just burning a lot of wood in 1400s. You can get it if that wood is burnt with the goal of, I dunno, making better metal or bricks for some specific mechanism.
Same with our time, how can they expect solutions of problems to be found when they don’t understand what they are trying to find?
It’s like a cargo cult - “white people had this thing and it could fly and drop cargo, so we must reproduce its shape and we’ll be rich”, only in this case it’s even dumber - nobody has seen the things they are trying to reach anywhere outside of space opera series.
What differentiates IT from most other engineering areas is that most of people doing it solve abstract tasks in abstract environments, defined by social and market demand. They are, sadly, simply a grade below real engineers and scientists for that reason alone.
Holy crap, I thought I hated AI and I was uncertain. Now I’m sure I hate AI
For thousands of years the ruling class has tolerated the rest of us because they needed us for labor and protection. We’re approaching the first time in human history where this may no longer be the case. If any of us are invited to the AI utopia, I suspect it will only be to worship those who control it. I’m not sure what utility we’ll have to offer beyond that. I doubt they’ll keep us around just to collect UBI checks.