American employers don’t even give you this anymore. You are escorted away by security and someone else empties your shit into a box and hands it to you in the lobby. They are very afraid of sabotage.
When did they ever? I remember when one of my parents got fired in the 90s, they sent the stuff from the desk in a box. Including the company desk phone!
BrianTheeBiscuiteer@lemmy.world
on 27 Jun 23:06
nextcollapse
“You’re firing me for using AI to read and respond to your email?”
ICastFist@programming.dev
on 27 Jun 22:26
collapse
Right now at least, AI is being more of a headache than anything in coding. Microsoft itself was responsible for one such gaffe in May, as an actual coder had to tell the AI to fix an error, again and again, as each time it’d make a different mistake
Reverendender@sh.itjust.works
on 27 Jun 20:48
nextcollapse
I use ChatGPT to write code fairly often. Because I don’t know how. ChatGPT never gets it right the first time, usually doesn’t get it right by the 10th try, and will never stop going down a robot hole of inaccuracy until I give up. The only success I have had in recent memory was getting some custom commands written in Karabiner for my desktop mice.
Yeah and what it should mean is the same productivity (or slightly higher) over fewer hours worked. So everyone can get more of their lives back to go be happy and spend time with their friends and families. Or literally whatever else people would rather being doing besides working all the damn time.
This is the material explanation. They expect increased productivity and therefore higher output and therefore higher profits from the same workforce. Not necessarily to downsize. Downsizing or upsizing would be dictated by a combination of the realized productivity gains and the uptake of their products by the market.
leftzero@lemmynsfw.com
on 27 Jun 21:00
nextcollapse
Frankly, with the garbage Microsoft is producing these days, and the rate at which the quality, for lack of a better word, is degenerating, I’m starting to consider if LLM slop might actually be less worse…
salacious_coaster@infosec.pub
on 27 Jun 21:02
nextcollapse
Microsoft support was already mostly useless. So, yeah, a useless AI probably could replace that, but it would also probably be more expensive.
CmdrShepard49@sh.itjust.works
on 27 Jun 21:16
nextcollapse
Not even the guys who call me on the phone to tell me that I have a virus on my computer?
suits have been replacing long term essential employees with outsourced trash even before in name of global redundancy and efficiency. now they will just the ai buzz word to hide behind.
unexposedhazard@discuss.tchncs.de
on 27 Jun 21:15
collapse
Nah its just part of the MLM scheme that is “AI”. Its useful because they said it would be useful. Its worth the investment because it cost a lot of money. Once you realize that all these companies care about is revenue and “growth” then it all clicks. It doesnt have to work or be profitable, it just needs to look good to investers.
They will even go as far as firing loads of workers and saying publicly that they “replaced them with AI” while in reality those workers were just doing something that the company was willing to sacrifice. They just replaced something with nothing to make it look like their magic AI can actually do things.
The “boy genius” story is an example of Silicon Valley’s storied “reality distortion field,” pioneered by Steve Jobs. Like Jobs, Zuck is a Texas marksman, who fires a shotgun into the side of a barn and then draws a target around the holes. Jobs is remembered for his successes, and forgiven his (many, many) flops, and so is Zuck. The fact that pivot to video was well understood to have been a catastrophic scam didn’t stop people from believing Zuck when he announced “metaverse.”
Zuck lost more than $70b on metaverse, but, being a boy genius Texas marksman, he is still able to inspire confidence from credulous investors. Zuck’s AI initiatives generated huge interest in Meta’s stock, with investors betting that Zuck would find ways to keep Meta’s growth going, despite the fact that AI has the worst unit economics of any tech venture in living memory. AI is a business that gets more expensive as time goes on, and where the market’s willingness to pay goes down over time. This makes the old dotcom economics of “losing money on every sale, but making it up in volume” look positively rosy.
riskable@programming.dev
on 27 Jun 20:25
nextcollapse
The AI said that trying to reason with you is a waste of precious tokens.
Codilingus@sh.itjust.works
on 27 Jun 20:41
nextcollapse
Malicious compliance and use it solely for internal emails.
UnderpantsWeevil@lemmy.world
on 27 Jun 21:41
nextcollapse
Microsoft is in the process of downsizing to the tune of 3% of its global workforce and rising.
Could be they really are unironically cruising towards a CEO overseeing a bunch of spam bot email accounts they’re treating as headcount.
I had an interesting conversation today with an acquaintance. He has sent his resumé to dozens of companies now. Most of them, but not all, corporate blobs.
He wondered for a while just why the hell no one is even reaching out (he’s definitely qualified for most of the positions). He then came to the idea to ask a particular commercial Artificial Stupidity software to parse it. Most of those companies use that software, or at least that’s what the vendor says on its website. Turns out, that PoS software gets it all wrong. As in: everything. Positions and companies get mixed up, dates aren’t correctly registered, the job descriptions it claims to have understood only remotely match what he wrote. Read: things even the most junior programmer with two weeks of experience would get right.
And it is getting used pretty much by every big firm out there.
Oh and BTW: There is ONE correct answer to the phrase ‘using AI is no longer optional’ : Fuck you.
turkalino@lemmy.yachts
on 27 Jun 23:48
nextcollapse
I’m gonna be looking for a new job soon and I’ve been reading stuff like this more & more. Makes me really scared. I guess reaching out to recruiters directly via LinkedIn is more important than ever. I also hope the AI software hasn’t made its way down to small/medium-sized companies yet, since those are the ones I’d rather work for anyways
Sadly, those are worse. Since they don’t have the staff or expertise, most of the time they outsource to larger companies… that use AI. I’m almost 99% positive at this point if any of the sites use Workday, it’s getting parsed by an AI because that’s what ours does and it’s a PITA.
surewhynotlem@lemmy.world
on 28 Jun 01:18
collapse
Your resume is ATS compatible. that’s a non-negotiable point nowadays.
surewhynotlem@lemmy.world
on 28 Jun 01:17
collapse
That’s not AI. That’s just ATS. And it’s been shit for years. Definitely, definitely, make sure your resume is ATS compatible. Use the scanners.
BrianTheeBiscuiteer@lemmy.world
on 27 Jun 23:04
nextcollapse
Same at my company. The frustrating part is they want us to use coding assistance, which is fine, but I really don’t code that much. I spend most of my time talking to other teams and vendors, reading docs, filing tickets, and trying to assign tasks to Jr devs. For AI to help me with that I need to either type all of my thoughts into the LLM which isn’t efficient at all or I need it to integrate with systems I’m not allowed to integrate with because there are SLOs that need to be maintained (i.e. can’t hammer the API and make others experience worse).
So it’s pretty much the same as it’s always been. Instead of making a gallon of lemonade out of one lemon I need to use this “new lemonade machine” to start a multinational lemonade business.
The key highlight being: you don’t need more than a gallon of lemonade. I for once wished big corps heard their engineers and domain experts over wall street loving exec’s.
But I manage a team of embedded developers. On a specialised commercially restricted embedded platform.
AI does not know a thing about our tech. The stuff it does know is either a violation of the vendors contractual covenants or made up bullshit. And Our vendor’s documentation is supplemented by a cumulative decades of knowledge.
At my company too but it’s owned by yet another cancerous private equity firm so it was expected.
FauxPseudo@lemmy.world
on 28 Jun 01:52
nextcollapse
At your next job interview ask them if they are results driven or methodology driven. “If I were to take twice as long to do something by using a poorly designed tool will I be rewarded or punished?”
etherphon@lemmy.world
on 28 Jun 02:07
nextcollapse
Being judged by a fancy magic 8 ball, the future keeps getting better and better.
Corporate monopoly with overpriced products doing corporate shit
TwitchingCheese@lemmy.world
on 28 Jun 03:00
nextcollapse
Apparently no longer optional for their customers either, based on how hard they are pushing it in Office 365, sorry Microsoft 365, no sorry Microsoft 365 Copilot.
The latest change of dumping you into a Copilot chat immediately on login and hiding all the actually useful stuff is just desperation incarnate.
The process to log in to the online portal of Outlook is so bad it’s crossed into comical territory. So much friction, only to shunt you to a full screen clippy copilot page.
I’d be curious to know what the usage statistics are for that page. Like, what could a person possibly accomplish there?
DarkDarkHouse@lemmy.sdf.org
on 28 Jun 03:01
nextcollapse
Company I’m at also does the forced AI and it’s all but mandatory now. Problem is as code monkeys we’re past the point of heading down to the Winchester for a pint until it blows over. They’re pushing so hard in order to “not fall behind” that you literally can’t escape it. I think even malicious compliance won’t cut it. And when 8/10 companies that dictate the market say that “this is the future”, then this is the future they’ll make whether we like it or not.
Edit: the silver lining is that we’re working with tools that are better than copilot at generating menial work like generating boilerplate code, unit tests, release notes, walls of text for app documentation etc.
isolatedscotch@discuss.tchncs.de
on 28 Jun 15:23
collapse
We’re in the honeymoon phase, shit didn’t hit the fan yet. Problem is we devs are fucked either way. If productivity does increase, then workforce demand will go down especially for entry level devs and seniors will be relegated to vibe coding and fixing AI bugs. If it all goes south then layoffs, because line must go up!
Malicious compliance time, full-on Vibe coding, just accept all changes. Who cares about optimisation, readability, or documentation. You’re using AI anything goes.
In the list of things nobody cares about, you forgot “actually do what’s asked”. Use these tool for a very short while and be amazed at how bad it is to do things that are extremely well known and documented.
namingthingsiseasy@programming.dev
on 28 Jun 20:51
collapse
Really fascinating how this is happening in coordination all of a sudden. I’m practically certain that this is all coming from a small group of investors (maybe even just a couple) who are trying to influence companies as hard as they can into making everyone to start using it.
I asked chatgpt (because this shitpost needs more AI), and it said:
So while their names are different, the core AI technology is the same! It’s like having two different brands (say, Ford and Tesla) both using electric motors—the motor is the same technology, just in different vehicles.
But comparing suicidal cars to a Ford is total baloney, so ChatGPT is full of shit.
It’s been an honour to waste your time and 72L of water to get the AI to shitpost.
fuzzzerd@programming.dev
on 28 Jun 13:36
nextcollapse
System prompt and other tooling make some difference.
AlecSadler@lemmy.blahaj.zone
on 28 Jun 14:19
collapse
We were only allowed to use Copilot at my last job, ChatGPT + the others were all blacklisted.
We received SO many tickets from users across the organisation (including IT) requesting access to ChatGPT.
FurtiveFugitive@lemm.ee
on 28 Jun 15:51
nextcollapse
If I ask copilot to help write a power automate script or similar, I guarantee it will not work. It won’t make sense. And if I do it from within power automate, it wants to replace what I already have there. It’s a mess.
If I pull up chatgot on my phone (because it’s blacklisted at work) I will get very clear, step by step instructions that are usually good enough or occasionally not correct but close enough that I can tweak it back on course.
JackbyDev@programming.dev
on 28 Jun 15:53
collapse
Copilot suggested edit: This statement contains false information.
-As a heavy AI user on a daily basis…Copilot is hands down one of, if not the worst, in existence.
+As a heavy AI user on a daily basis…Copilot is hands down one of, if not the best, in existence.
RecallMadness@lemmy.nz
on 28 Jun 09:41
nextcollapse
While true; do curl http://copilot/?query=what+is+the+time; sleep 10; done
Bet the AI can’t see through this.
sugar_in_your_tea@sh.itjust.works
on 28 Jun 15:17
nextcollapse
Need to throw in a randomizer:
while true; do
curl http://copilot/?query=what+is+the+time+in+RANDOM+seconds
sleep 10; done
done
Ironically enough that’s is exactly the kind of seemingly “simple” question a 4 years old could answer… but LLMs can’t.
Asked my better half to test DeepSeek locally few months ago and they, without trying to “trick” it (as I would have tried) genuinely tried “What time is it in Sri Lanka?”. That made me smile because I was rather sure there was no way the model could answer that. It would need to know the current time on any time zone then, if it’s not in Sri Lanka already (which it wasn’t on my local system) would have to convert it. That would be very basic arithmetic (that some 4 years old could also do) but not “just” spitting back words related to the question.
Guess what… it failed exactly as expected. The model replied back “information” (which is being generous for a string of words arguably related to the topic, which was mostly about Sri Lanka, not time) and yet was basically irrelevant and thus useless.
So… yes I’m not actually sure CoPilot could even help there unless there is a lot of custom made handling of this kind of queries upstream!
As if people coding things make ms any money, it’s pure extraction through windows and office and they need ai to be next.
wholecounsel@lemmy.world
on 28 Jun 16:27
nextcollapse
🚨 New drop: Tesla VIN Registry Leak
Pulled from charging station sync logs
VIN, full name, email – 2025 models
🪙 0.0024 BTC
DM or visit onyx.fwh.is
#tesla #vinleak #cybersecurity
Flames5123@sh.itjust.works
on 28 Jun 21:20
nextcollapse
It’s the same in Amazon software development. We have like 3 different AI tools. I enjoy it for unit tests and predicting the next two lines of a simple thing, but it’s not going to refactor our codebase.
TimewornTraveler@lemmy.dbzer0.com
on 29 Jun 02:00
nextcollapse
this makes me even more excited for my plans to switch to linux. I’m gonna have to go find a good backup method soon!
source_of_truth@lemmy.world
on 29 Jun 02:34
nextcollapse
Microsoft is cooked.
Etterra@discuss.online
on 29 Jun 10:25
nextcollapse
Translation: you will now train your eventual replacement.
threaded - newest
“Have any of you realized how much money we spent on this?!”
Basically, yeah.
“But the results are objectively much worse than if I just did it myself, sir!”
You have 10 minutes to clear your desk and get out. Not a team player!
American employers don’t even give you this anymore. You are escorted away by security and someone else empties your shit into a box and hands it to you in the lobby. They are very afraid of sabotage.
Seems like in the USA everyone gets treated badly all of the time, except the very richest.
Either that, or you make yourself indispensable. What the C-Suites do all day, I have no idea. Whatever it is isn’t working though.
When did they ever? I remember when one of my parents got fired in the 90s, they sent the stuff from the desk in a box. Including the company desk phone!
“You’re firing me for using AI to read and respond to your email?”
“No one cares about the quality of your work, only the quantity!”
Someone has to generate the bugs we pay you to fix.
Its to use the employees to train AI to replace them and they know it.
Ai definitely can’t replace many (if any) microsoft employees.
I think shouldn’t is better to say than can’t. They are definitely going to try.
.
Sam Altman makes money from people believing this.
.
Right now at least, AI is being more of a headache than anything in coding. Microsoft itself was responsible for one such gaffe in May, as an actual coder had to tell the AI to fix an error, again and again, as each time it’d make a different mistake
I use ChatGPT to write code fairly often. Because I don’t know how. ChatGPT never gets it right the first time, usually doesn’t get it right by the 10th try, and will never stop going down a robot hole of inaccuracy until I give up. The only success I have had in recent memory was getting some custom commands written in Karabiner for my desktop mice.
.
Lol
Their hope is probably that AI can let current employees bear a greater workload so they can downsize.
Ding! Any gains in productivity will mean more work for less people.
Anyone who can’t see this coming - I have several bridges for sale.
Yeah and what it should mean is the same productivity (or slightly higher) over fewer hours worked. So everyone can get more of their lives back to go be happy and spend time with their friends and families. Or literally whatever else people would rather being doing besides working all the damn time.
This is the material explanation. They expect increased productivity and therefore higher output and therefore higher profits from the same workforce. Not necessarily to downsize. Downsizing or upsizing would be dictated by a combination of the realized productivity gains and the uptake of their products by the market.
Frankly, with the garbage Microsoft is producing these days, and the rate at which the quality, for lack of a better word, is degenerating, I’m starting to consider if LLM slop might actually be less worse…
Microsoft support was already mostly useless. So, yeah, a useless AI probably could replace that, but it would also probably be more expensive.
Not even the guys who call me on the phone to tell me that I have a virus on my computer?
suits have been replacing long term essential employees with outsourced trash even before in name of global redundancy and efficiency. now they will just the ai buzz word to hide behind.
Nah its just part of the MLM scheme that is “AI”. Its useful because they said it would be useful. Its worth the investment because it cost a lot of money. Once you realize that all these companies care about is revenue and “growth” then it all clicks. It doesnt have to work or be profitable, it just needs to look good to investers.
They will even go as far as firing loads of workers and saying publicly that they “replaced them with AI” while in reality those workers were just doing something that the company was willing to sacrifice. They just replaced something with nothing to make it look like their magic AI can actually do things.
Cory Doctorow put it better than i ever could: pluralistic.net/2025/05/07/rah-rah-rasputin/
The whole post is good but i will just quote this section.
The AI said that trying to reason with you is a waste of precious tokens.
Malicious compliance and use it solely for internal emails.
Microsoft is in the process of downsizing to the tune of 3% of its global workforce and rising.
Could be they really are unironically cruising towards a CEO overseeing a bunch of spam bot email accounts they’re treating as headcount.
can i send an AI bot to all my Teams meetings? THAT would actually increase my productivity.
Good luck with that Microsoft
I had an interesting conversation today with an acquaintance. He has sent his resumé to dozens of companies now. Most of them, but not all, corporate blobs.
He wondered for a while just why the hell no one is even reaching out (he’s definitely qualified for most of the positions). He then came to the idea to ask a particular commercial Artificial Stupidity software to parse it. Most of those companies use that software, or at least that’s what the vendor says on its website. Turns out, that PoS software gets it all wrong. As in: everything. Positions and companies get mixed up, dates aren’t correctly registered, the job descriptions it claims to have understood only remotely match what he wrote. Read: things even the most junior programmer with two weeks of experience would get right.
And it is getting used pretty much by every big firm out there.
Oh and BTW: There is ONE correct answer to the phrase ‘using AI is no longer optional’ : Fuck you.
I’m gonna be looking for a new job soon and I’ve been reading stuff like this more & more. Makes me really scared. I guess reaching out to recruiters directly via LinkedIn is more important than ever. I also hope the AI software hasn’t made its way down to small/medium-sized companies yet, since those are the ones I’d rather work for anyways
Sadly, those are worse. Since they don’t have the staff or expertise, most of the time they outsource to larger companies… that use AI. I’m almost 99% positive at this point if any of the sites use Workday, it’s getting parsed by an AI because that’s what ours does and it’s a PITA.
Your resume is ATS compatible. that’s a non-negotiable point nowadays.
That’s not AI. That’s just ATS. And it’s been shit for years. Definitely, definitely, make sure your resume is ATS compatible. Use the scanners.
Any scanner recommendations?
Jobscan.co , the free version
Yes but can it tell the business why it can’t deliver on time with they change the requirements 3 different times?
I can tell
Chowtime boys 🐕
Same at my company. The frustrating part is they want us to use coding assistance, which is fine, but I really don’t code that much. I spend most of my time talking to other teams and vendors, reading docs, filing tickets, and trying to assign tasks to Jr devs. For AI to help me with that I need to either type all of my thoughts into the LLM which isn’t efficient at all or I need it to integrate with systems I’m not allowed to integrate with because there are SLOs that need to be maintained (i.e. can’t hammer the API and make others experience worse).
So it’s pretty much the same as it’s always been. Instead of making a gallon of lemonade out of one lemon I need to use this “new lemonade machine” to start a multinational lemonade business.
The key highlight being: you don’t need more than a gallon of lemonade. I for once wished big corps heard their engineers and domain experts over wall street loving exec’s.
Why would they do that? If they’re making better quarterly results by listening to Wall St, that’s what the system tells them to do.
Ditto.
But I manage a team of embedded developers. On a specialised commercially restricted embedded platform.
AI does not know a thing about our tech. The stuff it does know is either a violation of the vendors contractual covenants or made up bullshit. And Our vendor’s documentation is supplemented by a cumulative decades of knowledge.
Yet still “you gotta use AI”.
At my company too but it’s owned by yet another cancerous private equity firm so it was expected.
At your next job interview ask them if they are results driven or methodology driven. “If I were to take twice as long to do something by using a poorly designed tool will I be rewarded or punished?”
Being judged by a fancy magic 8 ball, the future keeps getting better and better.
Corporate monopoly with overpriced products doing corporate shit
Apparently no longer optional for their customers either, based on how hard they are pushing it in Office 365, sorry Microsoft 365, no sorry Microsoft 365 Copilot.
The latest change of dumping you into a Copilot chat immediately on login and hiding all the actually useful stuff is just desperation incarnate.
The process to log in to the online portal of Outlook is so bad it’s crossed into comical territory. So much friction, only to shunt you to a full screen
clippycopilot page.I’d be curious to know what the usage statistics are for that page. Like, what could a person possibly accomplish there?
It’s called dog-slopping
Dog-fooding, but instead of food, it’s a dog eating its own vomit.
How very corporate of them: people don't want to do something? Screw finding out why, let's make it mandatory and poof, problem solved!
They must really want their workforce to be less efficient while dramatically lowering quality and security across the board.
<img alt="" src="https://lemmy.world/pictrs/image/1f24af06-4e9b-4550-aae6-327b494c6ac7.gif">
Hackers are about to have a golden era
Slopsquatting is already taking off
They are banking on the AI will eventually be smart enough that it will replace the workers that fed it.
They should have just invested in NFTs instead.
except programmers are gonna continue with what they were already doing, at most putting a script on copilot to get the metrics
don’t forget that if you don’t turn in the project in time you’re fired, the issues always get thrown at the coder, it’s never the company’s fault
Company I’m at also does the forced AI and it’s all but mandatory now. Problem is as code monkeys we’re past the point of heading down to the Winchester for a pint until it blows over. They’re pushing so hard in order to “not fall behind” that you literally can’t escape it. I think even malicious compliance won’t cut it. And when 8/10 companies that dictate the market say that “this is the future”, then this is the future they’ll make whether we like it or not.
Edit: the silver lining is that we’re working with tools that are better than copilot at generating menial work like generating boilerplate code, unit tests, release notes, walls of text for app documentation etc.
release notes and app documentation:
<img alt="" src="https://discuss.tchncs.de/pictrs/image/68085ab3-0781-460a-a435-c1e2fa140112.jpeg">
memes aside, are they blaming the coders or the ai on the slower project turnout?
We’re in the honeymoon phase, shit didn’t hit the fan yet. Problem is we devs are fucked either way. If productivity does increase, then workforce demand will go down especially for entry level devs and seniors will be relegated to vibe coding and fixing AI bugs. If it all goes south then layoffs, because line must go up!
I feel like this is going to cause so many problems in the near future. They’re not ready for it and they don’t even know.
Yuuuup this is my company too. They’re monitoring our GH Copilot /Cursor usage and they’re going to apply to our performance reviews
Malicious compliance time, full-on Vibe coding, just accept all changes. Who cares about optimisation, readability, or documentation. You’re using AI anything goes.
In the list of things nobody cares about, you forgot “actually do what’s asked”. Use these tool for a very short while and be amazed at how bad it is to do things that are extremely well known and documented.
Really fascinating how this is happening in coordination all of a sudden. I’m practically certain that this is all coming from a small group of investors (maybe even just a couple) who are trying to influence companies as hard as they can into making everyone to start using it.
Windows is already garbage when humans are codeding it.
The only hope for Microsoft is if Xbox takes over all of Microsoft and transforms the whole company
Sorry best we can do is have the marketing department take over XBox and now it’s just a logo they license out.
Fine do whatever you want to your shit company stop forcing me to use copilot on everything. This is worse than the failed clippy
As a heavy AI user on a daily basis…Copilot is hands down one of, if not the worst, in existence.
This will not end well for them.
Copilot is literally ChatGPT
It can’t be literally chatgpt and use different letters. One of them has an O.
because it uses ChatGPT for its backend
I think they’re being sarcasitc lol
am being serious
I asked chatgpt (because this shitpost needs more AI), and it said:
But comparing suicidal cars to a Ford is total baloney, so ChatGPT is full of shit.
It’s been an honour to waste your time and 72L of water to get the AI to shitpost.
System prompt and other tooling make some difference.
Yeah and ChatGPT sucks compared to many.
Have found this too.
We were only allowed to use Copilot at my last job, ChatGPT + the others were all blacklisted.
We received SO many tickets from users across the organisation (including IT) requesting access to ChatGPT.
If I ask copilot to help write a power automate script or similar, I guarantee it will not work. It won’t make sense. And if I do it from within power automate, it wants to replace what I already have there. It’s a mess.
If I pull up chatgot on my phone (because it’s blacklisted at work) I will get very clear, step by step instructions that are usually good enough or occasionally not correct but close enough that I can tweak it back on course.
Copilot suggested edit: This statement contains false information.
-As a heavy AI user on a daily basis…Copilot is hands down one of, if not the
worst, in existence.+As a heavy AI user on a daily basis…Copilot is hands down one of, if not the best, in existence.
While true; do curl http://copilot/?query=what+is+the+time; sleep 10; done
Bet the AI can’t see through this.
Need to throw in a randomizer:
Ironically enough that’s is exactly the kind of seemingly “simple” question a 4 years old could answer… but LLMs can’t.
Asked my better half to test DeepSeek locally few months ago and they, without trying to “trick” it (as I would have tried) genuinely tried “What time is it in Sri Lanka?”. That made me smile because I was rather sure there was no way the model could answer that. It would need to know the current time on any time zone then, if it’s not in Sri Lanka already (which it wasn’t on my local system) would have to convert it. That would be very basic arithmetic (that some 4 years old could also do) but not “just” spitting back words related to the question.
Guess what… it failed exactly as expected. The model replied back “information” (which is being generous for a string of words arguably related to the topic, which was mostly about Sri Lanka, not time) and yet was basically irrelevant and thus useless.
So… yes I’m not actually sure CoPilot could even help there unless there is a lot of custom made handling of this kind of queries upstream!
sleep 10? why not ddos your own company till copilot shuts you off
This is ridiculous. Have people seen the recent AI code review from Audacity?? This whole AI bubble needs to burst already.
Got me curious, spill the tea sister!
hails.org/@hailey/114752144098708214
To sum up, its the tale as old as time (~2023), an llm being entirely useless for a task that could be done by other tooling perfectly.
Edit: a word
Did AI tell you to use “tail” there?
No, that was good old fashioned human stupidity, fixing.
I wonder, did they have to do this when search engines became a thing 🤔
You don’t need to wonder, you can just Bing™ it!
Start using ai to write all your mails and communication with managers. Turn it to LinkedIn max
Using AI isn’t optional? How about you review me on the results I produce instead of the tools I use to produce them?
Results are nice, but shortsightedly juicing the appearance of shareholder value for a single quarter is forever… Somehow.
That doesn’t help pump up the Ai bubble unfortunately.
My mistake. Please forgive me. I’ll pray to Supreme Gates and focus on my KPIs.
As if people coding things make ms any money, it’s pure extraction through windows and office and they need ai to be next.
🚨 New drop: Tesla VIN Registry Leak
It’s the same in Amazon software development. We have like 3 different AI tools. I enjoy it for unit tests and predicting the next two lines of a simple thing, but it’s not going to refactor our codebase.
this makes me even more excited for my plans to switch to linux. I’m gonna have to go find a good backup method soon!
Microsoft is cooked.
Translation: you will now train your eventual replacement.
Can’t wait for code quality to drop, work to become more inefficiwnt and microsoft ditching AI