henfredemars@infosec.pub
on 14 Mar 2025 21:25
nextcollapse
Far more impressive and interesting news.
Butterbee@beehaw.org
on 14 Mar 2025 22:58
collapse
Uranus is rising
knighthawk0811@lemmy.ml
on 14 Mar 2025 20:53
nextcollapse
hmm, this might actually be good
knighthawk0811@lemmy.ml
on 14 Mar 2025 22:30
nextcollapse
nevermind, it’s going to eat up my phone’s resources
a1studmuffin@aussie.zone
on 14 Mar 2025 23:59
collapse
If it is, get ready for the subscription plan.
cronenthal@discuss.tchncs.de
on 14 Mar 2025 21:21
nextcollapse
Every couple of years I try out the latest “assistant”. I ask very simple queries, stuff that you might expect it to be able to handle. It always disappoints hilariously.
Questions where previous assistants consistently and utterly failed: “What’s the weather going to be like today?” “I want to go to [address], using public transport, can you tell me which routes I have to take?” “What are my appointments today?”
You’d think these are pretty obvious use-cases, but every iteration of assistants was completely incapable of getting even close to a satisfying answer. I don’t expect LLMs to fare better this time around.
user224@lemmy.sdf.org
on 14 Mar 2025 21:30
nextcollapse
My every usage of assistant boiled down to just “What’s this song?”, so I just replaced it with Shazam.
Gemini… having tried it, you will surely get an answer, perhaps based on random magic.
entropicdrift@lemmy.sdf.org
on 14 Mar 2025 22:23
nextcollapse
I just tried those questions with Gemini on my phone. It got them all right. For the directions one it gave me a Google Maps link with the route dialed in in public transit mode
I only use voice to interact with my phone when I’m driving and want to set a destination, or if I’m at home and can’t find my phone.
Somehow I doubt that AI will help tell me my phone fell off the nightstand and is half way under the bed.
Flax_vert@feddit.uk
on 14 Mar 2025 21:30
nextcollapse
I remember asking my phone to navigate for me as I often do. Far easier than opening up a map, looking for locations, etc. Just what Google Assistant was always supposed to be for. It was cold and I was wearing gloves also. But this time, it was Gemini on my phone. Instead of passing the query into Google maps, she just starts reciting directions to me using outdated bus timetables instead of using Google maps with the live times and delays plugged into it. Or when I asked it to set a timer and it started giving me instructions on how to set a timer on both Android and iOS. Thankfully I figured out how to set it back to assistant. Sure, an LLM is great when you’re having a debate in the car and need to ask it for info, but not for everything. It’s like the whole “chatgpt vs stockfish chess match”
I think it’ll be cool if it knew all of the safe commands on your phone and had a prompt to “return a list of commands in order from the user’s prompt”. So like, if you want to play a playlist, set it to shuffle but queue five songs, it should be able to do that. I feel like the people who make this stuff don’t actually use it. Like on Google Maps on Android Auto, some buttons like the direction mute button is so tiny and could easily be enlarged, even if it’s less aesthetically pleasing
cy_narrator@discuss.tchncs.de
on 15 Mar 2025 01:26
collapse
threaded - newest
I’m replacing my arse with Sagittarius.
Far more impressive and interesting news.
Uranus is rising
hmm, this might actually be good
nevermind, it’s going to eat up my phone’s resources
If it is, get ready for the subscription plan.
Every couple of years I try out the latest “assistant”. I ask very simple queries, stuff that you might expect it to be able to handle. It always disappoints hilariously.
Questions where previous assistants consistently and utterly failed: “What’s the weather going to be like today?” “I want to go to [address], using public transport, can you tell me which routes I have to take?” “What are my appointments today?”
You’d think these are pretty obvious use-cases, but every iteration of assistants was completely incapable of getting even close to a satisfying answer. I don’t expect LLMs to fare better this time around.
My every usage of assistant boiled down to just “What’s this song?”, so I just replaced it with Shazam.
Gemini… having tried it, you will surely get an answer, perhaps based on random magic.
I just tried those questions with Gemini on my phone. It got them all right. For the directions one it gave me a Google Maps link with the route dialed in in public transit mode
I only use voice to interact with my phone when I’m driving and want to set a destination, or if I’m at home and can’t find my phone.
Somehow I doubt that AI will help tell me my phone fell off the nightstand and is half way under the bed.
I remember asking my phone to navigate for me as I often do. Far easier than opening up a map, looking for locations, etc. Just what Google Assistant was always supposed to be for. It was cold and I was wearing gloves also. But this time, it was Gemini on my phone. Instead of passing the query into Google maps, she just starts reciting directions to me using outdated bus timetables instead of using Google maps with the live times and delays plugged into it. Or when I asked it to set a timer and it started giving me instructions on how to set a timer on both Android and iOS. Thankfully I figured out how to set it back to assistant. Sure, an LLM is great when you’re having a debate in the car and need to ask it for info, but not for everything. It’s like the whole “chatgpt vs stockfish chess match”
It’s not that great for info either
I think it’ll be cool if it knew all of the safe commands on your phone and had a prompt to “return a list of commands in order from the user’s prompt”. So like, if you want to play a playlist, set it to shuffle but queue five songs, it should be able to do that. I feel like the people who make this stuff don’t actually use it. Like on Google Maps on Android Auto, some buttons like the direction mute button is so tiny and could easily be enlarged, even if it’s less aesthetically pleasing
Which only makes sense