SnoringEarthworm@sh.itjust.works
on 03 Oct 03:13
nextcollapse
Signal CEO Whittaker said that in the worst case scenario, they would work with partners and the community to see if they could find ways to circumvent these rules. Signal also did this when the app was blocked in Russia or Iran. “But ultimately, we would leave the market before we had to comply with dangerous laws like these.”
This is why we need the ability to sideload apps.
NocturnalEngineer@lemmy.world
on 03 Oct 03:42
nextcollapse
Most likely the reason, among others, they’re fighting tooth & nail to remove side loading too.
lmmarsano@lemmynsfw.com
on 03 Oct 04:50
nextcollapse
Are they?
SnoringEarthworm@sh.itjust.works
on 03 Oct 05:00
collapse
Bruh, you’re trying to sanewash this of all things? Right now I can go to any third-party app store and click install on an app without me nor the developer having to kiss the ring of Google or by extension the regulators (EU with Chat Control) that they are beholden to.
After this I’ll have to fucking install Google’s SDK on my computer, manually download application files, and deploy them to my device over USB with CLI commands. I will never ever ever be able to get friends and family access to third-party applications after this change.
And fuck, man, there’s not even a guarantee this solution will last, either. Google promised they would allow on-device sideloading back when they started adding deeper and deeper settings restrictions on enabling sideloaded app support, their word means fuck-all and you know that.
You misidentified your objection.
It isn’t sideloading removal, which isn’t happening.
It’s developer verification, which affects the sideloading that remains available.
Just because you don’t understand the value of verifying signatures doesn’t mean it lacks value.
I recall the same alarm over secureboot: there, too, we can (load our certificates into secureboot and) sign everything ourselves.
This locks down the system from boot-time attacks.
I will never ever ever be able to get friends and family access to third-party applications after this change.
Then sign it: problem solved.
Developer verification should also give them a hard enough time to install trash that fucks their system and steals their information when that trash is unsigned or signed & suspended.
That’s twice that you’ve missed the point that everyone else is saying. Read it again:
without me nor the developer having to kiss the ring of Google or by extension the regulators (EU with Chat Control) that they are beholden to
Google is irreversibly designating themselves the sole arbiter of what apps can be freely installed in the formerly-open Android ecosystem. It’s the same as if they just one day decided that Chromium-based browsers would require sites have a signature from Google and Google alone. I honestly don’t give a shit if they did it just on Pixel devices, but they’re doing it to the phones of ALL manufacturers by looping it into Play services.
I just don’t understand: why the fuck are you so pussy-whipped by Google that you’re stanning their blatant power grabs?
I don’t understand why you can’t read: (1) developer verification can be disabled, bypassed, or worked with, (2) you called it sideloading removal, which it isn’t.
You just don’t like the extra steps that limit the ease for ignorant users to install software known to be malicious that could have been blocked.
I don’t like handholding my dumbass folks through preventable IT problems they created.
This does fuck all for “security”. It’s targeting, mainly, power users and puts just more hoops for developers. This has nothing with security (they should purge malware from Play store first) and everything to do with consolidating power over users.
It’s a blatant power grab and I’m surprised to see this interpreted as anything else. Arguing about semantics just helps Google fuck everyone over.
So let me buy a goddamn phone that I can install what I want in it. Again, I do not give a shit about any phone manufacturers that want to make a walled garden out of their Android installations. I agree, it’s perfect for the grandmas of the world. But Google is forcibly doing this to every goddamn phone, phone manufacturer, and Android enthusiast.
The only silver lining is that whenever Google decides that unregulated social media services like Lemmy are not family-safe I won’t have to listen to your malicious horseshit.
forcibly doing this to every goddamn phone, phone manufacturer, and Android enthusiast
They can manage.
whenever Google decides that unregulated social media services like Lemmy are not family-safe I won’t have to listen to your malicious horseshit
So casual users can get wrecked, yet I’m malicious?
Maybe think of users other than yourself, weigh the potential losses to them by successful attacks, and consider whether OS designers have a legitimate claim in preventing exposure of known threats to casual users while still allowing power users to bypass those checks.
You’re assuming I use an Android app (trash) to get on here, and not a proper workstation or web browser.
You’re welcome to this “malicious horseshit” for eternity.
developer verification can be disabled, bypassed, or worked with
In reality this is useless given the technical capabilities (or access to the technology necessary) of nearly every android user. What percentage of them do you think has the capacity and capability to use ADB?
you called it sideloading removal, which it isn’t.
Strictly it ticks the box, however effectively it is sideloading removal. Arguing otherwise honestly makes me think you work for them. It’s such obvious marketing bullshit “Oh, we left this tiny window open to tick the box which people can use, but almost certainly not you and even if you are capable, it’s a pain in the arse”. There are lots of intelligent people in my house. I’m the only one capable of using ADB without enormous effort, making it a deliberately huge barrier and even I’m not going to do it to install a trusted open source app.
Let’s be clear; the only reason they left that little window open was to have people like you say “no, sideloading is still possible” to cover their arses legally and also for actual developers, not because they care about an open ecosystem.
What percentage of them do you think has the capacity and capability to use ADB?
All of them: they can follow procedures, plug a cable, and push buttons if they really want to.
Most won’t bother: capacity isn’t willpower.
it’s a pain in the arse
That’s the idea: welcome to an effective deterrent.
even I’m not going to do it to install a trusted open source app
Good, then it’ll deter as designed.
the only reason
Nah, the use cases are legitimate:
It will actually deter installation of malicious software once it’s been identified & flagged that way in their system.
It also verifies install packages haven’t been tampered (possibly maliciously) from their original releases.
Malicious software on devices connected to everything including highly sensitive information poses high-cost risks that you & casual users overlook because muh inconvenience 😭.
If casual users can’t bother with a straightforward procedure as you say, then how prepared are they to handle the real challenges of a successful attack?
From a security perspective, it makes sense for OS designers to choose to limit exposure to that threat to power users who can be expected to at least have a better idea of what they’re getting themselves into.
Google employee confirmed. Absolute trash reasoning verging on trolling it’s so ridiculous. Wild that you arguing so vehemently in favour of reduced access to use your hardware the way you want.
All of them
Laughable. You’ve obviously never worked in any kind of customer support role.
Most people are going to melt at the steps necessary to use adb.
capacity isn’t willpower.
By capacity I meant access to hardware. There are so many people in poorer countries out there that don’t have a laptop, permission to start using one for installing adb on it but also have an android phone.
welcome to an effective deterrent.
I don’t want an effective deterrent that effectively kills fdroid and the like. That’s the whole point. I’ve favoured android because it’s more open. The talking points in favour of it pale in comparison to the loss of freedom.
If casual users can’t bother with a straightforward procedure
Honestly just jog on. Please. It is not a straightforward procedure and my threat model shouldn’t need to include the steps you outline. There are already barriers in place that put off casual users.
The fact that you want people to stop installing open source apps that they trust is honestly deranged. Deranged.
Tollana1234567@lemmy.today
on 04 Oct 08:56
collapse
Even the OPLus phones are planning to softlock their phones in newer models
That means nothing when the servers stop taking EU traffic. I get your point, but the real solution here is putting a bullet (double tap) in Chat Control, once and for all.
You can run your own server for signal by the look of it
white_nrdy@programming.dev
on 03 Oct 10:03
collapse
Not officially I don’t think. And even if you did, you’d need a customized app to point to said server, and then you wouldn’t be interoperable with the regular signal network
That means nothing when the servers stop taking EU traffic
I don’t use any of these apps, so I’m not quite sure how they work. But couldn’t you just make an app that keeps a local private and public key pair. Then when you send a message (say via regular sms) it includes under the hood your public key. Then the receiver when they reply uses your public key to encrypt the message before sending to you?
Unless the sms infrastructure is going to attempt to detect and reject encrypted content, this seems like it can be achieved without relying on a server backend.
That makes the assumption you want to use your phone number at all. And I’m sure the overhead of encryption would break SMS due to the limits on character counts.
That is how the signal protocol works, it’s end to end encrypted with the keys only known between the two ends.
The issue is that servers are needed to relay the connections (they only hold public keys) because your phone doesn’t have a static public IP that can reliably be communicated to. The servers are needed to communicate with people as they switch networks constantly throughout the day. And they can block traffic to the relay servers.
white_nrdy@programming.dev
on 03 Oct 10:02
nextcollapse
I think they’re suggesting doing it on top of SMS/MMS instead of a different transport protocol, like Signal does, which is IP based
Which is what Textsecure was. The precursor to Signal. Signal did it too, but removed it because it confused stupid people.
conorab@lemmy.conorab.com
on 04 Oct 06:26
collapse
Signal does have a censorship circumvention feature in the advanced settings on iOS which may work when this hits provided you already have the app installed. Never had to use it though.
A short message is 140 bytes of gsm7-bit packed characters (I.e. each character is translated to “ascii” format which only take up 7-bit space, which also is packed together forming unharmonic bytes), so we can probably get away with 160 characters per SMS.
According to crypto.stackexchange, a 2048-bit private key generates a base64 encoded public key of 392 characters.
That would mean 3 SMSs per person you send your public key to.
For a 4096-bit private key, this accounts to 5 SMSs.
As key exchange only has to be sent once per contact it sounds totally doable.
After you sent your public key around, you should now be able to receive encrypted short messages from your contacts.
The output length of a ciphertext depends on the key size according to crypto.stackexchange and rfc8017. This means we have 256 bytes of ciphertext for each 2048-bit key encrypted plaintext message, and 512 bytes for 4096-bit keys.
Translated into short messages, it would mean 2 or 4 SMSs for each text message respectively, a 1:2, or 1:4 ratio.
NIST recommends abandoning 2048-bit keys by 2030 and use 3072-bit keys (probably a 1:3 ratio)
average number of text messages sent per day and subscriber seems to be around 5-6 SMS globally, this excludes WhatsApp and Signal messages which seems to be more popular than SMS in many parts of the world [quotation needed, I just quickly googled it]
Signal has never done that. Whilst the app might not be available in some regions they’ve been proud to talk about how people can use it to avoid government barriers.
I have become convinced by Cory Doctorow’s (tech writer and inventor of the term “enshittification”) argument that the fact that we’re even discussing this in terms of “sideloading” is a massive win for tech companies. We used to just call that “installing software” but now for some reason because it’s on a phone it’s something completely weird and different that needs a different term. It’s completely absurd to me that we as a society have become so accustomed to not being able to control our own devices, to the point of even debating whether or not we should be allowed to install our own software on our own computers “for safety.” It should be blatantly obvious that this is all just corporate greed and yet the general public can’t or refuses to see it.
xspurnx@lemmy.dbzer0.com
on 03 Oct 14:20
nextcollapse
TBH I was confused when I came across the term “sideloading” for the first few times because I thought it was something new. Part of the plan I guess. Damn.
Some political groups are better than others, but most politicians are clueless.
The key is to get muggles to understand we are living in Technofeudalism and why being digital serfs is bad. The problem is ineffective competition law and that monopolies are bad. That monopolies and standards are not the same thing. I have no idea how. Most people are just naturally compliant and unquestioning of something seemingly so abstract.
In the 80’s (I’m that old), many home computers came with the programming manual, and the impetus was to learn to code and run your programs on your own device. Even with Android it’s not especially hard (with LLM’s even less so than it used to be) to download Android Studio, throw some shit onto the screen, hit build, and run your own helper app or whatever sideloaded installed via usb cable (or wirelessly) on your own device.
In certain cases (cars, health related hw etc.) I get why it’s probably for the best if the user is not supposed to mod their device outside preinstalled sw’s preferences/settings. But when it comes to computers (i.e. smartphones, laptops, tablets, tv boxes etc.) I fully agree with Cory here. Such a shame everything must go to shit.
vacuumflower@lemmy.sdf.org
on 03 Oct 04:04
nextcollapse
About freedom, not freedom and various other things - might want to extend the common logic of gun laws to the remaining part of the human societies’ dynamics.
Signal is scary in the sense that it’s a system based on cryptography. Cryptography is a reinforcement, not a basis, if we are not discussing a file encryption tool. And it’s centralized as a service and as a project. It’s not a standard, it’s an application.
It can be compared to a gun - being able to own one is more free, but in the real world that freedom affects different people differently, and makes some freer than the other.
Again, Signal is a system based on cryptography most people don’t understand. Why would there not be a backdoor? Those things that its developers call a threat to rapid reaction to new vulnerabilities and practical threats - these things are to the same extent a threat against monoculture of implementations and algorithms, which allows backdoors in both.
It is a good tool for people whom its owners will never be interested to hurt - by using that backdoor in the open most people are not qualified to find, or by pushing a personalized update with a simpler backdoor, or by blocking their user account at the right moment in time.
It’s a bad tool even for them, if we account for false sense of security of people, who run Signal on their iOS and Android phones, or PCs under popular OSes, and also I distinctly remember how Signal was one of the applications that motivated me to get an Android device. Among weird people who didn’t have one then (around 2014) I might be even weirder, but if not, this seems to be a tool of soft pressure to turn to compromised suppliers.
Signal discourages alternative implementations, Signal doesn’t have a modular standard, and Signal doesn’t want federation. In my personal humble opinion this means that Signal has their own agenda which can only work in monoculture. Fuck that.
Varying9125@lemmy.world
on 03 Oct 04:43
nextcollapse
I think you may need some sleep man. wtf are you talking about
vacuumflower@lemmy.sdf.org
on 03 Oct 05:48
collapse
Perhaps you need to get some sleep if you don’t understand what I’m talking about.
vacuumflower@lemmy.sdf.org
on 05 Oct 06:25
collapse
Unironically yes, communications (information and roads) were historically as important. Lenin’s call to “take post, telegraph, telephone stations, bridges and rail stations” kinda illustrates that.
What I meant is that abstractly having fully private and free communications is just as universally good as everyone having a drone army. In reality both have problems. The problems with weapons are obvious, the problems with communications in my analogy are not symmetric to that, but real still - it’s that people can be deceived and backdoors and traps exist. Signal is one service, application and cryptographic system, it shouldn’t be relied upon this easily.
It’s sometimes hard to to express things based only on someone with good experience telling them to me, making it an appeal to anonymous authority, but a person who participated in a project for a state security service once told me that in those services cryptography is never the basis of a system. It can only be a secondary part.
Also, other than backdoors and traps, imbalance exists. Security systems are tools for specific purposes, none are universal. 20 years ago anonymity and resilience and globalism (all those plethora of Kademlia-based and overlay routing applications, most of which are dead now) were more in fashion, and now privacy and political weight against legal bans (non-technical thing, like, say, the title of the article) are. The balance between these in popular systems determines which sides and powers lose and benefit from those being used by many people. In case of Signal the balance is such that we supposedly have absolute privacy and convenience (many devices, history), but anonymity, resilience and globalism are reduced to proverbial red buttons on Meredith Whittaker’s table.
Unfortunately, I don’t get most of your refetences, but sure you can find similarities in wildy different things.
Signal being easy to rely on is its biggest benefit. No one will adopt something that’s more complex, but I don’t think extra complexity would offer better security for the average person. More complexity just means more things to go wrong.
People can be deceieved anywhere in their life, this isn’t synonymous to an end to end encrypted chat.
Backdoors do exist and they are obviously bad, but Signal choosing to leave the market before implementing one sounds best to me.
state security service once told me that in those services cryptography is never the basis of a system. It can only be a secondary part.
Obviously I’m no smarter than this person, but without cryptography how is any “secure” project actually “secure”. The only thing more important that I can imagine would be the physical location of a server (for example) being highly protected from bad actors.
In the end, I personally think having an easy to use platform that is secure gives everyone amazing power to recoup their free speech wherever is it eroded.
vacuumflower@lemmy.sdf.org
on 06 Oct 05:53
collapse
Signal being easy to rely on is its biggest benefit. No one will adopt something that’s more complex, but I don’t think extra complexity would offer better security for the average person. More complexity just means more things to go wrong.
My concerns on this are more that acceptable share in something in the internetworked world seems to be in percentages far smaller than the usual common sense percentages. Like - there are political systems with quotas, and there are anti-monopoly regulations, but with computers and the Internet every system is a meta-system. Allowing endless supply of monopolies and monocultures.
Signal is so easy to rely, that if you ask which applications with zero-knowledge cryptography and reliable groupchat encryption and so on people use, that are available without p2p (draining battery and connectivity requirements), with voice calls and file transfers, it’ll be mostly Signal.
Doesn’t matter it’s only one IM application. In its dimension it’s almost a monoculture. One group of developers, one company, one update channel. An update comes with a backdoor and it’s done.
It’s not specifically about Signal, rather about the amount of effort and publicity that goes into year 2002 schoolgirl’s webpage is as much as any separate IM application should get, if we want to avoid dangers with the Internet which don’t exist in other spheres. And they usually get more. The threshold where something becomes too big with computers is much smaller than with, I don’t know, garden owner associations.
Even if there are already backdoors put by their developers in a few very “open”, ideologically nice and friendly and “honorable” things like Signal, then such backdoors can exist and be used for many years before being found.
I mean, there are precedents IRL, and with computers you are hiding the needle in a much bigger hay stack.
Obviously I’m no smarter than this person
I’m bloody certain you are smarter than this person in everything not concerning things they were directly proficient in. And while being an idiot, they would stuck their nose into everything not their concern in very dangerous (for others, not for them) ways.
but without cryptography how is any “secure” project actually “secure”.
There are security schemes, security protocols, security models, and then there is cryptography as one kind of building blocks, with, just like in construction materials, its own traits and behavior.
In the end, I personally think having an easy to use platform that is secure gives everyone amazing power to recoup their free speech wherever is it eroded.
And I think the moment anything specific and controlled by one party becomes popular enough to be a platform, we’re screwed and we’re not secure.
Reminds of SG-1 and the Goauld (not good guys, I know) adjusting their spawn genome for different races.
Perhaps something like that should be made, a common DSL for describing application protocols and maybe even transport protocols, where we’d have many different services and applications, announcing themselves by a message in that DSL describing how to interact with them. (Also inspired by what Telegram creators have done with their MTProto thing, but even more general ; Telegram sometimes seems something that grew out of an attempt to do a very cool thing, I dunno if I was fair saying bad things about Durov on the Internet.)
A bit like in Star Wars Han Solo and Chewbacca speak to each other.
And a common data model, fundamentally extensible, say, posts as data blobs with any amount of tags of any length, it’s up to any particular application to decide on limits. Even which tag is the ID and how it’s connected to the data blob contents and others tags is up to any particular application. What matters is that posts can be indexed by tags and then replicated\shared\transferred\posted by various application protocols.
It should be a data-oriented system, so that one would, except for latency, use it as well by sharing post archives as they would by searching and fetching posts from online services, or even subscribing to posts of specific kind to be notified immediately. One can imagine many kinds of network services for this, relay services (like, say, IRC), notification services (like, say, SIP), FTP-like services, email-like services. The important thing would be that these are all transports, all variable and replaceable, and the data model is constant.
There can also be a DSL that describes some basics on how a certain way of interpreting posts and their tags works and which buttons, levers and text fields it presents, kinda similar to how we use the Web. It should be a layer above the DSL that would describe verifica
I am starting to agree with the new point. I still think everyone should move to Signal for now because it works and works well, but I see your point that one authority can become dangerous if any one malicious party in power tried anything.
There are probably solutions that could exist because it’s open source (eg a different trusted entity like f-droid managed builds from source for example so Signal themselves can’t add extra code in their builds or just a way to verify that no extra code is present in signals build vs any build from source).
In the future, I would prefer we moved to something more decentralised like what the Matrix protocol is trying to achieve. This could come with further issues, but while those are fixed, Signal is my main go to.
With Matrix I believe we would end up with pretty much the common data models as you were mentioning. Anyone can build their own server and or client and interact with others, knowing at least their software is safe.
lmmarsano@lemmynsfw.com
on 03 Oct 04:48
nextcollapse
I don’t think you understand anything you wrote about.
Signal is open source, is publicly audited by security researchers, and publishes its protocol, which has multiple implementations in other applications.
Messages are encrypted end-to-end, so the only weaknesses are the endpoints: the sender or recipients.
Security researchers generally agree that backdoors introduce vulnerabilities that render security protocols unsound.
Other than create opportunities for cybercriminals to exploit, they only serve to amplify the powers of the surveillance state to invade the privacy of individuals.
vacuumflower@lemmy.sdf.org
on 03 Oct 05:47
collapse
I don’t think you understand anything you wrote about. Signal is open source,
I don’t think you should comment on security if “open source” means anything to you in that regard. For finding backdoors binary disassembly is almost as easy or hard as looking in that “open source”. It’s very different for bugs introduced unintentionally, of course.
Also why the hell are you even saying this, have you looked at that source for long enough? If not, then what good it is for you? Magic?
I suppose you are an illustration to the joke about Raymond’s “enough eyeballs” quote, the joke is that people talking about “enough eyeballs” are not using their eyeballs for finding bugs\backdoors, they are using them and their hands for typing the “enough eyeballs” bullshit.
“Given enough good people with guns, all streets in a town are safe”. That’s how this reads for a sane person who has at least tried to question that idiotic narrative about “open source” being the magic pill.
Stallman’s ideology was completely different, sort of digital anarchism, and it has some good parts. But the “open source” thing - nah.
is publicly audited by security researchers,
Exactly, and it’s not audited by you, because you for the life of you won’t understand WTF happens there.
Yes, it’s being audited by some security researchers out there, mostly American. If you don’t see the problem you are blind.
and publishes its protocol, which has multiple implementations in other applications.
No, there are no multiple implementations of the same Signal thing. There are implementations of some mechanisms from Signal. Also have you considered that this is all fucking circus and having a steel gate in a flimsy wooden fence? Or fashion, if that’s easier to swallow.
Can you confidently describe what zero-knowledge means there, how is it achieved, why any specific part in the articles they’ve published matters? If you can’t, what’s the purpose of it being published, it’s like a schoolboy saying “but Linux is open, I can read the code and change it for my needs”, yeah lol.
Security researchers generally agree that backdoors introduce vulnerabilities that render security protocols unsound.
Do security researches have to say anything on DARPA that funds many of them? That being an American military agency.
And on how that affects what they say and what they don’t say, what they highlight and what they pretend not to notice.
In particular, with a swarm of drones in the sky at some point, do you need to read someone’s messages, or is it enough to know that said someone connected to Signal servers 3 minutes ago from a very specific location and send one of those drones. Hypothetically.
Other than create opportunities for cybercriminals to exploit, they only serve to amplify the powers of the surveillance state to invade the privacy of individuals.
Oh, the surveillance state will be fine in any case!
And cybercriminals we should all praise for showing us what the surveillance state would want to have hidden, to create the false notion of security and privacy. When cybercriminals didn’t yet lose the war to said surveillance state, every computer user knew not to store things too personal in digital form on a thing connected to the Internet. Now they expose everything, because they think if cybercriminals can no longer abuse them, neither can the surveillance state.
Do you use Facebook, with TLS till its services and nothing at all beyond that? Or Google - the same?
Now Signal gives you a feeling that at least what you say is hidden from the service. But can you verify that, maybe there’s a scientific work classified yet, possibly independently made in a few countries. This is a common thing with cryptography, scientific works on that are often state secret.
You are also using AES with NSA-provided s-boxes all the time.
I suggest you do some playing with cryptography in practice. Too few people do, while it’s very interesting and enlightening.
lmmarsano@lemmynsfw.com
on 03 Oct 08:33
nextcollapse
I don’t think you should comment on security if “open source” means anything to you
Anyone can look at the source, brah, and security auditors do.
For finding backdoors binary disassembly is almost as easy or hard as looking in that “open source”.
Are you in the dark ages?
Beyond code review, there are all kinds of automations to catch vulnerabilities early in the development process, and static code analysis is one of the most powerful.
Analysts review the design & code, subject it to various security analyzers including those that inspect source code, analyze dependencies, check data flow, test dynamically at runtime.
There are implementations of some mechanisms from Signal.
Right, the protocol.
Can you confidently describe
Stop right there: I don’t need to.
It’s wide open for review by anyone in the public including independent security analysts who’ve reviewed the system & published their findings.
That suffices.
Do security researches have to say anything on DARPA that funds many of them?
They don’t.
Again, anyone in the public including free agents can & do participate.
The scholarly materials & training on this aren’t exactly secret.
Information security analysts aren’t exceptional people and analyzing that sort of system would be fairly unexceptional to them.
Oh, the surveillance state will be fine in any case!
Even with state-level resources, it’s pretty well understood some mathematical problems underpinning cryptography are computationally beyond the reach of current hardware to solve in any reasonable amount of time.
That cryptography is straightforward to implement by any competent programmer.
Legally obligating backdoors only limits true information security to criminals while compromising the security of everyone else.
I do agree, though: the surveillance state has so many resources to surveil that it doesn’t need another one.
vacuumflower@lemmy.sdf.org
on 03 Oct 09:46
collapse
In short - something “everyone being able to look upon” is not an argument. The real world analogies are landmines and drug dealers and snake oil.
Even with state-level resources, it’s pretty well understood some mathematical problems underpinning cryptography are computationally beyond the reach of current hardware to solve in any reasonable amount of time.
You are not speaking from your own experience, because which problems are solved and which are not is not solely determined by hardware you have to do it by brute force. Obviously.
And nation states can and do pay researchers whose work is classified. And agencies like NSA do not, for example, provide reasoning for their recommended s-boxes formation process. For example.
Solving problems is sometimes done analytically, you know. Mostly that’s what’s called solving problems. If that yields some power benefits, that can be classified, you know. And kept as a state secret.
Are you in the dark ages? Beyond code review, there are all kinds of automations to catch vulnerabilities early in the development process, and static code analysis is one of the most powerful.
People putting those in are also not in the dark ages.
Stop right there: I don’t need to. It’s wide open for review by anyone in the public including independent security analysts who’ve reviewed the system & published their findings. That suffices.
There are things which were wide open for review by anyone for thousands of years, yet we’ve gotten ICEs less than two centuries ago, and electricity, and so on. And in case of computers, you can make very sophisticated riddles.
So no, that doesn’t suffice.
They don’t.
Oh, denial.
Again, anyone in the public including free agents can & do participate. The scholarly materials & training on this aren’t exactly secret.
There have been plenty of backdoors found in the open in big open source projects. I don’t see how this is different. I don’t see why you have to argue, is it some religion?
Have you been that free agent? Have you participated? How do you think, how many people check things they use? How often and how deeply?
Information security analysts aren’t exceptional people and analyzing that sort of system would be fairly unexceptional to them.
Yes, but you seem to be claiming they have eagle eyes and owl wisdom to see and understand everything. As if all of mathematics were already invented.
Legally obligating backdoors only limits true information security to criminals while compromising the security of everyone else.
It’s not about obligating someone. It’s about people not working for free, and those people working on free (for you) stuff might have put in backdoors which it’s very hard to find. Backdoors usually don’t have the “backdoor” writing on them.
I do agree, though: the surveillance state has so many resources to surveil that it doesn’t need another one.
Perhaps the reason they have so many resources is that they don’t miss opportunities, and they don’t miss opportunities because they have the resources.
You sound paranoid but it doesn’t mean you aren’t right, at least to some extent.
So what’s your solution for secure messaging?
vacuumflower@lemmy.sdf.org
on 03 Oct 09:52
collapse
Getting rid of monoculture via transports and cryptography being pluggable (meaning that the resulting system would be fit for sneakernet as well as for some kind of federated relays as well as something Kademlia-based, the point is that the common standard would describe the data structure, not transports and verification and protection).
RiverRabbits@lemmy.blahaj.zone
on 03 Oct 06:06
collapse
that’s a lot of words to say you generally accuse any programm that isn’t federated of having an agenda targeted at its userbase.
And lots of social woo-woo that doesn’t extend much further than “people don’t understand cryptography and think it’s therefore scary”.
A pretty weird post, and one which I don’t support any statement from because I think you’re wrong.
vacuumflower@lemmy.sdf.org
on 03 Oct 06:11
collapse
that’s a lot of words to say you generally accuse any programm that isn’t federated of having an agenda targeted at its userbase
No, that’s not what I’m saying. I used the word monoculture, it’s pretty good.
And lots of social woo-woo that doesn’t extend much further than “people don’t understand cryptography and think it’s therefore scary”.
Not that. Rather “people don’t understand cryptography, but still rely upon it when they shouldn’t”.
A pretty weird post, and one which I don’t support any statement from because I think you’re wrong.
I mean, you’ve misread those two you thought you understood.
RiverRabbits@lemmy.blahaj.zone
on 03 Oct 06:19
collapse
Using mono ulture as a word doesn’t change the meaning here. If anything, its a pathway for the foal you ascribe.
I do give you credit about the second part - it would be better to have your own private key in chat apps, which isn’t handled by the app itself, at the very least to establish a shared key. I still think the existence of crypto is a massive boon to many, even in a “flawed” implementation with the “control” being on the side of corporations - tho if they are smart, they’d never store the keys themselves, not even hashes. Unless you’re part of the signal project, I doubt you know the exact implementation and storage of data they do.
Still, thanks for summarising your lengthy post, even if I had to bait you into it. Sometimes, brevity is key.
vacuumflower@lemmy.sdf.org
on 03 Oct 06:38
collapse
Using mono ulture as a word doesn’t change the meaning here. If anything, its a pathway for the foal you ascribe.
Of course it does. Federation can be a monoculture too (as it is with plants). A bunch of centralized (technically federated in IRC’s case, but united) services, like with IRC, can be not a monoculture.
Monoculture is important because one virus (of conspiratorial nature, like backdoors and architectures with planned life cycle, like what I suspect of the Internet, or of natural one, like Skype’s downfall due to its P2P model not functioning in the world of mobile devices, or of political and organizational one, like with XMPP’s standards chaos and sabotage by Google) can kill it. In the real world different organisms have sexual procreation, as one variant, recombining their genome parts into new combinations. That existed with e-mail when it worked over a few different networks and situations and protocols, and with Fidonet and Usenet, with gateways between these. That wasn’t a monoculture.
Old Skype unfortunately was a monoculture. Its clients for Linux (QT) and Windows and mobile things were different implementations technically, but with the same creators and one network and set of protocols in practice.
I still think the existence of crypto is a massive boon to many
That’s the problem, it’s not. You should factor psychology in. People write things over encrypted channels that they wouldn’t over plaintext channels. That means it’s not just comparison of encrypted versus plain, other things equal.
even in a “flawed” implementation with the “control” being on the side of corporations - tho if they are smart, they’d never store the keys themselves, not even hashes.
And that’s another problem, no. Crooks only steal your money, and they have adjusted for encryption anyway. They are also warning you of the danger, for that financial incentive. Like wolves killing sick animals. The state and the corporation - they don’t steal your money, they are fine with just collecting everything there is and predicting your every step, and there will be only one moment with no warning then you will regret. That moment will be one and the same for many people.
Unless you’re part of the signal project, I doubt you know the exact implementation and storage of data they do.
What matters is that the core of their system is a complex thing that is magic for most people. You don’t need to look any further.
Still, thanks for summarising your lengthy post, even if I had to bait you into it. Sometimes, brevity is key.
EDIT:
Still, thanks for summarising your lengthy post, even if I had to bait you into it. Sometimes, brevity is key.
Yeah, I just woke up with sore throat and really bad mood (dog bites, especially when the dog was very good, old and dying, hurt immunity and morale).
XMPP was sabotaged by google (and meta) but is still alive and well.
vacuumflower@lemmy.sdf.org
on 03 Oct 09:47
collapse
It was intended as an ICQ replacement, and its advocates even managed to sell it as that for many normies. It became supported, with federation or not, by many email service providers, social networks, and so on. Then that support mostly vanished. Its users percentages are not inspiring.
vacuumflower@lemmy.sdf.org
on 05 Oct 06:29
collapse
Both. In my surroundings QIP was popular, a Jabber client with an ICQ gateway added from the start or something like that (maybe it just was a client of both). And the whole “roster with buddies and IM windows” thing was definitely more ICQ than IRC inspired.
TankovayaDiviziya@lemmy.world
on 03 Oct 10:12
nextcollapse
Haha! Do it if the EU does not give up on their Orwellian control!
Wait, I’m in the EU and I use Signal!
abbiistabbii@lemmy.blahaj.zone
on 03 Oct 11:12
nextcollapse
Basically, but what you forget is that Signal is also the standard for every Politician for their group chats because it’s secure, so the idea that they might lose their secure, leak-free* form of communication should worry MEPs and other politicians into taking action. Will it? I don’t know, politicians are very stupid when it comes to tech it seems.
* Baring screenshots
Corridor8031@lemmy.ml
on 03 Oct 12:46
nextcollapse
where are the companys lobbying against this btw?? i mean it is their data they will be leaked aswell
abbiistabbii@lemmy.blahaj.zone
on 03 Oct 12:47
nextcollapse
That is also a good point. Generally this is dangerous for all and sundry.
sugar_in_your_tea@sh.itjust.works
on 03 Oct 20:56
nextcollapse
Why would they care about leaks? I guess that’s some missed profit on selling the data, but that’s only if there’s a breach.
politicians are very stupid when it comes to tech it seems.
They are so so so stupid, about this.
There will be so much blackmail and ruined political careers if these backdoors get installed.
A backdoor is never solely used by the folks one might hope would use it.
abbiistabbii@lemmy.blahaj.zone
on 04 Oct 21:21
collapse
I’m sure some poor Civil Servant has had to sit one of them down and explain why it’s a bad idea to them, only be told to stfu with the most stupid excuse ever, leading to them putting their head in their hands and sobbing.
Why are so many European countries doing this? Why the sudden push for chat control and internet restriction laws?
TankovayaDiviziya@lemmy.world
on 04 Oct 19:30
collapse
It’s understandable from law enforcement perspective that it’s important to snoop on actual criminal communications. The EU has pretty reasonable measures and good at cracking down on continental-wide criminal activities. However, can we trust authorities that they won’t over reach with the chat control and violate privacy and freedom of speech? Like, come on, nothing good ever came from spying on communications. Catching criminals and/or terrorists is a convenient excuse to spy on dissidents.
We’ve seen it happen in America with the PATRIOT Act. People dismissed the opposition to it with “nothing to hide” thought terminating cliche, or accuse you of pedophile or terrorist for not wanting spying on communications. Then twenty years later, Americans have a fascist government who allowed a corporate asshole to steal information from the federal government. And those information will be used for surveillance capitalism. The same will happen to us in the EU if we don’t push back hard on this Orwellian desires of politicians.
Signal is considered one of the most secure messengers.
I mean lol, they require a phone number to sign up, which you can only get with an ID in many countries. You chat with a gestapo officer and they know where you life.
Signal IS GARBAGE. Fucking garbage article, gaslighting bullshit. Fuck this timeline. Honestly this article is fucking terrorism.
anamethatisnt@sopuli.xyz
on 03 Oct 12:17
nextcollapse
Why are you giving gestapo your phone number instead of your username?
I think it’s quite a good question to be honest - you can keep your phone number private from all your signal contacts and has been able to since early 2024. thehackernews.com/…/signal-introduces-usernames-a…
Even if signal was insecure and had no privacy (which it it secure and private), wouldn’t you still prefer people needed a warrant or some form of document that had to go through the court before your messages could be read?
Obviously not. Think about supply and demand. Because a toxic product is being hailed as secure there isn’t enough demand for an actually anonymous and private messenger. So calling signal “secure” is just helping state security.
If you actually want to message about revolutionary (illegal, “terrorist”) activity and don’t want to be traced immediately by an agent of state security or an informant, Signal offers nothing (unless you use criminal activity like identity theft). In such a case a warrant will obviously be granted and they can immediately find and arrest you.
Can you see the logic how Signal isn’t secure at all for an actual dissident?
Supply and demand: There are seemingly new messenging services that pop up every day, so I’m not sure why you think Signal existing is stopping progress. It isn’t.
Security: For 99.9% of people, the security and privacy granted through using Signal is amazing and it is worthy of being called secure. I mean it’s secure enough for government officials to trust using. With how Signal is currently, an official data request from the government for Signal data returns pretty much nothing except the phone number used (and that they have signed up for signal ofc), which is great.
I think ‘revolutionaries’ (protestors) are already using Signal. I haven’t heard of any cases where something has gone wrong for them, but again, there’s no way for your messages to be read unless they get access to your phone (if you are smart you will make sure your messages auto delete and that you lockdown or shutdown your phone incase of arrest).
You are confusing security with privacy. But keep on ranting if you like.
Corridor8031@lemmy.ml
on 03 Oct 12:59
nextcollapse
i would even more say with anonymity, considering the chats are still private and the main use case for messanger apps is to communicate with people who know who you are
Do you really think Meta would ignore the opportunity to both be the default option And have justification to read users’ messages?
InnerScientist@lemmy.world
on 03 Oct 17:15
collapse
Nah i don’t. I can hope though and the backdoor is a threat not just for consumers but also companies.
DeathByBigSad@sh.itjust.works
on 04 Oct 00:09
nextcollapse
Separate airgapped device running an encryption app. Type text on it, it spits out a ciphertext, then, use internet connected device to scan the ciphertext, OCR*, then send to target receipient, they also use this same airgap encryption device and they OCR, then decrypt using their key.
*Instead of OCR, you could also use a QR code to have error correction
Tell me how they can ban this? Anyone using a raspberry pi with a battery and touch display attached into one compact thing, is a criminal?
What if we just start using One Time Pad? Can they ban that?
Steganography?
Like seriously, how do you even stop “criminals” using steganography?
So, to Big Gov, here’s my question: Are you gonna ban talking to other people becuause criminals also talk to other people?
They don’t care about your messages, they don’t care about terrorists or pedophiles.
They do care about the general population, and wants to control it. That’s what this is all about. The hard right wants to have effective tools to slam down on dissent when they get in power.
A game as old as humanity.
Shameless plug, because I’m trying to do my part ☺️ : Tenfingers sharing
Not more legal or something (if that stupid laws becomes reality) I guess but who cares ☺️.
wurstgulasch3000@feddit.org
on 04 Oct 08:06
collapse
I already self host my own matrix server. Everybody can’t do that, but everybody can use someone’s matrix server. They can’t shut it down because it’s decentralised and federated. It would theoretically be illegal to use but I don’t see how they would be able to stop it.
Email with PGP would then also be illegal but impossible to effectively stop. That’s why the whole discussion is so stupid. It only hurts the normies. Criminals and tech savvy people will find a way around it and still use encryption without mandated backdoors.
Has any of those become like easy to install and use? To be fair I haven’t checked in some time…
BlackSam@lemmy.dbzer0.com
on 04 Oct 08:50
nextcollapse
With DeltaChat you don’t even need an email address anymore, they provide it for you on the fly. They just ask your name if you (optionally) want to put it.
Can’t be simpler than that tbh.
If you want a better looking ui, check ArcaneChat for Android. It’s 100% compatible with DeltaChat protocol
Simplex is really easy to install and use, unfortunately it’s still kinda buggy, specially with public relays, I personally don’t mind buggy, I’m willing to make sacrifices for the same of freedom and privacy.
I just keep a second chat app as a failback so I can send them a message saying “ur simplex broke again, pls restart”
Xmpp has been stable for decades, tho I guess otr/omemo is hard for family to install, also doesn’t support e2ee calls (or rather, it does, but it’s complicated). But I haven’t used xmpp in a long time.
jonnylyy@discuss.tchncs.de
on 04 Oct 08:39
nextcollapse
I hate talkingpoints like that. Sure Signal can be critiqued, but it’s still the best “mainstream” solution we have. And a lot of people would just stop using secure messanger when signal is gone. Including me because what is simplex or matrix worth for me, when no one I know cares to switch?
My personal experience is that if I can convince someone to install signal I can also convince them to install simplex, the process is the same. If I can’t then they aren’t going to use anything but the popular spyware anyway.
jonnylyy@discuss.tchncs.de
on 04 Oct 08:52
collapse
I will absolutely try that, but most of the people switched to signal or threema because they already heard about it and could just use it because they already used it for some other contacts (actually this was the most common. They already had an account and the app and just had to use it more) . But I don’t think a lot of people would switch to a messanger they never heard about, just for me.
Well, that is fair, also simplex has some serious bugs which I don’t mind because I value freedom, security and privacy over reliability, but sometimes the app just stops receiving messages until restarted and I need to message them via other means telling them to restart the app.
GreenMartian@lemmy.dbzer0.com
on 04 Oct 08:48
collapse
threaded - newest
This is why we need the ability to sideload apps.
Most likely the reason, among others, they’re fighting tooth & nail to remove side loading too.
Are they?
Google will soon stop you sideloading unverified apps – here’s what that means for you
ie, unsigned, so they are not
Sideloading is still available: you can sign it yourself or bypass verification with
adb
as they documented.So, cool misinformation.
Bruh, you’re trying to sanewash this of all things? Right now I can go to any third-party app store and click install on an app without me nor the developer having to kiss the ring of Google or by extension the regulators (EU with Chat Control) that they are beholden to.
After this I’ll have to fucking install Google’s SDK on my computer, manually download application files, and deploy them to my device over USB with CLI commands. I will never ever ever be able to get friends and family access to third-party applications after this change.
And fuck, man, there’s not even a guarantee this solution will last, either. Google promised they would allow on-device sideloading back when they started adding deeper and deeper settings restrictions on enabling sideloaded app support, their word means fuck-all and you know that.
You misidentified your objection. It isn’t sideloading removal, which isn’t happening. It’s developer verification, which affects the sideloading that remains available.
Just because you don’t understand the value of verifying signatures doesn’t mean it lacks value.
I recall the same alarm over secureboot: there, too, we can (load our certificates into secureboot and) sign everything ourselves. This locks down the system from boot-time attacks.
Then sign it: problem solved.
Developer verification should also give them a hard enough time to install trash that fucks their system and steals their information when that trash is unsigned or signed & suspended.
Even so, it’s mentioned only in regard to devices certified for and that ship with Play Protect, which I’m pretty sure can be disabled.
Promise kept.
No, I don’t. Developers are always going to need some way to load their unfinished work.
Adb is functionally useless for most people.
That’s twice that you’ve missed the point that everyone else is saying. Read it again:
Google is irreversibly designating themselves the sole arbiter of what apps can be freely installed in the formerly-open Android ecosystem. It’s the same as if they just one day decided that Chromium-based browsers would require sites have a signature from Google and Google alone. I honestly don’t give a shit if they did it just on Pixel devices, but they’re doing it to the phones of ALL manufacturers by looping it into Play services.
I just don’t understand: why the fuck are you so pussy-whipped by Google that you’re stanning their blatant power grabs?
Probably works at google or is a fanboy.
They’re being precise about their terms, while everyone else is being sloppy. Not stanning
I don’t understand why you can’t read: (1) developer verification can be disabled, bypassed, or worked with, (2) you called it sideloading removal, which it isn’t.
You just don’t like the extra steps that limit the ease for ignorant users to install software known to be malicious that could have been blocked. I don’t like handholding my dumbass folks through preventable IT problems they created.
This does fuck all for “security”. It’s targeting, mainly, power users and puts just more hoops for developers. This has nothing with security (they should purge malware from Play store first) and everything to do with consolidating power over users.
It’s a blatant power grab and I’m surprised to see this interpreted as anything else. Arguing about semantics just helps Google fuck everyone over.
So let me buy a goddamn phone that I can install what I want in it. Again, I do not give a shit about any phone manufacturers that want to make a walled garden out of their Android installations. I agree, it’s perfect for the grandmas of the world. But Google is forcibly doing this to every goddamn phone, phone manufacturer, and Android enthusiast.
The only silver lining is that whenever Google decides that unregulated social media services like Lemmy are not family-safe I won’t have to listen to your malicious horseshit.
Seems you don’t care about grandmas & gen z.
They can manage.
So casual users can get wrecked, yet I’m malicious? Maybe think of users other than yourself, weigh the potential losses to them by successful attacks, and consider whether OS designers have a legitimate claim in preventing exposure of known threats to casual users while still allowing power users to bypass those checks.
You’re assuming I use an Android app (trash) to get on here, and not a proper workstation or web browser. You’re welcome to this “malicious horseshit” for eternity.
In reality this is useless given the technical capabilities (or access to the technology necessary) of nearly every android user. What percentage of them do you think has the capacity and capability to use ADB?
Strictly it ticks the box, however effectively it is sideloading removal. Arguing otherwise honestly makes me think you work for them. It’s such obvious marketing bullshit “Oh, we left this tiny window open to tick the box which people can use, but almost certainly not you and even if you are capable, it’s a pain in the arse”. There are lots of intelligent people in my house. I’m the only one capable of using ADB without enormous effort, making it a deliberately huge barrier and even I’m not going to do it to install a trusted open source app.
Let’s be clear; the only reason they left that little window open was to have people like you say “no, sideloading is still possible” to cover their arses legally and also for actual developers, not because they care about an open ecosystem.
All of them: they can follow procedures, plug a cable, and push buttons if they really want to. Most won’t bother: capacity isn’t willpower.
That’s the idea: welcome to an effective deterrent.
Good, then it’ll deter as designed.
Nah, the use cases are legitimate:
Malicious software on devices connected to everything including highly sensitive information poses high-cost risks that you & casual users overlook because muh inconvenience 😭. If casual users can’t bother with a straightforward procedure as you say, then how prepared are they to handle the real challenges of a successful attack?
From a security perspective, it makes sense for OS designers to choose to limit exposure to that threat to power users who can be expected to at least have a better idea of what they’re getting themselves into.
Google employee confirmed. Absolute trash reasoning verging on trolling it’s so ridiculous. Wild that you arguing so vehemently in favour of reduced access to use your hardware the way you want.
Laughable. You’ve obviously never worked in any kind of customer support role.
Most people are going to melt at the steps necessary to use adb.
By capacity I meant access to hardware. There are so many people in poorer countries out there that don’t have a laptop, permission to start using one for installing adb on it but also have an android phone.
I don’t want an effective deterrent that effectively kills fdroid and the like. That’s the whole point. I’ve favoured android because it’s more open. The talking points in favour of it pale in comparison to the loss of freedom.
Honestly just jog on. Please. It is not a straightforward procedure and my threat model shouldn’t need to include the steps you outline. There are already barriers in place that put off casual users.
The fact that you want people to stop installing open source apps that they trust is honestly deranged. Deranged.
Even the OPLus phones are planning to softlock their phones in newer models
That means nothing when the servers stop taking EU traffic. I get your point, but the real solution here is putting a bullet (double tap) in Chat Control, once and for all.
You can run your own server for signal by the look of it
Not officially I don’t think. And even if you did, you’d need a customized app to point to said server, and then you wouldn’t be interoperable with the regular signal network
I don’t use any of these apps, so I’m not quite sure how they work. But couldn’t you just make an app that keeps a local private and public key pair. Then when you send a message (say via regular sms) it includes under the hood your public key. Then the receiver when they reply uses your public key to encrypt the message before sending to you?
Unless the sms infrastructure is going to attempt to detect and reject encrypted content, this seems like it can be achieved without relying on a server backend.
That makes the assumption you want to use your phone number at all. And I’m sure the overhead of encryption would break SMS due to the limits on character counts.
Can’t use Signal without a phone number.
You CAN use it to interact with people without them knowing your number. The only current requirement is specific to registration.
That is how the signal protocol works, it’s end to end encrypted with the keys only known between the two ends.
The issue is that servers are needed to relay the connections (they only hold public keys) because your phone doesn’t have a static public IP that can reliably be communicated to. The servers are needed to communicate with people as they switch networks constantly throughout the day. And they can block traffic to the relay servers.
I think they’re suggesting doing it on top of SMS/MMS instead of a different transport protocol, like Signal does, which is IP based
Which is what Textsecure was. The precursor to Signal. Signal did it too, but removed it because it confused stupid people.
Signal does have a censorship circumvention feature in the advanced settings on iOS which may work when this hits provided you already have the app installed. Never had to use it though.
<img alt="" src="https://lemmy.conorab.com/pictrs/image/04ae97c7-f60a-46f8-b062-ee9dbe590250.png">
I think SimpleX removes the need for static relays.
It was so hard getting people to use signal im imagining thisll never catch on
It is potentially doable:
A short message is 140 bytes of gsm7-bit packed characters (I.e. each character is translated to “ascii” format which only take up 7-bit space, which also is packed together forming unharmonic bytes), so we can probably get away with 160 characters per SMS.
According to crypto.stackexchange, a 2048-bit private key generates a base64 encoded public key of 392 characters.
That would mean 3 SMSs per person you send your public key to. For a 4096-bit private key, this accounts to 5 SMSs.
As key exchange only has to be sent once per contact it sounds totally doable.
After you sent your public key around, you should now be able to receive encrypted short messages from your contacts.
The output length of a ciphertext depends on the key size according to crypto.stackexchange and rfc8017. This means we have 256 bytes of ciphertext for each 2048-bit key encrypted plaintext message, and 512 bytes for 4096-bit keys. Translated into short messages, it would mean 2 or 4 SMSs for each text message respectively, a 1:2, or 1:4 ratio.
Hope you have a good SMS plan 😉
That’s how signal started way back. Doesn’t work well - sms is terrible.
Yes, please.
LOL, no. They’ll come back again with some other bullshit to Save the Children!™, it’s a never-ending whack-a-mole.
And they only have to win once, we have to fight and win every time they introduce a new variant. Its exhausting.
We need to get the right to privacy and control over our own devices enshrined as fundamental rights, like so many other rights the EU protects.
Signal has never done that. Whilst the app might not be available in some regions they’ve been proud to talk about how people can use it to avoid government barriers.
The CEO is saying they are willing to, that should be taken seriously.
I have become convinced by Cory Doctorow’s (tech writer and inventor of the term “enshittification”) argument that the fact that we’re even discussing this in terms of “sideloading” is a massive win for tech companies. We used to just call that “installing software” but now for some reason because it’s on a phone it’s something completely weird and different that needs a different term. It’s completely absurd to me that we as a society have become so accustomed to not being able to control our own devices, to the point of even debating whether or not we should be allowed to install our own software on our own computers “for safety.” It should be blatantly obvious that this is all just corporate greed and yet the general public can’t or refuses to see it.
TBH I was confused when I came across the term “sideloading” for the first few times because I thought it was something new. Part of the plan I guess. Damn.
Most of the general public buries their head in the sand. They are convinced being politically involved is either a waste of time or makes you crazy.
Tbf both are true.
Source: I have gone mad and everything has only become worse.
There are groups to support:
And in the UK:
Some political groups are better than others, but most politicians are clueless.
The key is to get muggles to understand we are living in Technofeudalism and why being digital serfs is bad. The problem is ineffective competition law and that monopolies are bad. That monopolies and standards are not the same thing. I have no idea how. Most people are just naturally compliant and unquestioning of something seemingly so abstract.
In the 80’s (I’m that old), many home computers came with the programming manual, and the impetus was to learn to code and run your programs on your own device. Even with Android it’s not especially hard (with LLM’s even less so than it used to be) to download Android Studio, throw some shit onto the screen, hit build, and run your own helper app or whatever
sideloadedinstalled via usb cable (or wirelessly) on your own device.In certain cases (cars, health related hw etc.) I get why it’s probably for the best if the user is not supposed to mod their device outside preinstalled sw’s preferences/settings. But when it comes to computers (i.e. smartphones, laptops, tablets, tv boxes etc.) I fully agree with Cory here. Such a shame everything must go to shit.
About freedom, not freedom and various other things - might want to extend the common logic of gun laws to the remaining part of the human societies’ dynamics.
Signal is scary in the sense that it’s a system based on cryptography. Cryptography is a reinforcement, not a basis, if we are not discussing a file encryption tool. And it’s centralized as a service and as a project. It’s not a standard, it’s an application.
It can be compared to a gun - being able to own one is more free, but in the real world that freedom affects different people differently, and makes some freer than the other.
Again, Signal is a system based on cryptography most people don’t understand. Why would there not be a backdoor? Those things that its developers call a threat to rapid reaction to new vulnerabilities and practical threats - these things are to the same extent a threat against monoculture of implementations and algorithms, which allows backdoors in both.
It is a good tool for people whom its owners will never be interested to hurt - by using that backdoor in the open most people are not qualified to find, or by pushing a personalized update with a simpler backdoor, or by blocking their user account at the right moment in time.
It’s a bad tool even for them, if we account for false sense of security of people, who run Signal on their iOS and Android phones, or PCs under popular OSes, and also I distinctly remember how Signal was one of the applications that motivated me to get an Android device. Among weird people who didn’t have one then (around 2014) I might be even weirder, but if not, this seems to be a tool of soft pressure to turn to compromised suppliers.
Signal discourages alternative implementations, Signal doesn’t have a modular standard, and Signal doesn’t want federation. In my personal humble opinion this means that Signal has their own agenda which can only work in monoculture. Fuck that.
I think you may need some sleep man. wtf are you talking about
Perhaps you need to get some sleep if you don’t understand what I’m talking about.
I get it messenger = gun wow i didnt know!
Holstering my phone now thanks
Unironically yes, communications (information and roads) were historically as important. Lenin’s call to “take post, telegraph, telephone stations, bridges and rail stations” kinda illustrates that.
What I meant is that abstractly having fully private and free communications is just as universally good as everyone having a drone army. In reality both have problems. The problems with weapons are obvious, the problems with communications in my analogy are not symmetric to that, but real still - it’s that people can be deceived and backdoors and traps exist. Signal is one service, application and cryptographic system, it shouldn’t be relied upon this easily.
It’s sometimes hard to to express things based only on someone with good experience telling them to me, making it an appeal to anonymous authority, but a person who participated in a project for a state security service once told me that in those services cryptography is never the basis of a system. It can only be a secondary part.
Also, other than backdoors and traps, imbalance exists. Security systems are tools for specific purposes, none are universal. 20 years ago anonymity and resilience and globalism (all those plethora of Kademlia-based and overlay routing applications, most of which are dead now) were more in fashion, and now privacy and political weight against legal bans (non-technical thing, like, say, the title of the article) are. The balance between these in popular systems determines which sides and powers lose and benefit from those being used by many people. In case of Signal the balance is such that we supposedly have absolute privacy and convenience (many devices, history), but anonymity, resilience and globalism are reduced to proverbial red buttons on Meredith Whittaker’s table.
Unfortunately, I don’t get most of your refetences, but sure you can find similarities in wildy different things.
Signal being easy to rely on is its biggest benefit. No one will adopt something that’s more complex, but I don’t think extra complexity would offer better security for the average person. More complexity just means more things to go wrong.
People can be deceieved anywhere in their life, this isn’t synonymous to an end to end encrypted chat.
Backdoors do exist and they are obviously bad, but Signal choosing to leave the market before implementing one sounds best to me.
Obviously I’m no smarter than this person, but without cryptography how is any “secure” project actually “secure”. The only thing more important that I can imagine would be the physical location of a server (for example) being highly protected from bad actors.
In the end, I personally think having an easy to use platform that is secure gives everyone amazing power to recoup their free speech wherever is it eroded.
My concerns on this are more that acceptable share in something in the internetworked world seems to be in percentages far smaller than the usual common sense percentages. Like - there are political systems with quotas, and there are anti-monopoly regulations, but with computers and the Internet every system is a meta-system. Allowing endless supply of monopolies and monocultures.
Signal is so easy to rely, that if you ask which applications with zero-knowledge cryptography and reliable groupchat encryption and so on people use, that are available without p2p (draining battery and connectivity requirements), with voice calls and file transfers, it’ll be mostly Signal.
Doesn’t matter it’s only one IM application. In its dimension it’s almost a monoculture. One group of developers, one company, one update channel. An update comes with a backdoor and it’s done.
It’s not specifically about Signal, rather about the amount of effort and publicity that goes into year 2002 schoolgirl’s webpage is as much as any separate IM application should get, if we want to avoid dangers with the Internet which don’t exist in other spheres. And they usually get more. The threshold where something becomes too big with computers is much smaller than with, I don’t know, garden owner associations.
Even if there are already backdoors put by their developers in a few very “open”, ideologically nice and friendly and “honorable” things like Signal, then such backdoors can exist and be used for many years before being found.
I mean, there are precedents IRL, and with computers you are hiding the needle in a much bigger hay stack.
I’m bloody certain you are smarter than this person in everything not concerning things they were directly proficient in. And while being an idiot, they would stuck their nose into everything not their concern in very dangerous (for others, not for them) ways.
There are security schemes, security protocols, security models, and then there is cryptography as one kind of building blocks, with, just like in construction materials, its own traits and behavior.
And I think the moment anything specific and controlled by one party becomes popular enough to be a platform, we’re screwed and we’re not secure.
Reminds of SG-1 and the Goauld (not good guys, I know) adjusting their spawn genome for different races.
Perhaps something like that should be made, a common DSL for describing application protocols and maybe even transport protocols, where we’d have many different services and applications, announcing themselves by a message in that DSL describing how to interact with them. (Also inspired by what Telegram creators have done with their MTProto thing, but even more general ; Telegram sometimes seems something that grew out of an attempt to do a very cool thing, I dunno if I was fair saying bad things about Durov on the Internet.)
A bit like in Star Wars Han Solo and Chewbacca speak to each other.
And a common data model, fundamentally extensible, say, posts as data blobs with any amount of tags of any length, it’s up to any particular application to decide on limits. Even which tag is the ID and how it’s connected to the data blob contents and others tags is up to any particular application. What matters is that posts can be indexed by tags and then replicated\shared\transferred\posted by various application protocols.
It should be a data-oriented system, so that one would, except for latency, use it as well by sharing post archives as they would by searching and fetching posts from online services, or even subscribing to posts of specific kind to be notified immediately. One can imagine many kinds of network services for this, relay services (like, say, IRC), notification services (like, say, SIP), FTP-like services, email-like services. The important thing would be that these are all transports, all variable and replaceable, and the data model is constant.
There can also be a DSL that describes some basics on how a certain way of interpreting posts and their tags works and which buttons, levers and text fields it presents, kinda similar to how we use the Web. It should be a layer above the DSL that would describe verifica
I am starting to agree with the new point. I still think everyone should move to Signal for now because it works and works well, but I see your point that one authority can become dangerous if any one malicious party in power tried anything.
There are probably solutions that could exist because it’s open source (eg a different trusted entity like f-droid managed builds from source for example so Signal themselves can’t add extra code in their builds or just a way to verify that no extra code is present in signals build vs any build from source).
In the future, I would prefer we moved to something more decentralised like what the Matrix protocol is trying to achieve. This could come with further issues, but while those are fixed, Signal is my main go to.
With Matrix I believe we would end up with pretty much the common data models as you were mentioning. Anyone can build their own server and or client and interact with others, knowing at least their software is safe.
I don’t think you understand anything you wrote about. Signal is open source, is publicly audited by security researchers, and publishes its protocol, which has multiple implementations in other applications. Messages are encrypted end-to-end, so the only weaknesses are the endpoints: the sender or recipients.
Security researchers generally agree that backdoors introduce vulnerabilities that render security protocols unsound. Other than create opportunities for cybercriminals to exploit, they only serve to amplify the powers of the surveillance state to invade the privacy of individuals.
I don’t think you should comment on security if “open source” means anything to you in that regard. For finding backdoors binary disassembly is almost as easy or hard as looking in that “open source”. It’s very different for bugs introduced unintentionally, of course.
Also why the hell are you even saying this, have you looked at that source for long enough? If not, then what good it is for you? Magic?
I suppose you are an illustration to the joke about Raymond’s “enough eyeballs” quote, the joke is that people talking about “enough eyeballs” are not using their eyeballs for finding bugs\backdoors, they are using them and their hands for typing the “enough eyeballs” bullshit.
“Given enough good people with guns, all streets in a town are safe”. That’s how this reads for a sane person who has at least tried to question that idiotic narrative about “open source” being the magic pill.
Stallman’s ideology was completely different, sort of digital anarchism, and it has some good parts. But the “open source” thing - nah.
Exactly, and it’s not audited by you, because you for the life of you won’t understand WTF happens there.
Yes, it’s being audited by some security researchers out there, mostly American. If you don’t see the problem you are blind.
No, there are no multiple implementations of the same Signal thing. There are implementations of some mechanisms from Signal. Also have you considered that this is all fucking circus and having a steel gate in a flimsy wooden fence? Or fashion, if that’s easier to swallow.
Can you confidently describe what zero-knowledge means there, how is it achieved, why any specific part in the articles they’ve published matters? If you can’t, what’s the purpose of it being published, it’s like a schoolboy saying “but Linux is open, I can read the code and change it for my needs”, yeah lol.
Do security researches have to say anything on DARPA that funds many of them? That being an American military agency.
And on how that affects what they say and what they don’t say, what they highlight and what they pretend not to notice.
In particular, with a swarm of drones in the sky at some point, do you need to read someone’s messages, or is it enough to know that said someone connected to Signal servers 3 minutes ago from a very specific location and send one of those drones. Hypothetically.
Oh, the surveillance state will be fine in any case!
And cybercriminals we should all praise for showing us what the surveillance state would want to have hidden, to create the false notion of security and privacy. When cybercriminals didn’t yet lose the war to said surveillance state, every computer user knew not to store things too personal in digital form on a thing connected to the Internet. Now they expose everything, because they think if cybercriminals can no longer abuse them, neither can the surveillance state.
Do you use Facebook, with TLS till its services and nothing at all beyond that? Or Google - the same?
Now Signal gives you a feeling that at least what you say is hidden from the service. But can you verify that, maybe there’s a scientific work classified yet, possibly independently made in a few countries. This is a common thing with cryptography, scientific works on that are often state secret.
You are also using AES with NSA-provided s-boxes all the time.
I suggest you do some playing with cryptography in practice. Too few people do, while it’s very interesting and enlightening.
Anyone can look at the source, brah, and security auditors do.
Are you in the dark ages? Beyond code review, there are all kinds of automations to catch vulnerabilities early in the development process, and static code analysis is one of the most powerful.
Analysts review the design & code, subject it to various security analyzers including those that inspect source code, analyze dependencies, check data flow, test dynamically at runtime.
Right, the protocol.
Stop right there: I don’t need to. It’s wide open for review by anyone in the public including independent security analysts who’ve reviewed the system & published their findings. That suffices.
They don’t. Again, anyone in the public including free agents can & do participate. The scholarly materials & training on this aren’t exactly secret.
Information security analysts aren’t exceptional people and analyzing that sort of system would be fairly unexceptional to them.
Even with state-level resources, it’s pretty well understood some mathematical problems underpinning cryptography are computationally beyond the reach of current hardware to solve in any reasonable amount of time. That cryptography is straightforward to implement by any competent programmer.
Legally obligating backdoors only limits true information security to criminals while compromising the security of everyone else.
I do agree, though: the surveillance state has so many resources to surveil that it doesn’t need another one.
In short - something “everyone being able to look upon” is not an argument. The real world analogies are landmines and drug dealers and snake oil.
You are not speaking from your own experience, because which problems are solved and which are not is not solely determined by hardware you have to do it by brute force. Obviously.
And nation states can and do pay researchers whose work is classified. And agencies like NSA do not, for example, provide reasoning for their recommended s-boxes formation process. For example.
Solving problems is sometimes done analytically, you know. Mostly that’s what’s called solving problems. If that yields some power benefits, that can be classified, you know. And kept as a state secret.
People putting those in are also not in the dark ages.
There are things which were wide open for review by anyone for thousands of years, yet we’ve gotten ICEs less than two centuries ago, and electricity, and so on. And in case of computers, you can make very sophisticated riddles.
So no, that doesn’t suffice.
Oh, denial.
There have been plenty of backdoors found in the open in big open source projects. I don’t see how this is different. I don’t see why you have to argue, is it some religion?
Have you been that free agent? Have you participated? How do you think, how many people check things they use? How often and how deeply?
Yes, but you seem to be claiming they have eagle eyes and owl wisdom to see and understand everything. As if all of mathematics were already invented.
It’s not about obligating someone. It’s about people not working for free, and those people working on free (for you) stuff might have put in backdoors which it’s very hard to find. Backdoors usually don’t have the “backdoor” writing on them.
Perhaps the reason they have so many resources is that they don’t miss opportunities, and they don’t miss opportunities because they have the resources.
You sound paranoid but it doesn’t mean you aren’t right, at least to some extent.
So what’s your solution for secure messaging?
Getting rid of monoculture via transports and cryptography being pluggable (meaning that the resulting system would be fit for sneakernet as well as for some kind of federated relays as well as something Kademlia-based, the point is that the common standard would describe the data structure, not transports and verification and protection).
that’s a lot of words to say you generally accuse any programm that isn’t federated of having an agenda targeted at its userbase.
And lots of social woo-woo that doesn’t extend much further than “people don’t understand cryptography and think it’s therefore scary”.
A pretty weird post, and one which I don’t support any statement from because I think you’re wrong.
No, that’s not what I’m saying. I used the word monoculture, it’s pretty good.
Not that. Rather “people don’t understand cryptography, but still rely upon it when they shouldn’t”.
I mean, you’ve misread those two you thought you understood.
Using mono ulture as a word doesn’t change the meaning here. If anything, its a pathway for the foal you ascribe.
I do give you credit about the second part - it would be better to have your own private key in chat apps, which isn’t handled by the app itself, at the very least to establish a shared key. I still think the existence of crypto is a massive boon to many, even in a “flawed” implementation with the “control” being on the side of corporations - tho if they are smart, they’d never store the keys themselves, not even hashes. Unless you’re part of the signal project, I doubt you know the exact implementation and storage of data they do.
Still, thanks for summarising your lengthy post, even if I had to bait you into it. Sometimes, brevity is key.
Of course it does. Federation can be a monoculture too (as it is with plants). A bunch of centralized (technically federated in IRC’s case, but united) services, like with IRC, can be not a monoculture.
Monoculture is important because one virus (of conspiratorial nature, like backdoors and architectures with planned life cycle, like what I suspect of the Internet, or of natural one, like Skype’s downfall due to its P2P model not functioning in the world of mobile devices, or of political and organizational one, like with XMPP’s standards chaos and sabotage by Google) can kill it. In the real world different organisms have sexual procreation, as one variant, recombining their genome parts into new combinations. That existed with e-mail when it worked over a few different networks and situations and protocols, and with Fidonet and Usenet, with gateways between these. That wasn’t a monoculture.
Old Skype unfortunately was a monoculture. Its clients for Linux (QT) and Windows and mobile things were different implementations technically, but with the same creators and one network and set of protocols in practice.
That’s the problem, it’s not. You should factor psychology in. People write things over encrypted channels that they wouldn’t over plaintext channels. That means it’s not just comparison of encrypted versus plain, other things equal.
And that’s another problem, no. Crooks only steal your money, and they have adjusted for encryption anyway. They are also warning you of the danger, for that financial incentive. Like wolves killing sick animals. The state and the corporation - they don’t steal your money, they are fine with just collecting everything there is and predicting your every step, and there will be only one moment with no warning then you will regret. That moment will be one and the same for many people.
What matters is that the core of their system is a complex thing that is magic for most people. You don’t need to look any further.
EDIT:
Yeah, I just woke up with sore throat and really bad mood (dog bites, especially when the dog was very good, old and dying, hurt immunity and morale).
XMPP was sabotaged by google (and meta) but is still alive and well.
It was intended as an ICQ replacement, and its advocates even managed to sell it as that for many normies. It became supported, with federation or not, by many email service providers, social networks, and so on. Then that support mostly vanished. Its users percentages are not inspiring.
I think you mean IRC replacement.
Both. In my surroundings QIP was popular, a Jabber client with an ICQ gateway added from the start or something like that (maybe it just was a client of both). And the whole “roster with buddies and IM windows” thing was definitely more ICQ than IRC inspired.
Haha! Do it if the EU does not give up on their Orwellian control!
Wait, I’m in the EU and I use Signal!
Basically, but what you forget is that Signal is also the standard for every Politician for their group chats because it’s secure, so the idea that they might lose their secure, leak-free* form of communication should worry MEPs and other politicians into taking action. Will it? I don’t know, politicians are very stupid when it comes to tech it seems.
* Baring screenshots
where are the companys lobbying against this btw?? i mean it is their data they will be leaked aswell
That is also a good point. Generally this is dangerous for all and sundry.
Why would they care about leaks? I guess that’s some missed profit on selling the data, but that’s only if there’s a breach.
i mean like company espionage and similar threats than leaked user data…
If they can lobby they’ll just lobby to get all their competitors’ data after it passes.
Screenshots, or just adding a journalist to the group chat.
no software can prevent PEBKAC errors. It’s like locking a door and then giving the key to a thief and being shocked when people steal your shit
There’s an explicit clause that exempts politicians from the ban. They get privacy because they need it, but nobody else does.
Ooh how convenient.
That’s completely backwards. The the extent we give people authority, they must also accept monitoring, so we know they aren’t abusing that authority.
They are so so so stupid, about this.
There will be so much blackmail and ruined political careers if these backdoors get installed.
A backdoor is never solely used by the folks one might hope would use it.
I’m sure some poor Civil Servant has had to sit one of them down and explain why it’s a bad idea to them, only be told to stfu with the most stupid excuse ever, leading to them putting their head in their hands and sobbing.
Why are so many European countries doing this? Why the sudden push for chat control and internet restriction laws?
It’s understandable from law enforcement perspective that it’s important to snoop on actual criminal communications. The EU has pretty reasonable measures and good at cracking down on continental-wide criminal activities. However, can we trust authorities that they won’t over reach with the chat control and violate privacy and freedom of speech? Like, come on, nothing good ever came from spying on communications. Catching criminals and/or terrorists is a convenient excuse to spy on dissidents.
We’ve seen it happen in America with the PATRIOT Act. People dismissed the opposition to it with “nothing to hide” thought terminating cliche, or accuse you of pedophile or terrorist for not wanting spying on communications. Then twenty years later, Americans have a fascist government who allowed a corporate asshole to steal information from the federal government. And those information will be used for surveillance capitalism. The same will happen to us in the EU if we don’t push back hard on this Orwellian desires of politicians.
I mean lol, they require a phone number to sign up, which you can only get with an ID in many countries. You chat with a gestapo officer and they know where you life.
Signal IS GARBAGE. Fucking garbage article, gaslighting bullshit. Fuck this timeline. Honestly this article is fucking terrorism.
Why are you giving gestapo your phone number instead of your username?
Try to think a bit before you post
I think it’s quite a good question to be honest - you can keep your phone number private from all your signal contacts and has been able to since early 2024.
thehackernews.com/…/signal-introduces-usernames-a…
A regular captain of industry for whom the automatic searchers of his zero-privacy messenger is one step too far lol
Even if signal was insecure and had no privacy (which it it secure and private), wouldn’t you still prefer people needed a warrant or some form of document that had to go through the court before your messages could be read?
Obviously not. Think about supply and demand. Because a toxic product is being hailed as secure there isn’t enough demand for an actually anonymous and private messenger. So calling signal “secure” is just helping state security.
If you actually want to message about revolutionary (illegal, “terrorist”) activity and don’t want to be traced immediately by an agent of state security or an informant, Signal offers nothing (unless you use criminal activity like identity theft). In such a case a warrant will obviously be granted and they can immediately find and arrest you.
Can you see the logic how Signal isn’t secure at all for an actual dissident?
Supply and demand: There are seemingly new messenging services that pop up every day, so I’m not sure why you think Signal existing is stopping progress. It isn’t.
Security: For 99.9% of people, the security and privacy granted through using Signal is amazing and it is worthy of being called secure. I mean it’s secure enough for government officials to trust using. With how Signal is currently, an official data request from the government for Signal data returns pretty much nothing except the phone number used (and that they have signed up for signal ofc), which is great.
I think ‘revolutionaries’ (protestors) are already using Signal. I haven’t heard of any cases where something has gone wrong for them, but again, there’s no way for your messages to be read unless they get access to your phone (if you are smart you will make sure your messages auto delete and that you lockdown or shutdown your phone incase of arrest).
I can’t see how Signal isn’t safe for anyone.
You are confusing security with privacy. But keep on ranting if you like.
i would even more say with anonymity, considering the chats are still private and the main use case for messanger apps is to communicate with people who know who you are
There can be no security without privacy and a central server that can be extorted. But keep lying if you like.
Central server only gives you metadata. There are alternative clients if you don’t trust the official one.
peek rage bait
You’re confusing privacy with anonymity.
.
Jesus lad relax.
Just let it happen
I hope more follow, would be funny if “all chat apps have to include a back door” leads to “there are no official chat apps”
Do you really think Meta would ignore the opportunity to both be the default option And have justification to read users’ messages?
Nah i don’t. I can hope though and the backdoor is a threat not just for consumers but also companies.
Separate airgapped device running an encryption app. Type text on it, it spits out a ciphertext, then, use internet connected device to scan the ciphertext, OCR*, then send to target receipient, they also use this same airgap encryption device and they OCR, then decrypt using their key.
*Instead of OCR, you could also use a QR code to have error correction
Tell me how they can ban this? Anyone using a raspberry pi with a battery and touch display attached into one compact thing, is a criminal?
What if we just start using One Time Pad? Can they ban that?
Steganography?
Like seriously, how do you even stop “criminals” using steganography?
So, to Big Gov, here’s my question: Are you gonna ban talking to other people becuause criminals also talk to other people?
They don’t care about your messages, they don’t care about terrorists or pedophiles.
They do care about the general population, and wants to control it. That’s what this is all about. The hard right wants to have effective tools to slam down on dissent when they get in power.
A game as old as humanity.
Shameless plug, because I’m trying to do my part ☺️ : Tenfingers sharing
If the law is implemented, I would selfhost my own chat server. I don’t see this as Signal fault.
But everybody can`t selfhost. That is a problem I am struggling with.
I am now sure what I would do about email, I assume it is affected as well?
If the law is implemented I will self host my own signal proxy and distribute patched apps to those in need
I looked into signal servers some years ago and found nothing, are you meaning like tunnel things to another country?
Yes, just to bypass any blocking
That’s actually a smart idea!
Not more legal or something (if that stupid laws becomes reality) I guess but who cares ☺️.
I already self host my own matrix server. Everybody can’t do that, but everybody can use someone’s matrix server. They can’t shut it down because it’s decentralised and federated. It would theoretically be illegal to use but I don’t see how they would be able to stop it.
Email with PGP would then also be illegal but impossible to effectively stop. That’s why the whole discussion is so stupid. It only hurts the normies. Criminals and tech savvy people will find a way around it and still use encryption without mandated backdoors.
Where would be the loss with that?
We wouldn’t have a simple and secure way of communicating?
The apple/Facebook alternatives are not good at all.
Simplex, xmpp, deltachat, briar, matrix, even session.
Anything is better than signal that relies on a centralised proprietary server and requires a phone number.
Sure, but tell my family that…
Has any of those become like easy to install and use? To be fair I haven’t checked in some time…
With DeltaChat you don’t even need an email address anymore, they provide it for you on the fly. They just ask your name if you (optionally) want to put it.
Can’t be simpler than that tbh.
If you want a better looking ui, check ArcaneChat for Android. It’s 100% compatible with DeltaChat protocol
Simplex is really easy to install and use, unfortunately it’s still kinda buggy, specially with public relays, I personally don’t mind buggy, I’m willing to make sacrifices for the same of freedom and privacy.
I just keep a second chat app as a failback so I can send them a message saying “ur simplex broke again, pls restart”
Xmpp has been stable for decades, tho I guess otr/omemo is hard for family to install, also doesn’t support e2ee calls (or rather, it does, but it’s complicated). But I haven’t used xmpp in a long time.
I hate talkingpoints like that. Sure Signal can be critiqued, but it’s still the best “mainstream” solution we have. And a lot of people would just stop using secure messanger when signal is gone. Including me because what is simplex or matrix worth for me, when no one I know cares to switch?
My personal experience is that if I can convince someone to install signal I can also convince them to install simplex, the process is the same. If I can’t then they aren’t going to use anything but the popular spyware anyway.
I will absolutely try that, but most of the people switched to signal or threema because they already heard about it and could just use it because they already used it for some other contacts (actually this was the most common. They already had an account and the app and just had to use it more) . But I don’t think a lot of people would switch to a messanger they never heard about, just for me.
Well, that is fair, also simplex has some serious bugs which I don’t mind because I value freedom, security and privacy over reliability, but sometimes the app just stops receiving messages until restarted and I need to message them via other means telling them to restart the app.
That hasn’t been true for a while now.
The world does not end after apple or Facebook.
I wish my instance supported down votes, just so that I could down vote this comment.
maybe because of this headline some more politicians change their minds
So now where will Signal go ?
I hate this framing. They don’t “THREATEN” to leave europe.
Europe is about to change laws that makes their product illegal.