Another article, much better and presents in more detail that Olvid was audited on an older version and chosen because it was French and they applied for it (French) https://www.numerama.com/tech/1575168-pourquoi-les-ministres-vont-devoir-renoncer-a-whatsapp-signal-et-telegram.html

Google translate link original post : https://www-lepoint-fr.translate.goog/high-tech-internet/les-ministres-francais-invites-a-desinstaller-whatsapp-signal-et-telegram-29-11-2023-2545099_47.php?_x_tr_sl=fr&_x_tr_tl=en&_x_tr_hl=fr&_x_tr_pto=wapp

The translation has some mistakes but good enough to understand the context.

Here is a short summary :

Olvid passed a 35d intrusion test by Anssi (French cybersecurity state organisation) experts or designated experts, with code examination without finding any security breach. Which is not the case of all other 3 messaging apps (either because they didn’t do any test, or because they didn’t pass).

This makes WhatsApp, signal and telegram unreliable for state security.

And so government members and ministerial offices will have to use Olvid or Tchap (French state in house messaging app).

More detail in the article.

  • spiderkle@lemmy.ca
    link
    fedilink
    English
    arrow-up
    56
    arrow-down
    6
    ·
    edit-2
    1 year ago

    Well that was the dumbest explanation ever, that’s basically just political pretext to give the government contract to some french company. Potentially there has been some lobbying going on.

    Signal doesn’t store it’s encryption/decryption keys in the cloud, so you would need the devices and then you would still have to decrypt content if the user doesn’t give you access manually.

    To crack a 128-bit AES key, it would take 1 billion billion years with a current supercomputer. To crack a 256-bit AES key, it would take 2^255 / 2,117.8 trillion years on average.

    So until some amazing quantum computer comes along, this is pretty safe. Fuck Olvid.

    • jet@hackertalks.com
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      1 year ago

      Signal does store the decryption keys in the cloud. Using their SGX enclaves. Which have their own issues. Signal SVR I believe they call it.

      You can turn off signal pins, which still stores the decryption keys in the cloud, but then they’re signed with a very long pin which is good enough.

      From a government perspective, signals a no-go, the SGX enclaves are completely exploitable at the state actor level. You just have to look at all of the security vulnerabilities to date for SGX enclaves.

      • stimut@aussie.zone
        link
        fedilink
        English
        arrow-up
        2
        ·
        1 year ago

        Do you have a reference for Signal using SGX for keys?

        Everything I could find was about metadata and private data, e.g. contact lists (which is what the SVR thing that you mention is), but nothing about keys.

        • jet@hackertalks.com
          link
          fedilink
          English
          arrow-up
          7
          ·
          edit-2
          1 year ago

          https://signal.miraheze.org/wiki/Secure_Value_Recovery

          https://github.com/signalapp/SecureValueRecovery2

          If you want to do an empirical test, create a signal account set a pin. Send a message to someone. Then delete signal. Recreate the account using the same phone number, recover using the pin and send a message. The receiver of that message will not get a warning that the signing key has changed.

          The only way that’s possible is if the key, or a derived key, is recoverable from the network. That is de facto proof that the keys or a key generation mechanism is in the cloud. Which is probably fine for personal communication.

          But if I’m a nation state, this represents a significant security risk, especially when you’re worried about another nation-state peaking at your communication. I.e France is buddy buddy with the US, but they probably don’t want the US to read all of their internal communication.

          SGX https://en.m.wikipedia.org/wiki/Software_Guard_Extensions

          https://sslab-gatech.github.io/sgx101/pages/attestation.html

          SGX is a inside chip secure enclave created by Intel, a company headquartered in the United States, that uses key management, and signing keys from Intel. Intel could be compelled by domestic intelligence to provide their keys, or to sign certain enclave software. I’m not saying it’s happened, but I am saying this is part of the risk assessment a nation state would use to evaluate a messaging platform

          So a nation state attack might look something like this: Intel is compelled to release their signing keys, the signal user enclave is copied, using the combination of both of these a special SGX environment will be set up to brute Force the passwords, with no limit. The limit will be removed, and it will operate in the SGX environment, and brute forcing a six-digit pin is trivial if you’re not rate limited. This is just one possibility, SGX has at least nine known side channel attacks at the moment, I’m sure there’s more that haven’t been published.

          • stimut@aussie.zone
            link
            fedilink
            English
            arrow-up
            2
            ·
            1 year ago

            Interesting, thanks for that.

            The first link you posted states that the master key is stored. It also states that the information on the page doesn’t match the official blog from Signal, but that they’ve gathered their information from the source code, so I assume it’s correct. It does make me wonder why Signal doesn’t say that they store the master key though 🤔

            • jet@hackertalks.com
              link
              fedilink
              English
              arrow-up
              1
              ·
              1 year ago

              You don’t have to trust blogs, do the experiment yourself, make a new signal account, send a message, set a pin, delete the app, reinstall, recover from pin, and send a message again… the signing key doesn’t change. That is proof the key is in the cloud.

              Signal DOES say its in the cloud, but they use the Corporate partial truth… SVR is for “personal data” … which the key is. They don’t emphasis it, because its such a bad idea, when they implemented this there was a big security online outrage… which seems to have died down.

              Signal is a good enough protocol for daily use, but not good enough for nation states, or the truly security conscious. Signal is a step in the path to federated democratic private communication but not the destination.

  • ∟⊔⊤∦∣≶@lemmy.nz
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    3
    ·
    1 year ago

    I don’t know much but what I do know is when a government endorses a secure messaging service, it’s definitely not secure.

    • bamboo@lemm.ee
      link
      fedilink
      English
      arrow-up
      4
      ·
      1 year ago

      They’re using it themselves, not forcing citizens to use it. It’s when they force citizens to use an app they claim is secure that I am distrustful. I would assume their intentions are more pure when it’s their own state security rather than their citizens’ privacy.

    • lemmyvore@feddit.nl
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      1
      ·
      edit-2
      1 year ago

      Are they? If you want to know if something is secure enough to use then not being able to examine the code should obviously disqualify it.

      • sudoshakes@reddthat.com
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        4
        ·
        1 year ago

        Sure it does, but that doesn’t make it bad.

        Open source code is not the only solution to secure communication.

        You can be extremely secure on closed source tools as well.

        If they found specific issues with Signal aside from not being allowed to freely inspect their code base, I suspect we would be hearing about it. Instead I don’t see specific security failings just hat it didn’t make the measure for their security software audit.

        As an example of something that is closed source and trusted:

        The software used to load data and debug the F-35 fighter jet.

        Pretty big problem for 16 countries if that isn’t secure… closed source. So much s you can’t even run tests against the device for loading data to the jet live. It’s a problem to sort out, but it’s an example of where highly important communication protocols are not open source and trusted by the governments of many countries.

        If their particular standard here was open source, ok, but they didn’t do anything to assure the version they inspected would be the only version used. In fact every release from that basement pair of programmers could inadvertently have a flaw in it, which this committee would not be reviewing in the code base for its members of parliament.

        • lemmyvore@feddit.nl
          link
          fedilink
          English
          arrow-up
          9
          ·
          1 year ago

          Lol at military stuff being secure. Most often it’s not, it’s just hidden. There was an Ars Technica article about the “secure” devices used at military bases being full of holes for example: https://arstechnica.com/security/2023/08/next-gen-osdp-was-supposed-to-make-it-harder-to-break-in-to-secure-facilities-it-failed/

          When code is hidden all you know for sure is that you don’t know anything about it. You certainly can’t say it’s secure.

          If a piece of code or a system is really secure then it does not care if the code is open because the security lays in the quality of its algorithms and the strength of the keys.

        • Tibert@jlai.luOP
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          1
          ·
          edit-2
          1 year ago

          Well let’s give some counter examples in the softwares I mentioned :

          • WhatsApp closed : Owned by Facebook. Well Facebook had multiple data leaks, privacy violations and nothing substantial was done about it. Definitely not trustable (also zero days are getting sold on the black market for WhatsApp (https://techcrunch.com/2023/10/05/zero-days-for-hacking-whatsapp-are-now-worth-millions-of-dollars/ ).

          • Telegram closed : not end to end encrypted. Russian app. Not trustable.

          • Signal open : well this one is e to e encrypted. Open source, maybe could be trusted. Seems to have passed some security audits (https://community.signalusers.org/t/overview-of-third-party-security-audits/13243), tho it’s based in the US and uses servers, maybe the US may have super computers capable of decrypting such communications. However is signal has switched their encryption to quantum computer resistance it may be too hard even for a state actor. However they also “debunked”/ignored zero-day reports which were not reported through their own tool, and by asking the US for confirmation. I am not sure if the US can be trusted to give confirmation about the existance or not of vulnerabilities when they are very likely to use them (https://thehackernews.com/2023/10/signal-debunks-zero-day-vulnerability.html?m=1).

          • Olvid open (servers closed) : is French, e to e, and backed up by an encryption PhD. And why not use a local messaging app witch also is very secure and open source.

          Notice how closed source is untrusted here. The economic activity of the tool changes how trustable it is. Military équipement has a huge and strict budget, it has to be secure.

          Communication apps are user first. So they do what they can get away with, and that is very true for Facebook.

  • Kalistia@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    1 year ago

    They had Tchap that may not be perfect but is open source (based on matrix/element), hosted in France and already used by 400 000 ppl from the public services… Why pay for a new app? Don’t get it…

    • jet@hackertalks.com
      link
      fedilink
      English
      arrow-up
      11
      ·
      1 year ago

      Honestly at the security level, critical infrastructure, which messaging is, is something every country should have independently. So it makes complete sense for the French government to set up their critical messaging infrastructure inside of France with a French company who cannot be compelled by external intelligence agencies.

  • merde alors@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    9
    arrow-down
    1
    ·
    1 year ago

    on one hand they are trying to illegalize encrypted messaging on the other they’re saying that it’s not secure? 😅

      • suckmyspez@lemmy.world
        link
        fedilink
        English
        arrow-up
        6
        arrow-down
        2
        ·
        1 year ago

        Yup I did see they were on GitHub but when I looked the iOS repository is months (and several releases) out of date.

        I’d expect an open source project to be working in public…not in private and updating their public repositories later down the line

        • hedgehog@ttrpg.network
          link
          fedilink
          English
          arrow-up
          2
          ·
          1 year ago

          Signal isn’t much better in this regard. They certainly don’t work directly in the public repos - they have internal repos that they work from and they push updates from them to the public repos after the fact.

          I’m not sure about the current state but when I looked into it a couple years ago, their client side repos were around a year behind. I recall reading some issues stating that the client was so far behind that the server was refusing to communicate with builds of it.

          • bamboo@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            1 year ago

            Signal’s official policy is that third party clients aren’t permitted, and lacks reproducible builds for their android client. Even if the open source code was up to date, using it without patching it to use a custom server would be a TOS violation.

            • hedgehog@ttrpg.network
              link
              fedilink
              English
              arrow-up
              2
              ·
              1 year ago

              One of the ways Signal doesn’t really feel FOSS that I read about was related to third party clients and the official server. Projects wanted to use forks of their client with the official servers. In one case this was just so they could remove nonfree software. In another they were adding minor features (that Signal would have been free to take back into the main build, since they were under the same license). But Moxie said they couldn’t use their servers, period.

        • satan@r.nf
          link
          fedilink
          English
          arrow-up
          3
          arrow-down
          9
          ·
          edit-2
          1 year ago

          sorry sir, we didn’t realize the world revolves around you. we’ll change it to your liking at once. We’ll run the code by you before we even think of it first

          • JTskulk@lemmy.world
            link
            fedilink
            English
            arrow-up
            7
            ·
            1 year ago

            It’s not him, it’s the public you dingus. Yes, the world actually does revolve around society and rightfully so.

    • Rokk@feddit.uk
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      I feel like for internal government communications you might not want it to be open source.

      Doesnt mean everyone else should want to use it.