I along with a lot of other people have tried to test OMEMO and we’ve all spent some time trying to figure out what is going on with OMEMO on multiple clients and compatibility. I think I know at least one reason why clients on different platforms can’t talk to each other. If you look at my earlier blog post I mentioned fixing an issue with the wrong size initialization vector in my AES-GCM code. I was using 16 instead of the standard 12 bytes. I used conversations to test compatibility and this is an issue/bug in conversations. Conversations can read 16 or 12 but sends the incorrect 16. This is apparently because Chatsecure had/has a bug where it is using 16. This results in the following issues:

  1. If someone is implementing OMEMO and doesn’t know to replicate this bug, it will not work. It is not documented or written down anywhere.
  2. If you are using a library that provides GCM and does not use the 16 byte iv. For example, Apple’s CryptoKit in ios13. OMEMO will not work with conversations or chat secure.

At the moment Monal for iOS and Mac will be updated to make 12 byte ivs. It will still be able to receive 16 byte ivs. This means monal will be able to send/receive to Conversations. It may not be able to communicate with chat secure or other clients that have replicated the bug depending on how it was implemented. Monal for catalyst uses apple’s Cryptokit and will be able to send to conversations but not receive. I have also observed that Beagle sends 16 byte but will accept 12 bytes like Conversations.

In a nutshell, if you are testing OMEMO compatibility across the ecosystem with different clients and platforms, don’t do it. It’s a waste of time until this is resolved by every client.

6 thoughts on “OMEMO is broken in general across the ecosystem :(”

  1. Anu, I can’t thank you enough for not giving up debugging this and digging into this problem. It is too easy to put it aside and continue work on other things. This issue has been going on forever. Now all the clients need to fix this problem and maybe then OMEMO can finally be used reliably.

    This should help the entire XMPP ecosystem. What a great finding. Please give this man a prize.

  2. > I was using 16 instead of the standard 12 bytes.
    What’s the “standard” you’re talking about?

  3. Hello,

    I’ve just read the NIST publication of GCM: https://dx.doi.org/10.6028/NIST.SP.800-38D

    Indeed they recommend, for simplicity of implementation, to use an initialisation vector with a lengtg of 96 bits (12 bytes as you noticed). But the length can be anything up to 2^64 – 1 bits (page 8).

    The GCM algorithm is defined on page 15.

    In the 2nd step, you have to transform the initialization vector in a 128 bits variable named J_0. This transformation is a simple concatenation of 31 zeros and 1 one when the length of the initialisation vector is 12 bytes. Otherwise you have to use a hash function to produce the 128 bits for J_0.

    The the algorithm always use increments of this 128 length J_n.

    So I think that’s absolutely right the GCM implementations work with 16bytes input even if you use a 12 bytes initialisation vector at first.


  4. Hello,

    I’ve checked the GCM paper and it seems that’s not a bug:
    even if the initialization vector length is 12 bytes (which is not required), GCM algorithm normalizes it to a 16bytes-length counter on first steps. Then it always works with a 16bytes-length counter created from the previous one.

    I took these information from the NIST paper: https://nvlpubs.nist.gov/nistpubs/Legacy/SP/nistspecialpublication800-38d.pdf
    The IV requirements is defined on page 8.
    The GCM algorithm is defined on page 15.

  5. > For IVs, it is recommended that implementations restrict support to the length of 96 bits, to promote interoperability, efficiency, and simplicity of design.


    Don’t think it’s a strict requirement there but generally accepted. See for example the RFC for TLS

    > The “nonce” SHALL be 12 bytes long consisting of two parts as follows: (this is an example of a “partially explicit” nonce;


Comments are closed.