Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

to allow for multibase support in the VCDI spec #51

Closed
Closed
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
10 changes: 8 additions & 2 deletions index.html
Original file line number Diff line number Diff line change
Expand Up @@ -229,7 +229,11 @@ <h3>
href="https://w3c.github.io/vc-data-model/#json-web-token">VC-JWT</a>, the <a
href="https://w3c-ccg.github.io/lds-ed25519-2020/">Ed25519 Cryptosuite</a>, and
the <a href="https://w3c-ccg.github.io/lds-ecdsa-secp256k1-2019/">Secp256k1
Cryptosuite</a>. Concrete serializations based on the <a
Cryptosuite</a>. Concrete serializations based on the <a
href=https://datatracker.ietf.org/doc/html/draft-multiformats-multibase-04>Multibase
Data Format</a> and the <a href=https://ns.did.ai/suites/multikey-2021/v1/>Multikey-2021
suite</a> might be provided if the latter is specified and published in time. Concrete
serializations based on the <a
href="https://w3c-ccg.github.io/ldp-bbs2020/">BBS+ Cryptosuite</a> might be provided, if an <a
href="https://identity.foundation/bbs-signature/">IETF BBS+ RFC</a> is published
in time. Concrete serializations for VC-JWP might be provided, if an <a
Expand All @@ -245,7 +249,9 @@ <h3>
<a href="https://w3c.github.io/vc-data-model/#json-web-token">VC-JWT</a>,
<a href="https://w3c-ccg.github.io/lds-ed25519-2020/">Ed25519 Cryptosuite</a>,
<a href="https://w3c-ccg.github.io/lds-ecdsa-secp256k1-2019/">Secp256k1 Cryptosuite</a>,
<a href="https://w3c-ccg.github.io/ldp-bbs2020/">BBS+ Cryptosuite</a>
<a href="https://w3c-ccg.github.io/ldp-bbs2020/">BBS+ Cryptosuite</a>,
<a href="https://ns.did.ai/suites/multikey-2021/">Multikey-2021</a>
Copy link
Member

@kdenhartog kdenhartog Feb 24, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

-1 for me on Multikey format

I don't buy into the idea that the WG should be defining key formats in the VCWG and that should remain separately abstracted from the signature formats.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

because of the choices made by the CCG, the key and signature formats are defined in the same specs. So if you want to include an input document, you are forced to refer to both.

There is also a community perspective that each key representation should be used for 1 signature representation.

This aligns with how P-384 is used with ES384 in JOSE.

I am inclined to agree that it would be better to just leverage JOSE / COSE key and signature types.... this would lead to a larger supply of software providers and better adoptions by legacy systems.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

There is also a community perspective that each key representation should be used for 1 signature representation.

I think this is likely to be overridden in the VC2WG... but again, that's a discussion that we have to have in that group, and in order for us to have that discussion, all perspectives must be inputs into the VC2WG. So, @kdenhartog -- we need to clearly signal that a subset of the group does intend to define the Multikey format. I don't think we need to refer to Orie's Multikey-2021 JSON-LD Context for that since we very clearly call it out already in one of the input documents already:

https://w3c-ccg.github.io/data-integrity-spec/#multikey

... and in other cryptosuite specs that are input documents:

https://w3c-ccg.github.io/di-ecdsa-secpr1-2019/#example-a-multibase-encoded-multicodec-curve-p-384-public-key

https://w3c-ccg.github.io/di-ecdsa-secpr1-2019/#example-a-dataintegritysignature-example-using-a-nistecdsa2019-cryptosuite

The text clearly still needs work, but there is an expectation written in these input documents that we do plan to separate key format from cryptosuite in the general case, but still provide the option to specify key format in cryptosuites to support things like EthereumEip712Signature2020 (which doesn't have a JWK/JWS mapping, nor a Multikey mapping, and probably won't because it's more proof than signature). PGP falls into the same category of not having mappings in either JWK/JWS or Multikey. PEM, while having a JWK mapping (kinda sorta, but not really because many implementations do totally different things there) also falls into this category.

So, in other words @kdenhartog -- we are going to have to define some key formats because JWK doesn't work for everything, and Multikey has clear benefits over JWK (like the ability to express a public key in a very simple an compact manner -- did:key identifiers are public Multikeys.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

that's a discussion that we have to have in that group, and in order for us to have that discussion, all perspectives must be inputs into the VC2WG.

This is the strongest argument for why I think this should be in here, but because we accept the work as a possible path in the charter it doesn't mean it will be a required path to take once we get into the work.

Generally speaking, I've found that by separating the key format from the proof format it becomes easier to rely on a variety of different forms of PKI infrastructure (PGP servers, DIDs, x509 certs, .well-known JWK sets, etc) in a more interoperable way. As long as the public key contains the proper associated metadata (curve/algo/key parameters/usage constraints/etc) the way in which the key material and metadata is formatted is largely irrelevant in my opinion since that becomes an implementation concern. This is because it's relatively easy to convert from one format to another during verification and that's why I'm arguing that key formats should be left out of scope. In this body of work it's my belief that we should be focusing on the proof formats and leave the key formats to the already defined formats that we're done elsewhere including DIDs, x509, JWK sets, and PEM key formats.

Copy link
Member

@msporny msporny Feb 25, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it's relatively easy to convert from one format to another during verification

It might be easy for you personally (@kdenhartog) to convert from every key format to every other key format, but it is not easy for the average developer. I certainly wouldn't trust myself to write that code without some really great test suites and even then, I'd need lots and lots of code review on the library to even remotely feel safe pushing it out to production.

That is, it seems simple enough to slap together some code for a particular use case, but it's no simple feat to take a JWK encoded RSA public key and transform it, properly, into something that a FIPS certified OpenSSL library will use... much less /every key format/ in significant use.

I mean, can you point me to the PEM -> JWK transformation conformance test suite? What about any of the JOSE conformance test suites, can you point me to those? The x509 PEM test suite? I really want you to try to find these things. You might discover that they don't exist... and that all the care that you think has been put into the safety of the cryptography libraries that you trust just might not actually be there.

If we are going to suggest that people do arbitrary key transformations, and that should be out of scope for the WG... we should at least make sure that someone else has the ball on that (I don't think anyone has the ball, there).

The last I checked, you could shove a number of incorrect binary strings at OpenSSL as the "public key" and it would happily accept the input and keep chugging along. My suggestion is that, key formats should be in scope so we can write some test suites for them. If we are going to recommend that people should do arbitrary key transformations, we should create some really solid test suites so that people can know if they're doing those transformations correctly.

it's my belief that we should be focusing on the proof formats and leave the key formats to the already defined formats

What you're saying above seems to be: "I don't think that multikey should exist." ... and if you believe that, then that raises the question on whether or not you feel like did:key should exist. :)

Digital Bazaar finds multikey, and the multiformats specifications in general, incredibly useful. If we don't standardize it, it doesn't look like there is anyone else out there that will do so on a reasonable timeframe.

So, I'm getting increasingly confused about your mental model wrt. key transformation and multikey. Please help me understand your mental model.

Copy link
Member

@kdenhartog kdenhartog Mar 2, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe you're making the assumption that this is going to be normatively defined hence the need for the testing. My assumption is that this is an implementation detail depending on the PKI system that's chosen to be supported. This aligns with the thinking used by Web Crypto where the internal data model for storing keys is different from the formats that are supported in the importKey/exportKey APIs. Essentially this conversion would be a way to fetch the key from a PKI system that stores it and then normalize it into an internal data model of their choosing (could be PEM/JWK/Multikey/CryptoKey from Web Crypto/bespoke made up one) so that they can perform their algorithms as needed.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I believe you're making the assumption that this is going to be normatively defined hence the need for the testing.

Yes, I'm saying that if it's not normatively defined, there will be no testable interoperability, and therefore, we can't depend on it happening in any sort of safe manner. These are security systems, they need to be thoroughly tested.

My assumption is that this is an implementation detail depending on the PKI system that's chosen to be supported.

Never leave critical security infrastructure to "an implementation detail". I know that tends to be a standard practice in today's security industry, but that's wrong, and we should start calling it out. The fact that there is no JOSE JWK/JWS interoperability test suite is a "Really Bad Thing(tm)". We must not keep repeating those mistakes. It's sloppy security hygiene.

Essentially this conversion would be a way to fetch the key from a PKI system that stores it and then normalize it into an internal data model of their choosing

This process needs to be standardized and painstakingly tested if we expect the world to depend on it. The NHTSA (https://www.nhtsa.gov/ratings) exists for a reason, to ensure safety for the general population. The people working on Internet/Web Security need to and can do better.

Copy link
Member

@kdenhartog kdenhartog Mar 3, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yes, I'm saying that if it's not normatively defined, there will be no testable interoperability, and therefore, we can't depend on it happening in any sort of safe manner. These are security systems, they need to be thoroughly tested.

Never leave critical security infrastructure to "an implementation detail". I know that tends to be a standard practice in today's security industry, but that's wrong, and we should start calling it out.

This is literally what your code that I referenced that you and @dmitrizagidulin wrote is doing though. How are you arguing for this and then implementing what I'm arguing for and then suggesting what I'm proposing is wrong? It seems outright contradictory to me.

Copy link
Member

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is literally what your code that I referenced that you and @dmitrizagidulin wrote is doing though. How are you arguing for this and then implementing what I'm arguing for and then suggesting what I'm proposing is wrong? It seems outright contradictory to me.

Because what we (myself and @dmitrizagidulin) are doing, right now, is completely and utterly wrong. Perhaps that's the miscommunication. I think what everyone is doing right now wrt. hand-waving over key translation being an implementation detail, including us, is wrong.

If we are to tell the world that they should depend on key translation, we need a test suite that ensures everyone is doing this correctly. That needs to be fixed, for everyone in the ecosystem.

It's not contradictory, I'm saying all of us, everyone, including myself and @dmitrizagidulin, cannot keep doing what we're doing now, which is doing key translation w/o a community approved test suite to ensure that what we're doing is correct. As an industry, we have to do better. :)

Does that help?

Copy link
Member

@kdenhartog kdenhartog Mar 4, 2022

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I appreciate you clarifying that because it wasn't clear to me that your position was that the code you wrote was incorrect. Ultimately, I think we fundamentally just see things differently then because I don't believe it's wrong, in fact I think it's the better abstraction model.

With that I think we're going to need to figure out what a good middle ground is because I'm adamant that standardizing key representations are unnecessary, the authorization model is not understood to be a authorization system (there's probably 10 or 20 of us total who understand that verification relationships are an authz system), and the things you're advocating for will be ignored as they already have been and as I've watched numerous new developers do when watching how they use DIDs and VCs.

Furthermore the advocacy for a test suite in order to do this is a movement of the goal posts that I've only heard you advocate for. Is this an opinion that's shared by the working group or by experts in SecDir or the CFRG? Everything I've seen indicates that many developers (not just myself) are quite comfortable doing this and aren't producing vulnerabilities by doing so. So to paraphrase what you've said to @OR13 when he raised security issues before, Can you explain what legitimate security issue is happening so we can address it rather than just calling it a security issue and expecting everyone to not do it?

</p>
<p class="milestone"><b>Expected completion:</b> <span class=todo>WG-START+24 months</span></p>
</dd>
</dl>
Expand Down