Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Verifiable Credentials Data Model 1.0 #343

Closed
1 of 5 tasks
burnburn opened this issue Feb 11, 2019 · 20 comments
Closed
1 of 5 tasks

Verifiable Credentials Data Model 1.0 #343

burnburn opened this issue Feb 11, 2019 · 20 comments
Assignees

Comments

@burnburn
Copy link

こんにちはTAG!

I'm requesting a TAG review of:

Further details (optional):

  • Relevant time constraints or deadlines: Group charter ends 31 March 2019, aiming to go to CR by early March.
  • I have read and filled out the Self-Review Questionnare on Security and Privacy. We completed the earlier PING version of this questionnaire and are almost done updating it to align with your new questionnaire. We will update this issue with the pointer when that is complete.
  • I have reviewed the TAG's API Design Principles

You should also know that...

This specification (and the working group) was specifically restricted, in the charter, to only be about a data model and one or more syntactic representations of that data model. We are not allowed to provide any normative answers to any questions regarding protocols or APIs that may use this data model, although we are happy to informally convey expectations held by the majority of the group.

We'd prefer the TAG provide feedback as (please select one):

  • open issues in our Github repo for each point of feedback
  • open a single issue in our Github repo for the entire review
  • leave review feedback as a comment in this issue and @-notify [github usernames]
@burnburn
Copy link
Author

Hi. Chaals Nevile suggested that @hadleybeeman might be a good reviewer for this. PLH also suggested that it might be helpful if we were to get on a call and give an overview and answer questions. We would be happy to do so - just let us know.

@plinss plinss added this to the 2019-03-12-telcon milestone Feb 26, 2019
@burnburn
Copy link
Author

Our original answers to the security and privacy questionnaire last fall (before support for ZKPs and JWTs) are at: https://docs.google.com/document/d/1cU2WHRWZ3rM4pzff87HMHIPlZIA_NZ0EyFR9YVuwOvo/edit

Our answers to the new questionnaire from about a month ago (including all features in the spec) are at: https://docs.google.com/document/d/1k1Y3XLRJ25iTNC_U5rOywxd4di4ytm1y-ARUaGZIjH4/edit

David Chadwick, PING member, helped us write these.

@lknik
Copy link
Member

lknik commented Mar 12, 2019

+1 for using the new security/privacy questionnaire :)

@lknik
Copy link
Member

lknik commented Mar 12, 2019

Quite verbose privacy considerations, I'd say. Long, too. If desiring to simplify, this might be the place:

For example remove the preface: "There are mechanisms external to verifiable credentials that are used to track and correlate individuals on the Internet and the Web. Some of these mechanisms include Internet protocol (IP) address tracking, web browser fingerprinting, evercookies, advertising network trackers, mobile network position information, and in-application Global Positioning System (GPS) APIs"
(remove?)

When discussing later "It is recommended that privacy-respecting systems prevent the use of these other tracking technologies when verifiable credentials are being used. In some cases, tracking technologies might need to be disabled on devices that transmit verifiable credentials on behalf of a holder."

It's not clear how/when "tracking technologies" should be disabled or why this would be preferable. Maybe better like "Systems deploying ... should consider disabling tracking technologies..."?

"For example, this document uses the ageOver property instead of a specific birthdate, which constitutes much stronger personally identifiable information"

Good thinking and illustrative.

@burnburn
Copy link
Author

Thanks @lknik . PING was quite extensive in their requests for considerations outside the scope of our document model and charter. We will look at your suggestions and see how much we can apply without encouraging a request for more detail again :)

@plinss plinss changed the title TAG review request: Verifiable Credentials Data Model 1.0 Verifiable Credentials Data Model 1.0 Mar 26, 2019
@travisleithead
Copy link
Contributor

Hi TAG!
If I may, I'd like to assert that a few browser implementers are quite concerned about the effect that this specification might have in the industry. But speaking just for Microsoft, we have been working in the Distributed Identity space for a while, depend on existing DID tech that is widely deployed, but which is oddly put at risk in this specificaion (JWT/CWT). Our overarching concern, is that if this specification becomes a standard, it will be held up by governments and lawmakers as the new requirement for a DiD data model, but will be impossible to get interop in practice (because of the out-of-scope bits that @burnburn mentioned in the OP). In order to actually make this work interoperably, the impact of cannonicalization, signing, and transport protocols for the data model must be considered as well.

I just wanted to make you aware of these concerns, and also that we've recently opened a number of tactical issues agains this spec that the TAG might want to look at as well--primarily these focus on places where the spec assumes JSON-LD and doesn't leave room for JWT/CWT. (For example, see w3c/vc-data-model#491)

@hadleybeeman
Copy link
Member

Hi all, and thanks for bringing this to us. We have a couple of questions here, though we'd like to look into the data model further for next week and get back to you on that:

  1. This is exciting work, and we can see the kind of ecosystem you're trying to build – but we are struggling to understand its relationship to the web. Your charter suggests you'll be creating some vocabularies – presumably those will operate in line with the existing web technologies? What are you expecting those to be?

  2. Similarly, the Verifiable Claims use cases doc (good, clear document, by the way!) implies a number of implementors to make the use cases work. Can you give us a sense of what products/services you would expect those to be, to illustrate this? It looks like you're not looking at browser vendors -- which is fine, to be clear! -- but it might help us to understand what existing web technologies/architecture you're planning to work with.

  3. It would be helpful for us to see the resolution of those issues that @nadalin (and/with? @travisleithead) have filed.

Thanks again!

@msporny
Copy link

msporny commented Apr 3, 2019

  1. This is exciting work, and we can see the kind of ecosystem you're trying to build – but we are struggling to understand its relationship to the web. Your charter suggests you'll be creating some vocabularies – presumably those will operate in line with the existing web technologies? What are you expecting those to be?

Yes, the core vocabulary will be the Verifiable Credentials Data Model v1.0 vocabulary, a working version of which can be found here:

https://www.w3.org/2018/credentials/

Note that this is a base layer vocabulary that is expected to be extended by particular market verticals (those market verticals, specifically education, government, and supply chain) are participating in the VCWG. More on that below...

Did the answer above address your question, @hadleybeeman?

  1. Similarly, the Verifiable Claims use cases doc (good, clear document, by the way!) implies a number of implementors to make the use cases work. Can you give us a sense of what products/services you would expect those to be, to illustrate this? It looks like you're not looking at browser vendors -- which is fine, to be clear! -- but it might help us to understand what existing web technologies/architecture you're planning to work with.

To provide a concrete example of a real world deployment... a number of US Supply Chain companies (importers, manufacturers, and retailers), along with the US Federal Government, are utilizing the technology to modernize old paper based processes using Verifiable Credentials. See the US Capitol Hill testimony for direct references to support of Verifiable Credentials at W3C (UPS was also a part of the testimony):

https://youtu.be/J7aCUM2RfJA?t=35m25s

An article outlining how Verifiable Credentials are being used can be found here:

https://www.americanshipper.com/news/cbp-planning-blockchain-tests-for-trade-compliance

We also shared who some of these organizations are under W3C Member Confidentiality at last year's W3C TPAC (see slide 9 -- yes, it's for the DID WG, but the reason they want the DID WG is because almost all of them are using Verifiable Credentials and they want to tie them to Decentralized Identifiers):

https://lists.w3.org/Archives/Member/w3c-ac-forum/2018OctDec/att-0007/W3C_DID_WG_Proposal_-_W3C_Member_Confidential.pdf

Another concrete example (that's public, there are many that are wrapped under NDA right now) is the Canadian Province of British Columbia's Verified Organization Network:

https://vonx.io/

Also note that education organizations such as Credly and BrightLink are also involved (to issue VC's for learning use cases).

The thread that ties all of these use cases and organizations together is the Web. They use websites to issue, store, and present Verifiable Credentials. Many are defining Web-based/HTTP APIs for the protocols (work continues in the W3C Credentials Community Group on that part, since larger organizations at W3C were successful in constraining the charter to only work on data model and not protocol or signature mechanisms).

The products that are being created today are: Verifiable Credential issuing platforms (Credly, BrightLink, Veres Issuer, etc.), Verifiable Credential Repositories (Digital Wallets - Attest Wallet, Veres Wallet, Connect.me, Verify.me, etc.), Verifiable Credential Verification libraries/products (Attest Enterprise, Veres Verifier, Connect.me, Verify.me, etc.), and fit-for-purpose blockchains (Hyperledger Fabric, Veres Delta, Corda, etc.) that utilize Verifiable Credentials (off-ledger) to provide provable/auditable statements from governments/industry.

Does that provide enough background, @hadleybeeman?

  1. It would be helpful for us to see the resolution of those issues that @nadalin (and/with? @travisleithead) have filed.

The VCWG is currently working through the large amount of issues filed by Microsoft and will be doing so for the next 3-4 weeks.

@hadleybeeman there is much more I could say, but am limited by time and github comments. Happy to have a recorded/scribed conversation to elaborate on any further questions to help in the W3C TAG review. There is a lot to what is going on wrt. Verifiable Credentials deployment and usage around the world.

@hadleybeeman
Copy link
Member

Hi all. We've revisited this in our face-to-face in Reykjavik. And thanks for your extensive comments. From us:

  1. A lot of what you've described here is out of our purview (as per business take-up and market verticals).

    What is most relevant to us, @msporny is where you write:

    The thread that ties all of these use cases and organizations together is the Web. They use websites to issue, store, and present Verifiable Credentials. Many are defining Web-based/HTTP APIs for the protocols.

    Would you like us to review those? It doesn't matter that they are in a community group; we look at web technologies from a number of forums.

  2. It's hard for us to look at the data model in isolation. It may be fine, but it may have dependencies on the underlying web technologies or interact with other architectural issues. We'd like to understand the rest of the work before commenting further.

@stonematt
Copy link

Hi @hadleybeeman and TAG

I'm one of the co-chairs of the VCWG. Here's where our schedule sits. We are closing the CR period and preparing to transition to PR. Implementors are active and implementation reports are coming in with positive coverage. Since we have limited time in our charter, we're very motivated to help bring the TAG up to speed on these related topics.

We'd like to setup an informational meeting at your earliest convenience. I'm sure we can get aligned on the narrow Data Model scope and how it fits in the web ecosystem. We hope there will be future work that extends this concept but it is outside the purview of our charter.

Is there time on the schedule before July 5 for a sit-down? We'd like to close this issue by mid-July so we can complete our transition to PR.

@nadalin
Copy link

nadalin commented Jun 27, 2019 via email

@msporny
Copy link

msporny commented Jun 27, 2019

@nadalin wrote:

It seems that there have been enough normative changes to have another CR

There have been ZERO substantive/normative changes to the specification. In fact, the Working Group has been very careful and methodical to NOT do that. It is not appropriate for you to make these sorts of assertions about a Working Group for which you are not a member.

I defer to the W3C VCWG Chairs to provide an official update to the W3C TAG.

@nadalin
Copy link

nadalin commented Jun 27, 2019 via email

@msporny
Copy link

msporny commented Jun 28, 2019

@nadalin wrote:

Would say that that is not true at all as there have been normative reference changes, changes in normative language, I’m making that statement because I submitted these issues.

As one of the Editors of the specification, and as one of the people that either authored or reviewed every single one of the changes you are citing, your assertions are baseless. The WG has a resolution for every single one of the changes being non-normative and or non-substantive and is prepared to defend every single one of them in a transition call if that's what it comes down to.

I suggest we let the VCWG Chairs take it from here, @nadalin, as you are not a member of the VCWG and thus it is inappropriate for you to make these sorts of statements as to the status of the work.

@slightlyoff
Copy link
Member

Hey all!

The proposal for a potential PR transition came across our transom, and quickly reviewing the spec I had a few questions:

  • Has any thought been given to a serialisation formation that is less error-prone than JWT? For instance, CBOR?
  • On that front, perhaps signing could re-use an existing container format, e.g. SXG: https://developers.google.com/web/updates/2018/11/signed-exchanges
  • Has thought been given to providing a JSON-LD+JWT (or other format) decoding/validation Web API? It seems as though we'll take on quite a lot of polyfill debt (and potential security problems) without ways of pulling the crypto bits out of userland

Regards

@nadalin
Copy link

nadalin commented Aug 14, 2019

JWT Serialization is pretty straight forward, Compact Serialization is the most common serialization format that I have seen for JWT, just Base64 Url Safe encoded information separated by an dot.

What I have not figured out yet is the RDF DataSet Normalization that is required for the JSON-LD Signature specification .

@msporny
Copy link

msporny commented Aug 14, 2019

NOTE: This is not an official response from the WG, just one of the Editors of the spec (and a corporate implementer) responding.

Has any thought been given to a serialisation formation that is less error-prone than JWT? For instance, CBOR?

Great question!

Yes, we considered CBOR and yes, we're aware of JWTs shortcomings :) ...

https://w3c.github.io/vc-imp-guide/#benefits-of-json-ld-and-ld-proofs

Note, though, that JWT is not the serialization format. JSON and JSON-LD is the data model serialization format. JWTs and LD Proofs are the proof format (e.g., digital signatures).

We would have preferred to use COSE for the digital signature expression format but the work and library development wasn't at the point where the group could rely on it. Even to this day there isn't a decent CBOR implementation in isomorphic Javascript (use of node.js Buffers everywhere for no reason) and no Javascript COSE implementation exists for a variety of different languages (and attempts to compile to wasm have been deemed too bleeding edge for the companies deploying Verifiable Credentials today).

In the future, it should not be difficult to express the data model in CBOR (we have internal experiments that already do this), and then digitally sign the data using LD Proofs (with a COSE signature... which is where some of us want to go), or CWTs... but for that future to unfold, a good chunk of the developer community is going to have to get the underlying CBOR/COSE library support there.

On that front, perhaps signing could re-use an existing container format, e.g. SXG: https://developers.google.com/web/updates/2018/11/signed-exchanges

We are aware of the Signed HTTP Exchanges work (I'm the editor for https://tools.ietf.org/html/draft-cavage-http-signatures-11).

Since the data model is designed to be separate from the proof format, someone could "sign" a Verifiable Credential using SXG (in theory). No one in the group felt like SXG was ready to be used in a W3C REC-track spec. I just took a quick look again to try and see if the SXG format was published anywhere, and was having a hard time finding a spec for it. I also took a look in a hex editor at some of the example files in the chromium source repo example and couldn't make heads or tails of the format.

Do you have a link to the SXG file format? It would help us determine how easy or difficult it would be to express a Verifiable Credential using an SXG as the proof format.

Has thought been given to providing a JSON-LD+JWT (or other format) decoding/validation Web API? It seems as though we'll take on quite a lot of polyfill debt (and potential security problems) without ways of pulling the crypto bits out of userland.

Yes, thought has been put into that but browser vendors seemed hostile to the idea of anything JSON-LD related going into the browser (this was 4+ years ago). Things may have changed now that Google is including jsonld.js in the browser as a part of their Lighthouse tooling. If the browser vendors want to explore putting JSON-LD + LD Proofs (or JWTs) into the browser, we'd be happy to discuss.... but

Almost all of the digital signature and verification for Verifiable Credentials happens on the server or in app and not in the browser. We do have isomorphic Javascript libraries that can use the WebCrypto APIs if they're available, but there has been little desire to work on that as most (maybe all?) of the production implementations sign stuff server side. The WebCrypto APIs also don't expose the digital signature schemes that are most heavily used for Verifiable Credentials either (ed25519 and secp256k1).

Does that answer all of your questions, @slightlyoff?

@hadleybeeman
Copy link
Member

Hi all. We are returning to this at our F2F in Tokyo. We'd like to resolve it ASAP, so it would be good to sort these questions:

  1. @burnburn, have you managed to resolve @lknik's concerns?

  2. What part of this ecosystem is outside of your charter and specified elsewhere? Could we please see those specs/documents, to understand how all the pieces fit together? Specifically, We are concerned by this part of @travisleithead's comment:

Our overarching concern, is that if this specification becomes a standard, it will be held up by governments and lawmakers as the new requirement for a DiD data model, but will be impossible to get interop in practice (because of the out-of-scope bits that @burnburn mentioned in the OP)

What important parts of this are currently out of scope?

We are still unclear how this spec fits into the ecosystem you imagine, and how that works with the web architecture. If you can help us understand this, we can close it in the next day or two. If not, we're not sure we can add much value here.

@burnburn
Copy link
Author

Apologies. I thought I had sent a message months ago (either as email or here) making clear that we are not specifically looking for any particular feedback from TAG. We had offered to schedule a call with TAG members at any point to cover the ecosystem, etc., but did not receive a response to that offer.
In the mean time, the Working Group continued with its work, published a second CR and a PR, and had its final teleconference two days ago. The Working Group's charter ends at the end of this month, although it could possibly be extended purely for maintenance work on the Recommendation when it comes out.

Hi all. We are returning to this at our F2F in Tokyo. We'd like to resolve it ASAP, so it would be good to sort these questions:

  1. @burnburn, have you managed to resolve @lknik's concerns?

The working group continued to affirm its interest in providing as much in the way of non-normative recommendations that it could. No changes to that section have been made. If absolutely necessary, changes such as those suggested by @lknik could be added post-Recommendation in the Credentials Community Group, which is the group the WG explicitly named to handle maintenance.

  1. What part of this ecosystem is outside of your charter and specified elsewhere? Could we please see those specs/documents, to understand how all the pieces fit together? Specifically, We are concerned by this part of @travisleithead's comment:

Our overarching concern, is that if this specification becomes a standard, it will be held up by governments and lawmakers as the new requirement for a DiD data model, but will be impossible to get interop in practice (because of the out-of-scope bits that @burnburn mentioned in the OP)

What important parts of this are currently out of scope?

As I mentioned above, we offered to meet with the group and cover your questions live, as the quantity of work outside the charter but relevant here is quite substantial, more than can be addressed by a few document links. Most relevant to the question by @travisleithead , however, is that production, transport, storage, use, etc. of the data model document was specifically chosen to be out of scope at the explicit request of several companies, including I believe, Microsoft. The charter was thus approved with the narrow scope we have been operating under for the (now complete) lifetime of the group. Feel free to ask the W3C team why the objectors to a broader charter did not wish a more complete set of work to be done at W3C; since some of the objections were private, I am not in a position to speak to that myself. Within the limitations of our charter, the working group members are satisfied as to the interoperability of the data model.

We are still unclear how this spec fits into the ecosystem you imagine, and how that works with the web architecture. If you can help us understand this, we can close it in the next day or two. If not, we're not sure we can add much value here.

We do not seek particular feedback from the TAG at this point. Many of us, myself included, will be at TPAC next week and would be happy to help you or others understand the broader (and growing) ecosystem in which this now-completed work plays a part.

I will be co-chairing the new Decentralized Identifier Working Group on Monday and Tuesday. Please don't hesitate to come find me or @msporny .

@hadleybeeman
Copy link
Member

Thanks, @burnburn. We appreciate your quick reply.

We had this issue open because you requested a TAG review. Apologies if we missed a subsequent "Thanks but never mind" comment!

We will close this now.

I'd also like to add, because I'm concerned we may have generated some confusion here: the TAG's remit isn't contained to work within W3C working groups. Our charter says "The TAG will coordinate its work with other groups within and outside of W3C whose technologies have an impact on Web architecture." In practice, that means that we offer advice on work from community groups, TC39, the IETF, WHATWG, and others -- as well as work from W3C working groups.

Our interest in your complete ecosystem is where we may be able to add value to your work, given that it sounds like that is where verifiable credentials may have architectural implications for the web. This is why we've been asking for information beyond the scope of your W3C working group's charter.

Also, one of our tenets for the web is that all of the documents and specs that form the web technologies are fully open and visible -- which is why we haven't taken up your kind offer for a call. We are keen to see this fully documented in the open, ideally in a standards venue in due course -- though initially, we'd be happy to see it in a GitHub repo or otherwise published on the web.

Hope that helps, and we do wish you luck with this important work.

@hadleybeeman hadleybeeman added Resolution: unsatisfied The TAG does not feel the design meets required quality standards and removed Progress: in progress labels Sep 16, 2019
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

No branches or pull requests