3rd Annual PKI R&D Workshop Summary
Ben Chinowsky, Internet2

The workshop announcement listed the goals of this gathering as:

  1. Explore the current state of public key technology in different domains including web services, grid technologies, authentication systems et al. in academia & research, government and industry.
  2. Share & discuss lessons learned and scenarios from vendors and practitioners on current deployments.
  3. Provide a forum for leading security researchers to explore the issues relevant to the PKI space in areas of security management, identity, trust, policy, authentication and authorization.

This summary groups workshop sessions according to which of these goals was their primary concern, although many sessions addressed more than one.

Surveying Deployments

Dr. Susan Zevin, Director of the Information Technology Laboratory at NIST, opened the meeting by noting some Federal PKI highlights. The Federal Bridge Certification Authority now connects six Federal agencies. The Department of Defense now requires contractors to obtain PKI credentials for email and authentication to DoD web sites. Several countries and international associations, including Canada, Australia, and NATO, are negotiating to connect to the Federal PKI. NIST is a global leader in smartcards and biometrics and their integration with PKI.

A panel discussion with Peter Alterman, Deb Blanchard, Russ Weiser, and Scott Rea discussed the NIH-EDUCAUSE PKI Interoperability Project: Phase Three. This project has been largely driven by the Government Paperwork Elimination Act; in order for virtual paperwork not to become just as much of a hassle as the physical paperwork it replaces, reducing the number of certificates each person needs ("reusability") is essential. While this is still at the technology-demonstration stage — a production implementation has additional, expensive, datacenter requirements — various agencies including GSA and HHS are adopting elements for production use. This uptake in the federal environment is what this seed project is all about. The panelists' project report describes progress to date in great detail.

The use of Document Signatures in land-ownership transactions in Canada was also the subject of a panel discussion. Attorney and former crypto engineer Randy Sabett compared physical and digital signatures, and in particular explored the issue of digital signatures being held to higher standards than physical ones. John Landwehr, from Adobe, described how Acrobat supports signatures from both the author and the user of a form, in order to guard against spoofing and data modification respectively; there has been strong customer demand for this. The centerpiece of this panel was Ron Usher's description of the application of the tools described by Landwehr to a real-world situation that raised many of the legal issues described by Sabett: moving the documentation of Canadian land-ownership transactions to an electronic format. Forgery of paper documents has been a big problem in the Canadian land-tenure system; this and the need for greater efficiency were the principal drivers of the move to secure electronic documentation. Usher described his philosophy as PKE, with the E standing for Enough: "usually what we really need is public-key cryptography," with infrastructure to be added only as needed. Usher's company, Juricert, was launched by the Law Society of Canada to implement this approach. Lawyers, not landowners, are the ones who sign the documents in the Canadian system, so it's only they who need certificates. On the other hand, Usher observed that lawyers tend to be very conservative about process. One key to user acceptance of the switch to electronic transactions is to make the form look as much like the paper version as possible. This is a main reason for choosing Acrobat (though a TIFF image is the permanent legal record). The new system provides an "astounding improvement" in transaction time. The government had been re-keying information keyed and printed by lawyers; this system eliminates the keystroking — a big win for the cash-strapped government. The benefits have prevailed over the lawyers' conservatism: the new system has handled $400 million (Canadian) in offers and ownership transfers in the few weeks it has been in operation.

Rich Guida offered an overview of Johnson & Johnson's Use of Public Key Technology. The J&J PKI is enterprise-directory-centric — a certificate subscriber must be in the enterprise directory (which is an internally LDAP-accessible AD forest). Guida stressed the importance of providing proper training for helpdesk personnel and providing enough helpdesk resources. J&J produced a one-page document distilling what users need to know to use the PKI — what tokens are for, where you use them, what to do when asked for a passphrase, etc. — and found that users often wouldn't read even this, but would instead call the helpdesk for even the most basic questions. On the other hand, J&J was able to do most configuration and credential preparation independently of the users. Guida also noted that while it has taken significant effort to get users to treat their USB tokens as a must-have item like their car keys or J&J badge, "the beauty of using the token is starting to catch on." Users particularly appreciate having a single passphrase that doesn't have to be complex or be changed every 90 days. USB tokens were chosen over smartcards only because of the ubiquity of USB ports; Guida expects a move to multifunction smartcards (e.g., used for building access also) over time. Standardization on 2048-bit keys will help drive the transition.

David Chadwick related Experiences of establishing trust in a distributed system operated by mutually distrusting parties. The mutually distrusting parties in question are national governments involved in a worldwide effort to monitor production of environmental contaminants capable of doing harm across international borders. Currently about 100 of 300 monitoring sites are sending signed messages to a data collection center. Every message must be signed by a threshold number of the mutually distrusting parties; this m-out-of-n principle is used wherever possible. Chadwick noted that human factors have been a major focus in both deployment and operation.

There were also two presentations relating experiences using PKI for the specific tasks of delegation and archiving.

Von Welch reviewed the use of X.509 Proxy Certificates for Dynamic Delegation. Proxy certificates were first prototyped in 1998 and were standardized in PKIX earlier this year; an RFC is imminent. Proxy certificates are part of the Globus toolkit and are now widely used in scientific testbeds in many countries. There are three authorization models: identity-based authorization (i.e., impersonation), restricted delegation of rights, and attribute assertions without delegation; most implementation experience has been with the first of these. The users seem pleased; their main complaint is that certificates exist as files on the local machine.

In the Trusted Archiving session, Santosh Chokhani and Carl Wallace described a proof-of-concept trusted archive that they built for the US Marine Corps. The approach taken was refreshed timestamps, with RFC 3161 rules used to verify that the correct data was stored. Chokhani called the group's attention to LTANS, an IETF working group formed for trusted archive standards.

Drawing Lessons

Two sessions were devoted primarily to this goal.

Peter Gutmann keynoted on How to build a PKI that works. After presenting an entertaining catalogue of PKI disasters, Gutmann offered a list of six "Grand Challenges" for PKI, along with proposed approaches to meeting those challenges.

  1. Challenge: key lookup. Response: "the Web is the Public File." In its simplest form, this would mean putting a certificate on your home page and letting people find it with Google; while he's not promoting this, Gutmann noted it would still be better than anything currently available. His more serious proposal is "http glue + anything you want"; pretty much any database now supports the Web, many with surprisingly little effort. See http://www.ietf.cnri.reston.va.us/internet-drafts/draft-ietf-pkix-certstore-http-07.txt.
  2. Challenge: enrollment. Response: make it transparent. Gutmann quoted Bo Leuf: "the vast majority of users detest anything they must configure and tweak." The norm when trying to get a certificate issued is to be subjected to pages and pages of hassle; there is a persistent myth that this is inherent in the process of certificate issuance. By contrast Gutmann cited the ISP model: you call the ISP with a credit card, they give you a username and password, you use them, DHCP does the rest. We need to remember that our PKI-enabled applications only have to be as secure as the best non-PKI alternative. More on this "plug-and-play" approach to PKI is in http://www.cs.auckland.ac.nz/~pgut001/pubs/usenix03.pdf.
  3. Challenge: validity checking. Response: Gutmann outlined an approach based on querying hashes submitted by users; this puts the work on the client.
  4. Challenge: user identification. Response: Distinguished Names "provide the illusion of order" but create chaos. Gutmann used a variety of examples of this to argue for treating Distinguished Names as meaningless bit strings, and using binary compare for name comparisons.
  5. Challenge: no quality control. "Some of the stuff out there is truly shocking." Again Gutmann provided a rich variety of examples. Response: Create "brands" and test procedures to become brand-certified (e.g., S/MIME testing under RSADSI); against these brands, test the basics only (lookup, verification and basicConstraints/keyUsage enforcement); make sure that users know that while software certified to the brand will work, software not so certified could do anything.
  6. Challenge: implementer/user apathy. E.g., never updating CRLs, but checking against them anyway in order to formally meet requirements. Response: "Make the right way the only way to do it."
Gutmann's slides for the workshop (124 of them) develop his proposed approach in detail; he also provides crypto libraries to support it (see http://www.cs.auckland.ac.nz/~pgut001/cryptlib/).

The other big "lessons learned" compilation was Carlisle Adams' presentation on PKI: Ten Years Later. Adams dates PKI from 1993 and the beginnings of X.509 dissemination. Three big lessons from those ten years are:

There were three particularly interesting exchanges in the Q&A portion of Adams' session:

Identifying Tasks

The bulk of the sessions at PKI04 were devoted to identifying and prioritizing tasks needed to move PKI forward. The two main themes that emerged from the previously described sessions — 1) human factors and 2) letting practical needs drive technology choices rather than vice versa — were dominant here as well.

Six sessions addressed directions for specific technical areas.

In the Controlled and Dynamic Delegation of Rights panel, participants put forward various tools for addressing this problem. Moderator Frank Siebenlist presented on Grid needs for delegation of rights; he believes that industry will face similar requirements in two or three years. Carl Ellison argued that when rights are delegated it is vital that the act of delegation be performed by the authority on the rights being delegated, rather than by the party that happens to control some authorization database. More generally, Ellison stressed that the user is part of the security protocol; Ellison's work on procedures designed to take proper account of this fact ("ceremonies") is documented in http://www.upnp.org/download/standardizeddcps/UPnPSecurityCeremonies_1_0secure.pdf. Ravi Pandya presented XrML 2.x as a general-purpose policy language, not the narrow DRM language it's often seen as (XrML 1.2 was much more limited). Kent Seamons presented TrustBuilder (http://isrl.cs.byu.edu/projects.html), an architecture for automated trust negotiation based on gradual disclosure of credentials. Seamons noted that this is a growing field; William Winsborough is another key person working in this area.

In the discussion, Steve Hanna asked why there had been no presentation on XACML; Frank Siebenlist, who's on the XACML TC, noted that XACML has no delegation capability, though there are plans to add this. Carl Ellison related his experiences with SPKI/SDSI to his current involvement with XrML: lack of industrial-strength toolkits and marketing are the main reasons SPKI hasn't deployed; this in turn is due to SPKI's lack of CAs precluding anyone from making money from it. But, XrML has all the power of SPKI/SDSI and more, and it's backed by Microsoft. Pandya added that the core of XrML is pretty much final, and that toolkits are in the works. Microsoft is committed to getting organizations like Globus to take it up and work it to its full broad potential. Information on the IPR status of XrML is at http://www.xrml.org/faq.asp.

Ken Stillson of Mitretek presented a "Dynamic Bridge" Concept Paper. Stillson observed that the path-discovery process scales very poorly and is brittle: path discovery has no sense of direction, and taking a wrong turn can lead to a wild goose chase. "Namespaces aren't organized in a way that facilitates a routing protocol." The Dynamic Bridge provides a means of consolidating paths so that intermediate nodes no longer make you have to guess. There is substantial overlap between these ideas and work on shortening certificate chains done by Radia Perlman at Sun. Mahantesh Halappanavar noted that he and his co-authors have also published work along similar lines. Mitretek owns the patents on the Dynamic Bridge concept, but has no intent to assert patent protection. They are looking to start a discussion on possibilities for implementation; contact stillson@mitretek.org if you are interested.

Stillson's talk was followed by a panel discussion on Approaches to Certificate Path Discovery. Peter Hesse reviewed the basic PKI structures that path discovery must deal with, describing them as all meshes, just of different shapes. Path building has not yet been addressed by IETF standards, but an informational Internet-Draft (I-D) is in the works. Steve Hanna explored analogies for path building. Is it graph theory? Only if you download all the certificates in the world. Is it navigation? Sort of. Really it's like building a deck — going out and getting things, then repeatedly running back for things you forgot, is most of the work. So, work with what you've got, keep scraps, collect tools ahead of time, and work carefully. The common theological issue of the right direction in which to build paths needs to be answered accordingly: "it depends." Meeting in the middle is also an option, particularly appropriate for bridged topologies. Hanna suggested that more research is needed: test different path-discovery modules with different topologies, and try to find the best algorithms for particular sets of circumstances. This would make a great master's thesis and could generate dozens of papers. Matt Cooper summarized the approaches he's taken in writing three pathbuilding modules, and shared test results quantifying the usefulness of various simplifications such as pruning and disallowing loops through CAs. He also stressed the importance of doing as much validation as you can in the process of doing discovery. Ken Stillson stressed that in addition to the tasks of path discovery and path validation there is also the task of object location — as there is no global directory, even if you know the Distinguished Name (DN), you don't necessarily know how to get the certificate, so you end up having to implement a bunch of different access methods.

Hesse then moderated a discussion:

What is the goal when discovering paths? The consensus here was that (as Hanna put it) "any path is a good path." Cooper observed that it's likely that the first path you find is the intended path even if it's not valid, so that path should be reported to the user. It's also important to be able to specify a timeout: e.g. users only want it to take a few seconds for email, and a search that takes more than five minutes is very unlikely to succeed.

Is path discovery best done on the client or on the server? There appears to be a consensus that the answer here is the same as the answer to the forward vs. backward issue — "it depends" — though Stillson pointed out that audit requirements may dictate doing path discovery on the server.

What are your recommendations for PKI architects?

Who has the obligation to do path discovery? The only consensus on this appears to be that it is an important unresolved question. Stillson noted a related question: Who's liable if a valid path tells me to do something I shouldn't?

What can be learned from PGP? Hesse observed that PGP doesn't really have a discovery mechanism; the user needs to know somebody it trusts, then build a local copy of the PKI that it cares about. On the other hand, Stillson cited the trust scoring system in PGP as having relevance. Neal McBurnett pointed the group to statistics on the PGP web of trust and links to path-building services at http://bcn.boulder.co.us/~neal/pgpstat/.

Steve Hanna wrapped up the path-discovery session by asking all with sample PKI topologies to send them to him (shanna@funk.com) for testing. Anyone interested in further research on path discovery and validation should also contact him.

Nicholas Goffee presented Greenpass: Decentralized, PKI-based Authorization for Wireless LANs. This project is driven by guests wanting access to Dartmouth's wireless network. Greenpass uses a SPKI/SDSI authorization certificate to bind authorizations to a public key; the delegation process makes use of a "visual fingerprint" assigned to a guest and verified by the delegator before signing the certificate. The certificate chain gets stored as a cookie on the guest's machine so the guest can reauthorize without repeating the introduction process. A pilot deployment is in the works.

Xunhua Wang presented a method for Role Sharing in Password-Enabled PKI. Roles are preferred to individuals as the subject of security because they are more permanent and because security policies are concerned with roles, not individuals. The principal advantage of the proposed approach is its lightweightness: users need passwords only, not smartcards or segments of the private key.

Hiroaki Kikuchi outlined a Private Revocation Test using Oblivious Membership Evaluation Protocol. In the course of certificate status checking, OCSP servers learn the relationship between the certificate holder and certificate checker. There is a privacy issue here; the proposal outlines an "oblivious membership test" to address this.

Another six sessions were specifically devoted to identifying key issues and next steps for PKI as a whole.

Stefan Brands outlined a comprehensive heterodox approach to making use of public-key cryptography: Non-Intrusive Cross-Domain Identity Management. In Brands' view, the Achilles heel of X.509 is its fundamental incompatibility with privacy: public keys are "strongly authenticated 'super-SSNs'". Brands pointed out the shortcomings of various proposed solutions to the privacy problem within the X.509 framework: pseudonyms and roles, attribute certificates, per-domain CAs and certificates, and federated identity management. Instead, "new authN primitives" are required. Brands' alternative, called Digital Credentials, is based on twenty years of research by dozens of academics, starting with David Chaum's work in the 1980s. The features of Digital Credentials include "sliders" for privacy and security, selective disclosure/hiding of attributes, unlinkability, and a variety of choices along the identity-pseudonymity-anonymity spectrum. Digital Credentials are patent-protected, but Brands stressed that this is only so that he can secure the investments needed to drive real-world deployments. Brands is willing to make the technology available where doing so does not conflict with this goal; contact him if you have ideas for collaboration. Brands' ideas are developed at length in his book, Rethinking Public Key Infrastructures: Building in Privacy.

John Linn of RSA offered An Examination of Asserted PKI Issues and Proposed Alternatives. Linn's proposed alternatives are more along the lines of new ways of using X.509: Identity-Based Encryption and related approaches; online trusted third parties; distributed computation; alternative approaches to validation (hash trees in particular); key servers; and privacy protection via pseudonyms and attribute certs. Linn also noted that "you can't have full success until you've had partial success first," and that choices such as hierarchical vs. nonhierarchical PKIs — once matters of ideological controversy — are now matters of pragmatic adaptation to circumstances.

In a panel discussion on the question Which PKI Approach for Which Application Domain?, Peter Alterman, Carl Ellison, and Russ Weiser explored some of the specifics of this latter point. The theme of PKI not being a one-size-fits-all technology, but rather a technology that needs to be custom-tailored to a huge variety of real-world situations, has become steadily more prominent over the last couple of years, and the contrast between this session and the "Dueling Theologies" session at PKI02 (http://www.cs.dartmouth.edu/~pki02/theologies.shtml) illustrates this nicely. Ellison stated his continuing belief in the importance of local naming — not so much to avoid name collisions, which can be addressed by domain component (dc) naming, but in order to provide a means of "the relying party knowing who this is." The relying party needs to be able to use a name it assigns — a name it can remember — for a trusted entity. Ellison claims that SPKI/SDSI and XrML can do everything needed here; X.509 might work if the environment is constrained accordingly. Rich Guida (the other dueling theologian from PKI02) observed that there's increasing recognition that if you want to join a namespace, you have to choose between changing your naming or not joining; conflicts should be resolved at join-time. The problem is that you still have to have a way of knowing who others really are, what they call themselves; relying on entities to attest to the identity of others is inescapable.

Guida suggested that doctors, for instance, would never bother to assign a local name for every patient with whom they'd need to securely exchange information. This led into a discussion of PKI in medical scenarios more generally. Peter Gutmann observed that doctors confronted with PKI usually just sign in at the start of the day and let everyone else use the machine. Doctors rightly don't want anything in the way of their work; you have to design around the fact that they see any kind of security as an impediment. PDAs that transmit certificates to the network, and short-range RFIDs, were suggested as approaches to security in emergency rooms and similar settings. Guida suggested that PKI will be used a lot more in the context of medical research and clinical trials, where there isn't the "get the certificate vs. restart the patient's heart" problem, but where there is a strong need to ensure data authenticity, integrity and confidentiality. Another possible application is finding out if a suspected drug-of-abuse-seeking patient has been to other clinics. Ellison pointed out that this use case requires an aggregator, but — contrary to common perception — doesn't require X.509, or any other particular variety of PKI. No global name for the patient is needed; what matters is that the aggregator have one, and only one, key for each patient.

Ken Klingenstein keynoted on A New and Improved Unified Field Theory of Trust. Klingenstein identified three spheres in which individuals require trust: personal, work, and transactions where extremely high assurance is needed (often transactions involving the government). For each of these, there is a type of trust relationship which is usually appropriate: peer-to-peer, federations, and hierarchical PKI, respectively. Virtual organizations cut across these boundaries and thereby represent an additional challenge. Klingenstein described P2P trust as "a bedrock of human existence;" expressing it in electronic form is therefore necessary. It's also hard, although PGP, webs of trust, and X.509 proxy certificates have made some progress. Federations are getting deployed; Merck has a large and noteworthy deployment. Klingenstein noted that the federation structure for InCommon will be per-nation, as attitudes and practices for security are nation- and culture-specific. InCommon is hoping to set up a virtuous circle between the use of trust and the strengthening of trust. Klingenstein also offered an overview of recent developments and ongoing projects such as Stanford's Signet, Penn State's LionShare, and Internet2's own Shibboleth, setting them in the context of his unified field theory, and noted four looming issues he expects to be prominent in his talk next year: inter-federation issues, virtual organizations over P2P trust, federated and progressive (growing trust levels) PKI, and middleware diagnostics.

Jean Pawluk, representing the OASIS PKI Technical Committee and PKI Action Plan coauthor Steve Hanna, presented on Identifying and Overcoming Obstacles to PKI Deployment and Usage. While the Technical Committee's research identified numerous obstacles to PKI deployment, the top four (Software Applications Don't Support It; Costs Too High; PKI Poorly Understood; Too Much Focus on Technology, Not Enough On Need) accounted for half the total points survey respondents assigned to indicate relative importance. The PKI Action Plan's five action items are:

As in other sessions, prominent themes of the discussion were that technology is a much smaller part of the problem than understanding the business needs of PKI implementers and selecting tools accordingly, and that when this is done, PKI can thrive. Bill Burr observed that the math in PKI is so cool that we try to bring everything up to its standard; instead we need to figure out how people can use PKI without understanding any of the esoteric details. Rich Guida noted that he sometimes feels like he and all the people who talk about the death of PKI dwell on "different planets;" in the pharmaceutical sector in particular, the use of PKI is "blossoming." Pawluk encouraged the group to get involved in the work of implementing the PKI Action Plan, and noted that the OASIS PKI Technical Committee that's driving it (http://www.oasis-open.org/committees/tc_home.php?wg_abbrev=PKI) usually meets via telephone.

This session was followed by a panel discussion focused on the theme: The PKI Action Plan: Will it make a difference? The consensus appears to be "yes, if...", with the ifs being a little different for each presenter. Sean Smith put forward his own top-three list of PKI commandments: 3) follow real-world trust flows, 2) pay proper attention to human factors, and 1) keep the goals of using PKI in mind. John Linn observed that a key question is whether deployment barriers are in PKI itself or in understanding what it can do. Most documentation is little-used and needs to be radically simplified. Linn also stressed the importance of building in reusability across applications. Lieutenant Commander Thomas Winnenberg, chief engineer for the DoD PKI, observed that the DoD PKI has been groundbreaking in that there was no ROI concern, allowing the project to be driven by an understanding of the need for vendor-neutrality and versatility in security functions. Their big problems have been around certificate issuance, but the DoD PKI now has about four million certificates in circulation. Winnenberg stressed that the focus has to be on infrastructure — relying parties are looking to PKI for a wide variety of needs, so implementations must abstract from applications. This makes "Ask Application Vendors What They Need" a key element of the PKI Action Plan. Tim Polk stressed the importance of an iterative process of application and revision of the Action Plan. Coordination will be key (in particular liaison work with groups that don't join OASIS), as will expansion of the action items into specific, concrete tasks.

Panelist Steve Hanna asked the group for further thoughts on coordination mechanisms. Tim Polk suggested making sure that IETF meeting minutes make it to the right groups; and more generally, pushing minutes of the various groups involved out to each other, rather than relying on everyone to check up on everyone else's work. Hanna suggested that liaisons also be set up between OASIS and similar groups elsewhere in the world. Hanna also asked for thoughts on how to achieve the universally-cited goal of keeping deployment efforts focused on needs rather than technology, therefore simpler ("brass-plated," as Polk put it) whenever possible. Focusing on business needs and ROI, reusability of infrastructure across applications, and applications that make it hard for the user to do the wrong thing, were all suggested here. Russ Weiser noted that often applications are something like "I have to sign something once a year;" he suggested implementing things like this in parallel with things where security need not be as stringent but that have to be done often, like submitting a timesheet. The idea is to pick the low-hanging fruit to further usability, without worrying too much about security. With respect to reusability, Polk noted that he's become a fan of the badge/token/certificate combo — if users can't get into the building without it, they'll have it with them, and then they can use it for other things. Polk also noted that NIST has been working on a PKIX test suite and client requirements for path validation; watch http://csrc.nist.gov.

Conclusions

Clearly, PKI is not dead. Throughout the workshop, presenters noted the contrast between the prevailing gloomy mood and the success of their own projects. The two overarching conclusions appear to be:

1) Understanding and educating users is centrally important. In particular, it is crucial a) to identify the smallest possible set of things that users need to know — the things that are inherent in the nature of PKI, b) to build systems that don't require users to know more than those things, and c) to find effective ways to teach them those things.

2) The specifics of any particular PKI deployment should be driven by real needs, and should be only as heavyweight as necessary. The Juricert deployment is exemplary here: it was driven by the need to stop paper forgeries, avoid re-keying, and improve transaction time, and was informed by a philosophy of "PKE" — Public Key Enough.

It was in the light of this consensus that the group met to consider the future of the PKI R&D Workshop itself.

Whither PKI0x?

There was broad agreement on keeping the current R&D focus of the workshop, with particular emphases following from the conclusions above: more on human factors, and more on using the many tools available to support a wide variety of needs and architectures. With respect to the latter, attendees would like to have more of a vendor presence at the meeting — application vendors in particular. The idea would be for the PKI0x community to get a better idea of what it can do to help vendors implement the perspective developed in the course of the workshops; ideally this would become a year-round dialogue. The group would also like to hear more about international experiences and concerns, e.g. a European report on deploying a national ID card. Finally, there was agreement that the execution of the workshop needs to be tightened up: getting proceedings out more quickly and making them more visible, and publicizing the workshop more widely and further in advance.