Skip navigation
All Places > CA Security > CA Single Sign-On > Blog > Authors Wesley Dunnington

CA Single Sign-On

2 Posts authored by: Wesley Dunnington Employee

Part 2 - Why you Care

In OAuth 2.0, OpenID Connect and JWT – What are they and why do you care? - Pt1  We gave a description of what OAuth and OpenID Connect are and how they work.

 

So all that is great, but why do we care when we already have SAML 2.0 assertions and existing cookie based session tokens like SMSESSION?

We care because OAuth, OpenID Connect, and JWT provide an enhanced ability to tie the new breed of applications together in a more logical and flexible manner. The rest of this blog post will show those scenarios, describe the current limitations, and how OAuth and OpenID Connect can solve the use case in an easier and more natural manner.

 

Once again, please note that none of this should be taken as official product commitment. For that please talk to Product Management.

 

Mobile Applications

Mobile applications fit into two basic categories. The first is a fully native application. This application’s user interface is written solely for the mobile device. It communicates with the outside world via API’s. While it may have an embedded HTTPClient for making REST calls there is no direct user interaction with any web resources.

This presents two issues for mobile applications.

 

The first is authentication. The mobile application will want to gather user credentials through its own UI. It will either store them for re-use, or will prompt for them each time it needs to authenticate the user with the external system.

Since it is submitting inline credentials it does not want to deal with redirects. While it’s certainly possible to write authentication schemes for WAM that accept inline credentials, OAuth has simplified and standardized this. RFC 7523 defines an OAuth 2.0 profile where a trusted client can POST a JWT token to the Authorization Server and directly receive back an access token.

 

This profile is similar in many ways to SAML 2.0 in that the client is registered with the Authorization Server and has signed the JWT request token. As such the Authorization Server “trusts” the identity in the JWT token and does not require actual user credentials such as a password.

 

So our mobile friendly flow looks like:

blog pt 2 pict1.jpg

 

Single Page Apps / Hybrid mobile Apps

To the Resource Servers and WAM systems protecting them, Single Page Apps and Hybrid mobile applications look very similar from an authentication and authorization standpoint.

Both cases typically mix browser based authentication and web based content access with REST calls. They are both effectively mashups that want to operate with a consistent identity and authorization context.

 

The external authentication problem is minimized by virtue of being able to leverage a browser. However trying to maintain a consistent session across a multi-domain set of content providers and REST services becomes an enormous problem with cookie based session tokens which are associated to web domains and very difficult for 3rd parties to validate without WAM specific code.

 

So if the client application wants to present a consistent session token to multiple web based and REST based Resource Servers what is the solution?

The solution is to leverage JWT and make the WAM systems a bit smarter. A legacy security token can be an entry in a JWT based access token. WAM policy enforcement points can first look for a JWT access token in an authorization header. They can then retrieve the legacy security token from the JWT token and using it as always for authorization decisions.

This allows existing WAM products to become much friendlier to SPA’s and Hybrid mobile apps.

 

The following diagram illustrates the flow:

blog pt 2 pict2.jpg

 

Consuming External Identities

The beauty of what we have been describing is that all the token exchanges and token formats are standardized. These are really lightweight federated trust relationships. Applications and Resource Servers should be able to be used in multiple environments, getting identities from different OAuth Authorization Servers.

So what happens if a client application wants to be able to use external and internally generated identities to access the same set of Resource Servers?

 

The first option is for the resources servers to get smarter. If information about the issuer is present in the access token the Resource Server can have a table of signing keys, UserInfo, and Token Validation endpoints associated with each issuer. This is similar to the way SAML SP’s typically accept identities from multiple SAML IDPS’s.

However we want this to be easier for implementers of clients and Resource Servers.

 

The solution here appears to be in the form of a draft IETF proposal for a REST based OAuth 2.0 token exchange – “OAuth 2.0 Token Exchange: An STS for the REST of Us” - https://tools.ietf.org/html/draft-ietf-oauth-token-exchange-03

With this capability a client can obtain an access and ID token from any external identity provider that is supported by the organization and exchange them via REST for token(s) issued by the corporate Authorization Server. This means the Resource Servers only need to trust, and be configured for, a single Authorization Server.

 

Here is a diagram that shows that exchange:

blog pt 2 pict3.jpg

The protocol is also useful when you have a chain of resource servers. A resource server can also be a client of the Authorization Server, present its access token, and request an access token for the next resource server in the chain. This provides an authorization check at each point in the chain and prevents the issuance of overly broad access tokens.

 

Here is a diagram that shows that exchange:

blog pt 2 pict4.jpg

 

Wrap up and Emerging Directions

So what you have seen is how OAuth and OpenID Connect provide a flexible way for mobile and Single Page Applications to easily authenticate a user and get standards based access and identity tokens.

These tokens can then be used to access both traditional web resources as well as REST services.  The standardized nature of the tokens, including their issuance and validation means that applications and systems are not required to use proprietary API’s. This open approach is compelling to both new and existing customers.

 

So with all of these improvements, what’s left? Well a couple of things.

More closely tying the Resource Server and the Authorization Server together. Different API’s and different resources logically require different sets of claims to access. Currently there is no prescribed mechanism for this linkage, except via the client’s knowledge of the needs of each Resource Server.

 

User Managed Access or UMA is addressing this need. UMA is a profile on top of OAuth that introduces new endpoints that allow a Resource Server to register resource sets and associated scopes with the Authorization Server. The Authorization Server can then use those scopes to make authorization decisions before granting an access token. The Resource Server may also dynamically request additional scopes from the Authorization Server in order to perform a step-up authentication, or privilege escalation.

 

The last items to cover are impersonation and delegation. Bearer tokens are useful, but many people want better control over delegated tokens, including who can request them, where they can be used, time of use restrictions, etc.

The draft OAuth token exchange spec mentioned above - https://tools.ietf.org/html/draft-ietf-oauth-token-exchange-03 adds additional grant types to the OAuth token endpoint for requesters to request a token with a 3rd party subject for impersonation. For delegation what is known as a composite token can be requested. A composite token includes information about both the subject delegating authority, as well as the subject requesting the delegation token. This will feel familiar to people who have been around long enough to remember WS-Trust ActAs and OnBehalfOf request elements.

 

I hope this overview has been helpful showing the promise of these technologies.  We still feel we have lots to learn and want to hear from customer. If you have thoughts or comments please feel free to leave them here.

Part 1 - What are they?

A lot of people are at least somewhat familiar with OAuth 2.0 and possibly OpenID Connect, an alternative to SAML for communicating identities and information about a user between identity providers and service or resource providers.

Many people are also familiar with JSON Web Tokens (JWT), which is a standard way for building a signed and if desired, encrypted token that can contain arbitrary key-value pairs of information.

It’s the combination of these three technologies that is truly powerful and I believe will be the dominant way that Web Access Management (WAM) products will produce, consume, and transform identities and claims around those identities.

The rest of this blog will focus on a vision of how those technologies fit together and solve a wide variety of current and upcoming use cases.  I’ve tried to err on the side of understand-ability and conciseness so many of the underlying protocol details have been skipped.

 

Note that this is one architects vision - none of what is described here should be construed as official commitment for new product development.

 

OpenID Connect Tokens and Endpoints

Below is a diagram that shows the components, endpoints, and typical flow for OpenID Connect. In OAuth 2.0 and OpenID connect there are three parties to the interaction (excluding the actual users). They are the client, the authorization server, and the resource server. This is different than SAML which only has two actors, the Identity Provider and the Service Provider.

 

 

OAuth and OpenID Components and flow.jpg

 

OAuth and OpenID Tokens

The format of the OAuth 2.0 access and refresh tokens are officially opaque to the client and to the resource server.

The access token is the token that the client uses to request resources from the resource server on behalf of the user. It is expected to be a short lived token (minutes or hours).

The refresh token is the token that the client uses to get a new access token from the authorization server when the current access token expires, without the user being re-challenged. Its lifetime is typically hours or days.

 

One of the weaknesses of OAuth 2.0 is that there is no prescribed way for the resource server to validate the access token or to use the access token to establish the identity of the user.

This was not a problem when the resource server and the authorization server were closely linked. The access token could just be a lookup key that the resource server checked via an internal API. What should we do if the authorization server and resource server are separate entities?

 

Google and other OAuth 2.0 providers solved this problem by providing a TokenInfo Endpoint. Resource servers could pass the access token to this endpoint and get back a JSON object that tells you whether or not the token is valid, user identity, token scope, and expiration time.

The Introspection or TokenInfo endpoint has now been formalized by RFC 7662 and is implemented by most OAuth 2.0 and OpenID connect providers. It provides a secure way for resource servers to get metadata about access and refresh tokens from the authorization server.

 

Although access and refresh tokens are defined to be opaque values, some OAuth 2.0 implementations allowed the tokens to be signed JWT tokens, which would allow the client and resource server to extract information directly from the token without having to make additional calls to the OpenID Provider UserInfo endpoint

 

This concept was expanded in OpenID Connect with the introduction of the ID token. The ID token is a signed and potentially encrypted JWT token which contains the user’s identity and any claims that were in scope.

This gives client applications and resource servers the ability to securely get information about the user directly without having to contact the authorization server every time.

There is still debate within the OpenID Connect implementer and user community as to whether the ID token is for use solely by the client, or can also be sent to the resource server.

 

Next Post

  In the next post I will describe why all of this is important and how it can be combined and used to solve real world application issues.