Exploration of Four51 OrderCloud, its architecture, and Headstart setup

This is an exciting time to be in Software Development. Things that we had been hearing about, Micro Service Based Architecture, Cloud Native Application, API First Headless Architecture are finally shaping up nicely. Sitecore’s recent acquisition of cloud first commerce platform Four51 and Customer Data Platform (CDP) Boxever confirms that the trend is going to be integrating with specialty platforms than one company is building everything. Sitecore’s core platform is a Content Management System that enables us to create content and deliver content to the target audience. But, just delivering content without intelligence is not good enough. That’s why Sitecore has become a Digital Experience Platform. With the acquisition of Four51 and Boxever, Sitecore is bringing full Digital Experience to online Online Commerce. In this article, I will discuss the Four51 OrderCloud platform and show you how to set up the OrderCloud Headstart application, which is an application like Sitecore’s Habitat Demo. Let’s dive in.

tl;dr: If you are interested in only Headstart Setup go to the Setting Up Headstart section.

OrderCloud

OrderCloud is an API First Headless truly Cloud Native B2B eCommerce Platform designed by Four51. OrderCloud architecture is MACH (Microservices, API-First, Cloud & Headless) certified. The MACH certification tells a lot about a platform. A Commerce platform that follows MACH architecture is modular and truly open for integration to other systems via Microservices and APIs. A typical MACH certified Commerce platform architecture looks like below and this looks very close to what we envision the OrderCloud architecture will be when it will be fully integrated with Sitecore.

Source: machalliance.org

Functional Architecture

Speaking about architecture, let’s talk about the Functional Architecture of OrderCloud. It will help us to understand as a product, what OrderCloud offers and what kind of B2B Commerce solutions we can create from it. In the core of OrderCloud following are the important entities that exist.

Seller: Seller is the orchestrator of the business. Seller defines how the business will be done. If Seller is a Manufacturer they might be selling products only to Buyers, but if Seller is a Distributor, in addition to selling products to Buyer, they might be connecting the Suppliers with the Buyers. Seller users are the admin users with highest privilege to OrderCloud APIs.

Buyer: A Buyer is a Customer or an organization with an account with the Seller so that they can purchase products. A buyer has one or more users. A user authenticates to the Storefront and orders products for the buyer. Buyer users can be put into groups for managing access levels and personalizing the buying experience.

Supplier: A Supplier is an organization in OrderCloud that fulfills orders placed by buyers of their products. Supplier is an optional construct in OrderCloud. A seller can be the only supplier in the system. Supplier users have restricted access to the Seller admin site where they can manage their products, orders, supplier information, and users. The Supplier can assign different roles to their users. Some can be responsible for managing products, some can be responsible for managing orders, etc.

User: User is someone who authenticates to the Storefront (Buyer User) or Seller Admin Site (Seller User) to use the site. Users can be assigned to User Groups for managing their access to the system as well as provide a personalizing shopping experience to buyers.

User Group: User Groups are roles when it comes to the administration of the application, but it also contains information that leads to personalized shopping from the buyers perspective. For example, information like Catalogs, Locations are assigned to User Groups. This drives the configuration of customer specific products in specific buyer locations. Roles in User Groups determine what users belong to a User Group can manage as far as administering the application goes. For example, if a Buyer User Group has AddressAdmin Role, a user in that group can add/modify/delete buyer addresses.

Address: A buyer or a supplier can have multiple addresses. From the buyer’s perspective, the address is where to order items will be shipped. Orders in OrderCloud can be shipped to multiple places because addresses can be attached at the line level. From the supplier’s perspective addresses are locations and these can be the addresses of warehouses. Seller’s addresses are locations for the sellers. When a seller is a sole supplier, these addresses can be warehouse addresses for fulfillment.

Catalog: Catalog defines a set of products. It drives what products a buyer can or cannot see in the storefront. A product can be assigned to more than one catalog. Products are organized in different categories. A Product in a Seller organization can be added by the seller or supplier selling the product. Although Suppliers can add products to the system, they cannot assign products to catalogs or organize them in categories, only sellers can do that.

Order: Order in OrderCloud represents the cart as well as the order submitted to the system. An order goes through a different state. If the status is ‘Unsubmitted’, it is a cart. If the status is ‘Open’ it is submitted to the system as an order. An order has directions in OrderCloud. An order from the buyer’s perspective is an outgoing order, but from the seller or supplier perspective, it is an incoming order. The supplier doesn’t see the order until the order is submitted, but the seller can see the order (cart) as an incoming order before submission. After submission of the order from the seller’s perspective, the order is both incoming (from buyer) and outgoing (to the supplier). The same order from the supplier perspective is an incoming order. This concept applies when accessing orders from the system using OrderCloud API. You need to use proper direction based on the API credential. If you are using the buyer’s credentials, you have to use ‘outgoing’ as the direction, but you are using the supplier’s credentials, you need to use the ‘incoming’ direction. The below image on the OrderCloud website describes this relationship.

Order Directions, Source: OrderClould.io

I haven’t seen any mention of invoices in OrderCloud. In many eCommerce platforms orders represent invoices after the order is completely fulfilled. Some eCommerce platform invoices are maintained separately. Invoices enable the buyer to make the payments after the invoices are generated by the seller. Often eCommerce platform integrates with third party invoice payment system like BillTrust which makes it easy for the buyer to manage their invoices of different purchases in one place.

Storefront: A seller may host more than one eCommerce website to sell products. A very common scenario for this is separate retail (B2C) website from B2B website. Some sellers choose to separate B2C sites from B2B sites because buyers’ experience in B2C sites is quite different than B2B sites. This is not the only reason to host more than one website. Seller may in the business of more than one brand and that may require a separate website too. Storefront in OrderCloud represents a website. You can have one organization in OrderCloud with multiple storefronts. Each storefront can have multiple suppliers and buyers associated with it.

In addition to the above entities, there are usual eCommerce entities that exist in the system, like, Promotion, Price, Shipping, etc. which drive eCommerce functions, but the above entities provide functional structure in OrderCloud. You can use the above described constructs to define a business. There are different business models described using the above constructs in the Commerce Strategy section in the OrderCloud document.

In the below diagram I tried to capture the high level functional architecture of OrderCloud. I may not be 100% correct about the Storefront part because there is not much documentation about Storefront in OrderCloud, but it is reasonable to think that Storefront will be related to Catalogs as Buyers are.


OrderCloud High Level Functional Architecture

Technical Architecture

OrderCloud is Cloud Native API first Headless eCommerce Platform which is constructed based on Micro Service based architecture. Below I described different parts of OrderCloud Platform Architecture.

OrderCloud Portal: OrderCloud Portal is where you will define your Seller Organization. Once you Sign Up and define your Seller Organization, that becomes your eCommerce System which you will be used for providing service to your buyers and optionally to your suppliers. All will be done using Restful APIs. OrderCloud Portal provides a nice API Console that you can use to query your eCommerce data as well as modify them if needed. The user has to authenticate to use APIs. OrderCloud uses OpenId Connect and OAuth2 for securing APIs. Every client who connects to OrderCloud for access to API has to have a ClientId and Client Secret. Below is a screenshot of the API Console.


OrderCloud API Console

Middleware: In OrderCloud Architecture, middleware is where integration to third services and cloud services happens. For example, if you want to integrate OrderCloud with an ERP, you will be implementing that in the Middleware. Also, if you want to integrate Cloud Services like, App Configuration, Blob Storage, etc., you will be implementing that here. Middleware can also be used to integrate with OrderCloud Webhooks via API endpoints in the Middleware. All services in the Middleware should be exposed by APIs in a headless manner. OrderCloud has provided a starter middleware project Catalyst in github.

Buyer UI: Buyer UI is eCommerce Storefronts that end users use to browse and purchase products. A Storefront functionalities are implemented by integrating with Middleware and OrderCloud APIs. Since OrderCloud is Headless, Buyer UI can be implemented in any language and platform, either client-side or server-side technology. OrderCloud has provided both .Net SDK and Javascript SDK for this purpose.

Seller / Supplier Admin: Seller / Supplier Admin is the Admin Portal to manage the eCommerce backend. Seller Admin Portal provides restricted access to Supplier users so that they can manage products they are selling, manage orders placed to them, warehouse inventory, and manage their users. Whereas Seller Admin users have full access so that they can define and manage the business. Seller Admin connects to the eCommerce backend via OrderCloud APIs. It can have its middleware if it requires to connect to any system other than OrderCloud. For example, the Seller may want to manage the payment from Seller Admin. In that case, Seller Admin has to be integrated with the payment system and that will require a middleware to be created. Unlike, Buyer UI, I imagine Seller Admin functionalities will not change from client to client. Also, Seller Admin bit more tightly coupled with OrderCloud architecture. For this reason, providing a fully functional but extensible Seller Admin by Sitecore will make sense. This will reduce the implementation cost as well as enable partners to extend the Seller Admin.

Webhook: Webhook is the way OrderCloud let the integrated system know that some event occurred in OrderCloud. For example, if you want to send an email notification when order status changes, you can create a Webhook in OrderCloud’s order API and that will call your send order status API associated with the Webhook. There are Pre-hook and Post-hook. Sending email when order status change is a Post-hook because webhook gets called after order status changes. You can add Webhook configuration like type of Webhook, Payload URL, OrderCloud API endpoints and API method, etc. in OrderCloud API Console. Typically you will host Payload API in the Middleware. This document describes how to create a Webhook in the OrderCloud Platform.

Below is the High Level Architecture of the Headstart Application. It is important to note that Headstart is a sample application. In actual implementation, architecture may change quite a bit.


High Level Architecture of Headstart

Extending OrderCloud

Whatever eCommerce platform you choose to implement eCommerce site for your client, there will be always requirements that will require you to customize the platform. OrderCloud platform architecture supports the Open-Closed Design Principle. It means, the platform is open for extension but closed for modification. You will not be able to modify the core platform and that makes the platform easier to upgrade. Since it is open for extension, you can easily add custom features on the top of the core platform.

There are generally three ways to extend the OrderCloud platform.

  • For outside integration, Middleware Services should be used.
  • For injecting your operational code into OrderCloud operation you should use Webhook. You can implement your Webhook APIs in the Middleware or a separate service. We discussed Middleware and Webhook in the previous section.
  • If you want to extend the OrderCloud schema to store additional data, you need to use Extended Property (XP). OrderCloud stores XP as JSON and you can have an elaborately constructed JSON. There are two things to remember, 1) the entire XP object cannot be more than 8000 bytes, and 2) XP should be consistent within an object (if you create XP for Order, the structure of that XP should be always same). Data included in XP can be searched, filtered, and sorted. Typically, XP should be used when a small amount of additional data needs to be added. If the need is to add a new object in the implementation, Middleware is the way to go. This article has nicely described how XP works in OrderCloud.

Setting up Headstart

You will find Headstart in Github. The ReadMe instruction is quite good, but I did face some issues. Watch the below video to see how I set up Headstart in my local machine and then used Azure DevOps to deploy applications. To set up Headstart you need an Azure account. You can create a free Azure account here. The issues and the solutions for them are described after this video.

Setup Issues

Registering with Third Party Services: My goal was not to set up Headstart and making it fully functional. For this reason, I haven’t configured all third party services. This caused some issues, especially with Avalara. It can take a long time to configure Avalara to return proper tax. So, I faked the Avalara calls in the Middleware. I Changed the code in AvalaraCommand class. To see the difference visit my Forked repository in Github. I configured SmartyStreet, which is required for address validation and easy to configure. I also configured Sendgrid for sending emails using the provided templates in the Headstart solution.

Seller UI Build Issue: I had a problem with building Seller App. The issue was finding Python version 2 in my machine. I resolved that by installing Windows Build Tools in my machine using npm (npm install –global windows-build-tools). Before installing, I removed node modules from Seller App.

Issue with OrderCloud CMS API: After building Seller App, I was getting 400 Bad Request error from ordercloud-cms-test.azurewebsites.net. This issue has been fixed in the Headstart repository. I resolved it by merging the original repository to my forked repository. This Stackoverflow thread helped me to understand, How to Sync a Forked Repository in Github.

Azure DevOps Deployment Issue: The ReadMe in Headstart repository has not fully explained how Azure DevOps Deployment works. I resolved below issues.

  • The azure-pipelines.yml was not generating a zipped artifact for the Middleware project. In the Middleware publish task, I had to change zipAfterPublish: true.
  • For Build Once, Deploy Many, you need to add “node inject-css defaultbuyer-test && node inject-appconfig defaultbuyer-test” in the Buyer release pipeline and “node inject-appconfig defaultadmin-test” in the Seller release pipeline. For this, you need to add ‘bash’ task in your release pipelines for Buyer and Seller. I showed this in the video.
  • The idea of Build Once, Deploy Many applies when you have multiple environments to deploy your application. This requires you to create Slots in Azure App Service and configure the release pipelines against those Slots. Each Slot in an App Service is used for an environment. I created Test Slots for Middleware, Buyer, and Seller App and deployed code there. If I need to deploy code to UAT, I have to create Slots for that each in App Services and create a release pipeline to deploy code in UAT Slots.

Conclusion

I enjoyed setting up Headstart. It helped me understand OrderCloud architecture. I like that OrderCloud is a highly extensible, Cloud Native platform with Micro Service based architecture and it does not limit me to any particular technology for using the platform. I hope my exploration of the platform and this article helps others to onboard to the OrderCloud platform quickly.

Acknowledgment

I would like to thank my colleague Daniel Govier for helping me with the Azure DevOps configuration. Without his help, it wouldn’t be possible for me to configure Azure DevOps deployment.

I would also like to thank Crhistian Ramirez for patiently answering all my questions in OrderCloud Community Slack.

References

Posted in Commercce, OrderCloud | Tagged , , | Leave a comment

Address and Email domain validation in Sitecore

This is going to be a quick blog article. Only reason to write this blog is to share some code we developed for address and email domain name validation using the address recommendation service Loqate . For many of Nish Tech’s clients (specially eCommerce implementation), we have implemented address and email validation to reduce number of mistakes in account creation. We thought this can be helpful for others in the Sitecore community if they are looking for similar solution. Thanks to my colleague Santhosh twitter: @Santhosh4184. He worked on most of the coding.

I have shared only the Feature projects that contain the code for address and email validation in this Github Repo. You still need to add the project in your Helix based Sitecore Solution to make it work. We developed this using SXA but, same concept can be used for Sitecore Forms.

Here is an animation that shows how the module work.

Address and Email validation form

The way address validation through Loqate service works is, you have to create an account in Loqate and set up your account for the solution. After this, you will be provided an API Key and base Javascript code that you will need to add in your pages. For code example, look at ValidateAddress.cshtml and ValidateEmail.cshtml.

We hope this helps.

Links

– Code in Github: https://github.com/himadric/AddressValidation
– Loqate: https://www.loqate.com/
– Follow Santhosh: https://twitter.com/Santhosh4184

Posted in Sitecore | Tagged , , | Leave a comment

Serilog Appender for Sitecore Logging

In my last blog post I discussed concepts of Structured Logging as well as discussed what benefits it can bring to Sitecore if we use Structured Logging for troubleshooting. It is clear that we cannot include Structured Logging in Sitecore from ground up, unless, we rebuild Sitecore Diagnostic using some Structured Logging framework. Which is a huge undertaking for Sitecore. Honestly, things like logging which is although very important don’t get enough priority when a new release is planned. We thought that Sitecore will move to SaaS model and rearchitect the product in .NET Core, but recent release of Sitecore 10 suggests that they are sticking to the current platform architecture. So, looks like, we will be staying the Log4Net logging for some time. Beside that, many clients will stay with the older Sitecore version and logging for them will not change.

So, what are the options? Do we have to wait for Sitecore to rearchitect the product? Is there an option to use structured logging without Sitecore moving to Structured Logging? It seems, we can use Structured Logging and not wait for Sitecore. The result is not as good as including Structured Logging in the core product, but it is a significant improvement than using Log4Net text based logging.

SerilogAppender

To format Sitecore Logging to Structured Logging, I will use a Log4Net Appender that I created to convert Sitecore log to Serilog Log. It is not difficult to create a Log4Net Appender. All we need to do is to derive a class from BufferingAppenderSkeleton and override the method SendBuffer. Below is the code for that. SendBuffer gets the array of events. It initializes the Serilog with Enrichers and loops through the events to log the events to Serilog Sink Seq.

        protected override void SendBuffer(LoggingEvent[] events)
        {
            using (var log = new LoggerConfiguration()
                .MinimumLevel.ControlledBy(new LoggingLevelSwitch(GetLogEventLevel()))
                .Enrich.FromLogContext()
                .Enrich.WithMachineName()
                .Enrich.WithEnvironmentUserName()
                .Enrich.WithProcessId()
                .Enrich.WithProcessName()
                .Enrich.WithProperty("ThreadId", SystemInfo.CurrentThreadId)
                .Enrich.WithMemoryUsage()
                .WriteTo.Seq(_seqHost, apiKey: _apiKey)
                .CreateLogger())
            {
                foreach (var thisEvent in events)
                {
                    LogEvent(log, thisEvent);
                }
            }

        }
        private void LogEvent(Logger log, LoggingEvent loggingEvent)
        {
            try
            {
                if (loggingEvent.Level == Level.DEBUG)
                {
                    log.Debug(loggingEvent.RenderedMessage);
                }
                if (loggingEvent.Level == Level.INFO)
                {
                    log.Information(loggingEvent.RenderedMessage);
                }
                if (loggingEvent.Level == Level.WARN)
                {
                    log.Warning(loggingEvent.RenderedMessage);
                }
                if (loggingEvent.Level == Level.ERROR)
                {
                    log.Error(loggingEvent.RenderedMessage);
                }
                if (loggingEvent.Level == Level.FATAL)
                {
                    log.Fatal(loggingEvent.RenderedMessage);
                }
            }
            catch (Exception ex)
            {
                this.ErrorHandler.Error("Error occurred while logging the event.", ex);
            }
        }

For full source code visit to Github repo https://github.com/himadric/structured-logging-for-sitecore

To include the SerilogAppender in Sitecore logging we need a patch config to include the appender in the configuration. Below is the config patch (also available in the github repo).

<?xml version="1.0" encoding="utf-8" ?>
<configuration xmlns:patch="http://www.sitecore.net/xmlconfig/" xmlns:role="http://www.sitecore.net/xmlconfig/role/" xmlns:security="http://www.sitecore.net/xmlconfig/security/">
    <sitecore role:require="Standalone or ContentDelivery or ContentManagement">
        <log4net>
            <appender name="SerilogAppender" type="log4net.Appender.SerilogAppender, Foundation.SerilogAppender" patch:after = "appender[@name='LogFileAppender']">
                <minimumlevel value="DEBUG" />
                <apikey value="fz0IdNDO6IfPCY9ct9o5" />
                <seqhost value="http://localhost:5341" />
                <layout type="log4net.Layout.PatternLayout" />
                <encoding value="utf-8" />
            </appender>
            <root>
                <appender-ref ref="SerilogAppender" patch:instead = "*[@ref='LogFileAppender']"/>
            </root>
        </log4net>
  </sitecore>
</configuration>

Once the SerilogAppender is deployed along with the config patch file, open up Seq in the browser, you will see log messages are showing up in Seq.

If you would like to see a demo of this SerilogAppender, you may watch my session on Structured Logging in Cincinnati Sitecore User Group.

Useful Links

Posted in Debugging, Logging, Sitecore | Tagged , , , | Leave a comment

Structured Logging in Sitecore

In Sitecore Symposium 2019, Sitecore announced company’s plan to move Sitecore Platform to SaaS based model. If you want to know more about it, you can read this FAQ. As Sitecore is moving to SaaS, which will require completely revamping the architecture, they will be building on ASP.NET Core. With many things that will change in this transition, Sitecore will definitely going to look at the logging strategy in the platform. With all certainty they will move to Structured Logging because ASP.NET Core itself adopted Structured Logging as the logging strategy. In fact, Sitecore introduced Structured Logging using Serilog in their newest modules like xConnect and Sitecore Host.

In this article, we will discuss about Structured Logging to understand what is expected in future version of Sitecore SaaS. We will look into Sitecore’s current text based logging strategy based on Log4net and discuss about approach to convert logging of existing Sitecore applications to Structured Logging.

What is Structured Logging

The idea of logging information in computer program existed from the very beginning. Every computer language has some form of printf statement. In the very beginning printing statement in the console was the way of debugging applications. Later on debugging issues in complex applications became difficult. Specially, illusive issues that happen in the production environment required us to persist the logging statement in some place, in files, database etc. This required industry to come up with some logging frameworks that will enable us to use logging as an Aspect Oriented Programming (AOP) and enable us to capture information in different storage of our choice. Log4j/Log4net is a good example of this and has a wide acceptance in the industry. Log4net is an interface based framework for implementing AOP based logging in applications. Although it comes with several implementation of Appenders (Log4net terms for different way to persist information), it doesn’t restrict anyone to implement new Appenders or change existing one. Sitecore went one step ahead. They took Log4net source code and implemented entire logging in Sitecore.Logging module. This gives them better control over Log4net versioning and implementation. Also, it helps us to separate implementation specific logging from Sitecore internal logging. If you look at Sitecore implementation bin folder, you will not find any log4net.dll, because everything related to logging is in Sitecore.Logging.dll.

Is Log4net good enough for what we need? Let’s look at how log4net is described in Apache log4net About page.
The Apache log4net library is a tool to help the programmer output log statements to a variety of output targets. log4net is a port of the excellent Apache log4j™ framework to the Microsoft® .NET runtime.
We can see log4net doesn’t say what to send in output log statement and that creates a huge problem in consuming logs. In other words, there is no structure in what we can send to output log, there is no rule. Log4net doesn’t stop us creating structure in the log statement, it’s just that, the framework was built based on Text Logging. In Text Logging, there is no separation between variables and values. Everything is included in the message. A Text Logging looks like below (taken from real sitecore log). When we log this way, we lose information because we cannot for example find all values of ‘interval’ easily.

35364 12:21:02 INFO  Starting periodic task "ExpiredMessagesCleanup" with interval 00:01:00
35364 12:21:02 INFO  Starting periodic task "CleanupTrackedErrors" with interval 00:01:00

The same logging in Structured Logging looks like below.

thread=35364, time=12:21:02, level=INFO, task="ExpiredMessagesCleanup", interval=00:01:00
thread=35364, time=12:21:02, level=INFO, task="CleanupTrackedErrors", interval=00:01:00

What Structured Logging provides is key/value pair, and that makes parsing logs much easier. But, what matters most is programming mind set, because, if we add the logging output to a message variable, it will become nothing but Text Logging. When we are adopting Structured Logging, we need to think what we need to capture to render logging that is easily navigable to troubleshoot difficult application issues. To help with this, Structured Logging standardized the process that we will discuss next. Logging framework such as Serilog is based on this Structured Logging concept and provides great tooling and programming extensions for implementing Structured Logging in applications.

Message Template

One thing should be mentioned that, Structured Logging is much harder for human to read than Text Logging because habitually, it is easier for us to read sentences than key/value pair. It is a problem that needed to be addressed because most of the time when we are troubleshooting issues, after querying or navigating through logs, we will read the log entries. This is something that has been addressed using Message Template. For example, if we want to capture above mentioned log entries using Message Template, we will log like below.

log.Information("Starting periodic task {taskname} with interval {interval}", taskname, interval);

The above will produce structured log entries like below.

{
    "thread":"35364",
    "time":"12:21:02",
    "level":"INFO",
    "template":"Starting periodic task {taskname} with interval {interval}",
    "properties":{
        "taskname":"ExpiredMessagesCleanup",
        "interval":"00:01:00"
    }
}
{
    "thread":"35364",
    "time":"12:21:02",
    "level":"INFO",
    "template":"Starting periodic task {taskname} with interval {interval}",
    "properties":{
        "taskname":"CleanupTrackedErrors",
        "interval":"00:01:00"
    }
}

Since the log entries are captured in Message Template format it is possible to render the entries in human readable format by replacing the properties in the template. At the same time since the properties are captured separately, it is also machine readable. One other benefit of using template is that, we can generate unique hash from the Message Template and use that to group the messages. To learn more about Message Templates visit https://messagetemplates.org/

Events and Event Types

Every entry captured in the log is due to an event that occurred. There is no different meaning of event in Structured Logging. Similar to Text Logging Structured Logging has event level like debug, information, warning etc.
In logging, level is used to reduce the number of logging events captured in the application to minimize the impact on performance as well as less number of event to parse in case of troubleshooting. Despite of this ability, our experience of troubleshooting issues by looking at log entries most of the time is overwhelming. Even if we set minimum level for logging to info, there are so many unrelated log entries, finding the entries that are related to issue becomes very difficult. If the issues could be categorized into different event types, we could exclude the events that we are not interested in and narrow our search. Something like facet filtering in search. In Text Logging this is done using Event Type, but it requires significant effort because we need to plan ahead to create the Event Types and then follow through that plan in application building. In Structured Logging we can use Message Template itself as the Event Type. If we are not interested in certain type of message, we can exclude those messages based on the Message Template. It is even easier if Message Template can be converted into hashes because each type of Message Template hash will be unique. For example, if we want to exclude all message of the below kind, we can exclude all messages created from the Message Template included in the message.

{
    "thread":"35364",
    "time":"12:21:02",
    "level":"INFO",
    "template":"Starting periodic task {taskname} with interval {interval}",
    "properties":{
        "taskname":"ExpiredMessagesCleanup",
        "interval":"00:01:00"
    }
}

Enrichment

Enrichment in Structured Logging is decorating logs with information to build a context so that we can correlate log entries. For example, in Sitecore application if someone says, a component in a page is showing error sometime for some users connected to a specific CD server. Diagnosing this kind of problem requires to build context in the log to narrow down the search. If we can add information like, Sitecore ItemId, UserId, Rendering ItemId, Data SourceId, Machine Name (which CD) etc., we may find that personalization rule for a group of users is failing because certain items were not published properly in problematic CD server. In Structured logging, Enrichment allows us to add information like Machine Name, ThreadId, even custom properties in the log to build context. Correlating log entries is specially difficult in asynchronous programming where you cannot really use Time or ThreadId easily to correlate log messages. In that case events can be wrapped using a MessageId to identify them belongs to one group. We will discuss more about Enrichment with examples in the second part of this blog about Serilog.

Log Parsing Tools

Tools do not have really anything to do with Structured Logging concepts, but Structured Logging makes it easier to build tools for slicing and dicing logs. In text logging, since there is lack of structure, after capturing logs, we use generic tools like LogParser, grep to parse logs. With logs captured using Structured Logging concept, it is easier to build tool to parse log, create reports & charts and send alerts. Seq is one such tool to parse logs captured using Serilog. We will look into Seq in the second part of this blog series.

Final Words

In this blog article I discussed the concepts of Structured Logging and tried to associate the concepts with Sitecore logging. Understanding these concepts will help us to understand logging in Sitecore in future SaaS version. Also, my goal is to see if I can convert the logging in the current versions of Sitecore to Structured Logging to some extent so that I can take advantage of fantastic tooling and extensions available in Serilog. That’s coming in future blog posts. Stay tuned.

References

Posted in Debugging, Logging, Sitecore | Tagged , , | Leave a comment

Sitecore Identity Part 3: Connecting to External Identity Provider

Introduction

Sitecore Identity Provider was implemented based on IdentityServer4 framework. IdentityServer4 doesn’t dictate how authentication to be done or what application can use the identity provider. It’s up to the implementer to decide that. In previous blog article, we discussed how a third party application can authenticate using Sitecore Identity Provider. In this blog we will look at the other side of Sitecore Identity. We know that Sitecore Identity authenticates users using the membership provider, but Sitecore Identity can delegate the authentication to other identity provider too. In fact Sitecore Identity comes with inbuilt AzureAd subprovider. If you enable it you should be able to authenticate against the Azure Active Directory. I added the following updated diagram to show how subprovider fits into the architecture.

Sitecore Identity with Subproviders

Configuring Azure Ad Subprovider

Sitecore provided some documentation about how to configure out of the box Azure Ad subprovider. It’s not in very detail and it takes some effort to configure it. I was going to write about it, but I had a problem setting it up end to end. So, I posted this question in Stack Exchange and through that I found this excellent blog post which explains everything in detail. I am skipping that part. I captured the following animation to show you how authentication happens once you setup Azure Ad subprovider.

If you setup the loginpage attribute of the shell site in Sitecore.Owin.Authentication.IdentityServer.config as
$(loginPath)shell/SitecoreIdentityServer/IdS4-AzureAd, Sitecore will skip Sitecore Identity Provider and use Azure Ad provider directly to authenticate.

Custom External Subprovider

Let’s write some code to implement a custom subprovider. I will be utilizing IdentityServer Demo site https://demo.identityserver.io/ to create our subprovider. The reason to use IdentityServer Demo is, it is simple and no configuration is needed on the IdentityServer side. If you go to that site, you will find several grant types. The one we will be using is Implicit grant type and the ClientId for that is ‘implicit’. Our subprovider will be using this ClientId to get access to the IdentityServer identity provider.
Sitecore Identity is a Sitecore Host application. Sitecore Host is a new framework introduced in Sitecore 9.1 that you can use to create Sitecore Services. The benefit of creating services using Sitecore Host is that, you get all the common features of Sitecore Host, like logging, Dependency Injection etc. right out off the bat. The subprovider we will be creating is a Sitecore Host plugin and since Sitecore Host supports dynamic plugin loading, our plugin will be loaded as soon as we drop the plugin the ‘sitecoreruntime’ folder under Sitecore Identity root folder.
You can find the plugin project in the github repo. It’s name is
Sitecore.Plugin.IdentityProvider.Ids4Demo. Here are the important code snippets.


The Sitecore.Plugin.IdentityProvider.Ids4Demo.xml contains the configuration of the subprovider. The <Enabled> property should be true to make the subprovider available to Sitecore Identity. <AuthenticationSchema> is to identify the subprovider and it is used along with the IdentityProvider name in Sitecore to configure which subprovider to be used for different site. <ClientId> contains the Id to be used to connect to IdentityServer demo provider. <DisplayName> is button caption that will be used in the Sitecore Identity login page for this subprovider.<ClaimsTranformations> are used to translate claims from the subprovider to Sitecore Identity claims.

<?xml version="1.0" encoding="utf-8"?>
<Settings>
<Sitecore>
<ExternalIdentityProviders>
<IdentityProviders>
<Ids4Demo type="Sitecore.Plugin.IdentityProviders.IdentityProvider, Sitecore.Plugin.IdentityProviders">
<AuthenticationScheme>IdS4-Ids4Demo</AuthenticationScheme>
<DisplayName>IdentityServer Demo Identity Provider</DisplayName>
<Enabled>true</Enabled>
<ClientId>implicit</ClientId>
<MetadataAddress></MetadataAddress>
<ClaimsTransformations>
<!--Place transformation rules here. -->
<ClaimsTransformation1 type="Sitecore.Plugin.IdentityProviders.DefaultClaimsTransformation, Sitecore.Plugin.IdentityProviders">
<SourceClaims>
<Claim1 type="http://schemas.xmlsoap.org/ws/2005/05/identity/claims/upn" />
</SourceClaims>
<NewClaims>
<Claim1 type="email" />
</NewClaims>
</ClaimsTransformation1 >
<ClaimsTransformation2 type="Sitecore.Plugin.IdentityProviders.DefaultClaimsTransformation, Sitecore.Plugin.IdentityProviders">
<SourceClaims>
<Claim1 type="http://schemas.xmlsoap.org/ws/2005/05/identity/claims/emailaddress" />
</SourceClaims>
<NewClaims>
<Claim1 type="email" />
</NewClaims>
</ClaimsTransformation2>
</ClaimsTransformations>
</Ids4Demo>
</IdentityProviders>
</ExternalIdentityProviders>
</Sitecore>
</Settings>

ConfigureSitecore class adds the subprovider in the services collection with appropriate option. In our case the code adds Authority as IdentityServer demo provider and also indicates that it is an external Identity Server to be used for sign in.

using Microsoft.AspNetCore.Authentication;
using Microsoft.AspNetCore.Authentication.OpenIdConnect;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.Extensions.Logging;
using Sitecore.Framework.Runtime.Configuration;
using Sitecore.Plugin.IdentityProvider.Ids4Demo.Configuration;
using System;
using System.Security.Claims;
using System.Threading.Tasks;
namespace Sitecore.Plugin.IdentityProvider.Ids4Demo
{
public class ConfigureSitecore
{
private readonly ILogger<ConfigureSitecore> _logger;
private readonly AppSettings _appSettings;
public ConfigureSitecore(ISitecoreConfiguration scConfig, ILogger<ConfigureSitecore> logger)
{
this._logger = logger;
this._appSettings = new AppSettings();
scConfig.GetSection(AppSettings.SectionName);
scConfig.GetSection(AppSettings.SectionName).Bind((object)this._appSettings.Ids4DemoIdentityProvider);
}
public void ConfigureServices(IServiceCollection services)
{
Ids4DemoIdentityProvider ids4DemoProvider = this._appSettings.Ids4DemoIdentityProvider;
if (!ids4DemoProvider.Enabled)
return;
this._logger.LogDebug("Configure '" + ids4DemoProvider.DisplayName + "'. AuthenticationScheme = " + ids4DemoProvider.AuthenticationScheme + ", ClientId = " + ids4DemoProvider.ClientId, Array.Empty<object>());
new AuthenticationBuilder(services).AddOpenIdConnect(ids4DemoProvider.AuthenticationScheme, ids4DemoProvider.DisplayName, (Action<OpenIdConnectOptions>)(options =>
{
options.SignInScheme = "idsrv.external";
options.ClientId = ids4DemoProvider.ClientId;
options.Authority = "https://demo.identityserver.io/";
options.MetadataAddress = ids4DemoProvider.MetadataAddress;
options.CallbackPath = "/signin-idsrv";
options.Events.OnRedirectToIdentityProvider += (Func<RedirectContext, Task>)(context =>
{
Claim first = context.HttpContext.User.FindFirst("idp");
if (string.Equals(first != null ? first.Value : (string)null, ids4DemoProvider.AuthenticationScheme, StringComparison.Ordinal))
context.ProtocolMessage.Prompt = "select_account";
return Task.CompletedTask;
});
}));
}
}
}
view raw ConfigureSitecore.cs hosted with ❤ by GitHub

Copy the code as shown the following directory structure under the Sitecore Identity root folder.

sitecoreruntime
│ license.xml

└───production
│ Sitecore.Plugin.IdentityProvider.Ids4Demo.dll
│ Sitecore.Plugin.IdentityProvider.Ids4Demo.xml

└───sitecore
└───Sitecore.Plugin.IdentityProvider.Ids4Demo
│ Sitecore.Plugin.manifest

└───Config
Sitecore.Plugin.IdentityProvider.Ids4Demo.xml

Launch Sitecore shell site. As you are redirected to Sitecore Identity site you should see new login button in the site for the new subprovider as shown in the below picture.

Click on the new subprovider button, it should take you to IdentityServer demo provider for authentication. Enter username bob and password bob, you will be signed and redirected to Sitecore Identity provider. It works like below.


Unfortunately, at this point, when website is redirected to Sitecore Identity, it is throwing a server error. The log shows the following error, ‘ The payload was invalid’.

System.Security.Cryptography.CryptographicException: The payload was invalid.
   at Microsoft.AspNetCore.DataProtection.Cng.CbcAuthenticatedEncryptor.DecryptImpl(Byte* pbCiphertext, UInt32 cbCiphertext, Byte* pbAdditionalAuthenticatedData, UInt32 cbAdditionalAuthenticatedData)
   at Microsoft.AspNetCore.DataProtection.Cng.Internal.CngAuthenticatedEncryptorBase.Decrypt(ArraySegment`1 ciphertext, ArraySegment`1 additionalAuthenticatedData)
   at Microsoft.AspNetCore.DataProtection.KeyManagement.KeyRingBasedDataProtector.UnprotectCore(Byte[] protectedData, Boolean allowOperationsOnRevokedKeys, UnprotectStatus& status)
   at Microsoft.AspNetCore.DataProtection.KeyManagement.KeyRingBasedDataProtector.DangerousUnprotect(Byte[] protectedData, Boolean ignoreRevocationErrors, Boolean& requiresMigration, Boolean& wasRevoked)
   at Microsoft.AspNetCore.DataProtection.KeyManagement.KeyRingBasedDataProtector.Unprotect(Byte[] protectedData)
   at Microsoft.AspNetCore.DataProtection.DataProtectionCommonExtensions.Unprotect(IDataProtector protector, String protectedData)
   at IdentityServer4.Infrastructure.DistributedCacheStateDataFormatter.Unprotect(String protectedText, String purpose)
   at Microsoft.AspNetCore.Authentication.OpenIdConnect.OpenIdConnectHandler.ReadPropertiesAndClearState(OpenIdConnectMessage message)
   at Microsoft.AspNetCore.Authentication.OpenIdConnect.OpenIdConnectHandler.HandleRemoteAuthenticateAsync()

I couldn’t get to the bottom of this problem. I checked the id_token and payload looks valid. So, I opened a support ticket with Sitecore. I have to wait until Sitecore provides me an explanation of why the error is happening for other Identity Providers, but not for Azure Ad. Same problem is happening with Okta too.

I decided to publish this blog before the issue is solved because, I don’t see any problem with the approach. As soon as I get a resolution, I will update the blog. Finger crossed.

References

Update

Sitecore answered the question about the issue I mentioned above and it resolved the issue. In the ConfigureSitecore class ConfigureServices method CallbackPath was missing. After I added options.CallbackPath = “/signin-idsrv”; my custom identity provider started working without any problem. I updated the code in Github also.

Posted in Security, Sitecore, sitecore Identity | Tagged , , , , , , | 17 Comments

Sitecore Identity Part 2: Identity Client and Resource Authorization

Introduction

In the previous blog we discussed fundamentals of OAuth 2.0 and Open Id Connect. We discussed the implementation of these two protocols in Sitecore Identity Server using IdentityServer4 framework. We also discussed, how Sitecore Identity will play the role in Sitecore Micro Service Architecture.
In this blog, we will show, how we can create a Sitecore Identity Client that authenticates against Sitecore Identity Server and how this client can authorize via Sitecore Identity Server to use Sitecore Services. For the purpose of demonstration, I have created a ASP.Net Core MVC client that will use hybrid grant type to authenticate and authorize against Sitecore Identity Server. I also created a sample API service that returns data using Sitecore Layout Service and it is scoped using sitecore.profile.api scope. This sample service can be seen as a sitecore resource.
Following diagram shows how different parts of this example fits into the architecture.

Following animation shows my Sitecore Identity Client in action. This shows

  • Client is authenticated by Sitecore Identity Server and displays the identity information.
  • Renewing token from the Identity Server.
  • Calling Sitecore Service to demonstrate authorizing Sitecore Resource via Sitecore Identity.
  • Logging out.

Sitecore Identity Client in Action

Sitecore Identity Configuration

For an Sitecore Identity Client to be recognized by Sitecore Identity Server, we have to provide some information to it. There are two ways to do this, we can use dependency injection or we can register the client in IdentityServer.xml located in <website root>\sitecore\Sitecore.Plugin.IdentityServer\Config. There is a Clients section in IdentityServer.xml, we can either use the DefaultClient, which is used by Sitecore (Sitecore is a Sitecore Identity Client) or we can add a new client. I chose to add a new Client. Let’s go through that and understand important client configuration. Following fields are most important.

  • ClientId: Uniquely identify a client’s behavior. When a Client use this Id, Sitecore Identity Server use this Id to find out the configuration to be used for the client. Same ClientId can be used by more than one client. For example, instead of creating a new client in the config for my Identity Server client, I could have used DefaultClient, which is used by Sitecore.
  • ClientSecret: Client Secret is used to authenticate client against Sitecore Identity Server. This field is optional.
  • AllowOfflineAccess: Sitecore Identity supports offline_access scope, which is used to renew access token. This field in the configuration determines if offline_access scope is supported for client or not. If not supported, client will not be able to renew the token and will have to authenticate again to get a new token after token expires.
  • RequireConsent: This field determines if Identity Server needs to get consent from the resource owner before the client is authorized to get access to the resources. Typically, it is an intermediate page implemented in the Identity Server to show the user a list of scopes she is giving access to the client. Sitecore Identity Server didn’t implement this page yet. So, if you set this field ‘true’, you will get page not found error.
  • AllowedGrantTypes: This field lists the OAuth grant types allowed for the client. Supported grant types for Sitecore Identity can be found in the previous blog.
  • RedirectUris: This field contains list of URIs where Identity Server will redirect this user after login. The OpenId Connect authentication middleware handles /singin-oidc route and populates the user information before redirecting user to the return url. The
    {AllowedCorsOrigin} is replaced with AllowedCorsOrigins field mentioned below. Sitecore uses /identity/signin  for RedirectUris, not the standard oidc route.
  • PostLogoutRedirectUris: This field is similar to RedirectUris, except that this is the URL where Identity Server redirects the client after logging out. Standard oidc route is /signout-callback-oidc
  • AllowedCorsOrigins: This field lists the domain names allowed to use Identity Server for authentication and authorization. In my case I am running the client website at http://localhost:5002.
<ExternalClient>
<ClientId>sitecoremvc</ClientId>
<ClientName>sitecoremvcclient</ClientName>
<ClientSecrets>
<ClientSecret1>abracadabra</ClientSecret1>
</ClientSecrets>
<AccessTokenType>0</AccessTokenType>
<AllowOfflineAccess>true</AllowOfflineAccess>
<AlwaysIncludeUserClaimsInIdToken>false</AlwaysIncludeUserClaimsInIdToken>
<AccessTokenLifetimeInSeconds>3600</AccessTokenLifetimeInSeconds>
<IdentityTokenLifetimeInSeconds>3600</IdentityTokenLifetimeInSeconds>
<AllowAccessTokensViaBrowser>true</AllowAccessTokensViaBrowser>
<RequireConsent>false</RequireConsent>
<RequireClientSecret>true</RequireClientSecret>
<AllowedGrantTypes>
<AllowedGrantType1>hybrid</AllowedGrantType1>
<AllowedGrantType2>client_credentials</AllowedGrantType2>
</AllowedGrantTypes>
<RedirectUris>
<RedirectUri1>{AllowedCorsOrigin}/signin-oidc</RedirectUri1>
</RedirectUris>
<PostLogoutRedirectUris>
<PostLogoutRedirectUri1>{AllowedCorsOrigin}/signout-callback-oidc</PostLogoutRedirectUri1>
</PostLogoutRedirectUris>
<AllowedCorsOrigins>
<AllowedCorsOrigin1>http://localhost:5002</AllowedCorsOrigin1>
</AllowedCorsOrigins>
<AllowedScopes>
<AllowedScope1>openid</AllowedScope1>
<AllowedScope2>sitecore.profile</AllowedScope2>
<AllowedScope3>sitecore.profile.api</AllowedScope3>
</AllowedScopes>
<UpdateAccessTokenClaimsOnRefresh>true</UpdateAccessTokenClaimsOnRefresh>
</ExternalClient>

Sitecore Identity Client

Now we have to create an Identity Client. For this I haven’t reinvented the wheel. I took the same approach as the IdentityServer samples and added a new sample client for Sitecore Identity Server. My samples can be found here. This is a fork out of original IdentityServer4 samples with Sitecore Client and sample Sitecore API (we will discuss in the next section).
I added the following code in Startup.cs. This configures the client. Lines of code we need to pay attentions are, Authority, ClientId, ClientSecret and scopes. Authority is Sitecore Identity server URL. ClientId and ClientSecret are same as in the Sitecore Identity Configuration above. Also, this client wants to authenticate using OpenId Connect, access Sitecore Profile and Profile API and renew the expired access token without logging in as declared in the scopes.

using System;
using System.IdentityModel.Tokens.Jwt;
using System.Net.Http;
using Clients;
using IdentityModel;
using IdentityModel.Client;
using Microsoft.AspNetCore.Authentication;
using Microsoft.AspNetCore.Authentication.Cookies;
using Microsoft.AspNetCore.Builder;
using Microsoft.AspNetCore.Hosting;
using Microsoft.Extensions.Configuration;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.IdentityModel.Tokens;
namespace SitecoreMvcClient
{
public class Startup
{
public Startup(IConfiguration configuration)
{
JwtSecurityTokenHandler.DefaultInboundClaimTypeMap.Clear();
}
// This method gets called by the runtime. Use this method to add services to the container.
public void ConfigureServices(IServiceCollection services)
{
services.AddMvc();
services.AddHttpClient();
services.AddSingleton<IDiscoveryCache>(r =>
{
var factory = r.GetRequiredService<IHttpClientFactory>();
return new DiscoveryCache(Constants.SitecoreAuthority, () => factory.CreateClient());
});
services.AddAuthentication(options =>
{
options.DefaultScheme = CookieAuthenticationDefaults.AuthenticationScheme;
options.DefaultChallengeScheme = "oidc";
})
.AddCookie(options =>
{
options.ExpireTimeSpan = TimeSpan.FromMinutes(60);
options.Cookie.Name = "sitecoremvcclient";
})
.AddOpenIdConnect("oidc", options =>
{
options.Authority = Constants.SitecoreAuthority;
options.RequireHttpsMetadata = false;
options.ClientSecret = "abracadabra";
options.ClientId = "sitecoremvc";
options.ResponseType = "code id_token";
options.Scope.Clear();
options.Scope.Add("openid");
options.Scope.Add("sitecore.profile");
options.Scope.Add("sitecore.profile.api");
options.Scope.Add("offline_access");
options.ClaimActions.MapAllExcept("iss", "nbf", "exp", "aud", "nonce", "iat", "c_hash");
options.GetClaimsFromUserInfoEndpoint = true;
options.SaveTokens = true;
options.TokenValidationParameters = new TokenValidationParameters
{
NameClaimType = JwtClaimTypes.Name,
RoleClaimType = JwtClaimTypes.Role,
};
});
}
// This method gets called by the runtime. Use this method to configure the HTTP request pipeline.
public void Configure(IApplicationBuilder app, IHostingEnvironment env)
{
app.UseDeveloperExceptionPage();
app.UseAuthentication();
app.UseStaticFiles();
app.UseMvcWithDefaultRoute();
}
}
}

Following method in the HomController when executes, authentication middleware redirect user to the Authority URL set in the Startup.cs, in this case redirects to Sitecore Identity Server. Once user authenticates, middleware use RedirectUri to collect user information and redirect user to the secure page, where the user, claims and token information are displayed.

[Authorize]
public IActionResult Secure()
{
ViewData["Message"] = "Secure page.";
return View();
}

There are methods in HomeController to handle Logout and Renewal of token. You can download the code or go to github and explore. I have added the github link at the end of this blog.
Following method is used to make a call to an API which is retrieving data using Sitecore Layout Service API and this API resource is protected by Sitecore Identity Server. The API is like a micro service that the Identity Client is accessing using the access token Identity Server generated on resource owner’s behalf.

[Authorize]
public async Task<IActionResult> CallSitecoreApi()
{
var token = await HttpContext.GetTokenAsync("access_token");
var client = _httpClientFactory.CreateClient();
client.SetBearerToken(token);
var response = await client.GetStringAsync(Constants.SitecoreApi + "sitecorelayout");
var obj = JsonConvert.DeserializeObject(response);
ViewBag.Json = JsonConvert.SerializeObject(obj, Formatting.Indented);
return View();
}

Sitecore Sevice API

I created a sample API project which is using Layout Service API to retrieve data for the root item from my Sitecore 9.1 instance. For this I had to install Sitecore JSS Server package. I provided prerequisite for this API project in the Readme of the github repository.
If you look at the Startup code for this project, the APIs from this project can be accessed using Bearer token generated by the Sitecore Identity Server and APIs are available only to clients who are granted sitecore.profile.api scope. This API project only allow client that is coming from http://localhost:5002.

using Clients;
using Microsoft.AspNetCore.Builder;
using Microsoft.Extensions.DependencyInjection;
using Microsoft.AspNetCore.Authentication.JwtBearer;
using System.Threading.Tasks;
using Microsoft.Extensions.Logging;
namespace SitecoreApi
{
public class Startup
{
private readonly ILogger<Startup> _logger;
public Startup(ILogger<Startup> logger)
{
_logger = logger;
}
public void ConfigureServices(IServiceCollection services)
{
services
.AddMvcCore()
.AddJsonFormatters()
.AddAuthorization();
services.AddCors();
services.AddDistributedMemoryCache();
services.AddAuthentication("Bearer")
.AddJwtBearer("Bearer", options =>
{
options.Authority = Constants.SitecoreAuthority;
options.RequireHttpsMetadata = false;
options.Audience = "sitecore.profile.api";
});
}
public void Configure(IApplicationBuilder app)
{
app.UseCors(policy =>
{
policy.WithOrigins(
"http://localhost:5002");
policy.AllowAnyHeader();
policy.AllowAnyMethod();
policy.WithExposedHeaders("WWW-Authenticate");
});
app.UseAuthentication();
app.UseMvc();
}
}
}

There is only one API controller which is returning simply layout data using Sitecore Layout Service as shown below.

using System.Net.Http;
using System.Threading.Tasks;
using Microsoft.AspNetCore.Authorization;
using Microsoft.AspNetCore.Mvc;
using Newtonsoft.Json;
namespace SitecoreApi
{
[Route("sitecorelayout")]
[Authorize]
public class SitecoreLayoutController : ControllerBase
{
public async Task<IActionResult> Get()
{
var client = new HttpClient();
var content = await client.GetStringAsync("http://sitecore910.sitecore/sitecore/api/layout/render/jss?item=/&sc_apikey={9B8A0FCF-BA5A-483E-9AB0-E263866B9EAF}");
return new JsonResult(JsonConvert.DeserializeObject(content));
}
}
}

Conclusion

In this blog we discussed how we can register a client for Sitecore Identity Server, how to create a Sitecore Identity Client and how to create a resource that can be authorized by Sitecore Identity Server. All the code created for this blog can be downloaded from github link in the reference section below.

References

Posted in Security, Sitecore, sitecore Identity | Tagged , , , , | 1 Comment

Sitecore Identity Part 1

Introduction

There was a time when applications were simple and so was the application security. Things have changed with rise of internet. Software applications are not confined in a desktop, in a server or in an organization anymore. Today when we think about applications we often think about services. The service based architecture in today’s world made application security a big challenge. An application built today probably has to use multiple services, developed and hosted by different organizations or different teams within the same organization. How can application securely use services, so that proper authentication and authorization can be applied? How does security flows from application to services?

The questions I posed above are not new challenges today due to the rise of internet or service based architecture. We looked for answers for these questions when we started building browser based applications. In the simplest form, we authenticated users using form based authentication, initially custom and later on using something like, Microsoft Membership Provider or Widows Authentication. But, application itself often use other resources. A common example is using a database. In such a case, popular way is to use database connection string and provide authentication information via connection string. What if,we want to authenticate to the database using user’s original credentials (delegated authorization). That requires application to impersonate the user and it’s much harder problem to solve. In windows platform it was solved by using Kerberos authentication protocol. It is much harder to use Kerberos in service based distributed internet applications across domains, where applications/services use variety of platforms. Only common thing in service based distributed internet applications is the HTTP protocol. We needed a security protocol that uses HTTP as the communication protocol. Enter OAuth, an open protocol for authentication and authorization.

why I am not talking about Sitecore Identity Server yet? Because, I don’t want to explain things superficially. Sitecore Identity Server is built on IdentityServer4, which is a framework to build Identity Provider based on OAuth 2.0 and OpenID Connect. If we do not understand the problem in hand, we will not be able to understand why Sitecore has to create Identity Server. What’s the architectural need for future Sitecore Platform to support Authentication and Authorization this way?

Identity Problem

In the following table I tried to capture the Identity problems of modern web applications and technologies used to address the problems.

Form Authentication

The simplest way to authenticate and authorize a user is to use form based authentication, where we take the user credentials and validate it against stored information in the database or other systems like Active Directory. It is not that Form Authentication can not be secured enough, it is just that there are lot limitations when the application needs to integrate with other systems securely. Sitecore, before 9.1, used Form Authentication via Microsoft Membership Provider. In Sitecore 9.1, Sitecore removed security from the core application and created separate Identity Provider. This helps Sitecore to use same Identity Service for securing micro services, architecturally where Sitecore is heading to. You can still use Form Authentication in Sitecore 9.1 if you need to. You need to enable this config, Sitecore.Owin.Authentication.IdentityServer.Disabler.config.example, which disables the Sitecore Identity Server Provider based authentication and reset the login page to old login page.

Security Assertion Markup Language (SAML)

One of most important requirements in web based application is Single Sign-On (SSO). With SSO you can sign in once and authenticate to one or many application without logging in again. Although, it is possible to use Form Authentication with cookies to authenticate to an application once and log in to the application without logging in again as long as the cookies doesn’t expire, it only works within the same domain. You cannot use cookie across domains because of browser’s Same Origin Policy (SOP). So, you cannot SSO to more than one application that has different domain name. 
Security Assertion Markup Language (SAML) is an open standard based on XML based markup language for security assertion. SAML addresses the cross domain SSO by separating Service Provider from Identity Provider. 
Following diagram show the SAML SSO Flow. This diagram shows how SAML authenticates user using Identity Provider and use the same authentication information to log in to applications (Service Providers) across different domain.

SAML SSO Flow

OAuth

SAML addressed the SSO issue by separating Identity Provider from Service Provider. Now we can use one system to authenticate to all applications. But, in the above diagram, consider this scenario, SP1 wants to use service from SP2? How can SP1 authenticate to SP2 on behalf of user without giving SP2 user’s password? This is the Delegated Authorization problem and OAuth addresses this problem.

Let’s look at the following diagram to understand how OAuth addresses the Delegated Authorization problem.  You can log in to StackExchange using your Google account. If you choose to do that, StackExchange will connect to Google to access your Google profile and email. Although, I said log in, which is authentication, in this case authorization to get the access to the user’s Google profile indicates user has authenticated to Google successfully. This diagram shows how that communication works.

We will discover more about OAuth 2.0 communication when we will discuss about OAuth 2.0 code flow. 

OpenID Connect

OAuth 2.0 seems to address all the problem that we wanted to solve. Then, why we need OpenID Connect? To answer this, we need to understand that OAuth 2.0 is an authorization framework, it is not an authentication protocol. People started using OAuth 2.0 for authentication also and the result of that, each solution has become one of its kind. We needed a standard way to do the authentication in OAuth 2.0 authorization server. OpenId Connect is a protocol on the top of OAuth 2.0 to verify identity of end user on a authorization server.

Proof Key for Code Exchange (PKCE)

PKCE pronounce as pixy is a way to securely authorize with the authorization server without using Client Secret. What is Client Secret? In the above diagram when StackExchange connects to Google and ask for the Request Token, first it needs to identify to Google. It will provide a Client Id and Client Secret, which it already provided to Google when it registered the application with Google. Client Secret should be protected by StackExchange, otherwise, any application can claim that it is StackExchange. For the sensitive nature of the Client Secret, it is only used to authorize Web Server application. You wouldn’t use it in Javascript based application or mobile application because it can be seen or intercepted. PKCE protocol helps device or mobile app to address this problem. Since it is not needed to understand PKCE to understand OAuth, we will not explore much about it.

OAuth 2.0 Terminology

Let’s talk about some OAuth 2.0 terminology because we need to know them to understand the OAuth Code Flow (Code Flow is a terminology too).

Resource Owner

Resource Owner is really a user. In the OAuth term it is called Resource Owner because OAuth provides access to some resources to other application on behalf of the owner of the resources. For example, StackExchange is asking access to my Google profile that I own. In this case, I am the Resource Owner. 

Client

Client is an application which is asking access to the resources. In StackExchange example, StackExchange is the Client. 

Authorization Server

The Authorization Server is the system that authorize the user. The server where Client connect and asks for access to some resources. In StackExchange example, Google is the Authorization Server.

Resource Server

Resource Server is the one that contains the resources. In our StackExchange example, the Client is asking to access user’s profile. In this case profile also is stored in Google server. Resource can be something other than profile, for example, it could be documents in a folder in Google Drive. In that case Google Drive is the Resource Server.

Authorization Grant

Authorization Grant is the Resource Owner’s (user) permission to the Client to access the Resource. In the StackExchange example, when user say ‘Yes’ to Google to give access to StackExchange, user grants the permission and Google provides a authorization code to StackExchange, so that, StackExchange can get an Access Token, which it will use to access user’s profile. There are different mechanism to communicate Authorization Grant. We will discuss different grant types later.

Access Token

Client’s ultimate goal is get the Access Token. Once client has the Access Token, it can get to the Resources anytime in the Resource Server, until the Access Token expires.
In OAuth 2.0 an Access Token is just a string type. The Access Token will be validated before a resources is delivered. One way to validate an Access Token is to store the Access Token in the server and every time client sends the Access Token, resource server checks against the stored token. This is why Access Token is sometime called Reference Token. If the Access Token is a JSON Web Token (JWT), it can be both stateless and stateful. The JWT Access Token is signed, thus, it can be validated by the client. This is stateless token because token doesn’t need to be stored in the resource authorization server. JWT Access Token can also be validated against the server using OpenID Connect introspection endpoint and this is stateful Access Token. We will discuss more about Open ID Endpoints later.

Scope

Scope is a string value that is already defined in the authorization server describes granularity of resource access. Client sends the Scope to the authorization server to ask what level of resource access client needs. In the StackExchange example, it needs user’s profile access, so, the Scope can be a string value ‘profile’.  

Consent

The flip side of Scope is Consent. User gets to see the scope of access that client is asking for and grant access to specific scopes. In the StackExchange example, Google will send the profile scopes for the consent to the user and user has to give consent.

Revisit the Authorization

Now, we understand the OAuth 2.0 terminology, let’s revisit the StackExchange example one more time. Few things to notice in the following diagram.

1. StackExchange provides client_id so that Google can identify the client.
2. StackExchange tells Google what grant type (querystring response_type) to use, which is code. This means Google will exchange a code to tell StackExchange that user agreed to give access to the resources.
3. StackExchange tells Google what url (querystring redirect_uri) to use to send the authorization code by providing a callback uri.
4. StackExchange tells Google what resources it needs to access using scope (querystring scope).
5. Once user gives consent, Google sends to authorization code to StackExchange.
6. StackExchange sends the authorization code and the client secret to Google and ask for access token.
7. Google verify the client secret to verify the client and verify the authorization code and sends an access token to StackExchange with access scope included in the token.
8. StackExchange uses the access token to access resources. If StackExchange try to access anything outside to scope included in the access token, Google denies access.

Back Channel and Front Channel

We will talk about different grant types later, but we already discussed one grant type above, it is called Authorization Code grant type. Channel is the way that a grant type communicates with authorization server. If it is a server side communication, like StackExchange web server to Google authorization server, it is Back Channel communication. In the above example when StackExchange is asking for the access token by sending authorization code and client secret, that’s Back Channel communication. If the communication is only through browser, it is a Front Channel communication. In the above example, exchanging the authorization code via StackExchange callback uri is Front Channel communication. The authorization code gets transferred via query string or form post (querystring response_mode). The Back Channel communication more secured than Front Channel communication as it happens between server to server and chance of intercepting the communication is lot less.

Authentication using OpenID Connect

We briefly discussed OpenID Connect. Let’s take a look at the same StackExchange workflow if the authentication is done using OpenID Connect. As discussed before, OpenID is a thin layer on the top of OAuth 2.0, that standardized authentication using OAuth authorization server. To use OpenID Connect, both authorization server and the client has to implement OpenID Connect protocol.
The following diagram shows the Code Flow when OpenID Connect protocol is used. Flow is exactly same as the one we described in the Revisit the Authorization section, except, scope includes openid and get the id_token back. The id_token includes user’s information (we will discuss content of id_token later). Also, authorization server implements standard API endpoint /userinfo, which can be accessed using the access token to retrieve more user information if needed.

In the above discussion, I used StackExchange as an example to describe the OAuth authorization flow, in the real life StackExchange authentication and authorization using google account is lot more than what I described. The concept is not different though. Let’s look at the requests below. I used bold font so that, you can recognize them from our earlier discussion.

Request to Google:
https://accounts.google.com/o/oauth2/auth?client_id=717762328687-p17pldm5fteklla3nplbss3ai9slta0a.apps.googleusercontent.com&scope=profile+email&redirect_uri=https://stackauth.com/auth/oauth2/google&state={"sid":4,"cdl":"https://stackexchange.com/users/login-or-signup/delegated?returnurl=https%3a%2f%2fstackexchange.com%3f_%3d247192781","st":"b332ab522f5a916ec48228026c76ff3f6f249c0392328cab4563a433d0dc405b","ses":"018274bb2cd74c12a71f175403243ff8"}&response_type=codeHTTP/1.1
Callback from Google:
https://stackauth.com/auth/oauth2/google?state={"sid":4,"cdl":"https://stackexchange.com/users/login-or-signup/delegated?returnurl=https%3a%2f%2fstackexchange.com%3f_%3d247192781","st":"b332ab522f5a916ec48228026c76ff3f6f249c0392328cab4563a433d0dc405b","ses":"018274bb2cd74c12a71f175403243ff8"}&code=4/rgBAjiN72soIG0mXXOLC5tN8OOPOaNVGNeJ_M1ajoZ9GvwxxxxxQBUQtd4LP9C3v0DIZQ2-ZhMycPFOLhudH-7A&scope=email+profile+https://www.googleapis.com/auth/userinfo.email+https://www.googleapis.com/auth/userinfo.profileHTTP/1.1
StackExchange Redirects to:
https://meta.stackexchange.com/users/oauth/google?code=
4/rgBAjiN72soIG0mXXOLC5tN8OOPOaNVGNeJ_M1ajoZ9GvwxxxxxQBUQtd4LP9C3v0DIZQ2-ZhMycPFOLhudH-7A&state={"sid":4,"cdl":"https://stackexchange.com/users/login-or-signup/delegated?returnurl=https%3a%2f%2fstackexchange.com%3f_%3d247192781","st":"b332ab522f5a916ec48228026c76ff3f6f249c0392328cab4563a433d0dc405b","ses":"018274bb2cd74c12a71f175403243ff8"}&s=018274bb2cd74c12a71f175403243ff8HTTP/1.1

Things to notice in the second call that, Google changed the scope parameter to provide profile and email API. The third call, which was a redirect to another StackExchange URL, shows the authorization code. It doesn’t look like StackExchange is using OpenID Connect as I don’t see openid in the scope.

OpenID Connect Terminology

OpenID Connect is built on OAuth 2.0, so lot of the terminology discussed in OAuth 2.0 are used in OpenID Connect, but it was created for the purpose of authenticating user in authorization server. 

id_token

The id_token in OpenID Connect returns user’s information. The difference between access token and id_token is, if you want to get information about user using the access token, you have to make another call to some API endpoint. id_token saves that round trip to the server.
The id_token is a JSON Web Token (JWT). It is a JSON decoded in base 64. There are 3 parts in id_token, separated by a period (‘.’). First part is the header and it contains information like, what security algorithm is used. Second part is the payload, which contains information like, user id, token expiration time etc. Third part of the token is signature to validate if token was tampered by anyone. The client has the public key for the private key that generated the token and using that, client can validate the token.
The easiest way to decode an id_token is to use decoder in jwt.io. If you have a Sitecore 9.1 instance, go to Sitecore login screen, it will redirect you to the identity server url. Copy that url and decode it using online url decoder. Below is from my local instance. I decoded it to make it readable. I bolded the part that we already discussed and might interest you. 

https://xp910.identityserver/account/login?returnUrl=/connect/authorize/callback?client_id=Sitecore&response_mode=form_post&response_type=code id_token token&scope=openid sitecore.profile&state=OpenIdConnect.AuthenticationProperties=NIRQvgTO6YlOMQdZkYlKizcVzNV1Felc1SpP4XRYoGtM54aQ3TLipFLsPtnoMjKhbuuNYqHUgefEy4BZkuIjY43NbTPE8NWWrpTTxpQ8P8LMg2o7ZHaSCO8uXRK8A31vO1EoXz1O0RnBuha7GlnN2jjPCfYNuVNl2S4fTiNiMuVBhGPMS7FMPdSMFj0XabYWgCrTCGnxoILALWWksa5cvw&nonce=636797438206701381.MWE4OWJlNzMtZTFhYS00OWViLWE1ZWMtMGFmNzNhOWMzZWE0Mjk2NmRhZDctZDg4Yy00ZDEwLWI2MzgtY2VlMDU0YWQ1NDQx&redirect_uri=http://xp910.sitecore/identity/signin&sc_account_prefix=sitecore\&x-client-SKU=ID_NET451&x-client-ver=5.2.2.0

Now, start fiddler to capture http requests to see what response is coming back from Sitecore Identity Server once you authorize. Mine shown below. Since response_mode is form_post, idententity server is posting a form to the url that was included in the url above. You can take the id_token and copy it in the jwt.io decoder to see the header and payload information. Also, notice that the access token is a stateless JWT access token.

<form method='post' action='http://xp910.sitecore/identity/signin'><input type='hidden' name='code' value='1b6f995dc665faaffaef7db59365b8bae680a800bf7630cf612033d6fdacb512' />
<input type='hidden' name='id_token' value='eyJhbGciOiJSUzI1NiIsImtpZCI6IkE2ODVCMkFBRkU1NkFEMzQ5MDA4Nzg2NzI3NTEzMkM1QUE3ODdFNDkiLCJ0eXAiOiJKV1QiLCJ4NXQiOiJwb1d5cXY1V3JUU1FDSGhuSjFFeXhhcDRma2sifQ.eyJuYmYiOjE1NDQxNDkwNzMsImV4cCI6MTU0NDE1MjY3MywiaXNzIjoiaHR0cHM6Ly94cDkxMC5pZGVudGl0eXNlcnZlciIsImF1ZCI6IlNpdGVjb3JlIiwibm9uY2UiOiI2MzY3OTc0MzgyMDY3MDEzODEuTVdFNE9XSmxOek10WlRGaFlTMDBPV1ZpTFdFMVpXTXRNR0ZtTnpOaE9XTXpaV0UwTWprMk5tUmhaRGN0WkRnNFl5MDBaREV3TFdJMk16Z3RZMlZsTURVMFlXUTFORFF4IiwiaWF0IjoxNTQ0MTQ5MDczLCJhdF9oYXNoIjoiMHlIMW5EWENucS1udE0zUm9WemNzUSIsImNfaGFzaCI6IjZTRy10Zy1WUmJqak5rNlk3aER4TEEiLCJzaWQiOiJiNGRjNmM0ZGY1ZDI4NDQwMjhjYzNhMTdiNTUxMzc3MSIsInN1YiI6IjcwZDhkYWJiMTcwZjQ2MzBhZDRhMzQ1Nzc3Y2M0NGQ4IiwiYXV0aF90aW1lIjoxNTQ0MTQ5MDczLCJpZHAiOiJsb2NhbCIsImFtciI6WyJwd2QiXX0.hb4TgdGIeiqOqgAinEOj7Ih5fVUhb6cqgQN_7gPptvuA3hHwDCavymYvoGM3GAqxAoyNjauOZ0hTa-yrmLB-VpE2AQP9VbEqvGZTk6Y_I5lau08zijQPsrjTgsXyrfbjkzcsqDzfEyQ3igiRbYAkwwd5RxkogyxS-KQehrqBVcKkMZ1gXDqGyeJVmPmTQITN-8QgqgTneFbzjfxEFVEkCxGN6kkUki507FYwRBIFr17XMNMw44ODnBjHogydwlQMvmUcUsa_W8PcgnsHKOmtcgmNESOpJbl-_TmuP0_m3JGqehSSTQD9UU7tKFnAZJo6xJHJ81VKpFdJ2ADMat0zeA' />
<input type='hidden' name='access_token' value='eyJhbGciOiJSUzI1NiIsImtpZCI6IkE2ODVCMkFBRkU1NkFEMzQ5MDA4Nzg2NzI3NTEzMkM1QUE3ODdFNDkiLCJ0eXAiOiJKV1QiLCJ4NXQiOiJwb1d5cXY1V3JUU1FDSGhuSjFFeXhhcDRma2sifQ.eyJuYmYiOjE1NDQxNDkwNzMsImV4cCI6MTU0NDE1MjY3MywiaXNzIjoiaHR0cHM6Ly94cDkxMC5pZGVudGl0eXNlcnZlciIsImF1ZCI6Imh0dHBzOi8veHA5MTAuaWRlbnRpdHlzZXJ2ZXIvcmVzb3VyY2VzIiwiY2xpZW50X2lkIjoiU2l0ZWNvcmUiLCJzdWIiOiI3MGQ4ZGFiYjE3MGY0NjMwYWQ0YTM0NTc3N2NjNDRkOCIsImF1dGhfdGltZSI6MTU0NDE0OTA3MywiaWRwIjoibG9jYWwiLCJzY29wZSI6WyJvcGVuaWQiLCJzaXRlY29yZS5wcm9maWxlIl0sImFtciI6WyJwd2QiXX0.AVcpOYq0OflaPwHKfmgi67Y1ReK9k2kybpyzQxd8J7fIC97Zk0SNUKEEjL5OVQjt9NhkzL7jR37V9TrNAW4X3hnpTn0qTP8VMK8phUdLNt8nEe2YTg0g-Wqq2xbRSENT5VRQdt5L7HaYQVmtOP8A-UbrZ3VUm2jAdxS1gLqJiB3N4WJHvUet3I4gXrNEdmaqQmFwdoe5KpxgDeFRt6nfyRYPAPJMiKbS23G6d1fDXhrBrIRlmYb4JzepNxpvWMf2HQbNrZ3m-oQ7qwbUDNoOpYfvt_Rf7n3EOvSSUyA8z7n9TEyG4mB4rse-91nfyUhbNR46wRThuyOGpj8Ah6YO0g' />
<input type='hidden' name='token_type' value='Bearer' />
<input type='hidden' name='expires_in' value='3600' />
<input type='hidden' name='scope' value='openid sitecore.profile' />
<input type='hidden' name='state' value='OpenIdConnect.AuthenticationProperties=NIRQvgTO6YlOMQdZkYlKizcVzNV1Felc1SpP4XRYoGtM54aQ3TLipFLsPtnoMjKhbuuNYqHUgefEy4BZkuIjY43NbTPE8NWWrpTTxpQ8P8LMg2o7ZHaSCO8uXRK8A31vO1EoXz1O0RnBuha7GlnN2jjPCfYNuVNl2S4fTiNiMuVBhGPMS7FMPdSMFj0XabYWgCrTCGnxoILALWWksa5cvw' />
<input type='hidden' name='session_state' value='MOXMSSjr5T1vZfYiECHizTqOiS_hmqvDuJ-tyDo1C6s.4b49bd60437e81b5ccaf94d2d9acb3ba' />
</form>

Here is the decoded id_token from jwt.io

Endpoints

OpenID Connect standardized the REST EndPoints exposed by the authorization server. Each endpoint a has purpose to accomplish authentication. Following are different endpoints:

Discovery Endpoint
Purpose of Discovery Endpoint is to provide authorization server metadata. Using Discovery Endpoint we can find other endpoints. The Endpoint url is https://xp910.identityserver/.well-known/openid-configuration where https://xp910.identityserver is my Sitecore Identity Server root. You should check the Discovery Endpoint to understand what configurations are supported by the Sitecore authorization server. In fact, what other endpoints are available, can be found in the metadata returned by the Discovery Endpoint. Below is the response from Discovery endpoint in my Sitecore Identity server.

{
  "issuer": "https://xp910.identityserver",
  "jwks_uri": "https://xp910.identityserver/.well-known/openid-configuration/jwks",
  "authorization_endpoint": "https://xp910.identityserver/connect/authorize",
  "token_endpoint": "https://xp910.identityserver/connect/token",
  "userinfo_endpoint": "https://xp910.identityserver/connect/userinfo",
  "end_session_endpoint": "https://xp910.identityserver/connect/endsession",
  "check_session_iframe": "https://xp910.identityserver/connect/checksession",
  "revocation_endpoint": "https://xp910.identityserver/connect/revocation",
  "introspection_endpoint": "https://xp910.identityserver/connect/introspect",
  "frontchannel_logout_supported": true,
  "frontchannel_logout_session_supported": true,
  "backchannel_logout_supported": true,
  "backchannel_logout_session_supported": true,
  "scopes_supported": [
    "openid",
    "profile",
    "email",
    "sitecore.profile",
    "sitecore.profile.api",
    "offline_access"
  ],
  "claims_supported": [
    "sub",
    "name",
    "family_name",
    "given_name",
    "middle_name",
    "nickname",
    "preferred_username",
    "profile",
    "picture",
    "website",
    "gender",
    "birthdate",
    "zoneinfo",
    "locale",
    "updated_at",
    "email",
    "email_verified",
    "role",
    "http://www.sitecore.net/identity/claims/isAdmin",
    "http://www.sitecore.net/identity/claims/originalIssuer"
  ],
  "grant_types_supported": [
    "authorization_code",
    "client_credentials",
    "refresh_token",
    "implicit",
    "password"
  ],
  "response_types_supported": [
    "code",
    "token",
    "id_token",
    "id_token token",
    "code id_token",
    "code token",
    "code id_token token"
  ],
  "response_modes_supported": [
    "form_post",
    "query",
    "fragment"
  ],
  "token_endpoint_auth_methods_supported": [
    "client_secret_basic",
    "client_secret_post",
    "private_key_jwt"
  ],
  "subject_types_supported": [
    "public"
  ],
  "id_token_signing_alg_values_supported": [
    "RS256"
  ],
  "code_challenge_methods_supported": [
    "plain",
    "S256"
  ]
}

Authorize Endpoint
Authorize Endpoint is used specifically to authenticate the user in OpenID Connect. It can return authorization code as well as token. Since this endpoint can return token also, it is used in the hybrid grant type (discussed later) and additional round trip to server is not needed to get token separately. Authorization Endpoint url is
https://xp910.identityserver/connect/authorize.

Token Endpoint
Token Endpoint is used to get access token or id_token from the authorization server. You have to provide some credentials like, Authorization Code, Client Credential etc. to get the token. The Token Endpoint url is 
https://xp910.identityserver/connect/token

Userinfo Endpoint
Purpose of Userinfo Endpoint is to return additional information about the logged in user. You have to pass Access Token to call Userinfor Endpoint. The Userinfo Endpoint url is https://xp910.identityserver/connect/userinfo.

Endsession Endpoint
Purpose of Endsession Endpoint is to end user’s session for selected application. For example, say, there are multiple clients that are authorized by user to the Sitcore Identity Server, using Endsession Endpoint user’s browser session can be ended for all the clients. The Endsession Endpoint url is
https://xp910.identityserver/connect/endsession.

Check Session Iframe Endpoint
Purpose of this endpoint is to check user’s session status using cross-origin communication via HTML5 postmessage API. Identity server accepts the postmessage and return the login status of the user’s session. 
The Check Session Iframe Endpoint url is
https://xp910.identityserver/connect/checksession.

Revocation Endpoint
This endpoint allows revoking the access token and refresh token from the Identity Server. The Revocation Endpoint url is
https://xp910.identityserver/connect/revocation.

Introspection Endpoint
This endpoint is used to validate reference tokens in the server. As discussed before tokens can be stateful or stateless. The JWT tokens are stateless and can be validated using signature, but stateful reference tokens which don’t have signatures can be checked only by making API call to the Introspection Endpoint. The Introspection Endpoint url is https://xp910.identityserver/connect/introspect.

Grant Types

Grant types are different ways to authorize user. Sitecore Identity Server supports following OAuth 2.0 grant types plus Hybrid grant types.

Sitecore Identity Server Grant Types

Authorization Code
We discussed Authorization Code Flow before. This is a grant type to get token  using the Back Channel. Client gets the Authorization Code and exchange that to get id_token or access token. When this grant type is used all tokens come from the token endpoint. 

Client Credentials
This grant type is used for authenticating a client in server to server communication. Client provides it’s Client Id and Client Secret to authenticate via token endpoint.

Password
This grant type uses resource owner’s username and password for authentication via token endpoint. This mostly used in legacy application and it is not recommended. 

Implicit
Implicit grant type is mainly designed to use from browser based application using scripting language like javascript. Implicit flow doesn’t use token endpoint; all tokens are returned from authorize endpoint. In this flow no client authentication is done by the authorization server. Authorization server authenticates the end user (resource owner), gets the consent and redirect the end user back to the client with id token and access token. Client validates the tokens using signature to make sure the token was not altered. Since all communication are done through front channel, this grant type doesn’t allow accessing token using refresh token due to the security concern.

Hybrid
Hybrid authorization is combination of implicit flow and code authorization flow. When this flow is used authorization server uses authorize endpoint via front channel and redirect end user to client with authorization code and id token. Client validates the id token using signature and requests access token using token endpoint via back channel. Since the access token retrieval is done via back channel, authorization server can provide refresh token to be used later to get access token in the offline mode.

Refresh Token
Refresh Token flow was designed to get access token non-interactively when access token is expired. This helps us to avoid asking for client consent every time the access token expires. It is also helpful when client is running a background process  on end user’s behalf  and there is no way to interact with the end user when access token expired. To enable refresh token flow client needs to include offline_access scope.

Claims

Claims is a way to define what end user information will be communicated by OpenID Provider (OP). Claims are communicated via id token or via userinfo endpoint. There are standard Claims defined in OpenID Connect protocol, but additional Claims can be added. Below is the list of Claims available in Siecore Identity server.

Sitecore Identity Server Claims

Scopes

OpenID Connect uses OAuth 2.0 Scopes. It has standardized the Scopes for the protocol. Scopes must include “openid”. Additionally, some optional Scopes can be included, for example, “profile”, “email” etc. For OpenID Connect, scopes can be used to request that specific sets of information be made available as Claim Values. Also, if you want to get the Refresh Token, “offline_access” scope has to be included. Below are Scopes supported in Sitecore Identity server. 

Sitecore Identity Server Scopes

Response Types

Response types defines what response to be sent back from the OpenID Connect Provider (OP). If the request url response_type=code, OP will only return authorization code. In our example above, we found that Sitecore Identity Server url contains response_type=code id_token token. When Sitecore Identity Server gets user consent, it posts all of them to the client redirect url. Following are different Response Types supported by Sitecore Identity Server.

Sitecore Identity Server Response Types

Response Modes

Response Modes define how the responses should be sent to client. Response can be sent as form post, query string or fragment in the redirect URI. If query is used as the mode, care should be taken to make sure length doesn’t exceed url length limit. All Response Modes supported in Sitecore Identity Server.

Sitecore Identity Server Response Modes

Token Validation

Tokens returned by Sitecore Identity server are jwt token. The signature of that can be verified using the public key of the private key that signed the tokens. How to validate the signature?
The discovery endpoint provides the JSON Web Key Set (JWKS) URI endpoint https://xp910.identityserver/.well-known/openid-configuration/jwks.&nbsp;Using the jwt.io libraries the public key can be extracted to verify the signature.
Go to this site https://8gwifi.org/jwkconvertfunctions.jsp to convert JWK to PEM. You need to only use kty, n, e from JWK. Insert the public key in jwt.io
to verify the signature.
I described above method to validate the token signature to tell you how OpenID Connect protocol standardized signature validation. In Sitecore Identity Server, IdentityServer4 framework validates the signature when token is returned to the client.

Role of Sitecore Identity Server in Sitecore Micro Service Architecture

So far we have discussed important OAuth 2.0 and OpenID Connect concepts used in Sitecore Identity server. Let’s discuss why Sitecore created the Sitecore Identity Server and how it fits in Sitecore’s future micro service based architecture. In the following diagram, I tried to explain how Sitecore Identity Provider fits into the future Sitecore micro services architecture. The micro services shown in Sitecore XP are my guess and in reality they will be probably vastly different. The part that I am sure about is the layers above the micro services. Every service will be authorized via Sitecore Identity Provider and it will serve both front channel and back channel communication. Currently, Sitecore Commerce has it’s own Identity Provider, I am guessing Sitecore Identity will serve both Sitecore and Sitecore Commerce.

End of Part 1

This blog post is already too long. I think, it is right time to stop. In this blog post, we discussed about OAuth 2.0 and OpenID Connect concepts. We discussed how those concepts are used in Sitecore Identity Server. We also looked at the need for Sitecore Identity Server from overall Sitecore Micro Service Architecture point of view.
In the next part we will explore how we can use Sitecore Identity Server to authenticate an application that intend to use Sitecore services. We will also explore how Sitecore Identity Server can be connected to external Identity Provider.

References

Posted in sitecore Identity | Tagged , , , , | 1 Comment

Compilation issues with Sitecore Commerce 9 Business Tools SDK, a.k.a Sitecore BizFx SDK

Like Sitecore Commerce 9 Engine SDK, which is used for creating custom plugins, Sitecore Commerce 9 Installation package also comes with Sitecore Commerce Business Tools SDK. This SDK is used to modify the Business Tools. For example, if you want to add a new commerce entity and you want to manage that from the Business Tools, you have to use BizFx SDK.

Now, there are multiple issues to make BizFx SDK working in Sitecore Commerce 9 update-1. Here is the list.

  • You may not have @speak and @sitecore registry in your .npmrc file. Run following commands to add them in the .npmrc file

    npm config set @speak:registry=https://sitecore.myget.org/F/sc-npm-packages/npm/

    npm config set @sitecore:registry=https://sitecore.myget.org/F/sc-npm-packages/npm/

     

  • “@sitecore/bizfx”: “^1.1.9” is missing from package.json. Add that.
  • “@speak/icon-fonts”: “~1.0.2” is missing from package.json. Add that.
  • “@speak/ng-bcl”: “~0.8.0” and “@speak/styling”: “0.9.0-r00078” are missing from Sitecore npm repository:
    For some reason Sitecore has not included these version of Speak npm packages in npm repository. They have included the packages with the Sitecore Commerce 9 installation package in the root directory. You need to install these packages from the file system. Modify “@speak/ng-bcl”: “~0.8.0” and “@speak/styling”: “0.9.0-r00078” as follows in the package.json.

    “@speak/ng-bcl”: “file:../speak-ng-bcl-0.8.0.tgz”,
    “@speak/styling”: “file:../speak-styling-0.9.0-r00078.tgz”,

Once above changes are done, save package.json and run the command ‘npm install‘. This will install all the required packages to the solution. If the BizFx SDK compiles without any error, run the command ‘ng serve‘. This will host the Business Tools in webpack dev server. If you go to http://localhost:4200, business tools will be launched.

Webpack will throw port not available error if SitecoreBizFx site that is installed with Sitecore Commerce 9 is running in IIS. Stop this site and run the command ‘ng serve‘ again.

 

Posted in Sitecore Commerce | Tagged , | Leave a comment

Yet another Sitecore Commerce 9 installation blog post

Update: Issues discussed in this article are on;y applicable to Initial Release of Sitecore Commerce 9. None of these issues occurred in Sitecore Commerce 9 Update 1.

If you are here and reading this blog, you probably have some problem installing Sitecore Experience Commerce 9 in your computer and you have problem because your machine configuration is not exactly what Sitecore Commerce 9 installation script expect it to be. You may have one or many of the following differences.

  • You have more than one instance of SQL Server installed in your machine and none of them are on localhost.
  • You have more than one disk drive and you want to install commerce websites in folder that’s not under C:\inetpub\wwwroot.
  • You have Sitecore Commerce 8.2.1 installed in your machine previously.

In a perfect world, we may have a clean machine with SQL Server 2016 SP1 installed on default localhost and have one C drive with default website folder location. In that case, I wouldn’t be writing this blog :). But, the reality is, in most cases we are not that lucky and the Sitecore Commerce 9 installation document is not that helpful in that situation. In my case, all of the above mentioned points are true. So, what issues I have faced and how I got through them with the help of good folks from Sitecore Community?

        1. Sitecore Commerce 9 installation guide has a section for ‘Host environment requirements’. There are list of softwares here, but most of them will be there in your machine when you installed Sitecore 9 update-1. What you need to install are  following
          • Visual Studio 2017
          • .NET Core 2.0.0 Visual Studio 2017 Tooling
          • .NET Core Windows Server Hosting 2.0.0 (if you have x64 machine, use command line installation to opt out x86 .Net Runtime, DotNetCore.2.0.5-WindowsHosting.exe OPT_NO_X86=1)
        2. In most cases you will be using an administrator account for installation and run the script from Powershell Console in administrator mode. Make sure your Windows logged in account is in SQL Server ‘sysadmin’ role.
        3. If you want to install Sitecore Commerce 9 in location other than C:\inetpub\wwwroot, you should be able to change the Drive, but changing the folder name will fail the installation with this error for the package installation Sitecore Commerce Connect Core 11.0.192.zip
          One or more exceptions occurred while processing the subscribers to the ‘item:saved’ event.
          At this point, if you launch Sitecore and open the indexing manager, you will see all indexes are gone. There is not point of return now, you have to re-install everything from scratch again, starting from XP9.
          If you want to change the Drive you have to make changes in the following files. Replace the environment(‘SystemDrive’) with the drive (e.g. ‘D:’).

          Sitecore.Commerce.2018.01-2.0.254\SIF.Sitecore.Commerce.1.0.1748\Configuration\Commerce\CEConnect\InitializeCommerce.json (4 hits)
          Line 32:     "CommerceOpsPhysicalPath": "[concat(environment('SystemDrive'), concat('\\inetpub\\wwwroot\\', variable('CommerceOps')))]",
          Line 33:     "CommerceShopsPhysicalPath": "[concat(environment('SystemDrive'), concat('\\inetpub\\wwwroot\\', variable('CommerceShops')))]",
          Line 34:     "CommerceAuthoringPhysicalPath": "[concat(environment('SystemDrive'), concat('\\inetpub\\wwwroot\\', variable('CommerceAuthoring')))]",
          Line 35:     "CommerceMinionsPhysicalPath": "[concat(environment('SystemDrive'), concat('\\inetpub\\wwwroot\\', variable('CommerceMinions')))]"
          Sitecore.Commerce.2018.01-2.0.254\SIF.Sitecore.Commerce.1.0.1748\Configuration\Commerce\CommerceEngine\CommerceEngine.Deploy.json (4 hits)
          Line 39:     "CommerceOpsPhysicalPath": "[concat(environment('SystemDrive'), concat('\\inetpub\\wwwroot\\', variable('CommerceOps')))]",
          Line 40:     "CommerceShopsPhysicalPath": "[concat(environment('SystemDrive'), concat('\\inetpub\\wwwroot\\', variable('CommerceShops')))]",
          Line 41:     "CommerceAuthoringPhysicalPath": "[concat(environment('SystemDrive'), concat('\\inetpub\\wwwroot\\', variable('CommerceAuthoring')))]",
          Line 42:     "CommerceMinionsPhysicalPath": "[concat(environment('SystemDrive'), concat('\\inetpub\\wwwroot\\', variable('CommerceMinions')))]",
          Sitecore.Commerce.2018.01-2.0.254\SIF.Sitecore.Commerce.1.0.1748\Configuration\Commerce\CommerceEngine\CommerceEngine.Initialize.json (4 hits)
          Line 38:     "CommerceOpsPhysicalPath": "[concat(environment('SystemDrive'), concat('\\inetpub\\wwwroot\\', variable('CommerceOps')))]",
          Line 39:     "CommerceShopsPhysicalPath": "[concat(environment('SystemDrive'), concat('\\inetpub\\wwwroot\\', variable('CommerceShops')))]",
          Line 40:     "CommerceAuthoringPhysicalPath": "[concat(environment('SystemDrive'), concat('\\inetpub\\wwwroot\\', variable('CommerceAuthoring')))]",
          Line 41:     "CommerceMinionsPhysicalPath": "[concat(environment('SystemDrive'), concat('\\inetpub\\wwwroot\\', variable('CommerceMinions')))]",
          Sitecore.Commerce.2018.01-2.0.254\SIF.Sitecore.Commerce.1.0.1748\Configuration\Commerce\SitecoreBizFx\SitecoreBizFx.json (1 hit)
          Line 33: 		"SitecoreBizFxPhysicalPath": "[concat(environment('SystemDrive'), '\\inetpub\\wwwroot\\SitecoreBizFx')]",
          D:\Deploy\Sitecore.Commerce.2018.01-2.0.254\SIF.Sitecore.Commerce.1.0.1748\Configuration\Commerce\SitecoreIdentityServer\SitecoreIdentityServer.json (1 hit)
          Line 41:     "SitecoreIdentityServerPhysicalPath": "[concat(environment('SystemDrive'), concat('\\inetpub\\wwwroot\\', parameter('SitecoreIdentityServerName')))]",
          
        4. If your SQL Server instance is not a default instance on localhost, the installation will fail with following error
          Get Token From Sitecore.IdentityServer
          Install-SitecoreConfiguration : The remote server returned an error: (500) Internal Server Error.
          

          If this is the case, you need to change the connection string in the json files mentioned in Naveed’s blog post. You might get away with installation by just making the change in \SitecoreIdentityServer\wwwroot\appsettings.json, but it is better to take care of them now than later. Sitecore uses trusted authentication, so don’t change the connection string to use SQL account. Just change ‘localhost’ with your SQL Server instance (machine-name\\instance-name). After making the correction, you can comment out all the tasks before InitializeCommerceEngine in Master_SingleServer.json and run the installation script again.

        5. If you are seeing the following error after running the installation, you might be using one ‘\’ instead of two ‘\\’ in the connection string, like I did and spent hours to find out what’s going on.HTTP Error 502.5 – Process Failure

          Thanks to Kautilya Prasad for helping me with this and actually recreating the issue in his machine.
          This error is little different from wrong connection string. This is a parsing issue in the json config and installation script doesn’t handle the error gracefully. As the effect of that, I was seeing the following error in event log and thought, the issue is something to do with state of .Net Core in my machine. It was bit confusing. So, if you see this error in event log, most probably the issue is related to unhandled error than anything to do with .Net Core installation .

          Application 'MACHINE/WEBROOT/APPHOST/SITECOREIDENTITYSERVER' with physical root 'C:\inetpub\wwwroot\SitecoreIdentityServer\' failed to start process with commandline 'C:\inetpub\wwwroot\SitecoreIdentityServer\Sitecore.IdentityServer.exe ', ErrorCode = '0x80004005 : 0.
          
        6. If you had installed Sitecore Commerce before in your machine and the account CSFndRuntimeUser existed before the installation, it might have a different password than the one used by the installation script to create the SitecoreIdentityServer website. In this case you might see the following error because the App Pool for SitecoreIdentityServer will be stopped for incorrect password.

          Service Unavailable


          HTTP Error 503. The service is unavailable.

        7. If you are installing the Sitecore Commerce 9 in C:\inetpub\wwwroot folder, script might fail to grant permission to CSFndRuntimeUser account to the folder. If that happens, you can resolve the issue by manually granting permission and re-run the script again.
        8. If everything goes well with installation and the Storefront is failing with Sitecore item not found, publishing the site will probably fix the issue.
        9. Finally, if you are not lucky, you might have to remove the Sitecore 9 and Sitecore Commerce 9 once or more. Get the removal scripts from Sitecore SIF-less blog and Sitecore Commerce 9 SIFLess Uninstall and configure them as per your need.
          You may use my script for removing XP9, but it doesn’t use SIF.

Good luck!

Posted in Commercce, Sitecore Commerce | Tagged , , , | Leave a comment

A Powershell script for removing Sitecore 9 instance from local machine

Sitecore 9 installation process is quite different than previous version. The new installation process uses, Sitecore Installation Framework (SIF). You can find installation documents in Sitecore 9 release page. The earlier version of Sitecore installation was easy and it was not hard to remove a Sitecore instance from the computer. If you had used SIM, it was even easier, just clicking a button would do it.

After I installed my first Sitecore 9 instance, I realized that, it installed the websites in C:\inetpub\wwwroot and created all the databases in the C drive. My C drive is smaller SSD drive and I don’t install Sitecore instances in it. It is reserved for only OS. I thought, it will be useful, if I create a script to remove the instance and share it with others. Here is the script.

############################################################################################
#This script is created to remove Sitecore 9 instance from local machine.
#This will not uninstall Solr but, it will remove Solr collections.
#The script looks at the Prefix parameter and find out all instances and remove them.
#Change the default values before using the script. Or, pass the values as parameters.
#Usage: .\Remove-Sitecore9-Instance.ps1 -Prefix "xp0"
#Declaimer: This is a destructive script, means, it deletes files, databases etc.
#Use it at your own risk.
#License: MIT
############################################################################################
param(
#Website related parameters
[string]$Prefix = 'xp0',
[string]$WebsiteOhysicalRootPath = 'C:\inetpub\wwwroot\',
#Database related parameters
[string]$SQLInstanceName = 'SQL2016',
[string]$SQLUsername = 'sa',
[string]$SQLPassword = '********',
#Certificate related parameters
[string]$CertificateRootStore = 'Cert:\Localmachine\Root',
[string]$CertificatePersonalStore = 'Cert:\Localmachine\My',
[string]$XConnectCertName = "$Prefix.xconnect",
[string]$XConnectClientCertName = "$Prefix.xconnect_client",
[string]$SitecoreRootCertName = 'DO_NOT_TRUST_SitecoreRootCert',
[string]$SitecoreFundamentalsRootCertName = 'DO_NOT_TRUST_SitecoreFundamentalsRoot',
[string]$CertPath = 'C:\Certificates',
#Solr related parameters
[string]$SolrPath = '[path]\solr-6.6.1'
)
$XConnectWebsiteName = "$Prefix.xconnect"
$SitecoreWebsiteName = "$Prefix.sc"
$XConnectWebsitePhysicalPath = "$WebsiteOhysicalRootPath$Prefix.xconnect"
$SitecoreWebsitePhysicalPath = "$WebsiteOhysicalRootPath$Prefix.sc"
$HostFileLocation = "c:\windows\system32\drivers\etc\hosts"
$MarketingAutomationService = "$Prefix.xconnect-MarketingAutomationService"
$IndexWorker = "$Prefix.xconnect-IndexWorker"
Write-Host -foregroundcolor Green "Starting Sitecore 9 instance removal..."
#Remove Sitecore website
if([bool](Get-Website $SitecoreWebsiteName)) {
Write-host -foregroundcolor Green "Deleting Website $SitecoreWebsiteName"
Remove-WebSite -Name $SitecoreWebsiteName
Write-host -foregroundcolor Green "Deleting App Pool $SitecoreWebsiteName"
Remove-WebAppPool $SitecoreWebsiteName
}
else {
Write-host -foregroundcolor Red "Website $SitecoreWebsiteName does not exists."
}
#Remove XConnect website
if([bool](Get-Website $XConnectWebsiteName)) {
Write-host -foregroundcolor Green "Deleting Website $XConnectWebsiteName"
Remove-WebSite -Name $XConnectWebsiteName
Write-host -foregroundcolor Green "Deleting App Pool $XConnectWebsiteName"
Remove-WebAppPool $XConnectWebsiteName
}
else {
Write-host -foregroundcolor Red "Website $XConnectWebsiteName does not exists."
}
#Remove hosts entries
if([bool]((get-content $HostFileLocation) -match $Prefix)) {
Write-Host -foregroundcolor Green "Deleting hosts entires."
(get-content $HostFileLocation) -notmatch $Prefix | Out-File $HostFileLocation
}
else {
Write-Host -foregroundcolor Red "No hosts entires found."
}
#Stop and remove maengine
Get-WmiObject -Class Win32_Service -Filter "Name='$MarketingAutomationService'" | Remove-WmiObject
$Service = Get-WmiObject -Class Win32_Service -Filter "Name='$MarketingAutomationService'"
if($Service) {
Get-Process -Name "maengine" | Stop-Process -Force
Write-Host -foregroundcolor Green "Deleting " $MarketingAutomationService
$Service.StopService()
$Service.delete()
}
else {
Write-Host -foregroundcolor Red $MarketingAutomationService " service does not exists."
}
$Service = Get-WmiObject -Class Win32_Service -Filter "Name='$IndexWorker'"
if($Service) {
Write-Host -foregroundcolor Green "Deleting " $IndexWorker
$Service.StopService()
$Service.delete()
}
else {
Write-Host -foregroundcolor Red $IndexWorker " service does not exists."
}
#Remove Sitecore Files
if (Test-Path $SitecoreWebsitePhysicalPath) {
Remove-Item -path $SitecoreWebsitePhysicalPath\* -recurse
Remove-Item -path $SitecoreWebsitePhysicalPath
Write-host -foregroundcolor Green $SitecoreWebsitePhysicalPath " Deleted"
[System.Threading.Thread]::Sleep(1500)
} else {
Write-host -foregroundcolor Red $SitecoreWebsitePhysicalPath " Does not exist"
}
#Remove XConnect files
if (Test-Path $XConnectWebsitePhysicalPath) {
Remove-Item -path $XConnectWebsitePhysicalPath\* -recurse -Force -ErrorAction SilentlyContinue
Remove-Item -path $XConnectWebsitePhysicalPath -Force
Write-host -foregroundcolor Green $XConnectWebsitePhysicalPath " Deleted"
[System.Threading.Thread]::Sleep(1500)
} else {
Write-host -foregroundcolor Red $XConnectWebsitePhysicalPath " Does not exist"
}
#Remove SQL Databases
import-module sqlps
$DBListQuery = "select * from sys.databases where Name like '" + $Prefix + "_%';"
$DBList = invoke-sqlcmd -ServerInstance ".\$SQLInstanceName" -U "$SQLUsername" -P "$SQLPassword" -Query $DBListQuery
ForEach($DB in $DBList) {
Write-host -foregroundcolor Green "Deleting Database " $DB.Name
$AlterQuery = "ALTER DATABASE [" + $DB.Name + "] SET SINGLE_USER WITH ROLLBACK IMMEDIATE;"
$DropQuery = "DROP DATABASE [" + $DB.Name + "];"
invoke-sqlcmd -ServerInstance ".\$SQLInstanceName" -U "$SQLUsername" -P "$SQLPassword" -Query $AlterQuery
invoke-sqlcmd -ServerInstance ".\$SQLInstanceName" -U "$SQLUsername" -P "$SQLPassword" -Query $DropQuery
}
#Remove Certificates
if([bool](Get-ChildItem -Path $CertificateRootStore -dnsname $SitecoreRootCertName)) {
Write-host -foregroundcolor Green "Deleting certificate " $SitecoreRootCertName
Get-ChildItem -Path $CertificateRootStore -dnsname $SitecoreRootCertName | Remove-Item
}
else {
Write-host -foregroundcolor Red "Certificate " $SitecoreRootCertName " does not exists."
}
if([bool](Get-ChildItem -Path $CertificateRootStore -dnsname $SitecoreFundamentalsRootCertName)) {
Write-host -foregroundcolor Green "Deleting certificate " $SitecoreFundamentalsRootCertName
Get-ChildItem -Path $CertificateRootStore -dnsname $SitecoreFundamentalsRootCertName | Remove-Item
}
else {
Write-host -foregroundcolor Red "Certificate " $SitecoreFundamentalsRootCertName " does not exists."
}
if([bool](Get-ChildItem -Path $CertificatePersonalStore -dnsname $XConnectCertName)) {
Write-host -foregroundcolor Green "Deleting certificate " $XConnectCertName
Get-ChildItem -Path $CertificatePersonalStore -dnsname $XConnectCertName | Remove-Item
}
else {
Write-host -foregroundcolor Red "Certificate " $XConnectCertName " does not exists."
}
if([bool](Get-ChildItem -Path $CertificatePersonalStore -dnsname $XConnectClientCertName)) {
Write-host -foregroundcolor Green "Deleting certificate " $XConnectClientCertName
Get-ChildItem -Path $CertificatePersonalStore -dnsname $XConnectClientCertName | Remove-Item
}
else {
Write-host -foregroundcolor Red "Certificate " $XConnectClientCertName " does not exists."
}
if (Test-Path $CertPath) {
Remove-Item -path $CertPath\* -recurse
Remove-Item -path $CertPath
Write-host -foregroundcolor Green $CertPath " Deleted"
[System.Threading.Thread]::Sleep(1500)
} else {
Write-host -foregroundcolor Red $CertPath " Does not exist"
}
# Remove Solr Cores
& "$SolrPath\bin\solr.cmd" delete -c ($Prefix + "_core_index")
& "$SolrPath\bin\solr.cmd" delete -c ($Prefix + "_fxm_master_index")
& "$SolrPath\bin\solr.cmd" delete -c ($Prefix + "_fxm_web_index")
& "$SolrPath\bin\solr.cmd" delete -c ($Prefix + "_marketing_asset_index_master")
& "$SolrPath\bin\solr.cmd" delete -c ($Prefix + "_marketing_asset_index_web")
& "$SolrPath\bin\solr.cmd" delete -c ($Prefix + "_marketingdefinitions_master")
& "$SolrPath\bin\solr.cmd" delete -c ($Prefix + "_marketingdefinitions_web")
& "$SolrPath\bin\solr.cmd" delete -c ($Prefix + "_suggested_test_index")
& "$SolrPath\bin\solr.cmd" delete -c ($Prefix + "_testing_index")
& "$SolrPath\bin\solr.cmd" delete -c ($Prefix + "_web_index")
& "$SolrPath\bin\solr.cmd" delete -c ($Prefix + "_master_index")
& "$SolrPath\bin\solr.cmd" delete -c ($Prefix + "_xdb")
& "$SolrPath\bin\solr.cmd" delete -c ($Prefix + "_xdb_rebuild")
Write-Host -foregroundcolor Green "Finished Sitecore 9 instance removal..."

There seems to have some issues with gist rendering in the blog. If the code is not displayed in the blog, this link will directly take you to Github gist.

Final Thoughts: SIF is extensible and it is probably possible to extend it to add the instance removal functionality. I will look into that as time permits. Feel free to change the script for your use. Make sure you run it carefully. Once the instance is deleted, nothing can be recovered.

Posted in Sitecore | Tagged , | 2 Comments