Continuous Integration with Gulp and Sitecore Powershell Extensions Part 2 – TeamCity Integration

This blog article is continuation of my earlier blog Continuous Integration with Gulp and Sitecore Powershell Extensions Part 1. In this blog, I will show how we can use TeamCity to build and deploy the code and Sitecore items to a Sitecore instance which resides in a different physical server than the Build Server where TeamCity runs.

QA Server Setup

I am going to deploy the code and Sitecore items in my QA environment for the Cobbler application. As I setup my local environment with Sitecore and Cobbler, I have done the same for the QA environment. I installed Sitecore 8.1 with Powershell using SIM. Then I installed the initial Cobbler package (can be found in Github). I also, created an empty website in IIS for ElfWebApi. My Cobbler website is in C:\Websites\Cobbler and ElfWebApi is in C:\Websites\Elf folder in the QA server. Since these two folders where the code will be deployed by TeamCity from a different physical build server, I had to share the folders over network. I created two shares called ‘Cobbler’ and ‘Elf’ in the QA server. If my QA server name is QAServer, the network path to those two shares are \\QAServer\Cobbler and \\QAServer\Elf. I needed to add these two paths in the gulp-config.js file as follows:

pic1

That’s all for the QA Server setup.

Script Changes

There were not much changes required in the scripts, but I needed to create a new set of scripts for the QA deployment from the original scripts, because the publish folders are changed as highlighted in the above picture. I also had to create a new script to publish Sitecore serialized items in the Cobbler data folder in the QA server. The Deserialization scripts will look into this folder to update Sitecore items. A new deserialization script was also created because the URL for the QA server is different than the local environment. Following are the new scripts:

Publish-Cobbler-QA –> Publishes code to the Cobbler website in the QA Server

Publish-Elf-QA –> Publishes code to the Elf webapi site in the QA Server

Publish-Cobbler-Design-QA –> Publishes design files to Cobbler website in the QA Server

Publish-Cobbler-Items-QA -> Publishes Serialized Sitecore Items to Cobbler data folder in the QA Server. This is the new script.

Deserialize-Cobbler-Items-QA –> Runs DeSerializeItems-qa.ps1

DeployAll-QA –> Runs multiple scripts to deploy files and Sitecore Items to the website in the QA server.

Deploy-Projects-QA –> Publishes only the deployment files. No Sitecore item gets deployed.

An additional change in the Cobbler.Web project was to add the SlowCheetah for the Sitecore.config transformation for the  QA build configuration. The SerializationFolder setting for the QA server is different than Local environment.

<setting name=”SerializationFolder” value=”$(datafolder)/serialization” />

This enables us to deploy the right Sitecore.config to the QA environment.

Build Server Setup

The build server had to setup to run the gulp script. I had to install node.js and gulp in the build server. I downloaded node.js from here. Once node is installed in the build server, I installed gulp in the server using the following command in the command window.

 npm install gulp -g -prefix=”C:\Program Files (x86)\nodejs”

Make sure to open the command prompt in administrator mode. The above command installs gulp in the same directory as where node is installed. As node path is set to environment variable PATH, gulp is also available from any location in the server.

Since the Powershell remote scripts will run from the build sever, I installed SPE Remoting package in the build server. I copied the SPE files in the following folder as build.server is the account that is used to run the TeamCity Build Agent.

C:\Users\build.server\Documents\WindowsPowershell\Modules\SPE

Build Configuration

In TeamCity I created following build steps

pic2

The third build step is the one to run the gulp scripts. This is a custom command line script that TeamCity runs using command prompt. The script first runs npm install. That installs the dependencies in the local node_modules folder. Next, it calls gulp to run my DeployAll-QA. This calls all other gulp script that deploys code and Sitecore items.

pic3

Final Words

With power of Sitecore Powershell Remoting and Gulp I successfully created a Continuous Integration using TeamCity for my QA environment. Developers adding new or modifying Sitecore items can serialize the items by running the Serialize-Cobbler-Items-Local gulp script in their environment and check in the items in the source control. TeamCity build will check out the Sitecore items from the source control and deserialize the items in the QA server or any remote server the build is configured for.

Links

 

Advertisements
Posted in CI, Gulp, Powershell, Sitecore | Tagged , , , | 3 Comments

Sitecore Pipelines and Unity

This is continuation of my earlier blog post Sitecore MVC and Unity. In that post I discussed how with a minimum effort we can use Unity as the DI Framework for the Sitecore MVC application. In this post, I will discuss how we can use Unity to make the Sitecore Pipelines DI enabled and thus make the Pipelines unit testable.

When it comes to DI for the Sitecore Pipelines, there are two approaches we can take.

  1. Use Sitecore’s inbuilt Configuration Factory. This was described in great detail by Mike Reynolds in his blog Leverage the Sitecore Configuration Factory: Inject Dependencies Through Class Constructors.
  2. Create our own Factory class and use DI container to resolve the objects. This was described in the blog post Sitecore pipelines & commands using IoC containers.

I will be taking the second approach because I can use the same Unity Container that I described in earlier post. For the sake of understanding how this approach works let’s define the application as follows.

Say, I have a website that makes Cookies called CookieMaker. In that, I have a Pipeline called CookieMaker.MakeCookies with some processors. One of the processors is GetSomeFlour. Following is the project structure.

Unity3

The Pipeline Processor GetSomeFlour uses FlourService and the FlourService in turns uses FlourRepository. Here are the classes.

GetSomeFlour

Unity4

FlourService

Unity5

FlourRepository

Unity6

Using Unity container all of the above classes will be resolved. For this, I need to create my own Configuration Factory called UnityContainerFactory as below.

Unity8

In the above class, UnityConfig.GetConfiguredContainer() method returns the same container that was mentioned in my earlier post and since I registered classes by naming convention the Service and Repository classes will be resolved automatically.

To use UnityContainerFactory we need add the following in the config file.

Unity7

That’s all we need to do to make Pipelines DI enabled.

Posted in Framework, MVC, Sitecore | Tagged , ,

2015 in review

Wish you a very happy and prosperous 2016. The WordPress.com stats helper monkeys prepared a 2015 annual report for this blog.

Here’s an excerpt:

The concert hall at the Sydney Opera House holds 2,700 people. This blog was viewed about 13,000 times in 2015. If it were a concert at Sydney Opera House, it would take about 5 sold-out performances for that many people to see it.

Click here to see the complete report.

Posted in Uncategorized | Leave a comment

Continuous Integration with Gulp and Sitecore Powershell Extensions Part 1

Sitecore development and code deployment is quite different than normal ASP.Net Web Application development because of the fact that we need to deal with the versioning of Sitecore items. In this regard Team Development for Sitecore (TDS) is a huge help. There are two main things that TDS does, it publishes the files to the application folder and deploys Sitecore items in the application. Actually, TDS does lot more than that, but those two things are most important to me.

Last few days I was contemplating if can I achieve these two deployment functionalities using Gulp and Sitecore Powershell Extensions (SPE), because, Gulp is great for publishing project files and SPE is great for dealing with Sitecore items. So, the journey started and I am happy that it works. This is going to be a blog series. This one is for deployment of Sitecore application in developer’s local environment.

Sitecore Setup

This is easy. I used SIM to install Sitecore 8.1 and 3.3 version of SPE from Sitecore Marketplace. To work on this, I need a real Sitecore application. I named it ‘Cobbler’. For local environment the hostname is cobbler.local. After setup is done, I had to create the templates and items for my application. Below is a screenshot from the content editor. All the items are available in Github along with the source code.

pic1

Only change I made in the default installation, I changed the serialization folder in the Sitecore.config. I did this because I want to version control the serialized Sitecore items. Following line changed

<setting name=”SerializationFolder” value=”C:\Github\Cobbler\data\serialization” />

Project Setup

Following is a screenshot of the Cobbler solution in Visual Studio.

pic2

 

By no means it is a fully working solution. I mean, after setting up everything and do the deployment, a nice website will not be launched. The idea is to show how deployment works. Being said that, I tried to add different kinds of projects in the solution to represent how a real application solution looks like. We have two website projects. Cobbler.Web is the main web application and Elf.WebApi is a restful API project. Cobbler talks Elf via restful API. Cobbler.Design project contains all UI design related files (css, font etc.). I separated it to show that files from project other than the Cobbler.Web can be published in Cobbler website. In real life this is often the case. The Elf website is a separate IIS site and deployment process recognize that and publishes to that website.

Gulp Setup

To use Gulp in Visual Studio solution you first need to install Node and npm. Visit nodejs.org and click on the big green button to download and then install. Npm comes with the Node and separate installation is not needed. After installation you can run following two command in the command prompt window to see if they were installed properly. This will show you the version.

node -v

npm -v

Once Node is installed, open Package Manager Console in the Cobbler Visual Studio solution and run following two commands. Since the solution already has a package.json, all gulp packages used in the gulp scripts will be downloaded.

npm install -g gulp

npm install –save-dev gulp

You will also need to install Task Runner Explorer to run the gulp tasks. You can down load from here. After installing Task Runner Explorer, close of Visual Studio and launch again. I found that the gulp tasks doesn’t show up in Task Runner unless I relaunch again. At this point if you right click on the gulpfile.js and click on Task Runner Explorer, you should see all the gulp tasks on the left.

pic3.png

Sitecore Powershell Extension Setup

Other than installing SPE Package in Sitecore, you need to download Remoting for SPE 3.3 Package from Marketplace. Copy the files in C:\Users\<username>\Documents\WindowsPowershell\Modules\SPE folder. The script Invoke-RemoteScript.ps1 in this folder will be use to run scripts remotely in the Cobbler Sitecore application.

Scripts

That should be all for setup. Let’s look at the scripts.

Powershell Scripts

SerializeItems.ps1
This script reads the items path from the SitecoreItemPath.txt file and serialize the files in the Serialization folder C:\Github\Cobbler\data\serialization. If a line in the SitecoreItemPath.txt marked as -DeployOnce, that item will be skipped if it already exists.


#Change the Powershell Execution Policy so that script can be run from Visual Studio
Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass
$session = New-ScriptSession -Username &quot;admin&quot; -Password &quot;b&quot; -ConnectionUri &quot;http://cobbler.local&quot;
$RootFolder=&quot;C:\Github\Cobbler\data\serialization\master&quot;
$ItemPaths=Get-Content &quot;C:\Github\Cobbler\SitecoreItemPath.txt&quot;
$ItemExtn=&quot;.item&quot;
foreach($ItemPath in $ItemPaths)
{
$ItemPath=$ItemPath.Trim()
$ItemPath=$ItemPath -replace &quot;/&quot;, &quot;\&quot;
if($ItemPath.Length -gt 0)
{
if(-not (($ItemPath.ToLower().Contains(&quot;-deployonce&quot;)) -and (Test-Path &quot;$($RootFolder)$($ItemPath.ToLower().TrimEnd(&quot;-deployonce&quot;).Trim())$($ItemExtn)&quot;)))
{
if($ItemPath.ToLower().Contains(&quot;-deployonce&quot;))
{
$ItemPath=$ItemPath.ToLower().TrimEnd(&quot;-deployonce&quot;).Trim()
}
Write-Host &quot;Serializing Item: $ItemPath&quot;
$script = {
Get-Item -path $params.Path | export-item -ItemPathsAbsolute -Root $params.RootFolder
}
$args = @{
&quot;Path&quot; = &quot;master:$($ItemPath)&quot;
&quot;RootFolder&quot; = &quot;$RootFolder&quot;
}
Invoke-RemoteScript -ScriptBlock $script -Session $session -ArgumentList $args
}
}
}

DeserializeItems.ps1
This script is opposite to SerializeItems.ps1 script. It reads the items path from the SitecoreItemPath.txt and create or update the items in Sitecore. Similar to the other script, this script also skips the item if the item is marked as -DeployOnce.


#Change the Powershell Execution Policy so that script can be run from Visual Studio
Set-ExecutionPolicy -Scope Process -ExecutionPolicy Bypass
$session = New-ScriptSession -Username &quot;admin&quot; -Password &quot;b&quot; -ConnectionUri &quot;http://cobbler.local&quot;
$RootFolder=&quot;C:\Github\Cobbler\data\serialization\master&quot;
$ItemPaths=Get-Content &quot;C:\Github\Cobbler\SitecoreItemPath.txt&quot;
$ItemExtn=&quot;.item&quot;
foreach($ItemPath in $ItemPaths)
{
$IsDeployOnceItem = $FALSE
$ItemPath=$ItemPath.Trim()
#$ItemPath=$ItemPath -replace &quot;/&quot;, &quot;\&quot;
if($ItemPath.Length -gt 0)
{
if($ItemPath.ToLower().Contains(&quot;-deployonce&quot;))
{
$IsDeployOnceItem = $TRUE
$ItemPath=$ItemPath.ToLower().TrimEnd(&quot;-deployonce&quot;).Trim()
}
Write-Host $ItemPath
$script = {
if(Test-Path $params.Path)
{
if(-not ($Params.DeployOnceFlag))
{
Get-Item -path $params.Path | Import-item -ForceUpdate
}
else
{
Write-Log &quot;Skipped&quot;
}
}
else
{
$Path=$params.Path
$ParentPath=$Path.Substring(0,$Path.LastIndexOf(&quot;/&quot;))
Get-Item -path $ParentPath | Import-item -Recurse
}
}
$args = @{
&quot;Path&quot; = &quot;master:$($ItemPath)&quot;
&quot;DeployOnceFlag&quot; = $IsDeployOnceItem
}
Invoke-RemoteScript -ScriptBlock $script -Session $session -ArgumentList $args
}
}

Gulp Scripts

I am not going to paste all the gulp script here but just describe the purpose of the scripts.

Publish-Cobbler-Local –> Publishes code to the Cobbler website

Publish-Elf-Local –> Publishes code to the Elf webapi site

Publish-Cobbler-Design-Local –> Publishes design files to Cobbler website

Serialize-Cobbler-Items-Local –> Runs SerializeItems.ps1

Deserialize-Cobbler-Items-Local –> Runs DeSerializeItems.ps1

DeployAll-Local –> Runs multiple scripts to deploy files and Sitecore Items to the website. Full Deployment.

Deploy-Projects-Local –> Publishes only the deployment files. No Sitecore item gets deployed.

Following scripts runs when solution is launched in Visual Studio and watch changes in the css, javascripi and MVC Views. As soon as file(s) change, they get deployed to the website.

Auto-Publish-Css-Local
Auto-Publish-js-Local
Auto-Publish-Views-Local

Final Word

This is working fine for me in most cases. What this does not provide that I get from TDS is the Sync functionality and interactive way to serialize items in Visual Studio. In the next part, I will integrate this deployment process with Team City and deploy application to remote server.

Links

 

 

 

 

 

Posted in Gulp, Powershell, Sitecore | Tagged , , , | 2 Comments

Social Media, Sitecore and B2B Commerce

In my last blog post I discussed about role of Sitecore in B2B Commerce and how big of an opportunity their to use Sitecore to increase the business revenue. In this blog I will discuss why B2B Commerce needs Sitecore for content generation and distributing the content via different channels to increase the business prospect.

I would like to start with a particular experience I had working with one of our B2B Commerce clients. This was beginning of a project and we were talking about requirements. In the Product Detail mock up there were some social media buttons for Facebook, Twitter etc. Our client thought none of their users will be posting the product link in the their Facebook or Twitter pages. I thought same way too. For example, if I am a Customer Representative and I buy products for customers, why would I post product link in my social media pages? I don’t think I would. But, that incident got me thinking about the role of social media for the B2B Commerce.

Before we discuss, how social media can be used in B2B Commerce, let’s discuss about the difference between a B2B Commerce and B2C Commercce user. In B2C Commerce the user is the customer and buys products for personal use. For B2C commerce user, posting the product link on user’s own social media page makes sense because user owns the product and probably would like to share the purchase with user’s followers. On the other hand in B2B, user often is not the customer. For B2B user the lead time for a purchase is often longer than B2C user. B2B user usually tend to do more research by reading blogs, articles that posted by either the manufacturer or by some independent bloggers. They need to do more research because most cases the B2B products are complex and some time it is configured product. What is important for the B2B Commerce site is, to make these research materials easily available for the users through social media channels and forums. This is where Sitecore can help with content generation and pushing contents using Experience Platform.

Before we talk about how or what Sitecore can do, let’s look at this interesting statistics below about a survey conducted on 2013 to understand how B2B decision makers are using social media.

Forrester-B2B-Decision-Maker-Use-Social-Media-July2013

You can find more about this chart in this article. Although, the chart is two years old the situation has not changed a lot. It is clear that B2B business can do a lot to distribute content about their product via social media.

Blogging using Sitecore WeBlog module

The B2B businesses needs to provide information about their products through blogs, articles either in the eCommerce site itself or via links to independent bloggers. Sitecore Marketplace has a great blogging module called WeBlog and it is free. You can find this module here. With the help of WeBlog, the blogging or Product articles can be added with minimum effort. WeBlog comes with many features like tagging, RSS feed generation, commenting, spam filtering, CAPTCHA etc. What is sweet though, it comes with Social sharing ShareThis or AddThis, and other Facebook and Twitter widgets. By having Sitecore and WeBlog module, the B2B business can easily generate blog contents and distribute those through different media channels.

Sitecore Media Framework

I am a serious hobbyist photographer. When I look for a new photo gear and I need to research about it, I sometime prefer to watch videos about the gear, instead of reading long articles. It is a great idea to add videos in the B2B site to easily communicate the product information to the user. In this regard Sitecore Media Framework can be a great help. Sitecore Media Framework integrates with Brightcove and Ooyala professional video streaming service. The B2B site built on Sitecore can take advantage of these media connectors two manage the videos. Some time back, I wrote a Media Connector for integrating Sitecore with Youtube. This module is available in Sitecore Marketplace. I wrote some blog posts too. You can find more here Sitecore Youtube Connector.

Sitecore Experience Platform

Finally, Sitecore Experience Platform can be a great use for selecting appropriate blogs/articles/videos for the B2B user. Based on the user and customer profile and what kind of products the user is interested in, related blogs/articles/videos/Independent blog links can be presented to the user. Each content will provide user the option to share content in the social media. The distribution of content via social media will attract more customer to the B2B site.

Conclusion

B2B business has yet to use the full potential of social media. With the power Sitecore, an integrated B2B eCommerce platform with Sitecore can achieve this goal very quick and it can add great value to the business.

Posted in Commercce, Sitecore, Uncategorized | Tagged , , , , , | Leave a comment

B2B eCommerce, Sitecore and Digital Marketing

As a regular online shopper, when we think about Online Shopping or eCommerce, we mostly picture about B2C eCommerce like Amazon. But,  a huge part of the Online eCommerce is B2B eCommerce. According to Forrester Research, in 2014 the B2B sale in US was $692 billion and that was a very conservative number. This year the prediction is that, it will reach $780 billion and the sale will reach more than a trillion dollar by 2020.

pic1

So, what is the role that Content Management System like Sitecore can play to influence the sales of a B2B eCommerce company. Both B2B and B2C eCommerce need some sort of CMS for managing the content, media or digital assets etc. for the eCommerce Site. Those are mostly similar in functionalities. The real difference how the Digital Marketing or in case of Sitecore, how the Experience Platform will be used to drive the B2B sales. But, before diving into that discussion, let’s look at what B2B businesses think about the importance and effectiveness of Content Management System. Below is a survey result I picked up from a Salesforce blog.

pic2

As per that survey, Marketing Analytics (Experience Analytics in Sitecore), Content Management (Sitecore), Marketing Automation (Engagement Planning in Sitecore) and Predictive Intelligence (Personalization in Sitecore) are very critical and effective. The good thing is Sitecore is one of the Market Leaders on those areas and Sitecore is capable of putting B2B business in advantageous position if used effectively.

For last few years, I am working mainly in the area of integrating eCommerce with Sitecore and spent lots of time understanding B2B clients’ business and requirements. The B2B commerce is different and most important difference is that, a Business buys products from B2B site for reselling or running it’s business. Where as in B2C an individual buys products for his/her use. In both cases, a user buys products, but in B2B, user buys it for his/her company. That’s why in B2B the buying is less impulsive and that’s make a huge difference on how the Perosonalization features should be used. For example, based on what a B2B user has bought or looked at in the past, a Quick List can be recommended instead of Product. Quick List is a list of Products that can be added in the Cart quickly from the list. It’s effective because it speeds up the buying process for the customer.

Lots of time the B2B user is a Customer Service Representative (CSR) and buys products for the Customers he/she is assigned to serve. A CSR not only buys products for the customer, they also educate them, fill up their questions and create a long term relationship with them. The B2B site can be and should personalized for the CSRs. For example, it will be helpful for CSRs if they can quickly find the documentation of the products. Based on what customer he/she is helping at that time and customer’s past buying data, documents, articles etc. can be made available to CSRs in the personalized website. The CSRs are knowledgeable person and they usually use Quick Order tools to buy products. The Quick Order Tool should be made more accessible to CSRs than regular user. These are just few examples of how personalization can be achieved in B2B site.

More and more B2B businesses are entering in the digital market. The B2B businesses are going from printed catalog to EMail Marketing. Most of ours clients see the power Personalized EMail Marketing and return of investing on Intelligent Targeted Marketing campaign using software like Sitecore. The user base is changing too. More and more people from young generation are entering in the B2B space. On their personal time they are online shopper and they expect similar experience from B2B business too. B2B business must change to satisfy this generation.

The software like Sitecore will make huge impact on the B2B market in the coming years because of its capability to make the business reach to more customers, using less resources and less time. Furthermore it will make shopping experience intriguing  and satisfying for the user and thus it will attract more customers. I am looking forward to integrating more and more B2B business to Sitecore.

Posted in Commercce | Tagged , | 1 Comment

Some Sitecore Habitat Setup Issues

If you haven’t already checked out Sitecore’s Habitat Framework, you should take a look at the solution. Whether you like the Framework architecture or not, it is worth checking the solution. I see it as Sitecore Accelerator  Solution Framework and it is going to be useful for my future projects. You can start from here in Github. Fantastic Architecture discussion and Getting Started videos from Thomas Eldblom (@TEldblom).

This blog is not about introducing Habitat. I recently setup the application in my machine and faced few issues. I will discuss them because they are not mentioned in the Habitat Wiki.

Visual Studio 2015 vs Visual Studio 2013

Habitat Solution requires you to have VS 2015. If you have VS 2015 or you can install it in your machine, just do that before opening up the solution. For me, I have several reasons why I cannot install VS 2015. I did a little hack to make it work for VS 2013.

  • Open the solution file (Habitat.sln) in notepad++ and following lines
    Microsoft Visual Studio Solution File, Format Version 12.00
    # Visual Studio 14
    VisualStudioVersion = 14.0.23107.0
    MinimumVisualStudioVersion = 10.0.40219.1

    WithMicrosoft Visual Studio Solution File, Format Version 12.00
    # Visual Studio 2013
    VisualStudioVersion = 12.0.31101.0
    MinimumVisualStudioVersion = 10.0.40219.1
  • In notepad++ replace 14.0 with 12.0 in all .csproj files in the SRC folder.
  • Now you can open the solution in VS 2013. Open the file gulpfile.js file in VS 2013 IDE and replace 14.0 with 12.0 in the file. There is only one reference. This is required for the valid MSbuild path.

NuGet Package Issue

I had trouble restoring the xUnit.net packages. It seems the xUnit.net packages were created with the more recent version of NuGet. I had to get latest version of NuGet Extension for Visual Studio. After that there was no problem restoring the packages.

Gulp Issues

Habitat uses Gulp to build and publish the files. If you don’t have Gulp already installed in your machine, you have to first install Gulp . In Visual Studio open the NuGet Package Manager Console. Run following two commands.

npm install –global gulp

npm install –save-dev gulp

If you open the Task Runner window, you still may not see the tasks. Click on the refresh button. If that doesn’t work close and open the solution in Visual Studio.

Unicorn

I had some trouble serializing items in Unicorn Control Panel. I must say, I am very new to Unicorn and it was probably my lack of understanding.

  • The Framework Project items has to be serialized first, then the Domain Projects items and lastly Website items. Start with Habitat.Framework.Serialization.
  • Do not click on the Reserialize until you Sync first. For me it wiped out the .yml files.

That’s all the issues I faced so far. I hope it helps some of you out there. Happy coding.

Posted in Framework, Sitecore | Tagged , | Leave a comment

Sitecore MVC and Unity

Using a Dependency Injection (DI) framework in ASP.NET MVC project is most common thing. I have used DI in almost all my MVC project and the one I use most is Unity. I know, Ninject is cool and most used DI statistically :). Unity has come a long way and there are many improvements done recently. When it comes to use Unity in a MVC project, what I most like is its ability to resolve the Types based on name matching. No need to register the Types in code or configuration, just use convention.

This blog post is not about ASP.NET MVC and Unity though. This is about using Unity as DI for Sitecore MVC project. We know Sitecore interferes with MVC framework and handle things differently. If you quickly want to know how Sitecore MVC works, watch Martina Welander’s video presentation Sitecore MVC – Getting Started. But, when it comes to using DI with Sitecore MVC any DI framework can work because Sitecore ultimately delegates the resolution of controllers to ASP.NET MVC. Recently, Mickey Rahman wrote a blog post about, how Sitecore 8.1 is now using Current DependencyResolver to resolve controllers: Changes to Dependency Injection in Sitecore 8.1. So, as long as I set the DependencyResolver to UnityDependencyResolver Sitecore MVC controllers should be resolved properly. The fact is, I even don’t have to do that. All should happen as I install the Unity NuGet package.

Here are the steps.

  • Open up the NuGet Package Manager in you Sitecore MVC solution and search for Unity. Install ‘Unity bootstrapper for ASP.NET MVC’.
    NuGet Package
  • Once this is installed you should see two files added to the App_Start folder, UnityMvcActivator.cs and UnityConfig.cs. If you open the UnityMvcActivator.cs file, you can see how the DependencyResolver is set to UnityDependencyResolver.UnityWebActivator
  • The next step is to open up the UnityConfig.cs file and register the types in the RegisterTypes method. You can register the types one at a time like below
    container.RegisterType<IProductService, ProductService>();
    But, this becomes difficult when your project has too many Types and chain of dependencies. As I said before, I like most the way Unity resolves the Types by convention. To do that you need to use container.RegisterTypes method like below.
    container.RegisterTypes(AllClasses.FromAssemblies(typeof(Services.BaseService).Assembly,
    typeof(ApiClient.BaseClient).Assembly), WithMappings.FromMatchingInterface,
    WithName.Default);

    One thing you have to be careful about. Do not use AllClasses.FromLoadedAssemblies(). This will try to resolve all types in the loaded assemblies and Sitecore will throw error. In most cases we know where the classes are that we want DI to resolve. Use AllClasses.FromAssemblies option and include the assemblies that contain your Types.

That’s all for today. Happy coding.

Posted in Uncategorized | Tagged , , , , | 2 Comments

Sitecore Powershell Extention : How Find-Item solved the performance issue

In my last blog post I mentioned that after loading the images in Sitecore with the help of Sitecore Powershell Extension (SPE), I needed to associate the products to the images. I didn’t show that code because it’s a specific requirement. At that time, I thought number of products is close to number of images, that is, 10000. When we started loading actual data from the eCommerce system, client started complaining about the performance of my Powershell Script. I realized actual number of products is way more than 10K, close to 70K.

Following is the portion of the code

		#Associate Product to Image.
		$ImageName=$Image.TrimEnd(".$FileExtension")
		$products=Get-Item master: -Query "/sitecore/content/Product Repository/Products//*[@@templatename='Custom Insite Product' and @Image Name='$ImageName']"
		if($products)
		{
			foreach($product in $products)
			{
				$product.Image=Get-Item $MediaImagePath.TrimEnd(".$FileExtension")
				$productName=$product.DisplayName
				Write-Host "Assigned image $ImageName to '$productName' ..."
			}
		}
		else
		{
			Write-Host "No product found for this image $Image ..."
		}

After some debugging, I found that the highlighted code is taking most of the time. That code is getting all the products that are supposed to use the image the script just loaded. Is it a SPE thing or Sitecore query problem? To find that, I took the query and ran it in XPath builder. Result shown below.

Sitecore Query Result

The query was taking 104 seconds. That’s for one image. For 10K images this script will never finish.

To improve the performance, I started looking at SPE commands to see if I can query the products from the search index, because, the products are already indexed by another process. I found Find-Item and replace the code with the following code.

		#Associate Product to Image.
		$ImageName=$Image.TrimEnd(".$FileExtension")
		$products=Find-Item -Index commerce_products_master_index -Criteria @{Filter = "Equals"; Field = "_templatename"; Value = "Custom Insite Product"}, @{Filter = "Equals"; Field = "Image Name"; Value = "$ImageName"} | Select-Object -Property Path | Get-Item

		if($products)
		{
			foreach($product in $products)
			{
				$product.Image=Get-Item $MediaImagePath.TrimEnd(".$FileExtension")
				$productName=$product.DisplayName
				Write-Host "Assigned image $ImageName to '$productName' ..."
			}
		}
		else
		{
			Write-Host "No product found for this image $Image ..."
		}

The result was amazing. Now the query returns products immediately. For 10K images and 70K products, it’s taking little more than half an hour. That’s acceptable performance for this kind of batch process.

 

Posted in Powershell | Tagged , , | Leave a comment

Bulk Loading Images in Sitecore Media Library Using Sitecore Powershell Extension (SPE)

With the blessings from Sitecore Powershell gurus Adam NajmanowiczMichael West and Mike Reynolds, this is my first blog post on Sitecore Powershell Extension (SPE). I mean it. When I had a problem to solve and I wanted to use SPE for that, I just dropped a question in Twitter and the help was more than what I expected.  They helped me all the way to solve my problem. These guys are awesome. I love Sitecore community.

So, what problem I was trying to solve using SPE? We have an application where, we have more than 10 thousands product items. We have to load and associate the images for those product items in Media Library. Why use SPE? Because, I cannot imagine a content editor loading images using Media Library uploader. Media Library does upload images from zip file. In my case the requirement doesn’t stop on just loading the images. We need to associate images to the product items, which I did using my SPE script. Beside this, since this a SPE is script, we can run it via Sitecore Schedule Task. All the user needs to do is drop the images in the designated folder.

For this blog, I will only describe how loading images in bulk works. Rest of the program of associating images to product items is very specific to our requirement and better not discussed for the sake of keeping length of blog smaller. While working on this problem, I faced couple of issues. Those were solved with the help of SPE team. I described the issues at the end of the blog, in case someone looking for answers.

My research started with this article written by Adam. This describes, how a remote SPE script can be run to load image from the file system to media library. I needed to extend this functionality. The first thing I did was created my own Powershell function that loads multiple images from the file system. Here is the script.

#You need to include cmdlets from following location before using this function
#Invoke-Script 'master:/sitecore/system/Modules/PowerShell/Script Library/Platform/Functions/Remoting'
function Upload-MultipleImages {
    [CmdletBinding()]
    param(
        #Remote site host name
        [Parameter(Position=0, Mandatory=$true)]
        [ValidateNotNullOrEmpty()]
        [string]$SiteUrl,
        
        #Remote site username
        [Parameter(Position=1, Mandatory=$true)]
        [ValidateNotNullOrEmpty()]
        [string]$Username,

        #Password for above username
        [Parameter(Position=2, Mandatory=$true)]
        [ValidateNotNullOrEmpty()]
        [string]$Password,

        #Folder location in the File System where images are located
        [Parameter(Position=3, Mandatory=$true)]
        [ValidateNotNullOrEmpty()]
        [string]$SourcePath,

        #Image file extension jpg, png etc.
        [Parameter(Position=4, Mandatory=$true)]
        [ValidateNotNullOrEmpty()]
        [string]$FileExtension,

        #Media Library location
        [Parameter(Position=5, Mandatory=$true)]
        [ValidateNotNullOrEmpty()]
        [String]$MediaPath

    )
    #Connect to Remote Sitecore site
    Set-SitecoreConfiguration $SiteUrl $Username $Password
    #Get all the Image file names
    $images=Get-ChildItem "$SourcePath*.$FileExtension" -name
    foreach($image in $images)
    {
        $MediaImagePath="$MediaPath$image"
        #Check if the image already exists in the Media Library location.
        #If exists, skip it but log in the Sitecore log file and in the console
        #If doesn't exists, upload the image and log the message
        if(Test-Path $MediaImagePath.TrimEnd(".$FileExtension"))
        {
            Write-Log "Image $Image already exists... skipping"
            Write-Host "Image $Image already exists... skipping"
        }
        else
        {
            Write-Log "Uploading Image $Image ..."
            Write-Host "Uploading Image $Image ..."
            Upload-SitecoreFile -remotePath $MediaImagePath -File "$SourcePath$image"
            Write-Log "Uploading Image $Image ... done."
            Write-Host "Uploading Image $Image ... done."
        }
    }
}

Once this function was created, I created below script to call the above function.

#Include SPE remoting scripts
Invoke-Script 'master:/sitecore/system/Modules/PowerShell/Script Library/Platform/Functions/Remoting'
#Include my function
Invoke-Script 'master:/sitecore/system/Modules/PowerShell/Script Library/Nishtech/Functions/Upload-MultipleImages'
$StartTime=Get-Date
#Get context item
$item=Get-Item .
#Get full path of the context item (media folder)
$MediaPath=$item.FullPath
Upload-MultipleImages 'http://myapp.local.com' 'admin' 'b' 'C:\Work\Project Docs\myapp\Images\' 'jpg' "$MediaPath/"
$EndTime=Get-Date
#Show total time taken
new-timespan -Start $StartTime -End $EndTime

I added the above script in the location showed in the following image.

 ScriptLocation

The reason to store the script there was to use Context Menu, like below. I can right click on any media folder and run the script from the context menu.

ContextMenu

I also added a rule for the script so that the context menu only shows up for the Media Library folders.

Rule

The context menu will not show up if the Module was not activated. Make sure the following option checked for the Module where script has been stored.

ModuleActivation

Now, if I run the script from the context menu, it runs and shows me the following progress box.

ProgressBox

If I click on the ‘View Script Results’ link, I can see the result in the SPE console window snapshot.

ConsoleI wanted to run the same script from the Schedule Task. I had to create a separate script for that because, in this case there is no context media folder. I need to pass the destination folder in from the script itself. Same script, only change is the following line.

Upload-MultipleImages ‘http://myapp.local.com&#8217; ‘admin’ ‘b’ ‘C:\Work\Project Docs\Myapp\Images\’ ‘jpg’ ‘master:/sitecore/media library/Images/Products’

Here is the scheduled task settings

ScheduledTask

That’s all I had to do for loading images.

Issues:

As I mentioned in the beginning that I faced couple of issues working on this. I will discuss them now. I learned few things from these issues.

Routing Issue:

One of my application included in the Sitecore desktop has the path that contains /Console/. After installing SPE, I found that this application stopped working and I was getting following error.

RoutingError

I opened up Cognifide.PowerShell.Core.Processors.RewriteUrl in DotPeek and found that following code in the preprocessorRequest pipeline is looking for any route that includes ‘/Console/’ and re-writing it with “/sitecore modules/PowerShell/”

        Uri url = arguments.Context.Request.Url;
        string localPath = url.LocalPath;
        if (localPath.StartsWith("/Console/", StringComparison.OrdinalIgnoreCase))
          WebUtil.RewriteUrl(new UrlString()
          {
            Path = localPath.ToLowerInvariant().Replace("/console/", "/sitecore modules/PowerShell/"),
            Query = url.Query
          }.ToString());

The solution was to write my own processor and replace the SPE processor with that. I just changed the code to look for more specific route “/Console/Services/”. Simple fix.

        Uri url = arguments.Context.Request.Url;
        string localPath = url.LocalPath;
        if (localPath.StartsWith("/Console/Services/", StringComparison.OrdinalIgnoreCase))
          WebUtil.RewriteUrl(new UrlString()
          {
            Path = localPath.ToLowerInvariant().Replace("/console/", "/sitecore modules/PowerShell/"),
            Query = url.Query
          }.ToString());

 No Rules Available in the Rules Editor:

When I tried to set the Rule for my script I found that there is no out of the box Sitecore rules in the Rules Editor. Adam pointed me that there is patch exists in the Sitecore Marketplace download area for this issue. After I installed that patch, the rules started showing up.

RulesPatch

Final Words:

This is my first encounter with Sitecore Powershell and I am extremely excited by realizing it’s usefulness in Sitecore projects. I will be using it in future and keep blogging about it.

References:

Posted in Powershell, Sitecore | Tagged , | 2 Comments