I’ve been working quite alot on database-intensive applications lately and have been setting up a simple repository pattern with structuremap that I thought might be of interest.

First things first, I love structuremap, it’s an excellent IoC framework and it makes life so much easier when used to its full potential. Not just for dependency injection but for making applications truly loosely coupled. One oftenly overlooked feature is Structuremaps ability to decorate an inner class with a wrapper. Have a look at this for example:

For<IMovieRepository>().Use<MovieRepository>();
For<IMovieRepository>().DecorateAllWith<CachingMovieRepository>();

What we're doing here is simply injecting a concrete implementation for IMovieRepository called MovieRepository and then decorating or wrapping it with another concrete implementation called CachingMovieRepository. We could continue this chain with another wrapper, and another and another. Perhaps something like this:

For<IMovieRepository>().Use<MovieRepository>();
For<IMovieRepository>().DecorateAllWith<CachingMovieRepository>();
For<IMovieRepository>().DecorateAllWith<AuditingMovieRepository>();
For<IMovieRepository>().DecorateAllWith<LoggingMovieRepository>();

The outermost wrapper would now be a LoggingMovieRepository, which in turn would call an AuditingMovieRepository, which in turn would call a CachingMovieRepository that finally would call the MovieRepository and return some movies (hopefully).

This might sound weird and not make much sense if you’re not into IoC or the decorator pattern, but let me continue the example by showing the IMovieRepository interface (simplified of course) and just some possible implementations of the different implementors.

public interface IMovieRepository {
    Movie GetById(int id);
}

The interface, simplified to one method.

public class MovieRepository : IMovieRepository {
    Movie GetById(int id) {
        //Some db-call to fetch a movie and return it
        //For example:
        return dbContext.Movies.FindById(id);
    }
}

The only concrete implementation that actually does the heavy lifting of getting a movie from the database. Nothing really to see here.

public class CachingMovieRepository : IMovieRepository {

    private readonly IMovieRepository _repo;
    private readonly ICacheManager _cache;

    public CachingMovieRepository(IMovieRepository repo, ICacheManager cacheManager) {
        _repo = repo;
        _cache = cacheManager;
    }

    Movie GetById(int id) {
        //We try to get the movie from the cache
        var movie = _cache.TryGetMovie(id);
        
        if(movie == null)
        {
            return _repo.GetById(id);
        }
        return movie;
    }
}

Now it's getting interesting, the caching repo implements the interface and takes a parameter for another IMovieRepository to call in case it does not find the requested movie in the cache. It also needs an ICacheManager to do the actual cache look up.

public class AuditingMovieRepository : IMovieRepository {

    private readonly IMovieRepository _repo;
    private readonly IAuditManager _audit;

    public AuditingMovieRepository (IMovieRepository repo, IAuditManager audit){
        _repo = repo;
        _audit = audit;
    }

    Movie GetById(int id) {
        //We audit the call
        _audit.AuditGetMovie(int id);
        
        return _repo.GetById(id);
    }
}

And we can see the pattern forming, every IMovieRepository takes another IMovieRepository to call after it has completed its operations. In this case just auditing the call to the database and perhaps save who accessed which table for which entity.

public class LoggingMovieRepository : IMovieRepository {

    private readonly IMovieRepository _repo;
    private readonly ILogManager_log;

    public LoggingMovieRepository (IMovieRepository repo, ILogManager log){
        _repo = repo;
        _log = log;
    }

    Movie GetById(int id) {
        //We log the call
        _log.LogInfo(String.Format("Getting movie {0} from database", id));

        return _repo.GetById(id);
    }
}

And we now have four completely different implementations of IMovieRepository each with it's own unique responsiblity. And the best part? They need to know absolutely nothing of each other. We could even switch the order of operations around by simply changing our structuremap configuration, like so:

For<IMovieRepository>().Use<MovieRepository>();
For<IMovieRepository>().DecorateAllWith<CachingMovieRepository>();
For<IMovieRepository>().DecorateAllWith<LoggingMovieRepository>();
For<IMovieRepository>().DecorateAllWith<AuditingMovieRepository>();

And suddenly the auditing happens before the logging.

To be able to fully utilize this pattern I have composed a simple asynchronous and generic repository interface that looks like this:

public interface IRepository<TEntity>
    {
        Task<TEntity> GetByIdAsync(int id);
 
        IQueryable<TEntity> Find(Expression<Func<TEntity, bool>> predicate);
 
        IQueryable<TEntity> GetAll();
 
        Task UpdateAsync(TEntity entity);
 
        Task InsertAsync(TEntity entity);
 
        Task DeleteAsync(TEntity entity);
    }

This combined with an abstract base class forms a solid foundation for dataaccess that can be easily extended with the decorator pattern. Here's the base class:

public abstract class BaseRepository<TEntity> : IRepository<TEntity> where TEntity : class
    {
        protected DbSet<TEntity> DbSet;
 
        protected readonly AppDbContext _dbContext;
 
        public BaseRepository(AppDbContext dbContext)
        {
            _dbContext = dbContext;
            DbSet = _dbContext.Set<TEntity>();
        }
 
        public virtual IQueryable<TEntity> GetAll()
        {
            return DbSet;
        }
 
        public virtual async Task<TEntity> GetByIdAsync(int id)
        {
            return await DbSet.FindAsync(id);
        }
 
        public virtual IQueryable<TEntity> Find(Expression<Func<TEntity, bool>> predicate)
        {
            return DbSet.Where(predicate);
        }
 
        public virtual async Task UpdateAsync(TEntity entity)
        {
            _dbContext.SetModified(entity);
            await _dbContext.SaveChangesAsync();
        }
 
        public virtual async Task InsertAsync(TEntity entity)
        {
            DbSet.Add(entity);
            await _dbContext.SaveChangesAsync();
        }
 
        public virtual async Task DeleteAsync(TEntity entity)
        {
            DbSet.Attach(entity);
            DbSet.Remove(entity);
            await _dbContext.SaveChangesAsync();
        }
    }

A very rudimentary implementation of the interface, it works for most simple data access scenarios. One thing to note is that I have extended the regular DbContext with a virtual method .SetModified(entity);. This is merely for unit testing purposes as it is otherwise hard to unit test the non virtual Entry() method.

Here's the SetModified method for the sake of completeness:

//This is here only for unit testing purposes
public virtual void SetModified(object entity)
    {
        Entry(entity).State = EntityState.Modified;
    }

There we go, all the plumbing done and over with... Lets say we now actually wanted to create a movie repository and start storing some movies.

Lets start with creating our movie class, a simple POCO object (we'll use Entity Framework code first to create the database).

public class Movie
    {
        public string Name { get; set; }
        public int Length { get; set; }
        public string Description { get; set; }
    }

Alot more should of course go into this class, like Directors, Genres etc. but lets keep it simple for now.

We need to add this as a DbSet to our DbContext class, something like this.

public class AppDbContext : DbContext {
    public DbSet<Movie> Movies { get; set; }
}

And now we just need an implementation of BaseRepository and we're ready to access some data.

public class MovieRepository : BaseRepository<Movie> {
    public MovieRepository(AppDbContext context) : base(context) {
    }
}

Now if we had an MVC controller where we needed to access a Movie and use it as a model we would do something along the lines of this:

public class MoviesController : Controller {

    private readonly IRepository<Movie> _repo;
    
    public MoviesController(IRepository<Movie> repo) {
        _repo = repo;
    }

    public async Task<ActionResult> Index(int id) {
        var model = await _repo.GetByIdAsync(id);

        return View(model);
    }
}

We use constructor injection here to inject an IRepository<Movie> into our controller, this would be populated by Structuremap and could be wrapped by any of our implementations of the IRepository interface. And that's the beauty of it! The controller doesn't even need to know which implementation it will be calling or what it will be doing. All it needs to know is that it will be passed an IRepository dependency and should expect to get a Movie entity from it.

I’ve been using several different ways of highlighting code on this blog over the years. Since I re-launched it about a year ago I use only Windows Live Writer as my tool for authoring the blog posts and have tried a number of plug-ins with varying success. The formatting of the code is usually quite allright, but the mark-up it spits out is just horrific. It looks like in the old days of pasting from word documents into dreamweaver and seeing something like this:

<p class=”MsoNormal”><p class=”MsoTac1><font size=”5” face=”Cambria”><span style=”font-family:Cambria;font-size:18.0pt;font-weight:bold”>The quick brown fox jumped all the hell around and spewed terrible html mark up to the left and right</span></font></p></p>

Absolutely terrifying. Anyway, I decided it was time for a spring cleaning and looked around at different options. I had a particular set of goals in mind:

  • Readability – it must be easy for consumers of the blog to read the code.
  • Extensibility – I must be able to extend functionality of the highlighting as I see fit.
  • Simplicity – It must be simple for me to paste and edit code with it using WLW.
  • Speed – It must load fast and be light as to not impact the load time of the blog negatively.
  • Languages – It must support different languages such as JavaScript, C#, markup, XML etc.

I stumbled around and tried a few until I found my perfect match: Prismjs! I selected the dark (almost Atom-like) theme Okaida by Ocodia just to get some contrast to the themes I used before.

Prismjs core is less than 2k, blazingly fast and uses semantic non-cluttering html 5 mark up. It’s also fully open source and supports most about every programming language you can think of.

Here’s what it looks like for some javascript:

var inputDelay = (function () {
    var timer = 0;
    return function (callback, ms) {
        clearTimeout(timer);
        timer = setTimeout(callback, ms);
    };
})();

Now that's nice, but what's even nicer is that I just copied this code-snippet and pasted it in Windows Live Writer and it automatically wrapped it in <pre> and <code> tags for me which is all prismjs needs to do its magic! In this case, as it’s javascript, I had to manually add the class=”language-javascript” attribute, but that’s a minor price to pay. I have defaulted my language to c# so if I were to paste som cool .net stuff in here like this:

public enum Suits
{
    Spades,
    Hearts,
    Clubs,
    Diamonds,
    NumSuits
}

public void PrintAllSuits()
{
    foreach (string name in Enum.GetNames(typeof(Suits)))
    {
        System.Console.WriteLine(name);
    }
}

I wouldn’t even have to add the class. It just automatically works. Now that’s simplicity!

To the cloud!

While i was in the mood for changing things up I also decided to move the hosting of my website to Azure. Not because I was dissatisfied with my current web hosting at Binero, but because I got inspired by Troy Hunt and his website http://worldsgreatestazuredemo.com/. The process of setting up a new site in Azure is dead simple. It takes about 5 minutes to have a brand new site up and running. In my case I set it up with deploy from GitHub, which means I actually write and publish my blog posts locally and then commit them into git. Azure automatically picks it up and deploys it. This brought one problem though… comments.

I previously handled comments as part of the blog post. Every comment would be stored in the XML representing a post and that was all fine and dandy. However, as I am now pushing my posts from source control that would mean that I would overwrite any comments with every commit. How do we get around that?

The cloud to the rescue! There are several different cloud based commenting solutions but I decided to go with disqus, a clean and simple api, support for facebook, twitter, google logins and an unobtrusive interface.

So what do you think of the new theme for code highlighting? Let me know in the (new) comments.

The new Inversion of Control pattern introduced with EPiServer 7 is, in my opinion, one of the best new features of EPiServer CMS. Anywhere in your application you can simply call the ServiceLocator and GetInstance <ISomeInterface>() and it will return the concrete implementation of your choosing of that interface.

This is of course nothing new, StructureMap has been around for years and there are dozens of other excellent frameworks for IoC.

However, in the EPiServer world this is still kind of new and I’ve seen several epi-projects where the concept of IoC is still largely misunderstood. But this is not a post about IoC as such, but if you’re interested here’s an introduction to StructureMap, it’s a pretty old post but still interesting.

I was working in a project recently where we created a couple of concrete implementations for cache handling. A CacheManager and a NullCacheManager, the CacheManager utilized the HttpRuntime.Cache in the background, the NullCacheManager simply cached nothing as one would expect. They both implement the ICacheManager interface, shown highly simplified below.

public interface ICacheManager {

 GetCachedItem<T>(string key, Func<T> uncachedMethod);

 SetCachedItem<T>(string key, T item);

 RemoveCachedItem(string key); 

}

This was all good and we could now easily decouple our design using code looking something like this.

public void SomeMethod() {

 var cacheManager = ServiceLocator.Current.GetInstance<ICacheManager>();

 var cachedItem = cacheManager.GetCachedItem<SomeObject>("key", SomeMethodThatFetchesSomObjectUnCached);

}

This is great and gives us a very loosely coupled design. We could easily create a new concrete type of ICacheManager such as SqlCacheManager or FileSystemCacheManager and switch between the implementations using a ConfigureableModule, something like this:

public class ConfigureableModule : IConfigurableModule
	{
		public void ConfigureContainer(ServiceConfigurationContext context)
		{
		    var container = context.Container;

		    container.Configure(c => c.For<ICacheManager>().Use<CacheManager>());
		}
	}

We could also have different concrete implementations for different environments by using profiles, like this:

public class ConfigureableModule : IConfigurableModule
 {
     public void ConfigureContainer(ServiceConfigurationContext context)
	{
		var container = context.Container;

		container.Configure(c => { 
			c.Profile("debug", ctx => { ctx.For<ICacheManager>().Use<NullCacheManager>(); 

			c.Profile("release", ctx => { ctx.For<ICacheManager>().Use<CacheManager>(); 
				}); 
			});  
		});
	}
 }

But wouldn’t it be pretty cool if we could switch the implementation at runtime? Turns out it’s not too difficult to achieve… Lets create an admin plugin where an administrator can choose which implementation of ICacheManager should currently be used by the site.

We start out by creating a simple plugin:

[EPiServer.PlugIn.GuiPlugIn(Area = EPiServer.PlugIn.PlugInArea.AdminConfigMenu, Url = "/modules/samplesite/ChangeCacheManager/Index", DisplayName = "Cache management")]
[Authorize(Roles = "Administrators")]
public class ChangeCacheManagerController : Controller
 {
	public ActionResult Index() { return View(); }
 }

Remember that we need to add this nonsense to the episerver.shell part of our web.config as well for our module to be picked up:


<episerver.shell>
    <publicModules rootPath="~/modules/" autoDiscovery="Minimal">
      <add name="samplesite">
        <assemblies>
          <add assembly="WhateverYouNameYourAssembly" />
        </assemblies>
      </add>
   </publicModules>
</episerver.shell>

Also please note that I’ve added an authorization attribute to the controller to make sure noone but an administrator stumbles upon it.

Add an empty index.cshtml view for now and type something awesome in it. Compile and run and make sure your plugin appears in the admin config menu.

Allright, next lets create a model for our view.

public class CacheManagerViewModel {

	public string SelectedManager { get; set; }

	public List<SelectListItem> ConfiguredManagers { get; set; }

	public CacheManagerViewModel() {
		ConfiguredManagers = new List<SelectListItem>();
	}
}

Simple, all our plugin will need is a list of the available cachemanagers and a string representing the currently selected one. Ok, lets flesh out our view next.

@model Afa.web.AdminPlugins.Models.CacheManagerViewModel
<!DOCTYPE html>
<html>
    <head>
        <link rel="stylesheet" type="text/css" href="/EPiServer/Shell/7.11.1.0/ClientResources/epi/themes/legacy/ShellCore.css">
        <link rel="stylesheet" type="text/css" href="/EPiServer/Shell/7.11.1.0/ClientResources/epi/themes/legacy/ShellCoreLightTheme.css">
        <link href="../../../App_Themes/Default/Styles/ToolButton.css" type="text/css" rel="stylesheet">
        <title>Cache management</title>
    </head>
    <body>
        <div class="epi-contentContainer epi-padding">
            <h1>Select active cachemaanger</h1>
            <p>Select which CacheManager implementation to use for this site.</p> 

            @using (Html.BeginForm())
            {
                <div class="epi-padding">
                    <p>Currently active cachemanager: <strong>@Model.SelectedManager</strong></p>
                    <div class="epi-size25">                       
                        <div>

                            <label>Select cachemanager: </label>

                            @Html.DropDownListFor(m => m.SelectedManager, Model.ConfiguredManagers, new { @class = "episize240" })

                        </div>
                    </div>
                </div> 

                    <div class="epi-buttonContainer">

                    <span class="epi-cmsButton">

                        <input class="epi-cmsButton-text epi-cmsButton-tools epi-cmsButton-Save" type="submit" value="Save" />

                    </span>

                </div>
            }
        </div>
    </body>
</html>

Now we have a model and a view, all that’s left is to add some code to our controller:


public ActionResult Index()
{
    return View(GetViewModel());
}

[HttpPost]
public ActionResult Index(CacheManagerViewModel model)
{
    var container = ServiceLocator.Current.GetInstance<IContainer>(); 

    var selectedManager = Type.GetType(model.SelectedManager); 

    container.Model.EjectAndRemoveTypes(t => t == selectedManager);

    var instance = (ICacheManager)container.GetInstance(selectedManager);

    container.Configure(x => x.For<ICacheManager>().Use(instance));

    return View(GetViewModel());
} 

private CacheManagerViewModel GetViewModel()
{
    var model = new CacheManagerViewModel();

    var currentActiveManager = ServiceLocator.Current.GetInstance<ICacheManager>();

    model.SelectedManager = currentActiveManager.GetType().Name;

    foreach (var manager in ServiceLocator.Current.GetAllInstances<ICacheManager>())
    {
        model.ConfiguredManagers.Add(new SelectListItem()
       {
          Text = manager.GetType().Name,

          Value = manager.GetType().AssemblyQualifiedName,

          Selected = currentActiveManager.GetType() == manager.GetType()

        });
     }
     return model;
}

There are a few interesting points in the code above. Lets first consider GetViewModel() method. We simply get the currently active concrete implementation and then loop through all instances of ICacheManager that is currently registered in our IoC container. This brings us to an important point, allimplementations of ICacheManager must have been registered in our container or they wont be returned by the GetAllInstances method. This could be done in simple fashion by using the AddAllTypesOf<ICacheManager> when we configure our container. Something like this:

public void ConfigureContainer(ServiceConfigurationContext context) {

	context.Container.Configure(c => c.Scan(s => s.AddAllTypesOf<ICacheManager>()));
}

Next lets have a look at the post method. This method looks kinda peculiar, what’s all this ejecting and removing stuff? It turns out that we can’t really just change which configured cachemanager to use with For.Use because that would add another instance of the same type and not replace the existing one, and that’s why we first need to eject the model of the selected type and then re-configure it.

And that should be it! We now have a working admin plugin that lets us change fundamental inner workings of our application at runtime. We could substitute ICacheManager for IContentRepository and radically change how we fetch our content, we could substitute it for IContentRenderer and add functionality such a detailed logging to the rendering of our content… There’s no end to the possibilities!

But, as we’ve learned from modern classics such as the bible and the amazing spiderman:

spider

Have you ever tried running an EPiServer MVC website with the latest version of the ASP.NET MVC Framework and latest versions of all the nuget packages? Well, it won’t work very well…

jsonnetfail

Unfortunately EPiServers EPiServer.Framework nuget package only supports Newtonsoft.Json version >= 5.0.8 and <= 6.0 (which is the same thing as saying we only support 5.0.8 as there are no other releases in the span).

This means that we will have to downgrade our Newtonsoft.Json package to be able to install the EPiServer.Framework nuget package. Fortunately this is quite simple and WebGrease (another package used by the Microsoft ASP.NET Web Optimization package) only requires NewtonSoft.Json >= 5.0.4 so thus far this shouldn’t be a problem.

Go to your Package Manager Console and type the command:

Update-Package Newtonsoft.Json -Version 5.0.8

This will downgrade your Json.Net installation to version 5.0.8. Now we can carry on installing the EPiServer.Framework nuget package without issue.

However, you might still run into this error:

Could not load file or assembly 'Newtonsoft.Json' or one of its dependencies. The located assembly's manifest definition does not match the assembly reference.

Turns out that the assembly redirect in web.config is still looking for the 6.0.x version of Newtonsoft.Json. To fix this do not change the version to 5.0.8 and hope for the best, the version number of < 6.0 of Json.Net had the version 4.5.0.0 so changing to that version should work. However, simply removing the assembly redirect is a simpler and more direct solution. It serves no purpose anyway as we know that the version we have installed is the one we want to use so we simply put the assembly redirect tag out of its misery.

boromir

In a quite recent blog post by Per Bjurström of EPiServer he wrote about a new database version for EPiServer CMS where they actually include the database schema changes in the nuget update package.

This is an awesome step in the right direction where everything that is needed to upgrade an EPiServer site is included in the nuget package, but… How do we integrate this with our continuous integration process?

Turns out it’s not so difficult at all. Looking at this blog post by Paul Stovell, the man behind Octopus Deploy, we can get some inspiration as to how this could be achieved.

Lets start by adding a console application project to our solution and add the DbUp nuget package. This will be the project that contains all of our database schema changes, in most cases probably only the SQL scripts from EPiServer, but we could of course have different databases that also need to update their schema from time to time.

We then update our EPiServer nuget packages. If we try to run our site now we’ll get the old familiar yellow screen of death saying: “The database has not been updated to the version 7007.0, current database version is 7006.0.”

dberror

This is easily remedied by running the “update-epidatabase” command from the Nuget Package Manager Console, however this will only fix our development database. The database will still be out of sync when we deploy this release to any other environment. We need to extract the SQL script from the nuget package by using the “export-epiupdates” command:

PM> Export-EPiUpdates

 
An Export package is created C:\EPiServer\AlloyDemo\wwwroot\EPiUpdatePackage

Exporting  epiupdates into EPiUpdatePackage\EPiServer.CMS.Core.7.8.2\epiupdates

This will create a folder in our project root where we can find the sql scripts being run: EPiUpdatePackage \EPiServer.CMS.Core.7.8.2\epiupdates\sql

episql

 

We’ll grab the 7.8.0.sql file from the folder and include it in our console application and make sure to set the build action to “embedded resource” to make the sql script a part of the generated .exe file.

episql2

Next we’ll add our episerver database connection string to our App.config.

connectionstring

It doesn’t really matter what you call the connectionstring here and you could of course have several if you have multiple databases that need to be updated.

Then we write som code to trigger the update by DbUp in the main method of our console app. This code is taken directly from Paul Stovells blog post and works just perfectly.

DbUp will automatically wrap all the calls in a transaction that will only be committed if all of the calls are successful. This is really nice as it means that we won’t have to worry about leaving our database in a half-upgraded state of some kind.


static int Main(string[] args)
        {
            //Grab a reference to our connectionstring
            var connectionString = ConfigurationManager.ConnectionStrings["DatabaseConnection"].ConnectionString;

            //DeployChanges is a fluent builder for creating databases.
            //There are lots of options other than executing scripts embedded in the assembly, 
            //including from a specified file location or manually created scripts.
            var upgrader =
                DeployChanges.To
                    .SqlDatabase(connectionString)
                    .WithScriptsEmbeddedInAssembly(Assembly.GetExecutingAssembly())
                    .LogToConsole()
                    .Build();

            var result = upgrader.PerformUpgrade();

            //If the result is unsuccessful we'll change the fore color to red and display the error

            if (!result.Successful)
            {
                Console.ForegroundColor = ConsoleColor.Red;
                Console.WriteLine(result.Error);
                Console.ResetColor();
                return -1;
            }

            Console.ForegroundColor = ConsoleColor.Green;
            Console.WriteLine("Success!");
            Console.ResetColor();
            return 0;
        }

DbUp is also clever enough to add a simple database table storing every script that has already been executed so we do not need to worry about unnecessarily running scripts multiple times.

We could run this application right now, as is, and it would upgrade our database for us, but that’s not our goal right now. We want Octopus Deploy (or whichever deployment service you use) to be able to run this app and upgrade our databases automatically. That’s why we also add a simple powershell script that Octopus deploy will run as part of the deployment process. We put this in a Deploy.ps1 file and include it in our console project, making sure to put Copy to output directory to Copy if newer. This file will be automatically picked up by Octopus and run when the project has been deployed.

deployps

The Deploy.ps1 file looks like this (again courtesy of Paul Stovell):

& .\OctoSample.Database.exe | Write-Host

As I mentioned in a previous blog post I currently use TeamCity as my build server of choice and let TeamCity run Octopack to package my projects into nuget packages that Octopus deploy then grabs and deploys. To be able to let TeamCity pack our console application we need to install the OctoPack nuget package. Then when we check in our changeset TeamCity will pick it up and run OctoPack automatically, creating a nuget package for every project with OctoPack installed.

The last step to get the whole process to work is configuring Octopus Deploy. Simply add a step to the deployment process called “Update database” or something similar where you fetch the nuget package “EpiDbUp”, deploy it and execute the resulting .exe file to update the database.

octopusdeploy

The complete process now looks something like this:

  • Include the .sql file we want to run in our EPiDbUp-project (making sure to embedd it into the .exe).
  • Check it in to source control. Which triggers a build on our TeamCity Build server which upon completion triggers the creation of a release in Octopus Deploy.
  • Choose which environment we wish to deploy the current release to.
  • The deployment process runs in two steps, first updating the database through our console project, then deploying our web application.

And that’s it! The next time a database schema change is included in an EPiServer update we’ll simply add the script to our database project and all our environments will be automatically updated on next deploy.