I had a strange experience when upgrading a clients site from EPiServer 7 to EPiServer 7.5. Suddenly the site performed noticeably worse with loading times being measured in seconds. At one point it was so bad the site was hardly useable.

Naturally we initially blamed the new code we had deployed and tried to investigate where we had gone wrong performance-wise, but after some further investigation it became clear that the performance drain was not coming from any new code but rather from multiple calls to the EPiServer Dynamic Data Store (DDS). This was very strange as we had made no changes to the code handling the DDS calls and would therefore assume that the performance would be similar as before the upgrade. The performance was so appallingly bad that we had to temporarily disable some of the functionality of the site. Hurriedly we rewrote parts of the application to instead use Entity Framework and most of the performance issues went away.

While this might have been an edge case (thousands upon thousands of objects in the DDS with many nested objects and relations) and certainly the DDS implementation could have been rewritten to perform better it still raises an interesting question.

Just how badly does the DDS perform in comparison to EF 6.1?

I decided to create a few simple test scenarios and try to make an even comparison.

First I created a few POCO objects to store in the respective ORM:

[EPiServerDataStore(AutomaticallyCreateStore = true, AutomaticallyRemapStore = true)]
public class DdsLike : IDynamicData
{
    public string User { get; set; }

    public int PageId { get; set; }

    public Identity Id
    {
       get;

      set;
    }
}

[EPiServerDataStore(AutomaticallyCreateStore = true, AutomaticallyRemapStore = true)]
public class DdsComment : IDynamicData
{
    public string User { get; set; }

    public int PageId { get; set; }

    public string Text { get; set; }

    public string Heading { get; set; }

    public Data.Identity Id { get; set; }

}

public class EfComment
{
    public int Id { get; set; }

    public string User { get; set; }

    public int PageId { get; set; }

    public string Text { get; set; }

    public string Heading { get; set; }
}

public class EfLike
{
    public int Id { get; set; }

    public string User { get; set; }

    public int PageId { get; set; }

}

Likes and comments the bread and butter of the web!

I then created 20 000 objects (10 000 of each entity) in both Entity Framework (using EF code first) and the DDS:

public void InitializeData()
    {
        var itemCount = 10000;

        using (var alloyContext = new AlloyContext())
            {
                for (int i = 0; i < itemCount; i++)
                {
                    alloyContext.Likes.Add(new EfLike() { 
                        PageId = i, 
                        User = "bla"
                    });

                    alloyContext.Comments.Add(new EfComment() { 
                        PageId = i, 
                        User = "bla", 
                        Heading = "The raven", 
                        Text = "Once upon a midnight dreary, while I pondered weak and weary",
                    });
                }

                alloyContext.SaveChanges();
            }

         var likeStore = DynamicDataStoreFactory.Instance.GetStore(typeof(DdsLike));

         var commentsStore = DynamicDataStoreFactory.Instance.GetStore(typeof(DdsComment));

         var ddsLikes = new List<DdsLike>();

         var ddsComments = new List<DdsComment>();

         for (int i = 0; i < itemCount; i++)
            {
                likeStore.Save(new DdsLike()
                {
                    PageId = i,
                    User = "bla",
                });

            commentsStore.Save(new DdsComment() { 
                PageId = i, 
                User = "bla", 
                Heading = "The raven", 
                Text = "Once upon a midnight dreary, while I pondered weak and weary" });

     }
}

I then started measuring different calls. The procedure was to setup a scenario and measure it 5000 times with EF and DDS respectively, measuring the average time of all the calls made. Here’s the breakdown of the result:

Procedure DDS EF
Load 20 000 items, return count19 ms12 ms
Load 20 000 items, sort on pageid > 5000, orderby pageId, return count88 ms10 ms
Load 20 000 items, sort on pageid > 5000 && < 15000, orderby pageId, return count104 ms11 ms
Load 20 000 items, sort on pageid > 5000 && < 15000, orderby pageId, tolist, count840 ms11 ms
Load 20 000 items, get by id 5555, orderby pageid, tolist, count40 ms8 ms
Load 20 000 items, order by pageid, take 20, tolist, count313 ms33ms
Load 20 000 items, order by pageid, skip 80, take 20, tolist, count420 ms42 ms
Load 20 000 items, order by pageid, skip 5000, take 20, tolist, count390 ms190 ms

Returning a count of 20 000 items is of course no problem for either mapper, but it’s interesting to note how quickly the performance of the DDS deteriorates as soon as we add some ordering and filtering. Especially noteworthy is when we load our 20 000 items, sort on pageId > 5000 and < 15 000, order by pageId then make a ToList() call, effectively forcing the execution of the query against the database. 840 ms vs 11 ms!

To make things a bit more interesting I reran the test but added some related objects in both databases. I added two nested classes to the Comments class for the DDS and a simple reference to two other classes in EF.

public class DdsSomeNestedClass
    {
        public int MyProperty { get; set; }

        public string MyString { get; set; }

        public Guid Id { get; set; }

        public int PageId { get; set; }

        public string SomethingElse { get; set; }

        public Guid GuidId { get; set; }

    }

public class DdsSomeOtherClass
    {
        public string MyProperty { get; set; }

        public int SomeId { get; set; }

        public int PageId { get; set; }

        public string Something { get; set; }

        public DateTime Date { get; set; }

        public DateTime ChangedDate { get; set; }

        public Guid GuidMaster { get; set; }
    }

I ran all the tests again and performance went down quite dramatically for the DDS while EF kept going at about the same pace:

ProcedureDDSEF
Load 20 000 items, return count11 ms12 ms
Load 20 000 items, sort on pageid > 5000, orderby pageId, return count78 ms8 ms
Load 20 000 items, sort on pageid > 5000 && < 15000, orderby pageId, return count78 ms13 ms
Load 20 000 items, sort on pageid > 5000 && < 15000, orderby pageId, tolist, count1346 ms212 ms
Load 20 000 items, get by id 5555, orderby pageid, tolist count42 ms6 ms
Load 20 000 items, order by pageid, take 20, tolist, count407 ms33ms
Load 20 000 items, order by pageid, skip 80, take 20, tolist, count420 ms36 ms
Load 20 000 items, order by pageid, skip 5000, take 20, tolist, count500 ms40 ms

There’s absolutely nothing scientific about the tests, but it’s still interesting to note how poorly the DDS performs in comparison on certain calls. Especially when filtering and sorting is involved.

To conclude we can say that using EF (or some other ORM that does not rely on a big table solution) is always preferable. If you’re going to use the DDS make sure to cache your calls and try to keep them as simple as possible for best performance.