Having launched an e-commerce store for 2B Scientific a few months ago, I am reminded of a quote from one of Roald Dahl's books (yes, I’m a child at heart):
"Never do anything by halves… Be outrageous. Go the whole hog. Make sure everything you do is so completely crazy it’s unbelievable."
If I had to describe the e-commerce build for 2B Scientific, that quote sums it all up. Utilising Kentico 10 as the underlying platform, we created an online store that has the capability of serving around 1.7 million products, each containing over forty facets of information with the potential to hit the 2 million mark. Crazy!
I highly recommend using Kentico for a standard website or e-commerce build. You are provided with out-of-the-box features to get a site with key functionality running very quickly, ready to expand the existing feature set and adapt to a clients custom requirements. I think this one of those Kentico projects where quite a bit of customisation was carried out using the Kentico API framework.
Even with Kentico at the very heart of our application, this kind of development wasn’t for the faint of heart. The question on all our minds at the initial start of the project was: Could Kentico take on the sheer load of this many products?
From our internal testing, we found there was a degradation in database performance the products count hovered around the 500k mark, even when hosted on a mid-range Azure database tier. Now this was only encountered when managing products from within the CMS Administration area. Kentico has provided a very useful best practice approach to building sites of this scale, which allowed the site to perform very well whilst browsing as a user. I’ll be writing about Kentico performance in greater detail in a future blog post and how we overcame certain obstacles.
For this post, I like to highlight some of my favourite pieces of functionality this project had to offer…
When having a site of this scale, managing and updating products can be quite a mission in itself and requires some form of automation. The client receives regular spreadsheets from suppliers that encompasses price and product attribute updates. A custom developed area within the CMS was created to that would allow the client to upload spreadsheets containing around 65k rows of data and then field map the data they wanted to import.
To allow for maximum flexibility, field mapping functionality was introduced to allow the import to adapt to different supplier spreadsheet structures as well as allowing for partial updates to existing products.
Due to the nature of the data, many integrity and validation checks were carried out as well as normalisation processes to structure the data in a form that could be used by the e-commerce store. - I am big fan on database design and the normalisation of data! If there were any irregularities picked up through the import process, the user will be notified in a report format on the line number and field that caused the issue.
The only downside of carrying out an import job is the performance impact it can have on both the database and website. To get around this, a separate application that ran outside the website was used to carry out all the validation checks and data restructuring tasks, before being pushed to the Kentico store.
Even though Kentico 11 has provided strong Azure Search integration, it’s unfortunate we were already in the midst of some hardcore development many months before its 11th December 2017 release date. Thus, a custom approach was integrated into our Kentico build that would send product details for indexing whenever a product was either created or updated.
Specific measures had to be taken into consideration when there were bulk import of products. This was purely down to the service limits Azure Search has in place for the maximum number of items that can be sent in a batch at any one time.
One of the major benefits of using Azure Search is its ability to pull back multiple facets of information to allow for responsive searching against vast amounts of data and their attributes. Our integration worked so closely with Kentico, we were able to reduce the number of database calls by pulling back product data directly from the index itself. I’m intrigued to see Kentico 11’s approach to using Azure Search.
There was a requirement for a delivery markup price to be added to products that belonged to a specific manufacturer, which required the base SKU price to be adjusted accordingly. In addition, when products are bulk imported the prices could be converted into pounds based on the currency used in the spreadsheet.
Due to the nature of the business, the client had a requirement to have the ability to bulk delete products from a particular supplier on demand. Generally, this would be very straight-forward on e-commerce sites with a smaller number of products. But when you take into consideration Azure Search indexing, mass deletions across multiple database tables and cache control - it’s not so so straight-forward.
A secondary background task was developed, similar to the approach taken for mass importing products this time to carry out the complete opposite — delete!
If you haven’t done so already, head on over to www.2BScientific.com to see all the hard work both the technical and design team poured into the site.