Order fulfillment planning engine

The Problem

Sourcing items, storing that within a warehouse and delivering those items to your door can be a complex problem to solve. Many companies spend millions of dollars trying to build supply chain systems to satisfy this task. SPUD is a Vancouver based company that started down that journey. SPUD had a local warehouse with a well-trained staff that excelled at the specific requirements for picking, packing and all the rules associated with that process. However, that was not something they could scale; as a result, they decided to automate a vital step of the order fulfillment process. They decided to automate the order planning process for order fulfillment. To do this, they needed an efficient planning engine.

Client Profile

SPUD.ca is a company focused on sustainably connecting local producers with local consumers. To that end, they have a sharp eye for technology and business integrity.

Visit Website

The Approach

When we engaged with SPUD, they were in the early stages of designing the rules and started down the path of using Azure Logic Apps to start building “business-facing” rule components that can be tweaked by warehouse operations staff. While on the surface, a low-code (or no-code) solution can appear appealing for self-management, for critical business systems these solutions often degrade very quickly. We reviewed the requirements and proposed a code-base serverless solution with a focus on test automation and deployment automation. So that we could still maintain a degree of agility while still preserving rule fidelity.

To create a planning engine for order fulfillment that can contain various combinations of items, with specific restrictions we decide to take an iterative approach. Starting with a simple proof of concept, and iterating to build a full rules engine.

The Process

The idea was to eliminate as much risk as possible early on, to prove out that we can deploy and deliver a basic engine that performs well for planning thousands of orders. In addition, we discovered early on that we can make each rule self-contained, we could mutate what rules were applied at runtime, giving the system more flexibility.

We started by breaking down each rule as it was described to us, and understanding its impact and implementing it in code; bugs and all. Then with basic unit tests, we arrived at a basic, functional rule. It made generating the required data more straightforward and self-contained because we could see the way the data interacted with the rules immediately.

Next, we built a basic way to compose those rules and an HTTP trigger as part of an Azure Function App so that we could test automated build, test, and deployment.

Finally, we started to add additional tests to see how the rules interact with each other. As a result, very early on, we started to uncover side-effects of composing individual rules together and how they would conflict. This prompted further redesign and refactor of the rules and how they were being composed.

Finally, after about 2 months of development, we deployed and integrated the rule engine with the development environment of the warehouse management system. After that we continued to iterate until we identified further issues upstream, but the warehouse management system is another case study.

The Solution

Azure DevOps for Build and Deployment Automation

Very early on, we emphasized the need to build and deployment automation. We have participated in quite a few projects where this was deferred, and as a result, deployments became a source of stress and risk.

Azure DevOps provides effective tools for build and deployment automation at a reasonable cost, along with cloud-based build and release agents. SPUD was already using Azure DevOps, so we integrated our process with their tools.

This made integrating with Azure Function App for release a very straightforward process and required minimal iteration. 

GitLab for Source Control Management

Git is a great SCM tool; at Calico, we use GitLab for hosting our repositories, and some of our automation. So we decided to stick with this approach until SPUD was ready to take over the hosting of the codebase.

Azure Function App for RESTful API Endpoint

The Azure Function Core Tools provide a simple way to build out Azure Function applications. Getting started with a prototype for the rule engine was a matter of working with C# code to build the rules that were required. This eliminated the need to focus on deployment and configuration and allowed the team to focus on solving the problem at hand. Specifically, focusing on order planning and rule decomposition.

NUnit for Unit Test Automation

Once we started creating rules, it became clear that we needed to be able to test each rule independently, and as part of a subset of rules. Specifically, testing side-effects and interactions that we didn’t expect; this portion became critical for us.

Lessons Learned

  • Start testing as early as possible

  • Get real-world test data, this will pay off in the long run

  • Deployment automation accelerated development confidence

  • Automate testing to make deployment more reliable

  • Release and integrate as early as you can

  • Decompose the problem statement to reduce complexity and take steps to arrive at a complete solution

  • Understand the business context for the technology

Technology Summary

  • Azure Function App

  • Azure DevOps Build Pipeline (YAML)

  • Azure DevOps Release Pipeline

  • .Net Framework 4.7.2 (Azure Function Library)

  • JSON and RESTful API

  • GitLab

  • NUnit Unit Testing Framework