Microsoft gives us a fair number of options to seamlessly connect machine learning models to our production code, and I honestly love using them all. AzureML is fantastic for many use cases, and with the Data Factory, Databricks and Data Lakes combo virtually every possible scenario can be covered really nicely.
Except of course if the model you need to use is hosted by a 3rd party which does not support any of these services. Then again, you might want to quickly test a few models first in a POC context before committing to “productizing” these into AzureML. Perhaps you just don’t want all your eggs in one vendor basket, or all your raindrops in one single cloud.
Worse, you might have a requirement to call an R API from D365 FinOps. In this blog post I’ll show you how.
First things first, let’s build a simple R model using the Prophet library from Facebook to do forecasting. This uses a data frame with two columns, y & ds to feed a time series set of values (y) based on time (ds). Prophet supports a lot of parameters for seasonality and such and I suggest reading up on it.
For our example I’ll keep things simple, and assume the R script won’t be doing any munging or wrangling as such. Clean data frame goes in, Prophet predicts, but instead of returning the y-hat values (Ŷ) we’ll make it interesting and return a set of base64 encoded PNG plots containing the forecast and seasonality trends instead.
So there are a number of challenges for us:
- We need to host this R model as an API
- We need to grab the resulting plot predictions created by Prophet
- Encode the plots to base64 and return it from the API as JSON
- Call and display this all in D365 from a form
The best way I’ve found to host R as an API is to use the Plumber library. So I’ve deployed a Linux environment in my cloud of choice and installed all the required R libraries, including Plumber, and set up NGINX to route incoming traffic on port 80 to Plumber which listens on port 8000. To call this API from D365 you’ll need to install a certificate as only HTTPS will do between D365 and our Linux box.
The R code is shown below, detailing how we grab the plots and encode it to base64. We also receive our data frame as part of the call so we need to URIDecode it. This will do for small data sets; if you want to tackle a large data set, use a different mechanism of passing a reference to the data, perhaps a POST call with the data in the body as JSON. In our case our API returns JSON containing three base64 encoded plots.
Here is the code for our BuildControl, just a single CSV property is defined:
Followed by the code for our Control class that contains our CSV property that we will populate from our X++ form.
So far it’s all fairly simple, and we can add a demo form in X++ to use our extensible control. We’ll grab some sales orders from D365, URI encode it manually and then send it off to our extensible control to pass to our R API sitting somewhere outside the D365 cloud.
A second or two later and we receive our plots.
Pretty simple stuff. We can extend this further by passing various parameters to the R API, for example, which time-series model we would like to use, whether to return the predicted values (Ŷ) or not, seasonality parameters and anything else we need.