As part of the Microsoft Azure 2018 Challenge, we hosted the Wildfire AWARE application entirely on Microsoft Azure and set up a full CI/CD pipeline.
Wildfire AWARE consists of two separate parts. It has a web application based on Python’s Django framework and it has a separate data retrieval, storage and analysis process that runs daily. These were both set up simultaneously on the same Microsoft Azure server due to the requirement for shared file storage. The web application’s sqlite database gets frequently updated by the data produced from the data analysis process (application of the machine learning model on retrieved data). Due to this, the standard web hosting solution provided by Microsoft Azure cannot be utilised. We got around this by writing a custom .cmd Powershell deployment script to be run at bootup/deployment time. Simple examples of how this works can be seen here: https://azure.microsoft.com/en-gb/resources/templates/django-app/
Azure’s detailed documentation was followed to set up a standard full Continuous Integration and Deployment pipeline. Whenever we merge changes to the Master branch, both our end-to-end and unit tests are run. The updated version of the application then gets automatically deployed and booted up using our custom .cmd deployment script. Up to date documentation that can be utilised to set this up can be found here: https://docs.microsoft.com/en-us/azure/devops-project/azure-devops-project-github
Finally, our Gradient Boosted Decision Tree model was developed using Microsoft’s AzureML LightGBM library rather than through the usage of Tensorflow or the Python scikit-learn library. A major benefit provided by LightGBM’s implementation of Gradient Boosted Decision Trees over existing implementations is its increased efficiency when dealing with larger data sets, experiments have shown it can increase training speeds by over 20 times whilst maintaining the same accuracy. This has been especially useful for the development of this project as model training was mostly conducted on our personal computers. You can see a paper with details regarding the efficiency of LightGBM here: https://papers.nips.cc/paper/2017/hash/6449f44a102fde848669bdd9eb6b76fa-Abstract.html