Contents

Nvidia docker. Microservices dockerization and automation for city-wide video surveillance

Our client has 900+ video cameras that are located all over the city. The idea of this project was to collect recorded videos, store them on dedicated servers, where certain scripts were executed to analyze them. The infrastructure needed to be scalable with fully automated development process that gives the opportunity to make dynamic changes in scripts. So, the draft is placed on dockerization, orchestration and automation of software being developed, which uses video cards and GPU’s for calculations.

 

nvidia docker layers image

The purpose of this project was to build scalable infrastructure, that will work with microservices that are connected with Nvidia CUDA (Compute Unified Device Architecture).

We have configured Continuous Integration & Continuous Delivery in such a way, that when updated script is pushed into production environment on all servers, there is always an opportunity to get back to previous version in one click in case of code error!

One of our tasks was to orchestrate Docker with Rancher, but we faced one issue – Rancher doesn’t work correctly with Nvidia docker plugin, so we have developed our custom solution that handled this problem.

We have also implemented monitoring with visual interface to see the stats and various metrics of the servers and their components, beginning from hardware stats like temperature and power distribution, ending with software stats like interface load, process state, etc.

The Author

Pavel Konobeyev

Tags

Continuous Delivery
Continuous Integration
cuda
DevOps
Docker
nvidia
video surveillance