Nvidia Docker. Dockerized cuda driver for video surveillance


Nvidia docker. Microservices dockerization and automation for city-wide video surveillance

Our client has 900+ video cameras that are located all over the city. The idea of this project was to collect recorded videos and store them on dedicated servers, where certain scripts were executed to analyze them. The infrastructure needed to be scalable with a fully automated development process that gives the opportunity to make dynamic changes in scripts. So, the draft is placed on dockerization, orchestration, and automation of software being developed, which uses video cards and GPU's for calculations.


The purpose of this project was to build scalable infrastructure, that will work with microservices that are connected with Nvidia CUDA (Compute Unified Device Architecture).
We have configured Continuous Integration & Continuous Delivery in such a way, that when the updated script is pushed into the production environment on all servers, there is always an opportunity to get back to the previous version in one click in case of code error!
One of our tasks was to orchestrate Docker with Rancher, but we faced one issue - Rancher doesn't work correctly with the Nvidia docker plugin, so we have developed our custom solution that handled this problem.
We have also implemented monitoring with a visual interface to see the stats and various metrics of the servers and their components, beginning from hardware stats like temperature and power distribution, and ending with software stats like interface load, process state, etc.

If you need help with the configuration of such infrastructures just ping us and let us carry about all tasks on the infrastructure side. Focus on your business and let the professionals do their job!