[ Teaching ]  How to choose a best-fit AI platform?

How to choose a best-fit AI platform?

  By : Leadtek AI Expert     738

There are more and more AI applications. Most development framework or platform is open source. How can beginners choose a development environment that suits them?

For the development environment using Linux as the operating system, it can be divided into management software, development platforms, and frameworks according to the hierarchy.

  • Management software generally refers to hardware management, Python and other tool version management and monitoring systems, the most common of which are Kubernetes, Docker or Anaconda.
  • The development platform is a higher-level system. Its main function is to provide an environment for users to write programs easily. Commonly, it includes Jupyter Notebook, JupyterLab, PyCharm or Visual Studio Code.
  • The framework is a development kit that implements machine learning or deep learning at a higher level. There are many types of frameworks, and there are suitable platforms according to personal habits or application types, such as TensorFlow, PyTorch, or Caffe, which will be introduced below according to different user environments.

There is a huge variety of management software, development platforms, or frameworks, so we are not going to introduce all of them one by one in this article. We will only focus on the best-fit AI environment for different types of users.

Individual Developer

An individual developer environment mostly installs the system on the local end. It is common to set up Anaconda in the system to switch between and manage the projects. Anaconda installs the deep learning framework through pip install.


  • Can switch between different Python or framework versions


  • Requires large hard drive space
  • Some pre-installed kits may not be used at all
  • When the versions of the CUDA Toolkit that may be used for multiple versions of the framework are different, the developer must go to the virtual environment of each version of the framework to configure the CUDA Toolkit environment settings.

The above disadvantages are easily encountered by individual developers. For example, the previous project may use TensorFlow 1.12 and CUDA Toolkit version 10.0, but the new project uses TensorFlow 1.15. At this time, you must upgrade the CUDA Toolkit version to 10.2. When both the old project and the new project can be executed at the same time, the CUDA Toolkit version used by both must be set separately in the two virtual environments.

Therefore, Leadtek AI Expert recommend using Docker as the management software. Docker includes the advantages of Anaconda, and it saves the hassles of setting up the environment.


  • Different development environments include a lightweight OS, like virtual machines; each has an independent development environment
  • There are official open source versions of mainstream frameworks that can be downloaded directly from Docker Hub, eliminating complicated installation processes
  • NVIDIA provides the nvidia-docker plugin tool, and Docker services can include GPU resources.
  • Can be combined with common development platforms such as Jupyter notebook or Visual Studio Code.

Are there any disadvantages or precautions for using Docker? The Docker environment contains the above-mentioned OS (lightweight), development platform, and framework. The environment usually takes 4-7GB of space. If you continue to install the package, more hard drive space will be consumed. This is important to note in setting up any virtual environment.

As for the choice of Jupyter notebook or Visual Studio Code for the upper layer, both have their own users. The advantage of Visual Studio Code is that it inherits the advantages of the Visual Studio platform and is easy to debug, and Jupyter notebook has development and powerful annotation functions for easy tutorials. Today, Visual Studio Code can be combined with Jupyter notebook, which is a big plus for developers.

(reference of Jupyter notebook: https://code.visualstudio.com/docs/python/jupyter-support)

As for deep learning frameworks, each framework has its habitual users and functions, which will not be introduced in detail in this article. However, the frameworks most developers currently use are both TensorFlow and PyTorch. The former is developed and maintained by the Google team; it also supports multiple languages, and is currently the most used framework. The latter was developed for Facebook's AI team. There are also many supporters. It has developed rapidly in the past two years and has the momentum to catch up with TensorFlow. No matter what framework is used, Python is still the most required programming language for AI development.

Small Research Unit

Small research units generally have AI development teams of 2 to 10 people. Deep learning consumes a large amount of hardware resources during the model building process, among which the most important hardware is the GPU. GPU resources are important for the smooth progress of team work.

The most recommended management software for small teams is also Docker. In addition to allocating hardware resources such as GPU, CPU, and RAM to virtual containers, Docker also includes framework and application version management. Without hardware management software, multi-GPU system resources are difficult to allocate (if you do not specify a GPU, the operation is usually executed by a GPU with ID 0), or the team needs to communicate the used hardware resources first, otherwise it is easy that some GPUs will be preempted or idled.

Docker can solve this situation through hardware resource allocation in advance (as shown below). Different from Kubernetes commonly used in large enterprises, different Docker virtual machines can be configured on the same set of GPU hardware, so that hardware resources can be used more effectively.

As for the choice of development platform, it is the same as the situation of individual developers. You can choose either Visual Studio Code or Jupyter notebook according to personal development habits.

If you take into consideration that the developing programs will be incorporated into teaching, Jupyter notebook can be used as a teaching material because it can execute code in blocks and easily write annotations.

However, if it is based on project system development, the debug function of Visual Studio Code is more suitable for system development.

No matter what platform or tool you choose, the most important thing is that developers can focus on the R & D and development of models and systems. Other hardware or management software can be left to the system administrator of the team. Therefore, Docker is a very suitable choice. The Docker installation link is also attached below. Please remember to install the Docker Engine before installing the nvidia-docker package.

Docker Engine installation guide: https://docs.docker.com/install/

nvidia-docker installation guide: https://github.com/NVIDIA/nvidia-docker

As we mentioned earlier, GPU is an important part for the smooth process of your AI project. In next article, we will show you how to select the right driver for GPU. 

Next Sharing Coming Up on 2/5: How to select the right GPU driver.

Don't for forget to come back on 2/5 and learn from us. 

Comments as following