System-level container images are used by all users. You can obtain images from Lenovo salesperson or import images into LiCO as system-level container images. This section describes how to create and import system-level container images.
LiCO is released with image bootstrap files for commonly-used AI frameworks. The image bootstrap files for commonly-used AI frameworks in the compressed package you obtained(https://hpc.lenovo.com/lico/downloads/6.4/images/k8s/image_bootstrap.zip). Users can use these files to create images.
The table below lists the image bootstrap files.
File name | Images | Framework | CPU/GPU | Comments |
---|---|---|---|---|
caffe-1.0-cpu | caffe:caffe-1.0-cpu | Caffe | CPU | |
caffe-1.0-gpu-cuda92 | caffe:caffe-1.0-gpu-cuda92 | Caffe | CUDA 9.2 | Supports P100 and V100 Caffe does not support CUDA 9.0 officially |
NVCaffe-0.17.3-gpu-cuda102 | caffe:NVCaffe-0.17.3-gpu-cuda102 | Caffe | CUDA 10.2 | Supports P100 and V100 |
chainer-6.7.0-gpu-cuda101 | chainer:chainer-6.7.0-gpu-cuda101 | Chainer | CUDA 10.1 | Supports P100, V100, RTX5000, RTX8000 and T4 |
intel-caffe-1.1.3-cpu | intel-caffe:intel-caffe-1.1.3-cpu | Intel-caffe | CPU | |
intel-python | intel-python:intel-python | Other | CPU | |
intel-pytorch-1.7.0-cpu | intel-pytorch:intel-pytorch-1.7.0-cpu | PyTorch | CPU | |
intel-tensorflow-1.15.2-cpu | intel-tensorflow:intel-tensorflow-1.15.2-cpu | TensorFlow | CPU | |
intel-tensorflow-2.3.0-cpu | intel-tensorflow:intel-tensorflow-2.3.0-cpu | TensorFlow | CPU | |
jupyter-default | jupyter:jupyter-default | Jupyter | CUDA 11.2 | Supports P100, V100, RTX5000, RTX8000 , T4 and A100 |
jupyter-py37 | jupyter:jupyter-py37 | Jupyter | CUDA 11.0 | Supports P100, V100, RTX5000, RTX8000 , T4 and A100 |
jupyter-py38 | jupyter:jupyter-py38 | Jupyter | CUDA 11.0 | Supports P100, V100, RTX5000, RTX8000 , T4 and A100 |
letrain-1.6.0-cuda110 | lico:letrain-1.6.0-cuda110 | LeTrain | CPU | |
lico-ai-scripts | lico-k8s-client:latest | Other | CPU | Indispensable |
lico-file-manager | lico-file-manager:latest | Other | CPU | Indispensable |
mxnet-1.5.0-cpu-mkl | mxnet:mxnet-1.5.0-cpu-mkl | Mxnet | CPU | |
mxnet-1.5.0-gpu-mkl-cuda101 | mxnet:mxnet-1.5.0-gpu-mkl-cuda101 | Mxnet | CUDA 10.1 | Supports P100, V100, RTX5000, RTX8000 and T4 |
neon-2.6-cpu | neon:neon-2.6-cpu | Neon | CPU | |
pytorch-1.11.0-cuda115 | pytorch:pytorch-1.11.0-cuda115 | PyTorch | CUDA 11.5 | Supports P100, V100, RTX5000, RTX8000, T4 and A100 |
rstudio | rstudio:rstudio | RStudio | CUDA 11.2 | Supports P100, V100, RTX5000, RTX8000 , T4 and A100 |
scikit-single-cpu | scikit:scikit-single-cpu | Scikit | CPU | |
tensorflow-1.15.3-cuda110 | tensorflow:tensorflow-1.15.3-cuda110 | TensorFlow | CPU CUDA 11.0 | Supports Keras(2.2.4) Supports P100, V100, RTX5000, RTX8000, T4 and A100 |
tensorflow-1.15.3-cuda110-hbase | tensorflow:tensorflow-1.15.3-cuda110-hbase | TensorFlow | CPU CUDA 11.0 | Supports HBase Supports P100, V100, RTX5000, RTX8000 and T4 |
tensorflow-1.15.3-cuda110-keras | tensorflow:tensorflow-1.15.3-cuda110-keras | TensorFlow | CPU CUDA 11.0 | Supports Keras(2.2.4) Supports P100, V100, RTX5000, RTX8000, T4 and A100 |
tensorflow-1.15.3-cuda110-mongodb | tensorflow:tensorflow-1.15.3-cuda110-mongodb | TensorFlow | CPU CUDA 11.0 | Supports MongoDB Supports P100, V100, RTX5000, RTX8000, T4 and A100 |
tensorflow-1.15.2-mkl | tensorflow:tensorflow-1.15.2-mkl | TensorFlow | CPU CUDA 10.0 | Supports P100, V100, RTX5000, RTX8000 and T4 |
tensorflow-2.5.0-cuda114 | tensorflow:tensorflow-2.5.0-cuda114 | TensorFlow | CPU CUDA 11.4 | Supports P100, V100, RTX5000, RTX8000, T4 and A100 |
tensorrt-7.1.3.4-cuda110 | tensorrt:tensorrt-7.1.3-cuda110 | TensorRT | CUDA 11.0 | Supports P100, V100 and A100 |
cvat | cvat:cvat-1.7.0 | Other | CPU |
Step 1. Prepare a build node with a minimum storage of 100 GB.
Notes:
opt/images
and /var/tmp
cannot be an NFS mount.step 2. To the build node, upload the compressed image bootstrap file you obtained which named image_bootstrap.zip
(https://hpc.lenovo.com/lico/downloads/6.4/images/k8s/image_bootstrap.zip). For example, upload the compressed package to the new directory /opt/images
.
Step 3. To the build node, run the following commands to configure the image:
cd /opt/images
unzip image_bootstrap.zip
Step 4. To the build node, do one of the following to create image.
Run the following commands to create all images at once:
xxxxxxxxxx
cd /opt/images/image_bootstrap
make all
Run the following commands to create a group images at once:
xxxxxxxxxx
cd /opt/images/image_bootstrap
make caffe
make cvat
make NVCaffe
make intel-caffe
make intel-python
make tensorflow
make neon
make chainer
make mxnet
make pytorch
make letrain
make jupyter
make lico-file-manager
make lico-k8s-client
make rstudio
make scikit
make tensorrt
make intel-pytorch
make intel-tensorflow
Step 5. Push the created docker image to one existing docker repository, the repository can be one docker registry (https://docs.docker.com/registry/) or one docker harbor (https://goharbor.io/) or docker hub (https://hub.docker.com/), just make sure the k8s nodes can access the docker repository. For example:
xxxxxxxxxx
docker tag lico-k8s-client:latest 10.240.212.106:5000/lico-k8s-client:latest
docker push 10.240.212.106:5000/lico-k8s-client:latest
...
Run the following commands on the LiCO node to import images from docker repository:
xlico import_image kube-tools 10.240.212.106:5000/lico-k8s-client other
lico import_image lico-file-manager 10.240.212.106:5000/lico-file-manager other
lico import_image caffe-cpu 10.240.212.106:5000/caffe:caffe-1.0-cpu caffe
lico import_image caffe-gpu 10.240.212.106:5000/caffe:caffe-1.0-gpu-cuda92 caffe
lico import_image NVCaffe 10.240.212.106:5000/caffe:NVCaffe-0.17.3-gpu-cuda102 caffe
lico import_image pytorch 10.240.212.106:5000/pytorch:pytorch-1.11.0-cuda115 pytorch
lico import_image intel-pytorch \
10.240.212.106:5000/intel-pytorch:intel-pytorch-1.7.0-cpu pytorch
lico import_image letrain 10.240.212.106:5000/lico:letrain-1.6.0-cuda110 letrain
lico import_image jupyter-default \
10.240.212.106:5000/jupyter:jupyter-default jupyter -t py38 -t cpu -t gpu
lico import_image jupyter-py37 \
10.240.212.106:5000/jupyter:jupyter-py37 jupyter -t py37 -t cpu -t gpu
lico import_image jupyter-py38 \
10.240.212.106:5000/jupyter:jupyter-py38 jupyter -t py38 -t cpu -t gpu
lico import_image scikit-cpu 10.240.212.106:5000/scikit:scikit-single-cpu scikit
lico import_image tensorflow2 \
10.240.212.106:5000/tensorflow:tensorflow-2.5.0-cuda114 tensorflow2
lico import_image intel-tensorflow \
10.240.212.106:5000/tensorflow:intel-tensorflow-1.15.2-cpu tensorflow
lico import_image intel-tensorflow2 \
10.240.212.106:5000/tensorflow:intel-tensorflow-2.3.0-cpu tensorflow2
lico import_image tensorflow \
10.240.212.106:5000/tensorflow:tensorflow-1.15.3-cuda110 tensorflow
lico import_image tensorflow-mkl \
10.240.212.106:5000/tensorflow:tensorflow-1.15.2-mkl tensorflow
lico import_image tensorflow-hbase \
10.240.212.106:5000/tensorflow:tensorflow-1.15.3-cuda110-hbase tensorflow
lico import_image tensorflow-keras \
10.240.212.106:5000/tensorflow:tensorflow-1.15.3-cuda110-keras tensorflow
lico import_image tensorflow-mongodb \
10.240.212.106:5000/tensorflow:tensorflow-1.15.3-cuda110-mongodb tensorflow
lico import_image intel-caffe \
10.240.212.106:5000/intel-caffe:intel-caffe-1.1.3-cpu intel-caffe
lico import_image intel-python 10.240.212.106:5000/intel-python:intel-python other
lico import_image chainer-gpu \
10.240.212.106:5000/chainer:chainer-6.7.0-gpu-cuda101 chainer
lico import_image mxnet-cpu 10.240.212.106:5000/mxnet:mxnet-1.5.0-cpu-mkl mxnet
lico import_image mxnet-gpu 10.240.212.106:5000/mxnet:mxnet-1.5.0-gpu-mkl-cuda101 mxnet
lico import_image neon-cpu 10.240.212.106:5000/neon:neon-2.6-cpu neon
lico import_image rstudio 10.240.212.106:5000/rstudio:rstudio -t cpu -t gpu
lico import_image tensorrt 10.240.212.106:5000/tensorrt:tensorrt-7.1.3-cuda110 tensorrt -t tensorrt
lico import_image cvat 10.240.212.106:5000/cvat:cvat-1.7.0 other
Attention: Modify 10.240.212.106:5000 to the actual url of your docker repository.