System-level container images are used by all users. You can obtain images from Lenovo salesperson and import images into LiCO as system-level container images. This section describes how to create and import system-level container images.
Download image_bootstrap.zip from https://hpc.lenovo.com/lico/downloads/7.2.2/images/image_bootstrap.zip, image_bootstrap.zip includes the following singularity bootstrap(definition) files, then we can use it to create images.
| File name | Framework | CPU/GPU | Comments |
|---|---|---|---|
| caffe-1.0-cpu | Caffe | CPU | |
| NVCaffe-0.17.3-gpu-cuda102 | Caffe | CUDA 10.2 | |
| cvat-2.3.0 | Other | CPU | |
| intel-caffe-1.1.6-cpu | Intel-caffe | CPU | |
| intel-python-3.10 | Other | CPU | |
| intel-pytorch-2.0.1-cpu | PyTorch | CPU | |
| intel-tensorflow-1.15.2-cpu | TensorFlow | CPU | |
| intel-tensorflow-2.10.0-cpu | TensorFlow | CPU | |
| jupyter-default | Jupyter | CPU CUDA 12.0.1 | |
| jupyter-py39 | Jupyter | CPU CUDA 12.0.1 | |
| jupyter-py310 | Jupyter | CPU CUDA 12.0.1 | |
| jupyterlab-default | JupyterLab | CPU CUDA 12.0.1 | |
| jupyterlab-py39 | JupyterLab | CPU CUDA 12.0.1 | |
| jupyterlab-py310 | JupyterLab | CPU CUDA 12.0.1 | |
| llama2-finetuning-cu123 | Other | CUDA 12.3 | |
| llama2-testing-cu123 | Other | CUDA 12.3 | |
| letrain-2.1.0-cuda120 | LeTrain | CPU CUDA 12.0.1 | |
| letrain-2.0.0-xpu | LeTrain | XPU | |
| mxnet-1.9.1-cpu | Mxnet | CPU | |
| mxnet-1.9.1-gpu-cuda112 | Mxnet | CUDA 11.2 | |
| neon-2.6-cpu | Neon | CPU | |
| onnx-1.17.1-cuda122 | ONNX | CPU CUDA 12.2 | |
| paddle-2.4.0-cuda120 | paddlepaddle | CPU CUDA 12.0.1 | |
| pytorch-2.1.0-cuda121 | PyTorch | CPU CUDA 12.1.1 | |
| rstudio | RStudio | CPU CUDA 12.0.1 | |
| scikit-single-cpu | Scikit | CPU | |
| tensorflow-1.15.5-cuda121 | TensorFlow | CPU CUDA 12.1.0 | |
| tensorflow-1.15.5-cuda121-hbase | TensorFlow | CPU CUDA 12.1.0 | Supports HBase |
| tensorflow-1.15.5-cuda121-keras | TensorFlow | CPU CUDA 12.1.0 | Supports Keras(2.11.0) |
| tensorflow-1.15.5-cuda121-mongodb | TensorFlow | CPU CUDA 12.1.0 | Supports MongoDB |
| tensorflow-1.15.2-mkl | TensorFlow | CPU | |
| tensorflow-2.12.0-cuda121 | TensorFlow | CPU CUDA 12.1.1 | |
| tensorrt-8.5.3-cuda120 | TensorRT | CUDA 12.0.1 |
Step 1. Prepare a build node with a minimum storage of 100 GB.
Notes:
Step 2. To the build node, ensure that squashfs-tools, libarchive, and make are installed.
Step 3. To the build node, upload the compressed image bootstrap file you obtained which named image_bootstrap.zip. For example, upload the compressed package to the new directory /opt/images. If the new directory cannot be found, create it manually. Note that both this new directory and /var/tmp cannot be an NFS mount.
Step 4. To the build node, run the following commands to extract the compressed package.
cd /opt/imagesunzip image_bootstrap.zipStep 5. To the build node, do one of the following to create image. The created image file is in the dist folder of the current directory.
Run the following commands to create all images at once:
xxxxxxxxxxcd /opt/images/image_bootstrapmake allRun the following commands to create a group images at once:
xxxxxxxxxxcd /opt/images/image_bootstrapmake caffemake intel-caffemake intel-pythonmake intel-pytorchmake intel-tensorflowmake jupytermake jupyterlabmake llama2make mxnetmake neonmake onnxmake paddlepaddlemake pytorchmake rstudiomake scikitmake tensorflowmake tensorrtmake cvatmake letrainNotes: Check the network when one of the following errors is displayed:
Step 1. Copy the created images to the management node.
Attention:
SHARE_DIR configuration item in /etc/lico/lico.ini./home as an example/var/tmp cannot be an NFS mountStep 2. Run the following commands to import images into LiCO:
x$SHARE_DIR='/home'docker exec -it lico lico import_image caffe-cpu \$SHARE_DIR/caffe-1.0-cpu.image caffedocker exec -it lico lico import_image NVCaffe \$SHARE_DIR/NVCaffe-0.17.3-gpu-cuda102.image caffedocker exec -it lico lico import_image intel-caffe \$SHARE_DIR/intel-caffe-1.1.6-cpu.image intel-caffedocker exec -it lico lico import_image intel-python \$SHARE_DIR/intel-python-3.10.image otherdocker exec -it lico lico import_image tensorflow \$SHARE_DIR/tensorflow-1.15.5-cuda121.image tensorflowdocker exec -it lico lico import_image tensorflow-mkl \$SHARE_DIR/tensorflow-1.15.2-mkl.image tensorflowdocker exec -it lico lico import_image tensorflow-hbase \$SHARE_DIR/tensorflow-1.15.5-cuda121-hbase.image tensorflowdocker exec -it lico lico import_image tensorflow-keras \$SHARE_DIR/tensorflow-1.15.5-cuda121-keras.image tensorflowdocker exec -it lico lico import_image tensorflow-mongodb \$SHARE_DIR/tensorflow-1.15.5-cuda121-mongodb.image tensorflowdocker exec -it lico lico import_image tensorflow2 \$SHARE_DIR/tensorflow-2.12.0-cuda121.image tensorflow2docker exec -it lico lico import_image intel-tensorflow \$SHARE_DIR/intel-tensorflow-1.15.2-cpu.image tensorflowdocker exec -it lico lico import_image intel-tensorflow2-cpu \$SHARE_DIR/intel-tensorflow-2.10.0-cpu.image tensorflow2docker exec -it lico lico import_image mxnet-cpu \$SHARE_DIR/mxnet-1.9.1-cpu.image mxnetdocker exec -it lico lico import_image mxnet-gpu \$SHARE_DIR/mxnet-1.9.1-gpu-cuda112.image mxnetdocker exec -it lico lico import_image neon \$SHARE_DIR/neon-2.6-cpu.image neondocker exec -it lico lico import_image letrain \$SHARE_DIR/letrain-2.1.0-cuda120.image letraindocker exec -it lico lico import_image letrain-xpu \$SHARE_DIR/letrain-2.0.0-xpu.image letraindocker exec -it lico lico import_image jupyter-default \$SHARE_DIR/jupyter-default.image jupyter -t py38 -t cpu -t gpudocker exec -it lico lico import_image jupyter-py39 \$SHARE_DIR/jupyter-py39.image jupyter -t py39 -t cpu -t gpudocker exec -it lico lico import_image jupyter-py310 \$SHARE_DIR/jupyter-py310.image jupyter -t py310 -t cpu -t gpudocker exec -it lico lico import_image jupyterlab-default \$SHARE_DIR/jupyterlab-default.image jupyterlab -t py38 -t cpu -t gpudocker exec -it lico lico import_image jupyterlab-py39 \$SHARE_DIR/jupyterlab-py39.image jupyterlab -t py39 -t cpu -t gpudocker exec -it lico lico import_image jupyterlab-py310 \$SHARE_DIR/jupyterlab-py310.image jupyterlab -t py310 -t cpu -t gpudocker exec -it lico lico import_image pytorch \$SHARE_DIR/pytorch-2.1.0-cuda121.image pytorchdocker exec -it lico lico import_image intel-pytorch-cpu \$SHARE_DIR/intel-pytorch-2.0.1-cpu.image pytorchdocker exec -it lico lico import_image rstudio \$SHARE_DIR/rstudio.image rstudio -t cpu -t gpudocker exec -it lico lico import_image scikit \$SHARE_DIR/scikit-single-cpu.image scikitdocker exec -it lico lico import_image tensorrt8 \$SHARE_DIR/tensorrt-8.5.3-cuda120.image tensorrt -t tensorrtdocker exec -it lico lico import_image cvat \$SHARE_DIR/cvat-2.3.0.image otherdocker exec -it lico lico import_image paddlepaddle \$SHARE_DIR/paddle-2.4.0-cuda120.image paddlepaddledocker exec -it lico lico import_image onnx \$SHARE_DIR/onnx-1.17.1-cuda122.image onnxdocker exec -it lico lico import_image llama2-finetuning \$SHARE_DIR/llama2-finetuning-cu123.image other -t llamadocker exec -it lico lico import_image llama2-testing \$SHARE_DIR/llama2-testing-cu123.image other -t llamarm -f $SHARE_DIR/*.image