ccr.ccs.tencentyun.com/tione-public-images/ti-cloud-gpu-base-tiinfer:py38-cu111-1.0.0Software or Package | Version |
CUDA | 11.1.1 |
python | 3.9.13 |
cos-python-sdk-v5 | 1.9.14 |
coscmd | 1.8.6.24 |
numpy | 1.23.1 |
msgpack | 1.0.5 |
opencv-python | 4.6.0.66 |
opencv-contrib-python | 4.6.0.66 |
pandas | 1.4.3 |
Pillow | 9.4.0 |
tiinfer | 0.1.1 |
mosec-tiinfer | 0.0.6 |
#!/bin/bashsource /etc/profilesource /root/.bashrcexport LD_LIBRARY_PATH=/usr/local/python3/lib/python3.8/site-packages/torch/lib:/usr/local/openmpi/lib:/usr/local/nccl/lib:/usr/local/cuda/lib64:/usr/local/python3/lib:/usr/local/python3/lib64:/usr/local/openmpi/lib:/usr/local/gcc/lib:/usr/local/gcc/lib64MODEL_DIR=/data/modelecho "================== code path ${MODEL_DIR}=========="cd ${MODEL_DIR}if [ -f "requirements.txt" ]; thenecho "============== install python requirements ======================"echo "python3 -m pip install -r requirements.txt"python3 -m pip install -r requirements.txtecho "============== install python requirements done ================="fiecho "====================== start serving ============================"echo "python3 -m tiinfer"export TI_MODEL_DIR=${MODEL_DIR}python3 -m tiinfer --timeout 30000
${MODEL_DIR} directory, and use pip to install the specified dependency Python package.${MODEL_DIR} environment variable. After loading the model, it will start an HTTP service and listen on the port defined by the environment variable ${REST_PORT} .FROM ccr.ccs.tencentyun.com/tione-public-images/ti-cloud-gpu-base-tiinfer:py38-cu111-1.0.0/data/model directory of the service instance. Therefore, custom code and data cannot be placed in the /data/model directory. Otherwise, they will be overwritten by the platform./data/model directory.File | Function |
model_service.py | Write an adder model as required by tiinfer. |
entrypoint.sh | Launch scripts. You can install more dependency packages. |
Dockerfile | Copy the first two files to the image. |
from typing import Dictimport tiinferclass AdderModel(tiinfer.Model):def __init__(self, model_dir: str):super().__init__(model_dir)def load(self) -> bool:self.ready = Truereturn self.readydef preprocess(self, request: Dict) -> Dict:return requestdef predict(self, request: Dict) -> Dict:return {'result': request['a'] + request['b']}def postprocess(self, result: Dict) -> Dict:return result
#!/bin/bashsource /etc/profilesource /root/.bashrcexport LD_LIBRARY_PATH=/usr/local/python3/lib/python3.8/site-packages/torch/lib:/usr/local/openmpi/lib:/usr/local/nccl/lib:/usr/local/cuda/lib64:/usr/local/python3/lib:/usr/local/python3/lib64:/usr/local/openmpi/lib:/usr/local/gcc/lib:/usr/local/gcc/lib64MODEL_DIR=/opt/modelecho "================== code path ${MODEL_DIR}=========="cd ${MODEL_DIR}if [ -f "requirements.txt" ]; thenecho "============== install python requirements ======================"echo "python3 -m pip install -r requirements.txt"python3 -m pip install -r requirements.txtecho "============== install python requirements done ================="fiecho "====================== start serving ============================"echo "python3 -m tiinfer"export TI_MODEL_DIR=${MODEL_DIR}python3 -m tiinfer --timeout 30000
MODEL_DIR=/opt/model of the above code, change the startup directory from the default /data/model to /opt/model to avoid being overwritten by the platform.FROMccr.ccs.tencentyun.com/tione-public-images/ti-cloud-gpu-base-tiinfer:py38-cu111-1.0.0COPY model_service.py /opt/model/model_service.pyCOPY entrypoint.sh ./entrypoint.shRUN chmod +x ./entrypoint.sh
/opt/model directory instead of the default /data/model directory to avoid being overwritten by the platform.docker build . --tag ccr.ccs.tencentyun.com/YOUR_NAMESPACE/YOUR_IMAGENAME
docker run -d --name myinfer ccr.ccs.tencentyun.com/YOUR_NAMESPACE/YOUT_IMAGENAME to run the service.docker exec -it myinfer bash to enter the container.curl http://127.0.0.1:8501/v1/models/m:predict -d '{"a": 1, "b": 2}' in the container to get a correct return result: {"result": 3}docker push ccr.ccs.tencentyun.com/YOUR_NAMESPACE/YOUR_IMAGENAME./data/model directory of the service instance. Therefore, custom code and data cannot be placed in the /data/model directory. Otherwise, they will be overwritten by the platform.


Feedback