产品概述
产品定价
客户价值
应用场景
ccr.ccs.tencentyun.com/tione-public-images/ti-cloud-gpu-base-tiinfer:py38-cu111-1.0.0软件或包 | 版本 |
CUDA | 11.1.1 |
python | 3.9.13 |
cos-python-sdk-v5 | 1.9.14 |
coscmd | 1.8.6.24 |
numpy | 1.23.1 |
msgpack | 1.0.5 |
opencv-python | 4.6.0.66 |
opencv-contrib-python | 4.6.0.66 |
pandas | 1.4.3 |
Pillow | 9.4.0 |
tiinfer | 0.1.1 |
mosec-tiinfer | 0.0.6 |
#!/bin/bashsource /etc/profilesource /root/.bashrcexport LD_LIBRARY_PATH=/usr/local/python3/lib/python3.8/site-packages/torch/lib:/usr/local/openmpi/lib:/usr/local/nccl/lib:/usr/local/cuda/lib64:/usr/local/python3/lib:/usr/local/python3/lib64:/usr/local/openmpi/lib:/usr/local/gcc/lib:/usr/local/gcc/lib64MODEL_DIR=/data/modelecho "================== code path ${MODEL_DIR}=========="cd ${MODEL_DIR}if [ -f "requirements.txt" ]; thenecho "============== install python requirements ======================"echo "python3 -m pip install -r requirements.txt"python3 -m pip install -r requirements.txtecho "============== install python requirements done ================="fiecho "====================== start serving ============================"echo "python3 -m tiinfer"export TI_MODEL_DIR=${MODEL_DIR}python3 -m tiinfer --timeout 30000
${MODEL_DIR} 目录下的 requirements.txt 文件,使用 pip 安装其中指定的依赖 python 包。 ${MODEL_DIR} 下的文件,加载模型后,启动一个 HTTP 服务并监听在环境变量 ${REST_PORT} 定义的端口。FROM ccr.ccs.tencentyun.com/tione-public-images/ti-cloud-gpu-base-tiinfer:py38-cu111-1.0.0/data/model目录下。因此自定义的代码及数据不能置于 /data/model目录,否则会被平台覆盖。/data/model目录。文件 | 作用 |
model_service.py | 按照 tiinfer 的要求,编写加法器模型。 |
entrypoint.sh | 启动脚本,可在此自行安装更多的依赖包。 |
Dockerfile | 负责将前两个文件拷贝到镜像中。 |
from typing import Dictimport tiinferclass AdderModel(tiinfer.Model):def __init__(self, model_dir: str):super().__init__(model_dir)def load(self) -> bool:self.ready = Truereturn self.readydef preprocess(self, request: Dict) -> Dict:return requestdef predict(self, request: Dict) -> Dict:return {'result': request['a'] + request['b']}def postprocess(self, result: Dict) -> Dict:return result
#!/bin/bashsource /etc/profilesource /root/.bashrcexport LD_LIBRARY_PATH=/usr/local/python3/lib/python3.8/site-packages/torch/lib:/usr/local/openmpi/lib:/usr/local/nccl/lib:/usr/local/cuda/lib64:/usr/local/python3/lib:/usr/local/python3/lib64:/usr/local/openmpi/lib:/usr/local/gcc/lib:/usr/local/gcc/lib64MODEL_DIR=/opt/modelecho "================== code path ${MODEL_DIR}=========="cd ${MODEL_DIR}if [ -f "requirements.txt" ]; thenecho "============== install python requirements ======================"echo "python3 -m pip install -r requirements.txt"python3 -m pip install -r requirements.txtecho "============== install python requirements done ================="fiecho "====================== start serving ============================"echo "python3 -m tiinfer"export TI_MODEL_DIR=${MODEL_DIR}python3 -m tiinfer --timeout 30000
MODEL_DIR=/opt/model这一行,将启动目录由默认的/data/model改为/opt/model,避免被平台覆盖。FROMccr.ccs.tencentyun.com/tione-public-images/ti-cloud-gpu-base-tiinfer:py38-cu111-1.0.0COPY model_service.py /opt/model/model_service.pyCOPY entrypoint.sh ./entrypoint.shRUN chmod +x ./entrypoint.sh
/opt/model目录,而非默认的/data/model目录,避免被平台覆盖。docker build . --tag ccr.ccs.tencentyun.com/YOUR_NAMESPACE/YOUR_IMAGENAME
docker run -d --name myinfer ccr.ccs.tencentyun.com/YOUR_NAMESPACE/YOUT_IMAGENAME 将服务运行起来;docker exec -it myinfer bash 进入容器中;curl http://127.0.0.1:8501/v1/models/m:predict -d '{"a": 1, "b": 2}' 得到正确返回: {"result": 3} docker push ccr.ccs.tencentyun.com/YOUR_NAMESPACE/YOUR_IMAGENAME。/data/model目录下。因此自定义的代码及数据不能置于 /data/model目录,否则会被平台覆盖。



文档反馈