diff --git a/docs/modelarts/umn/modelarts_23_0091.html b/docs/modelarts/umn/modelarts_23_0091.html index a2c25f38..2f833c02 100644 --- a/docs/modelarts/umn/modelarts_23_0091.html +++ b/docs/modelarts/umn/modelarts_23_0091.html @@ -2,7 +2,7 @@
When you import models in Model Management, if the meta model is imported from OBS or a container image, the model package must meet the following specifications:
-ModelArts also provides custom script examples of common AI engines. For details, see Examples of Custom Scripts.
@@ -16,7 +16,7 @@ | │ │ ├── variables.index Mandatory | │ │ ├── variables.data-00000-of-00001 Mandatory | │ ├──config.json (Mandatory) Model configuration file. The file name is fixed to config.json. Only one model configuration file is supported. -| │ ├──customize_service.py (Optional) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory. +| │ ├──customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.When publishing the model, you only need to specify the resnet directory.
OBS bucket/directory name |── resnet @@ -25,7 +25,7 @@ | │ ├── resnet-50-symbol.json (Mandatory) Model definition file, which contains the neural network description of the model | │ ├── resnet-50-0000.params (Mandatory) Model variable parameter file, which contains parameter and weight information | │ ├──config.json (Mandatory) Model configuration file. The file name is fixed to config.json. Only one model configuration file is supported. -| │ ├──customize_service.py (Optional) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.+| │ ├──customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.
When publishing the model, you only need to specify the resnet directory.
OBS bucket/directory name |── resnet @@ -38,7 +38,7 @@ | │ ├── <<Custom Python package>> (Optional) User's Python package, which can be directly referenced in the model inference code | │ ├── spark_model (Mandatory) Model directory, which contains the model content saved by PySpark | │ ├──config.json (Mandatory) Model configuration file. The file name is fixed to config.json. Only one model configuration file is supported. -| │ ├──customize_service.py (Optional) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.+| │ ├──customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.
When publishing the model, you only need to specify the resnet directory.
OBS bucket/directory name |── resnet @@ -46,7 +46,7 @@ | │ ├── <<Custom Python package>> (Optional) User's Python package, which can be directly referenced in the model inference code | │ ├── resnet50.pth (Mandatory) PyTorch model file, which contains variable and weight information and is saved as state_dict | │ ├──config.json (Mandatory) Model configuration file. The file name is fixed to config.json. Only one model configuration file is supported. -| │ ├──customize_service.py (Optional) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.+| │ ├──customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.
OBS bucket/directory name |── resnet | |── model (Mandatory) Name of a fixed subdirectory, which is used to store model-related files @@ -54,7 +54,7 @@ | | |── deploy.prototxt (Mandatory) Caffe model file, which contains information such as the model network structure | | |── resnet.caffemodel (Mandatory) Caffe model file, which contains variable and weight information | | |── config.json (Mandatory) Model configuration file. The file name is fixed to config.json. Only one model configuration file is supported. -| | |── customize_service.py (Optional) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.+| | |── customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.
OBS bucket/directory name |── resnet @@ -62,7 +62,7 @@ | | |── <<Custom Python package>> (Optional) User's Python package, which can be directly referenced in the model inference code | | |── *.m (Mandatory): Model file whose extension name is .m | | |── config.json (Mandatory) Model configuration file. The file name is fixed to config.json. Only one model configuration file is supported. -| | |── customize_service.py (Optional) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.+| | |── customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.
OBS bucket/directory name |── resnet @@ -70,7 +70,7 @@ | | |── <<Custom Python package>> (Optional) User's Python package, which can be directly referenced in the model inference code | | |── *.m (Mandatory): Model file whose extension name is .m | | |── config.json (Mandatory) Model configuration file. The file name is fixed to config.json. Only one model configuration file is supported. -| | |── customize_service.py (Optional) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.+| | |── customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.
The following code uses the TensorFlow engine as an example. You can modify the model_type parameter based on the actual engine type.
Value: image files
-``` -{ +
{ "detection_classes": [ "face", "arm" @@ -432,10 +431,8 @@ ] ], "detection_scores": [0.99, 0.73] -} -```-
``` -{ +}+
{ "model_type": "TensorFlow", "model_algorithm": "object_detection", "metrics": { @@ -505,24 +502,20 @@ } ] }] -} -```+}
The following code uses the TensorFlow engine as an example. You can modify the model_type parameter based on the actual engine type.
Value: image files
-``` -{ +
{ "predicted_label": "flower", "scores": [ ["rose", 0.99], ["begonia", 0.01] ] -} -```-
``` -{ +}+
{ "model_type": "TensorFlow", "model_algorithm": "image_classification", "metrics": { @@ -588,13 +581,11 @@ } ] }] -} -```+}
The following code uses the TensorFlow engine as an example. You can modify the model_type parameter based on the actual engine type.
-``` -{ +
{ "data": { "req_data": [ { @@ -617,10 +608,8 @@ } ] } -} -```-
``` -{ +}+
{ "data": { "resp_data": [ { @@ -631,10 +620,8 @@ } ] } -} -```-
``` -{ +}+
{ "model_type": "TensorFlow", "model_algorithm": "predict_analysis", "metrics": { @@ -708,8 +695,7 @@ "package_name": "Pillow" }] }] -} -```+}
The model input and output are similar to those in Example of the Object Detection Model Configuration File.