Compare commits

...

1 Commits

Author SHA1 Message Date
laiweijian
daf6e6f312 fix Model Package Specifications 2023-05-04 16:57:39 +08:00
2 changed files with 22 additions and 36 deletions

View File

@ -2,7 +2,7 @@
<h1 class="topictitle1">Model Package Specifications</h1>
<div id="body8662426"><p id="modelarts_23_0091__en-us_topic_0172466148_p102525456382">When you import models in <strong id="modelarts_23_0091__en-us_topic_0172466148_b996033001618">Model Management</strong>, if the meta model is imported from OBS or a container image, the model package must meet the following specifications:</p>
<ul id="modelarts_23_0091__en-us_topic_0172466148_ul1225294533816"><li id="modelarts_23_0091__en-us_topic_0172466148_li11253445183818">The model package must contain the <strong id="modelarts_23_0091__en-us_topic_0172466148_b2984171213407">model</strong> directory. The <strong id="modelarts_23_0091__en-us_topic_0172466148_b1134410594404">model</strong> directory stores the model file, model configuration file, and model inference code.</li><li id="modelarts_23_0091__en-us_topic_0172466148_li2253114515389">The model configuration file must exist and its name is fixed to <strong id="modelarts_23_0091__en-us_topic_0172466148_b10641123284110">config.json</strong>. There exists only one model configuration file. For details about how to compile the model configuration file, see <a href="modelarts_23_0092.html">Specifications for Compiling the Model Configuration File</a>.</li><li id="modelarts_23_0091__en-us_topic_0172466148_li17160121145519">The model inference code file is optional. If this file is required, the file name is fixed to <strong id="modelarts_23_0091__en-us_topic_0172466148_b27036550441">customize_service.py</strong>. There must be one and only one such file. For details about how to compile the model inference code, see <a href="modelarts_23_0093.html">Specifications for Compiling Model Inference Code</a>.<div class="note" id="modelarts_23_0091__en-us_topic_0172466148_note15587134105518"><img src="public_sys-resources/note_3.0-en-us.png"><span class="notetitle"> </span><div class="notebody"><ul id="modelarts_23_0091__en-us_topic_0172466148_ul1968882345614"><li id="modelarts_23_0091__en-us_topic_0172466148_li17688172395615">The <strong id="modelarts_23_0091__en-us_topic_0172466148_b872474511576">.py</strong> file on which <strong id="modelarts_23_0091__en-us_topic_0172466148_b759319475197">customize_service.py</strong> depends can be directly stored in the <strong id="modelarts_23_0091__en-us_topic_0172466148_b2794155042410">model</strong> directory. Use the Python import mode to import the custom package.</li><li id="modelarts_23_0091__en-us_topic_0172466148_li17163192616567">The other files on which <strong id="modelarts_23_0091__en-us_topic_0172466148_b1069811016555">customize_service.py</strong> depends can be stored in the <strong id="modelarts_23_0091__en-us_topic_0172466148_b136995005520">model</strong> directory. You must use absolute paths to access these files. For more details, see <a href="modelarts_23_0093.html#modelarts_23_0093__en-us_topic_0172466150_li135956421288">Obtaining an Absolute Path</a>.</li></ul>
<ul id="modelarts_23_0091__en-us_topic_0172466148_ul1225294533816"><li id="modelarts_23_0091__en-us_topic_0172466148_li11253445183818">The model package must contain the <strong id="modelarts_23_0091__en-us_topic_0172466148_b2984171213407">model</strong> directory. The <strong id="modelarts_23_0091__en-us_topic_0172466148_b1134410594404">model</strong> directory stores the model file, model configuration file, and model inference code.</li><li id="modelarts_23_0091__en-us_topic_0172466148_li2253114515389">The model configuration file must exist and its name is fixed to <strong id="modelarts_23_0091__en-us_topic_0172466148_b10641123284110">config.json</strong>. There exists only one model configuration file. For details about how to compile the model configuration file, see <a href="modelarts_23_0092.html">Specifications for Compiling the Model Configuration File</a>.</li><li id="modelarts_23_0091__en-us_topic_0172466148_li17160121145519">The model inference code file is mandatory. The file name is consistently to be <strong id="modelarts_23_0091__en-us_topic_0172466148_b27036550441">customize_service.py</strong>. There must be one and only one such file. For details about how to compile the model inference code, see <a href="modelarts_23_0093.html">Specifications for Compiling Model Inference Code</a>.<div class="note" id="modelarts_23_0091__en-us_topic_0172466148_note15587134105518"><img src="public_sys-resources/note_3.0-en-us.png"><span class="notetitle"> </span><div class="notebody"><ul id="modelarts_23_0091__en-us_topic_0172466148_ul1968882345614"><li id="modelarts_23_0091__en-us_topic_0172466148_li17688172395615">The <strong id="modelarts_23_0091__en-us_topic_0172466148_b872474511576">.py</strong> file on which <strong id="modelarts_23_0091__en-us_topic_0172466148_b759319475197">customize_service.py</strong> depends can be directly stored in the <strong id="modelarts_23_0091__en-us_topic_0172466148_b2794155042410">model</strong> directory. Use the Python import mode to import the custom package.</li><li id="modelarts_23_0091__en-us_topic_0172466148_li17163192616567">The other files on which <strong id="modelarts_23_0091__en-us_topic_0172466148_b1069811016555">customize_service.py</strong> depends can be stored in the <strong id="modelarts_23_0091__en-us_topic_0172466148_b136995005520">model</strong> directory. You must use absolute paths to access these files. For more details, see <a href="modelarts_23_0093.html#modelarts_23_0093__en-us_topic_0172466150_li135956421288">Obtaining an Absolute Path</a>.</li></ul>
</div></div>
</li></ul>
<p id="modelarts_23_0091__en-us_topic_0172466148_p1161310181139">ModelArts also provides custom script examples of common AI engines. For details, see <a href="modelarts_23_0173.html">Examples of Custom Scripts</a>.</p>
@ -16,7 +16,7 @@
| │ │ ├── variables.index Mandatory
| │ │ ├── variables.data-00000-of-00001 Mandatory
| │ ├──config.json (Mandatory) Model configuration file. The file name is fixed to <strong id="modelarts_23_0091__b1779665794211">config.json</strong>. Only one model configuration file is supported.
| │ ├──customize_service.py (Optional) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.</pre>
| │ ├──customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.</pre>
</li><li id="modelarts_23_0091__en-us_topic_0172466148_li5831202416390">Structure of the MXNet-based model package<p id="modelarts_23_0091__en-us_topic_0172466148_p14573124164020"><a name="modelarts_23_0091__en-us_topic_0172466148_li5831202416390"></a><a name="en-us_topic_0172466148_li5831202416390"></a>When publishing the model, you only need to specify the <span class="filepath" id="modelarts_23_0091__en-us_topic_0172466148_filepath13165220164112"><b>resnet</b></span> directory.</p>
<pre class="screen" id="modelarts_23_0091__screen143831518154311">OBS bucket/directory name
|── resnet
@ -25,7 +25,7 @@
| │ ├── resnet-50-symbol.json (Mandatory) Model definition file, which contains the neural network description of the model
| │ ├── resnet-50-0000.params (Mandatory) Model variable parameter file, which contains parameter and weight information
| │ ├──config.json (Mandatory) Model configuration file. The file name is fixed to <strong id="modelarts_23_0091__b5903132334314">config.json</strong>. Only one model configuration file is supported.
| │ ├──customize_service.py (Optional) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.</pre>
| │ ├──customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.</pre>
</li><li id="modelarts_23_0091__en-us_topic_0172466148_li737020312406">Structure of the Image-based model package<p id="modelarts_23_0091__en-us_topic_0172466148_p6955191510401"><a name="modelarts_23_0091__en-us_topic_0172466148_li737020312406"></a><a name="en-us_topic_0172466148_li737020312406"></a>When publishing the model, you only need to specify the <span class="filepath" id="modelarts_23_0091__en-us_topic_0172466148_filepath1333910163416"><b>resnet</b></span> directory.</p>
<pre class="screen" id="modelarts_23_0091__screen938803011437">OBS bucket/directory name
|── resnet
@ -38,7 +38,7 @@
| │ ├── &lt;&lt;Custom Python package&gt;&gt; (Optional) User's Python package, which can be directly referenced in the model inference code
| │ ├── spark_model (Mandatory) Model directory, which contains the model content saved by PySpark
| │ ├──config.json (Mandatory) Model configuration file. The file name is fixed to <strong id="modelarts_23_0091__b171785954315">config.json</strong>. Only one model configuration file is supported.
| │ ├──customize_service.py (Optional) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.</pre>
| │ ├──customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.</pre>
</li><li id="modelarts_23_0091__en-us_topic_0172466148_li610313145402">Structure of the PyTorch-based model package<p id="modelarts_23_0091__en-us_topic_0172466148_p164232524010"><a name="modelarts_23_0091__en-us_topic_0172466148_li610313145402"></a><a name="en-us_topic_0172466148_li610313145402"></a>When publishing the model, you only need to specify the <span class="filepath" id="modelarts_23_0091__en-us_topic_0172466148_filepath1454313122419"><b>resnet</b></span> directory.</p>
<pre class="screen" id="modelarts_23_0091__screen3368560443">OBS bucket/directory name
|── resnet
@ -46,7 +46,7 @@
| │ ├── &lt;&lt;Custom Python package&gt;&gt; (Optional) User's Python package, which can be directly referenced in the model inference code
| │ ├── resnet50.pth (Mandatory) PyTorch model file, which contains variable and weight information and is saved as <strong id="modelarts_23_0091__b181831137442">state_dict</strong>
| │ ├──config.json (Mandatory) Model configuration file. The file name is fixed to <strong id="modelarts_23_0091__b0183113164416">config.json</strong>. Only one model configuration file is supported.
| │ ├──customize_service.py (Optional) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.</pre>
| │ ├──customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory.</pre>
</li><li id="modelarts_23_0091__en-us_topic_0172466148_li191744350532">Structure of the Caffe-based model package<div class="p" id="modelarts_23_0091__en-us_topic_0172466148_p098165213533"><a name="modelarts_23_0091__en-us_topic_0172466148_li191744350532"></a><a name="en-us_topic_0172466148_li191744350532"></a>When publishing the model, you only need to specify the <span class="filepath" id="modelarts_23_0091__en-us_topic_0172466148_filepath186316825317"><b>resnet</b></span> directory.<pre class="screen" id="modelarts_23_0091__screen8378119114415">OBS bucket/directory name
|── resnet
| |── model (Mandatory) Name of a fixed subdirectory, which is used to store model-related files
@ -54,7 +54,7 @@
| | |── deploy.prototxt (Mandatory) Caffe model file, which contains information such as the model network structure
| | |── resnet.caffemodel (Mandatory) Caffe model file, which contains variable and weight information
| | |── config.json (Mandatory) Model configuration file. The file name is fixed to <strong id="modelarts_23_0091__b96102917440">config.json</strong>. Only one model configuration file is supported.
| | |── customize_service.py (Optional) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory. </pre>
| | |── customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory. </pre>
</div>
</li><li id="modelarts_23_0091__en-us_topic_0172466148_li14705162955213">Structure of the XGBoost-based model package<div class="p" id="modelarts_23_0091__en-us_topic_0172466148_p9705182925215"><a name="modelarts_23_0091__en-us_topic_0172466148_li14705162955213"></a><a name="en-us_topic_0172466148_li14705162955213"></a>When publishing the model, you only need to specify the <span class="filepath" id="modelarts_23_0091__en-us_topic_0172466148_filepath10976344133219"><b>resnet</b></span> directory.<pre class="screen" id="modelarts_23_0091__screen136252345448">OBS bucket/directory name
|── resnet
@ -62,7 +62,7 @@
| | |── &lt;&lt;Custom Python package&gt;&gt; (Optional) User's Python package, which can be directly referenced in the model inference code
| | |── *.m (Mandatory): Model file whose extension name is <strong id="modelarts_23_0091__b2138154134413">.m</strong>
| | |── config.json (Mandatory) Model configuration file. The file name is fixed to <strong id="modelarts_23_0091__b5138141114415">config.json</strong>. Only one model configuration file is supported.
| | |── customize_service.py (Optional) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which <strong id="modelarts_23_0091__b111381541124415">customize_service.py</strong> depends can be directly stored in the <strong id="modelarts_23_0091__b5138154184418">model</strong> directory. </pre>
| | |── customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which <strong id="modelarts_23_0091__b111381541124415">customize_service.py</strong> depends can be directly stored in the <strong id="modelarts_23_0091__b5138154184418">model</strong> directory. </pre>
</div>
</li><li id="modelarts_23_0091__en-us_topic_0172466148_li1783019205541">Structure of the Scikit_Learn-based model package<div class="p" id="modelarts_23_0091__en-us_topic_0172466148_p16830122065411"><a name="modelarts_23_0091__en-us_topic_0172466148_li1783019205541"></a><a name="en-us_topic_0172466148_li1783019205541"></a>When publishing the model, you only need to specify the <span class="filepath" id="modelarts_23_0091__en-us_topic_0172466148_filepath812312063516"><b>resnet</b></span> directory.<pre class="screen" id="modelarts_23_0091__screen10985104813441">OBS bucket/directory name
|── resnet
@ -70,7 +70,7 @@
| | |── &lt;&lt;Custom Python package&gt;&gt; (Optional) User's Python package, which can be directly referenced in the model inference code
| | |── *.m (Mandatory): Model file whose extension name is <strong id="modelarts_23_0091__b346375814444">.m</strong>
| | |── config.json (Mandatory) Model configuration file. The file name is fixed to <strong id="modelarts_23_0091__b74631058124413">config.json</strong>. Only one model configuration file is supported.
| | |── customize_service.py (Optional) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory. </pre>
| | |── customize_service.py (Mandatory) Model inference code. The file name is fixed to customize_service.py. Only one model inference code file exists. The files on which customize_service.py depends can be directly stored in the model directory. </pre>
</div>
</li></ul>
</div>

View File

@ -411,8 +411,7 @@
<div class="section" id="modelarts_23_0092__en-us_topic_0172466149_section218715919415"><a name="modelarts_23_0092__en-us_topic_0172466149_section218715919415"></a><a name="en-us_topic_0172466149_section218715919415"></a><h4 class="sectiontitle">Example of the Object Detection Model Configuration File</h4><p id="modelarts_23_0092__en-us_topic_0172466149_p18323116198">The following code uses the TensorFlow engine as an example. You can modify the <strong id="modelarts_23_0092__en-us_topic_0172466149_b0482153318122">model_type</strong> parameter based on the actual engine type.</p>
<ul id="modelarts_23_0092__en-us_topic_0172466149_ul19743491621"><li id="modelarts_23_0092__en-us_topic_0172466149_li7744495210">Model input<p id="modelarts_23_0092__en-us_topic_0172466149_p9447730219"><a name="modelarts_23_0092__en-us_topic_0172466149_li7744495210"></a><a name="en-us_topic_0172466149_li7744495210"></a>Key: images</p>
<p id="modelarts_23_0092__en-us_topic_0172466149_p1447931723">Value: image files</p>
</li><li id="modelarts_23_0092__en-us_topic_0172466149_li101181414522">Model output<pre class="screen" id="modelarts_23_0092__screen13371131194514">```
{
</li><li id="modelarts_23_0092__en-us_topic_0172466149_li101181414522">Model output<pre class="screen" id="modelarts_23_0092__screen13371131194514">{
"detection_classes": [
"face",
"arm"
@ -432,10 +431,8 @@
]
],
"detection_scores": [0.99, 0.73]
}
```</pre>
</li><li id="modelarts_23_0092__en-us_topic_0172466149_li1315171322">Configuration file<pre class="screen" id="modelarts_23_0092__screen9347330134510">```
{
}</pre>
</li><li id="modelarts_23_0092__en-us_topic_0172466149_li1315171322">Configuration file<pre class="screen" id="modelarts_23_0092__screen9347330134510">{
"model_type": "TensorFlow",
"model_algorithm": "object_detection",
"metrics": {
@ -505,24 +502,20 @@
}
]
}]
}
```</pre>
}</pre>
</li></ul>
</div>
<div class="section" id="modelarts_23_0092__en-us_topic_0172466149_section8806755101916"><h4 class="sectiontitle">Example of the Image Classification Model Configuration File</h4><p id="modelarts_23_0092__en-us_topic_0172466149_p8229421192015">The following code uses the TensorFlow engine as an example. You can modify the <strong id="modelarts_23_0092__en-us_topic_0172466149_b11456645121215">model_type</strong> parameter based on the actual engine type.</p>
<ul id="modelarts_23_0092__en-us_topic_0172466149_ul07931028734"><li id="modelarts_23_0092__en-us_topic_0172466149_li97936281538">Model input<p id="modelarts_23_0092__en-us_topic_0172466149_p115813457310"><a name="modelarts_23_0092__en-us_topic_0172466149_li97936281538"></a><a name="en-us_topic_0172466149_li97936281538"></a>Key: images</p>
<p id="modelarts_23_0092__en-us_topic_0172466149_p9581945833">Value: image files</p>
</li><li id="modelarts_23_0092__en-us_topic_0172466149_li1261510310318">Model output<pre class="screen" id="modelarts_23_0092__screen9126125264512">```
{
</li><li id="modelarts_23_0092__en-us_topic_0172466149_li1261510310318">Model output<pre class="screen" id="modelarts_23_0092__screen9126125264512">{
"predicted_label": "flower",
"scores": [
["rose", 0.99],
["begonia", 0.01]
]
}
```</pre>
</li><li id="modelarts_23_0092__en-us_topic_0172466149_li79592332316">Configuration file<pre class="screen" id="modelarts_23_0092__screen67712564619">```
{
}</pre>
</li><li id="modelarts_23_0092__en-us_topic_0172466149_li79592332316">Configuration file<pre class="screen" id="modelarts_23_0092__screen67712564619">{
"model_type": "TensorFlow",
"model_algorithm": "image_classification",
"metrics": {
@ -588,13 +581,11 @@
}
]
}]
}
```</pre>
}</pre>
</li></ul>
</div>
<div class="section" id="modelarts_23_0092__en-us_topic_0172466149_section1490393064512"><h4 class="sectiontitle">Example of the Predictive Analytics Model Configuration File</h4><p id="modelarts_23_0092__en-us_topic_0172466149_p992432616208">The following code uses the TensorFlow engine as an example. You can modify the <strong id="modelarts_23_0092__en-us_topic_0172466149_b79601842191214">model_type</strong> parameter based on the actual engine type.</p>
<ul id="modelarts_23_0092__en-us_topic_0172466149_ul629110581646"><li id="modelarts_23_0092__en-us_topic_0172466149_li6291125817415">Model input<pre class="screen" id="modelarts_23_0092__screen7118435194613">```
{
<ul id="modelarts_23_0092__en-us_topic_0172466149_ul629110581646"><li id="modelarts_23_0092__en-us_topic_0172466149_li6291125817415">Model input<pre class="screen" id="modelarts_23_0092__screen7118435194613">{
"data": {
"req_data": [
{
@ -617,10 +608,8 @@
}
]
}
}
```</pre>
</li><li id="modelarts_23_0092__en-us_topic_0172466149_li10051551">Model output<pre class="screen" id="modelarts_23_0092__screen85537128475">```
{
}</pre>
</li><li id="modelarts_23_0092__en-us_topic_0172466149_li10051551">Model output<pre class="screen" id="modelarts_23_0092__screen85537128475">{
"data": {
"resp_data": [
{
@ -631,10 +620,8 @@
}
]
}
}
```</pre>
</li><li id="modelarts_23_0092__en-us_topic_0172466149_li372010210513">Configuration file<pre class="screen" id="modelarts_23_0092__screen082917395471">```
{
}</pre>
</li><li id="modelarts_23_0092__en-us_topic_0172466149_li372010210513">Configuration file<pre class="screen" id="modelarts_23_0092__screen082917395471">{
"model_type": "TensorFlow",
"model_algorithm": "predict_analysis",
"metrics": {
@ -708,8 +695,7 @@
"package_name": "Pillow"
}]
}]
}
```</pre>
}</pre>
</li></ul>
</div>
<div class="section" id="modelarts_23_0092__en-us_topic_0172466149_section9113122232018"><a name="modelarts_23_0092__en-us_topic_0172466149_section9113122232018"></a><a name="en-us_topic_0172466149_section9113122232018"></a><h4 class="sectiontitle">Example of the Custom Image Model Configuration File</h4><p id="modelarts_23_0092__en-us_topic_0172466149_p184212308818">The model input and output are similar to those in <a href="#modelarts_23_0092__en-us_topic_0172466149_section218715919415">Example of the Object Detection Model Configuration File</a>.</p>