Xiong, Chen Xiao 3bc19c4f14 DataArts UMN 20240301 version
Reviewed-by: Pruthi, Vineet <vineet.pruthi@t-systems.com>
Co-authored-by: Xiong, Chen Xiao <chenxiaoxiong@huawei.com>
Co-committed-by: Xiong, Chen Xiao <chenxiaoxiong@huawei.com>
2024-03-01 14:11:08 +00:00

3044 lines
124 KiB
JSON

[
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Service Overview",
"uri":"dataartsstudio_12_0001.html",
"doc_type":"usermanual",
"p_code":"",
"code":"1"
},
{
"desc":"Enterprises often face challenges in the following aspects when managing data:GovernanceInconsistent data system standards impact data exchange and sharing between differ",
"product_code":"dataartsstudio",
"title":"What Is DataArts Studio?",
"uri":"dataartsstudio_07_001.html",
"doc_type":"usermanual",
"p_code":"1",
"code":"2"
},
{
"desc":"A DataArts Studio instance is the minimum unit of compute resources provided for users. You can create, access, and manage multiple DataArts Studio instances at the same ",
"product_code":"dataartsstudio",
"title":"Basic Concepts",
"uri":"dataartsstudio_07_004.html",
"doc_type":"usermanual",
"p_code":"1",
"code":"3"
},
{
"desc":"DataArts Migration can help you seamlessly migrate batch data between 20+ homogeneous or heterogeneous data sources. You can use it to ingest data from both on-premises a",
"product_code":"dataartsstudio",
"title":"Functions",
"uri":"dataartsstudio_07_005.html",
"doc_type":"usermanual",
"p_code":"1",
"code":"4"
},
{
"desc":"DataArts Studio is a one-stop data operations platform that allows you to perform many operations, including integrating data from every domain and connecting data from d",
"product_code":"dataartsstudio",
"title":"Advantages",
"uri":"dataartsstudio_07_002.html",
"doc_type":"usermanual",
"p_code":"1",
"code":"5"
},
{
"desc":"You can use DataArts Studio to migrate offline data to the cloud and integrate the data into big data services. On the DataArts Studio management console, you can use the",
"product_code":"dataartsstudio",
"title":"Application Scenarios",
"uri":"dataartsstudio_07_003.html",
"doc_type":"usermanual",
"p_code":"1",
"code":"6"
},
{
"desc":"If you need to assign different permissions to employees in your enterprise to access your DataArts Studio resources, IAM is a good choice for fine-grained permissions ma",
"product_code":"dataartsstudio",
"title":"DataArts Studio Permissions Management",
"uri":"dataartsstudio_07_012.html",
"doc_type":"usermanual",
"p_code":"1",
"code":"7"
},
{
"desc":"A workspace member can be assigned the role of admin, developer, operator, or viewer. This topic describes the permissions of each role.Admin: Users with this role have t",
"product_code":"dataartsstudio",
"title":"DataArts Studio Permissions",
"uri":"dataartsstudio_07_013.html",
"doc_type":"usermanual",
"p_code":"1",
"code":"8"
},
{
"desc":"The following table lists the recommended browser for logging in to DataArts Studio.Browser compatibilityBrowser VersionDescriptionGoogle Chrome 93.x or laterRecommendedB",
"product_code":"dataartsstudio",
"title":"Constraints",
"uri":"dataartsstudio_07_006.html",
"doc_type":"usermanual",
"p_code":"1",
"code":"9"
},
{
"desc":"DataArts Studio uses Identity and Access Management (IAM) for authentication and authorization.DataArts Studio uses Cloud Trace Service (CTS) to audit users' non-query op",
"product_code":"dataartsstudio",
"title":"Related Services",
"uri":"dataartsstudio_07_007.html",
"doc_type":"usermanual",
"p_code":"1",
"code":"10"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Preparations",
"uri":"dataartsstudio_12_0002.html",
"doc_type":"usermanual",
"p_code":"",
"code":"11"
},
{
"desc":"To use DataArts Studio, create a a cloud platform account, create a DataArts Studio instance, and authorize a user to use DataArts Studio.For details about the preparatio",
"product_code":"dataartsstudio",
"title":"Preparations",
"uri":"dataartsstudio_01_0003.html",
"doc_type":"usermanual",
"p_code":"11",
"code":"12"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Creating DataArts Studio Instances",
"uri":"dataartsstudio_01_1028.html",
"doc_type":"usermanual",
"p_code":"11",
"code":"13"
},
{
"desc":"Only a cloud platform account users with the DAYU Administrator or Tenant Administrator permissions can create DataArts Studio instances or DataArts Studio incremental pa",
"product_code":"dataartsstudio",
"title":"Creating a DataArts Studio Basic Package",
"uri":"dataartsstudio_01_0115_0.html",
"doc_type":"usermanual",
"p_code":"13",
"code":"14"
},
{
"desc":"DataArts Studio provides basic and incremental packages. If the basic package cannot meet your requirements, you can create an incremental package. Before you create an i",
"product_code":"dataartsstudio",
"title":"(Optional) Creating a DataArts Studio Incremental Package",
"uri":"dataartsstudio_01_0119.html",
"doc_type":"usermanual",
"p_code":"13",
"code":"15"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Managing a Workspace",
"uri":"dataartsstudio_01_0011.html",
"doc_type":"usermanual",
"p_code":"11",
"code":"16"
},
{
"desc":"By default, a workspace will be automatically created after you create a DataArts Studio instance. You will be automatically assigned the admin role and can use the defau",
"product_code":"dataartsstudio",
"title":"Creating and Managing a Workspace",
"uri":"dataartsstudio_01_0116_0.html",
"doc_type":"usermanual",
"p_code":"16",
"code":"17"
},
{
"desc":"By default, job logs and Data Lake Insight (DLI) dirty data are stored in an Object Storage Service (OBS) bucket named dlf-log-{Project ID}. You can customize a log stora",
"product_code":"dataartsstudio",
"title":"(Optional) Changing the Job Log Storage Path",
"uri":"dataartsstudio_01_0530.html",
"doc_type":"usermanual",
"p_code":"16",
"code":"18"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Authorizing Users to Use DataArts Studio",
"uri":"dataartsstudio_01_0118_0.html",
"doc_type":"usermanual",
"p_code":"11",
"code":"19"
},
{
"desc":"Identity and Access Management (IAM) can be used for fine-grained permissions management on your DataArts Studio resources. With IAM, you can:Create IAM users for employe",
"product_code":"dataartsstudio",
"title":"Creating an IAM User and Assigning DataArts Studio Permissions",
"uri":"dataartsstudio_01_0004.html",
"doc_type":"usermanual",
"p_code":"19",
"code":"20"
},
{
"desc":"If you want to allow another IAM user to use your DataArts Studio instance, create an IAM user by referring to Creating an IAM User and Assigning DataArts Studio Permissi",
"product_code":"dataartsstudio",
"title":"Adding a Member and Assigning a Role",
"uri":"dataartsstudio_01_0117_0.html",
"doc_type":"usermanual",
"p_code":"19",
"code":"21"
},
{
"desc":"When creating OBS links, making API calls, or locating issues, you may need to obtain information such as access keys, project IDs, and endpoints. This section describes ",
"product_code":"dataartsstudio",
"title":"(Optional) Obtaining Authentication Information",
"uri":"dataartsstudio_01_0006.html",
"doc_type":"usermanual",
"p_code":"11",
"code":"22"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"User Guide",
"uri":"dataartsstudio_12_0004.html",
"doc_type":"usermanual",
"p_code":"",
"code":"23"
},
{
"desc":"Before using DataArts Studio, you must conduct data and business surveys and select an appropriate data governance model.Then, make the following preparations by referrin",
"product_code":"dataartsstudio",
"title":"Preparations Before Using DataArts Studio",
"uri":"dataartsstudio_01_0134.html",
"doc_type":"usermanual",
"p_code":"23",
"code":"24"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Management Center",
"uri":"dataartsstudio_01_0008.html",
"doc_type":"usermanual",
"p_code":"23",
"code":"25"
},
{
"desc":"Before using DataArts Studio, select a cloud service or data warehouse as the data lake. The data lake stores raw data and data generated during data governance and serve",
"product_code":"dataartsstudio",
"title":"Data Sources",
"uri":"dataartsstudio_01_0005.html",
"doc_type":"usermanual",
"p_code":"25",
"code":"26"
},
{
"desc":"You can create data connections by configuring data sources. Based on the data connections of the Management Center, DataArts Studio performs data development, governance",
"product_code":"dataartsstudio",
"title":"Creating Data Connections",
"uri":"dataartsstudio_01_0009.html",
"doc_type":"usermanual",
"p_code":"25",
"code":"27"
},
{
"desc":"To migrate resources in one workspace to another, you can use the resource migration function provided by DataArts Studio.The resources that can be migrated include the d",
"product_code":"dataartsstudio",
"title":"Migrating Resources",
"uri":"dataartsstudio_01_0010.html",
"doc_type":"usermanual",
"p_code":"25",
"code":"28"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Tutorials",
"uri":"dataartsstudio_01_0350.html",
"doc_type":"usermanual",
"p_code":"25",
"code":"29"
},
{
"desc":"This section describes how to create an MRS Hive connection between DataArts Studio and the data lake base.You have created a data lake to connect, for example, a databas",
"product_code":"dataartsstudio",
"title":"Creating an MRS Hive Connection",
"uri":"dataartsstudio_01_0351.html",
"doc_type":"usermanual",
"p_code":"29",
"code":"30"
},
{
"desc":"This section describes how to create a DWS connection between DataArts Studio and the data lake base.You have created a data lake to connect, for example, a database or c",
"product_code":"dataartsstudio",
"title":"Creating a DWS Connection",
"uri":"dataartsstudio_01_0352.html",
"doc_type":"usermanual",
"p_code":"29",
"code":"31"
},
{
"desc":"This section describes how to create a MySQL connection between DataArts Studio and the data lake base.You have created a data lake to connect, for example, a database or",
"product_code":"dataartsstudio",
"title":"Creating a MySQL Connection",
"uri":"dataartsstudio_01_0353.html",
"doc_type":"usermanual",
"p_code":"29",
"code":"32"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"DataArts Migration",
"uri":"dataartsstudio_01_0012.html",
"doc_type":"usermanual",
"p_code":"23",
"code":"33"
},
{
"desc":"DataArts Migration is an efficient and easy-to-use data integration service. Based on the big data migration to the cloud and intelligent data lake solutions, CDM provide",
"product_code":"dataartsstudio",
"title":"Overview",
"uri":"dataartsstudio_01_0013.html",
"doc_type":"usermanual",
"p_code":"33",
"code":"34"
},
{
"desc":"You cannot modify the flavor of an existing cluster. If you require a higher flavor, create a cluster with your desired flavor.Arm CDM clusters do not support agents. The",
"product_code":"dataartsstudio",
"title":"Constraints",
"uri":"dataartsstudio_01_0015.html",
"doc_type":"usermanual",
"p_code":"33",
"code":"35"
},
{
"desc":"CDM provides the following migration modes which support different data sources:Table/File migration in the import of data into a data lake or migration of data to the cl",
"product_code":"dataartsstudio",
"title":"Supported Data Sources",
"uri":"dataartsstudio_01_0014.html",
"doc_type":"usermanual",
"p_code":"33",
"code":"36"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Managing Clusters",
"uri":"dataartsstudio_01_0017.html",
"doc_type":"usermanual",
"p_code":"33",
"code":"37"
},
{
"desc":"CDM provides independent clusters for secure and reliable data migration. Clusters are isolated from each other and cannot access each other.CDM clusters can be used in t",
"product_code":"dataartsstudio",
"title":"Creating a CDM Cluster",
"uri":"dataartsstudio_01_0576.html",
"doc_type":"usermanual",
"p_code":"37",
"code":"38"
},
{
"desc":"After creating a CDM cluster, you can bind an EIP to or unbind an EIP from the cluster.If CDM needs to access a local or Internet data source, or a cloud service in anoth",
"product_code":"dataartsstudio",
"title":"Binding or Unbinding an EIP",
"uri":"dataartsstudio_01_0020.html",
"doc_type":"usermanual",
"p_code":"37",
"code":"39"
},
{
"desc":"After modifying some configurations (for example, disabling user isolation), you must restart the cluster to make the modification take effect.You have created a CDM clus",
"product_code":"dataartsstudio",
"title":"Restarting a Cluster",
"uri":"dataartsstudio_01_0578.html",
"doc_type":"usermanual",
"p_code":"37",
"code":"40"
},
{
"desc":"You can delete a CDM cluster that you no longer use.After a CDM cluster is deleted, the cluster and its data are destroyed and cannot be restored. Exercise caution when p",
"product_code":"dataartsstudio",
"title":"Deleting a Cluster",
"uri":"dataartsstudio_01_0579.html",
"doc_type":"usermanual",
"p_code":"37",
"code":"41"
},
{
"desc":"This section describes how to obtain cluster logs to view the job running history and locate job failure causes.You have created a CDM cluster.The Source column is displa",
"product_code":"dataartsstudio",
"title":"Downloading Cluster Logs",
"uri":"dataartsstudio_01_0022.html",
"doc_type":"usermanual",
"p_code":"37",
"code":"42"
},
{
"desc":"After creating a CDM cluster, you can view its basic information and modify its configurations.You can view the following basic cluster information:Cluster information: c",
"product_code":"dataartsstudio",
"title":"Viewing Basic Cluster Information and Modifying Cluster Configurations",
"uri":"dataartsstudio_01_0021.html",
"doc_type":"usermanual",
"p_code":"37",
"code":"43"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Viewing Metrics",
"uri":"dataartsstudio_01_0121.html",
"doc_type":"usermanual",
"p_code":"37",
"code":"44"
},
{
"desc":"You have obtained required Cloud Eye permissions.This section describes metrics reported by CDM to Cloud Eye as well as their namespaces and dimensions. You can use APIs ",
"product_code":"dataartsstudio",
"title":"CDM Metrics",
"uri":"dataartsstudio_01_0122.html",
"doc_type":"usermanual",
"p_code":"44",
"code":"45"
},
{
"desc":"Set the alarm rules to customize the monitored objects and notification policies. Then, learn CDM running status in a timely manner.A CDM alarm rule includes the alarm ru",
"product_code":"dataartsstudio",
"title":"Configuring Alarm Rules",
"uri":"dataartsstudio_01_0123.html",
"doc_type":"usermanual",
"p_code":"44",
"code":"46"
},
{
"desc":"You can use Cloud Eye to monitor the running status of a CDM cluster. You can view the monitoring metrics on the Cloud Eye console.Monitored data takes some time for tran",
"product_code":"dataartsstudio",
"title":"Querying Metrics",
"uri":"dataartsstudio_01_0124.html",
"doc_type":"usermanual",
"p_code":"44",
"code":"47"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Managing Links",
"uri":"dataartsstudio_01_0023.html",
"doc_type":"usermanual",
"p_code":"33",
"code":"48"
},
{
"desc":"Before creating a data migration job, create a link to enable the CDM cluster to read data from and write data to a data source. A migration job requires a source link an",
"product_code":"dataartsstudio",
"title":"Creating Links",
"uri":"dataartsstudio_01_0024.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"49"
},
{
"desc":"The Java Database Connectivity (JDBC) provides programmatic access to relational databases. Applications can execute SQL statements and retrieve data using the JDBC API.B",
"product_code":"dataartsstudio",
"title":"Managing Drivers",
"uri":"dataartsstudio_01_0132.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"50"
},
{
"desc":"If your data is stored in HDFS or a relational database, you can deploy an agent on the source network. CDM pulls data from your internal data sources through an agent bu",
"product_code":"dataartsstudio",
"title":"Managing Agents",
"uri":"dataartsstudio_01_0128.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"51"
},
{
"desc":"On the Cluster Configurations page, you can create, edit, or delete Hadoop cluster configurations.When creating a Hadoop link, the Hadoop cluster configurations can simpl",
"product_code":"dataartsstudio",
"title":"Managing Cluster Configurations",
"uri":"dataartsstudio_01_1096.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"52"
},
{
"desc":"Common relational databases include: Data Warehouse Service (DWS), RDS for MySQL, RDS for PostgreSQL, RDS for SQL Server, PostgreSQL, Microsoft SQL Server, IBM Db2, and S",
"product_code":"dataartsstudio",
"title":"Link to a Common Relational Database",
"uri":"dataartsstudio_01_0044.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"53"
},
{
"desc":"Sharding refers to the link to multiple backend data sources at the same time. The link can be used as the job source to migrate data from multiple data sources to other ",
"product_code":"dataartsstudio",
"title":"Link to a Database Shard",
"uri":"dataartsstudio_01_1214.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"54"
},
{
"desc":"Table 1 lists the parameters for a link to a MySQL database.",
"product_code":"dataartsstudio",
"title":"Link to a MySQL Database",
"uri":"dataartsstudio_01_1211.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"55"
},
{
"desc":"Table 1 lists the parameters for a link to an Oracle database.Parameters for a link to an Oracle databaseParameterDescriptionExample ValueNameLink name, which should be d",
"product_code":"dataartsstudio",
"title":"Link to an Oracle Database",
"uri":"dataartsstudio_01_1212.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"56"
},
{
"desc":"When connecting CDM to DLI, configure the parameters as described in Table 1.",
"product_code":"dataartsstudio",
"title":"Link to DLI",
"uri":"dataartsstudio_01_0036.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"57"
},
{
"desc":"CDM supports the following Hive data sources:MRS HiveFusionInsight HiveApache HiveMRS HiveFusionInsight HiveApache HiveYou can view a table during field mapping only when",
"product_code":"dataartsstudio",
"title":"Link to Hive",
"uri":"dataartsstudio_01_0026.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"58"
},
{
"desc":"CDM supports the following HBase data sources:MRS HBaseFusionInsight HBaseApache HBaseMRS HBaseFusionInsight HBaseApache HBaseWhen connecting CDM to HBase of MRS, configu",
"product_code":"dataartsstudio",
"title":"Link to HBase",
"uri":"dataartsstudio_01_0039.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"59"
},
{
"desc":"CDM supports the following HDFS data sources:MRS HDFSFusionInsight HDFSApache HDFSMRS HDFSFusionInsight HDFSApache HDFSWhen connecting CDM to HDFS of MRS, configure the p",
"product_code":"dataartsstudio",
"title":"Link to HDFS",
"uri":"dataartsstudio_01_0040.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"60"
},
{
"desc":"When connecting CDM to the destination OBS bucket, you need to add the read and write permissions to the destination OBS bucket, and file authentication is not required.W",
"product_code":"dataartsstudio",
"title":"Link to OBS",
"uri":"dataartsstudio_01_0045.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"61"
},
{
"desc":"The FTP/SFTP link is used to migrate files from the on-premises file server or ECS to database.Only FTP servers running Linux are supported.When connecting CDM to an FTP ",
"product_code":"dataartsstudio",
"title":"Link to an FTP or SFTP Server",
"uri":"dataartsstudio_01_0028.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"62"
},
{
"desc":"The Redis link is applicable to data migration of Redis created in the local data center or ECS. It is used to load data in the database or files to Redis.The DCS link is",
"product_code":"dataartsstudio",
"title":"Link to Redis/DCS",
"uri":"dataartsstudio_01_0032.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"63"
},
{
"desc":"The DDS link is used to synchronize data from Document Database Service (DDS) on cloud to a big data platform.When connecting CDM to DDS, configure the parameters as desc",
"product_code":"dataartsstudio",
"title":"Link to DDS",
"uri":"dataartsstudio_01_0031.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"64"
},
{
"desc":"When connecting CDM to CloudTable, configure the parameters as described in Table 1.Click Show Advanced Attributes, and then click Add to add configuration attributes of ",
"product_code":"dataartsstudio",
"title":"Link to CloudTable",
"uri":"dataartsstudio_01_0027.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"65"
},
{
"desc":"When connecting CDM to CloudTable OpenTSDB, configure the parameters as described in Table 1.",
"product_code":"dataartsstudio",
"title":"Link to CloudTable OpenTSDB",
"uri":"dataartsstudio_01_0037.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"66"
},
{
"desc":"This link is used to transfer data from a third-party cloud MongoDB service or MongoDB created in the on-premises data center or ECS to a big data platform.When connectin",
"product_code":"dataartsstudio",
"title":"Link to MongoDB",
"uri":"dataartsstudio_01_0030.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"67"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Link to Cassandra",
"uri":"dataartsstudio_01_004501.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"68"
},
{
"desc":"When connecting CDM to Kafka of MRS, configure the parameters as described in Table 1.Click Show Advanced Attributes, and then click Add to add configuration attributes o",
"product_code":"dataartsstudio",
"title":"Link to Kafka",
"uri":"dataartsstudio_01_0033.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"69"
},
{
"desc":"When connecting CDM to DMS Kafka, configure the parameters as described in Table 1.",
"product_code":"dataartsstudio",
"title":"Link to DMS Kafka",
"uri":"dataartsstudio_01_0038.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"70"
},
{
"desc":"The Elasticsearch link is applicable to data migration of Elasticsearch services and Elasticsearch created in the local data center or ECS.The Elasticsearch connector sup",
"product_code":"dataartsstudio",
"title":"Link to Elasticsearch/CSS",
"uri":"dataartsstudio_01_0035.html",
"doc_type":"usermanual",
"p_code":"48",
"code":"71"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Managing Jobs",
"uri":"dataartsstudio_01_0081.html",
"doc_type":"usermanual",
"p_code":"33",
"code":"72"
},
{
"desc":"CDM supports table and file migration between homogeneous or heterogeneous data sources. For details about supported data sources, see Data Sources Supported by Table/Fil",
"product_code":"dataartsstudio",
"title":"Table/File Migration Jobs",
"uri":"dataartsstudio_01_0046.html",
"doc_type":"usermanual",
"p_code":"72",
"code":"73"
},
{
"desc":"CDM supports entire DB migration between homogeneous and heterogeneous data sources. The migration principles are the same as those in Table/File Migration Jobs. Each typ",
"product_code":"dataartsstudio",
"title":"Creating an Entire Database Migration Job",
"uri":"dataartsstudio_01_0075.html",
"doc_type":"usermanual",
"p_code":"72",
"code":"74"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Source Job Parameters",
"uri":"dataartsstudio_01_0047.html",
"doc_type":"usermanual",
"p_code":"72",
"code":"75"
},
{
"desc":"If the source link of a job is the Link to OBS, configure the source job parameters based on Table 1.Advanced attributes are optional and not displayed by default. You ca",
"product_code":"dataartsstudio",
"title":"From OBS",
"uri":"dataartsstudio_01_0048.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"76"
},
{
"desc":"When the source link of a job is the Link to HDFS, that is, when data is exported from MRS HDFS, FusionInsight HDFS, or Apache HDFS, configure the source job parameters b",
"product_code":"dataartsstudio",
"title":"From HDFS",
"uri":"dataartsstudio_01_0049.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"77"
},
{
"desc":"When the source link of a job is the Link to HBase or Link to CloudTable, that is, when data is exported from MRS HBase, FusionInsight HBase, CloudTable, or Apache HBase,",
"product_code":"dataartsstudio",
"title":"From HBase/CloudTable",
"uri":"dataartsstudio_01_0050.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"78"
},
{
"desc":"If the source link of a job is the Link to Hive, configure the source job parameters based on Table 1.If the data source is Hive, CDM will automatically partition data us",
"product_code":"dataartsstudio",
"title":"From Hive",
"uri":"dataartsstudio_01_0051.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"79"
},
{
"desc":"If the source link of a job is the Link to DLI, configure the source job parameters based on Table 1.",
"product_code":"dataartsstudio",
"title":"From DLI",
"uri":"dataartsstudio_01_0120.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"80"
},
{
"desc":"If the source link of a job is the Link to an FTP or SFTP Server, configure the source job parameters based on Table 1.Advanced attributes are optional and not displayed ",
"product_code":"dataartsstudio",
"title":"From FTP/SFTP",
"uri":"dataartsstudio_01_0052.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"81"
},
{
"desc":"When the source link of a job is the HTTP link, configure the source job parameters based on Table 1. Currently, data can only be exported from the HTTP URLs.",
"product_code":"dataartsstudio",
"title":"From HTTP",
"uri":"dataartsstudio_01_0053.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"82"
},
{
"desc":"Common relational databases that can serve as the source include GaussDB(DWS), RDS for MySQL, RDS for PostgreSQL, RDS for SQL Server, FusionInsight LibrA, PostgreSQL, Mi",
"product_code":"dataartsstudio",
"title":"From a Common Relational Database",
"uri":"dataartsstudio_01_0054.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"83"
},
{
"desc":"If the source link of a job is the Link to a MySQL Database, configure the source job parameters based on Table 1.In a migration from MySQL to DWS, the constraints on the",
"product_code":"dataartsstudio",
"title":"From MySQL",
"uri":"dataartsstudio_01_1254.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"84"
},
{
"desc":"If the source link of a job is the Link to an Oracle Database, configure the source job parameters based on Table 1.When an Oracle database is the migration source, if Pa",
"product_code":"dataartsstudio",
"title":"From Oracle",
"uri":"dataartsstudio_01_1255.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"85"
},
{
"desc":"If the source link of a job is the Link to a Database Shard, configure the source job parameters based on Table 1.If the Source Link Name is the backend link of the shard",
"product_code":"dataartsstudio",
"title":"From a Database Shard",
"uri":"dataartsstudio_01_1256.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"86"
},
{
"desc":"When you migrate MongoDB or DDS data, CDM reads the first row of the collection as an example of the field list. If the first row of data does not contain all fields of t",
"product_code":"dataartsstudio",
"title":"From MongoDB/DDS",
"uri":"dataartsstudio_01_0055.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"87"
},
{
"desc":"Because DCS restricts the commands for obtaining keys, it cannot serve as the migration source but can be the migration destination. The Redis service of the third-party ",
"product_code":"dataartsstudio",
"title":"From Redis",
"uri":"dataartsstudio_01_0056.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"88"
},
{
"desc":"If the source link of a job is the Link to Kafka or Link to DMS Kafka, configure the source job parameters based on Table 1.",
"product_code":"dataartsstudio",
"title":"From Kafka/DMS Kafka",
"uri":"dataartsstudio_01_0058.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"89"
},
{
"desc":"If the source link of a job is the Link to Elasticsearch/CSS, configure the source job parameters based on Table 1.On the Map Field page, you can set custom fields for th",
"product_code":"dataartsstudio",
"title":"From Elasticsearch or CSS",
"uri":"dataartsstudio_01_0059.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"90"
},
{
"desc":"If the source link of a job is the Link to CloudTable OpenTSDB, configure the source job parameters based on Table 1.",
"product_code":"dataartsstudio",
"title":"From OpenTSDB",
"uri":"dataartsstudio_01_0060.html",
"doc_type":"usermanual",
"p_code":"75",
"code":"91"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Destination Job Parameters",
"uri":"dataartsstudio_01_0061.html",
"doc_type":"usermanual",
"p_code":"72",
"code":"92"
},
{
"desc":"If the destination link of a job is the Link to OBS, configure the destination job parameters based on Table 1.Advanced attributes are optional and not displayed by defau",
"product_code":"dataartsstudio",
"title":"To OBS",
"uri":"dataartsstudio_01_0062.html",
"doc_type":"usermanual",
"p_code":"92",
"code":"93"
},
{
"desc":"If the destination link of a job is one of them listed in Link to HDFS, configure the destination job parameters based on Table 1.HDFS supports the UTF-8 encoding only. R",
"product_code":"dataartsstudio",
"title":"To HDFS",
"uri":"dataartsstudio_01_0063.html",
"doc_type":"usermanual",
"p_code":"92",
"code":"94"
},
{
"desc":"If the destination link of a job is one of them listed in Link to HBase or Link to CloudTable, configure the destination job parameters based on Table 1.",
"product_code":"dataartsstudio",
"title":"To HBase/CloudTable",
"uri":"dataartsstudio_01_0064.html",
"doc_type":"usermanual",
"p_code":"92",
"code":"95"
},
{
"desc":"If the destination link of a job is the Link to Hive, configure the destination job parameters based on Table 1.When Hive serves as the destination end, a table whose sto",
"product_code":"dataartsstudio",
"title":"To Hive",
"uri":"dataartsstudio_01_0066.html",
"doc_type":"usermanual",
"p_code":"92",
"code":"96"
},
{
"desc":"Common relational databases serving as the destination include RDS for MySQL, RDS for SQL Server, and RDS for PostgreSQL.To import data to the preceding data sources, con",
"product_code":"dataartsstudio",
"title":"To a Common Relational Database",
"uri":"dataartsstudio_01_0068.html",
"doc_type":"usermanual",
"p_code":"92",
"code":"97"
},
{
"desc":"If the destination link of a job is a DWS link, configure the destination job parameters based on Table 1.Figure 1 describes the field mapping between DWS tables created ",
"product_code":"dataartsstudio",
"title":"To DWS",
"uri":"dataartsstudio_01_1251.html",
"doc_type":"usermanual",
"p_code":"92",
"code":"98"
},
{
"desc":"If the destination link of a job is the Link to DDS, configure the destination job parameters based on Table 1.Parameter descriptionParameterDescriptionExample ValueDatab",
"product_code":"dataartsstudio",
"title":"To DDS",
"uri":"dataartsstudio_01_0069.html",
"doc_type":"usermanual",
"p_code":"92",
"code":"99"
},
{
"desc":"If the data is imported to DCS, configure the destination job parameters based on Table 1.Parameter descriptionParameterDescriptionExample ValueRedis Key PrefixKey prefix",
"product_code":"dataartsstudio",
"title":"To DCS",
"uri":"dataartsstudio_01_0070.html",
"doc_type":"usermanual",
"p_code":"92",
"code":"100"
},
{
"desc":"If the destination link of a job is the Link to Elasticsearch/CSS, that is, when data is imported to CSS, configure the destination job parameters based on Table 1.",
"product_code":"dataartsstudio",
"title":"To CSS",
"uri":"dataartsstudio_01_0071.html",
"doc_type":"usermanual",
"p_code":"92",
"code":"101"
},
{
"desc":"If the destination link of a job is the Link to DLI, configure the destination job parameters based on Table 1.",
"product_code":"dataartsstudio",
"title":"To DLI",
"uri":"dataartsstudio_01_0072.html",
"doc_type":"usermanual",
"p_code":"92",
"code":"102"
},
{
"desc":"If the destination link of a job is the Link to CloudTable OpenTSDB, configure the destination job parameters based on Table 1.",
"product_code":"dataartsstudio",
"title":"To OpenTSDB",
"uri":"dataartsstudio_01_0074.html",
"doc_type":"usermanual",
"p_code":"92",
"code":"103"
},
{
"desc":"CDM supports scheduled execution of table/file migration jobs by minute, hour, day, week, and month. This section describes how to configure scheduled job parameters.When",
"product_code":"dataartsstudio",
"title":"Scheduling Job Execution",
"uri":"dataartsstudio_01_0082.html",
"doc_type":"usermanual",
"p_code":"72",
"code":"104"
},
{
"desc":"On the Settings tab page, you can perform the following operations:Maximum Concurrent Extractors of JobsScheduled Backup and Restoration of CDM JobsEnvironment Variables ",
"product_code":"dataartsstudio",
"title":"Job Configuration Management",
"uri":"dataartsstudio_01_0083.html",
"doc_type":"usermanual",
"p_code":"72",
"code":"105"
},
{
"desc":"Existing CDM jobs can be viewed, modified, deleted, started, and stopped. This section describes how to view and modify a job.Viewing job statusThe job status can be New,",
"product_code":"dataartsstudio",
"title":"Managing a Single Job",
"uri":"dataartsstudio_01_0084.html",
"doc_type":"usermanual",
"p_code":"72",
"code":"106"
},
{
"desc":"This section describes how to manage CDM table/file migration jobs in batches. The following operations are involved:Manage jobs by group.Run jobs in batches.Delete jobs ",
"product_code":"dataartsstudio",
"title":"Managing Jobs in Batches",
"uri":"dataartsstudio_01_0085.html",
"doc_type":"usermanual",
"p_code":"72",
"code":"107"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Auditing",
"uri":"dataartsstudio_01_0125.html",
"doc_type":"usermanual",
"p_code":"33",
"code":"108"
},
{
"desc":"CTS provides records of operations on cloud service resources. With CTS, you can query, audit, and backtrack those operations.",
"product_code":"dataartsstudio",
"title":"Key CDM Operations Recorded by CTS",
"uri":"dataartsstudio_01_0126.html",
"doc_type":"usermanual",
"p_code":"108",
"code":"109"
},
{
"desc":"After you enable CTS, the system starts to record the CDM operations. The management console of CTS stores the traces of the latest seven days.This section describes how ",
"product_code":"dataartsstudio",
"title":"Viewing Traces",
"uri":"dataartsstudio_01_0127.html",
"doc_type":"usermanual",
"p_code":"108",
"code":"110"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Tutorials",
"uri":"dataartsstudio_01_0086.html",
"doc_type":"usermanual",
"p_code":"33",
"code":"111"
},
{
"desc":"MRS Hive links are applicable to the MapReduce Service (MRS). This tutorial describes how to create an MRS Hive link.You have created a CDM cluster.You have obtained the ",
"product_code":"dataartsstudio",
"title":"Creating an MRS Hive Link",
"uri":"dataartsstudio_01_0130.html",
"doc_type":"usermanual",
"p_code":"111",
"code":"112"
},
{
"desc":"MySQL links are applicable to third-party cloud MySQL services and MySQL created in a local data center or ECS. This tutorial describes how to create a MySQL link.You hav",
"product_code":"dataartsstudio",
"title":"Creating a MySQL Link",
"uri":"dataartsstudio_01_0131.html",
"doc_type":"usermanual",
"p_code":"111",
"code":"113"
},
{
"desc":"MRS provides enterprise-level big data clusters on the cloud. It contains HDFS, Hive, and Spark components and is applicable to massive data analysis of enterprises.Hive ",
"product_code":"dataartsstudio",
"title":"Migrating Data from MySQL to MRS Hive",
"uri":"dataartsstudio_01_0092.html",
"doc_type":"usermanual",
"p_code":"111",
"code":"114"
},
{
"desc":"CDM supports table-to-OBS data migration. This section describes how to migrate tables from a MySQL database to OBS. The process is as follows:Creating a CDM Cluster and ",
"product_code":"dataartsstudio",
"title":"Migrating Data from MySQL to OBS",
"uri":"dataartsstudio_01_0100.html",
"doc_type":"usermanual",
"p_code":"111",
"code":"115"
},
{
"desc":"CDM supports table-to-table data migration. This section describes how to migrate data from MySQL to DWS. The process is as follows:Creating a CDM Cluster and Binding an ",
"product_code":"dataartsstudio",
"title":"Migrating Data from MySQL to DWS",
"uri":"dataartsstudio_01_0101.html",
"doc_type":"usermanual",
"p_code":"111",
"code":"116"
},
{
"desc":"This section describes how to migrate the entire on-premises MySQL database to RDS using the CDM's entire DB migration function.Currently, CDM can migrate the entire on-p",
"product_code":"dataartsstudio",
"title":"Migrating an Entire MySQL Database to RDS",
"uri":"dataartsstudio_01_0098.html",
"doc_type":"usermanual",
"p_code":"111",
"code":"117"
},
{
"desc":"Cloud Search Service provides users with structured and unstructured data search, statistics, and report capabilities. This section describes how to use CDM to migrate da",
"product_code":"dataartsstudio",
"title":"Migrating Data from Oracle to CSS",
"uri":"dataartsstudio_01_0091.html",
"doc_type":"usermanual",
"p_code":"111",
"code":"118"
},
{
"desc":"CDM supports table-to-table migration. This section describes how to use CDM to migrate data from Oracle to Data Warehouse Service (DWS). The procedure is as follows:Crea",
"product_code":"dataartsstudio",
"title":"Migrating Data from Oracle to DWS",
"uri":"dataartsstudio_01_0133.html",
"doc_type":"usermanual",
"p_code":"111",
"code":"119"
},
{
"desc":"CDM supports data migration between cloud services. This section describes how to use CDM to migrate data from OBS to CSS. The procedure is as follows:Creating a CDM Clus",
"product_code":"dataartsstudio",
"title":"Migrating Data from OBS to CSS",
"uri":"dataartsstudio_01_0088.html",
"doc_type":"usermanual",
"p_code":"111",
"code":"120"
},
{
"desc":"DLI is a fully hosted big data query service. This section describes how to use CDM to migrate data from OBS to DLI. The procedure includes four steps:Creating a CDM Clus",
"product_code":"dataartsstudio",
"title":"Migrating Data from OBS to DLI",
"uri":"dataartsstudio_01_0089.html",
"doc_type":"usermanual",
"p_code":"111",
"code":"121"
},
{
"desc":"CDM supports file-to-file data migration. This section describes how to migrate data from MRS HDFS to OBS. The process is as follows:Creating a CDM Cluster and Binding an",
"product_code":"dataartsstudio",
"title":"Migrating Data from MRS HDFS to OBS",
"uri":"dataartsstudio_01_0103.html",
"doc_type":"usermanual",
"p_code":"111",
"code":"122"
},
{
"desc":"CSS provides users with structured and unstructured data search, statistics, and report capabilities. This section describes how to use CDM to migrate the entire Elastics",
"product_code":"dataartsstudio",
"title":"Migrating the Entire Elasticsearch Database to CSS",
"uri":"dataartsstudio_01_0099.html",
"doc_type":"usermanual",
"p_code":"111",
"code":"123"
},
{
"desc":"CDM allows you to migrate data from DDS to other data sources. This section describes how to use CDM to migrate data from DDS to DWS. The procedure includes four steps:Cr",
"product_code":"dataartsstudio",
"title":"Migrating Data from DDS to DWS",
"uri":"dataartsstudio_01_0087.html",
"doc_type":"usermanual",
"p_code":"111",
"code":"124"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Advanced Operations",
"uri":"dataartsstudio_01_0110.html",
"doc_type":"usermanual",
"p_code":"33",
"code":"125"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Incremental Migration",
"uri":"dataartsstudio_01_0111.html",
"doc_type":"usermanual",
"p_code":"125",
"code":"126"
},
{
"desc":"CDM supports incremental migration of file systems. After full migration is complete, all new files or only specified directories or files can be exported.Currently, CDM ",
"product_code":"dataartsstudio",
"title":"Incremental File Migration",
"uri":"dataartsstudio_01_0112.html",
"doc_type":"usermanual",
"p_code":"126",
"code":"127"
},
{
"desc":"CDM supports incremental migration of relational databases. After a full migration is complete, data in a specified period can be incrementally migrated. For example, dat",
"product_code":"dataartsstudio",
"title":"Incremental Migration of Relational Databases",
"uri":"dataartsstudio_01_0113.html",
"doc_type":"usermanual",
"p_code":"126",
"code":"128"
},
{
"desc":"During the creation of table/file migration jobs, CDM supports the macro variables of date and time in the following parameters of the source and destination links:Source",
"product_code":"dataartsstudio",
"title":"Using Macro Variables of Date and Time",
"uri":"dataartsstudio_01_0114.html",
"doc_type":"usermanual",
"p_code":"126",
"code":"129"
},
{
"desc":"You can use CDM to export data in a specified period of time from HBase (including MRS HBase, FusionInsight HBase, and Apache HBase) and CloudTable. The CDM scheduled job",
"product_code":"dataartsstudio",
"title":"HBase/CloudTable Incremental Migration",
"uri":"dataartsstudio_01_0115.html",
"doc_type":"usermanual",
"p_code":"126",
"code":"130"
},
{
"desc":"When a CDM job fails to be executed, CDM rolls back the data to the state before the job starts and automatically deletes data from the destination table.Parameter positi",
"product_code":"dataartsstudio",
"title":"Migration in Transaction Mode",
"uri":"dataartsstudio_01_0116.html",
"doc_type":"usermanual",
"p_code":"125",
"code":"131"
},
{
"desc":"When you migrate files to a file system, CDM can encrypt and decrypt those files. Currently, CDM supports the following encryption modes:AES-256-GCMKMS EncryptionAES-256-",
"product_code":"dataartsstudio",
"title":"Encryption and Decryption During File Migration",
"uri":"dataartsstudio_01_0117.html",
"doc_type":"usermanual",
"p_code":"125",
"code":"132"
},
{
"desc":"CDM extracts data from the migration source and writes the data to the migration destination. Figure 1 shows the migration mode when files are migrated to OBS.During the ",
"product_code":"dataartsstudio",
"title":"MD5 Verification",
"uri":"dataartsstudio_01_0118.html",
"doc_type":"usermanual",
"p_code":"125",
"code":"133"
},
{
"desc":"You can create a field converter on the Map Field page when creating a table/file migration job.Creating a field converterField mapping is not involved when the binary fo",
"product_code":"dataartsstudio",
"title":"Field Conversion",
"uri":"dataartsstudio_01_0104.html",
"doc_type":"usermanual",
"p_code":"125",
"code":"134"
},
{
"desc":"You can migrate files (a maximum of 50) with specified names from FTP, SFTP, or OBS at a time. The exported files can only be written to the same directory on the migrati",
"product_code":"dataartsstudio",
"title":"Migrating Files with Specified Names",
"uri":"dataartsstudio_01_0105.html",
"doc_type":"usermanual",
"p_code":"125",
"code":"135"
},
{
"desc":"During table/file migration, CDM uses delimiters to separate fields in CSV files. However, delimiters cannot be used in complex semi-structured data because the field val",
"product_code":"dataartsstudio",
"title":"Regular Expressions for Separating Semi-structured Text",
"uri":"dataartsstudio_01_0106.html",
"doc_type":"usermanual",
"p_code":"125",
"code":"136"
},
{
"desc":"When you create a job on the CDM console to migrate tables or files of a relational database, you can add a field to record the time when they were written to the databas",
"product_code":"dataartsstudio",
"title":"Recording the Time When Data Is Written to the Database",
"uri":"dataartsstudio_01_0109.html",
"doc_type":"usermanual",
"p_code":"125",
"code":"137"
},
{
"desc":"When creating a CDM job, you need to specify File Format in the job parameters of the migration source and destination in some scenarios. This section describes the appli",
"product_code":"dataartsstudio",
"title":"File Formats",
"uri":"dataartsstudio_01_0108.html",
"doc_type":"usermanual",
"p_code":"125",
"code":"138"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"DataArts Factory",
"uri":"dataartsstudio_01_0400.html",
"doc_type":"usermanual",
"p_code":"23",
"code":"139"
},
{
"desc":"DataArts Factory is a one-stop big data collaborative development platform that provides fully managed big data scheduling capabilities. It manages various big data servi",
"product_code":"dataartsstudio",
"title":"Overview",
"uri":"dataartsstudio_01_0401.html",
"doc_type":"usermanual",
"p_code":"139",
"code":"140"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Data Management",
"uri":"dataartsstudio_01_0403.html",
"doc_type":"usermanual",
"p_code":"139",
"code":"141"
},
{
"desc":"The data management function helps you quickly establish data models and provides you with data entities for script and job development. With data management, you can:Man",
"product_code":"dataartsstudio",
"title":"Data Management Process",
"uri":"dataartsstudio_01_0402.html",
"doc_type":"usermanual",
"p_code":"141",
"code":"142"
},
{
"desc":"After a data connection is created, you can perform data operations on DataArts Factory, for example, managing databases, namespaces, database schema, and tables.With one",
"product_code":"dataartsstudio",
"title":"Creating a Data Connection",
"uri":"dataartsstudio_01_0404.html",
"doc_type":"usermanual",
"p_code":"141",
"code":"143"
},
{
"desc":"After creating a data connection, you can create a database on the console or using a SQL script.(Recommended) Console: You can directly create a database on the DataArts",
"product_code":"dataartsstudio",
"title":"Creating a Database",
"uri":"dataartsstudio_01_0405.html",
"doc_type":"usermanual",
"p_code":"141",
"code":"144"
},
{
"desc":"After creating a DWS data connection, you can manage the database schemas under the DWS data connection.A DWS data connection has been created. For details, see Creating ",
"product_code":"dataartsstudio",
"title":"(Optional) Creating a Database Schema",
"uri":"dataartsstudio_01_0412.html",
"doc_type":"usermanual",
"p_code":"141",
"code":"145"
},
{
"desc":"You can create a table on the DataArts Factory console, in DDL mode, or using a SQL script.(Recommended) Console: You can directly create a table on the DataArts Studio D",
"product_code":"dataartsstudio",
"title":"Creating a Table",
"uri":"dataartsstudio_01_0416.html",
"doc_type":"usermanual",
"p_code":"141",
"code":"146"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Script Development",
"uri":"dataartsstudio_01_0421.html",
"doc_type":"usermanual",
"p_code":"139",
"code":"147"
},
{
"desc":"The script development function provides the following capabilities:Provides an online script editor for developing and debugging SQL, Python, and Shell scripts.Supports ",
"product_code":"dataartsstudio",
"title":"Script Development Process",
"uri":"dataartsstudio_01_0422.html",
"doc_type":"usermanual",
"p_code":"147",
"code":"148"
},
{
"desc":"DataArts Factory allows you to edit, debug, and run scripts online. You must create a script before developing it.Currently, you can create the following types of scripts",
"product_code":"dataartsstudio",
"title":"Creating a Script",
"uri":"dataartsstudio_01_0423.html",
"doc_type":"usermanual",
"p_code":"147",
"code":"149"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Developing Scripts",
"uri":"dataartsstudio_01_0406.html",
"doc_type":"usermanual",
"p_code":"147",
"code":"150"
},
{
"desc":"You can develop, debug, and run SQL scripts online. The developed scripts can be run in jobs. For details, see Developing a Job.A corresponding cloud service has been ena",
"product_code":"dataartsstudio",
"title":"Developing an SQL Script",
"uri":"dataartsstudio_01_0424.html",
"doc_type":"usermanual",
"p_code":"150",
"code":"151"
},
{
"desc":"You can develop, debug, and run shell scripts online. The developed scripts can be run in jobs. For details, see Developing a Job.A shell script has been added. For detai",
"product_code":"dataartsstudio",
"title":"Developing a Shell Script",
"uri":"dataartsstudio_01_0425.html",
"doc_type":"usermanual",
"p_code":"150",
"code":"152"
},
{
"desc":"You can develop, debug, and run Python scripts online. The developed scripts can be run in jobs. For details, see Developing a Job.A Python script has been added. For det",
"product_code":"dataartsstudio",
"title":"Developing a Python Script",
"uri":"dataartsstudio_01_4503.html",
"doc_type":"usermanual",
"p_code":"150",
"code":"153"
},
{
"desc":"This involves the version management and lock functions.Version management: traces script and job changes, and supports version comparison and rollback. The system retain",
"product_code":"dataartsstudio",
"title":"Submitting a Version and Unlocking the Script",
"uri":"dataartsstudio_01_0901.html",
"doc_type":"usermanual",
"p_code":"147",
"code":"154"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"(Optional) Managing Scripts",
"uri":"dataartsstudio_01_0407.html",
"doc_type":"usermanual",
"p_code":"147",
"code":"155"
},
{
"desc":"This section describes how to copy a script.A script has been developed. For details about how to develop scripts, see Developing Scripts.Log in to the DataArts Studio co",
"product_code":"dataartsstudio",
"title":"Copying a Script",
"uri":"dataartsstudio_01_0430.html",
"doc_type":"usermanual",
"p_code":"155",
"code":"156"
},
{
"desc":"You can copy the name of a script and rename a script.A script has been developed. For details about how to develop scripts, see Developing Scripts.Log in to the DataArts",
"product_code":"dataartsstudio",
"title":"Copying the Script Name and Renaming a Script",
"uri":"dataartsstudio_01_0426.html",
"doc_type":"usermanual",
"p_code":"155",
"code":"157"
},
{
"desc":"You can move a script file from one directory to another or move a script directory to another directory.A script has been developed. For details about how to develop scr",
"product_code":"dataartsstudio",
"title":"Moving a Script or Script Directory",
"uri":"dataartsstudio_01_0427.html",
"doc_type":"usermanual",
"p_code":"155",
"code":"158"
},
{
"desc":"You can export one or more script files from the script directory. The exported files store the latest content in the development state.Click in the script directory and",
"product_code":"dataartsstudio",
"title":"Exporting and Importing a Script",
"uri":"dataartsstudio_01_0428.html",
"doc_type":"usermanual",
"p_code":"155",
"code":"159"
},
{
"desc":"This section describes how to view the references of a script or all the scripts in a folder.A script has been developed. For details about how to develop scripts, see De",
"product_code":"dataartsstudio",
"title":"Viewing Script References",
"uri":"dataartsstudio_01_0471.html",
"doc_type":"usermanual",
"p_code":"155",
"code":"160"
},
{
"desc":"If you do not need to use a script any more, perform the following operations to delete it.When you delete a script, the system checks whether the script is being referen",
"product_code":"dataartsstudio",
"title":"Deleting a Script",
"uri":"dataartsstudio_01_0429.html",
"doc_type":"usermanual",
"p_code":"155",
"code":"161"
},
{
"desc":"DataArts Factory allows you to change the owner for scripts with a few clicks.Log in to the DataArts Studio console. Locate an instance and click Access. On the displayed",
"product_code":"dataartsstudio",
"title":"Changing the Script Owner",
"uri":"dataartsstudio_01_1102.html",
"doc_type":"usermanual",
"p_code":"155",
"code":"162"
},
{
"desc":"This section describes how to unlock scripts in batches.Log in to the DataArts Studio console. Locate an instance and click Access. On the displayed page, locate a worksp",
"product_code":"dataartsstudio",
"title":"Unlocking Scripts",
"uri":"dataartsstudio_01_1107.html",
"doc_type":"usermanual",
"p_code":"155",
"code":"163"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Job Development",
"uri":"dataartsstudio_01_0431.html",
"doc_type":"usermanual",
"p_code":"139",
"code":"164"
},
{
"desc":"The job development function provides the following capabilities:Provides a graphical designer that allows you to quickly build a data processing workflow by drag-and-dro",
"product_code":"dataartsstudio",
"title":"Job Development Process",
"uri":"dataartsstudio_01_0432.html",
"doc_type":"usermanual",
"p_code":"164",
"code":"165"
},
{
"desc":"A job is composed of one or more nodes that are performed collaboratively to complete data operations. Before developing a job, create a new one.Each workspace can hold a",
"product_code":"dataartsstudio",
"title":"Creating a Job",
"uri":"dataartsstudio_01_0434.html",
"doc_type":"usermanual",
"p_code":"164",
"code":"166"
},
{
"desc":"This section describes how to develop and configure a job.You have created a job. For details about how to create a job, see Creating a Job.You have locked the job. Other",
"product_code":"dataartsstudio",
"title":"Developing a Job",
"uri":"dataartsstudio_01_0435.html",
"doc_type":"usermanual",
"p_code":"164",
"code":"167"
},
{
"desc":"This section describes how to set up scheduling for an orchestrated job.If the processing mode of a job is batch processing, configure scheduling types for jobs. Three sc",
"product_code":"dataartsstudio",
"title":"Setting Up Scheduling for a Job",
"uri":"dataartsstudio_01_0470.html",
"doc_type":"usermanual",
"p_code":"164",
"code":"168"
},
{
"desc":"This involves the version management and lock functions.Version management: traces script and job changes, and supports version comparison and rollback. The system retain",
"product_code":"dataartsstudio",
"title":"Submitting a Version and Unlocking the Script",
"uri":"dataartsstudio_01_0902.html",
"doc_type":"usermanual",
"p_code":"164",
"code":"169"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"(Optional) Managing Jobs",
"uri":"dataartsstudio_01_0408.html",
"doc_type":"usermanual",
"p_code":"164",
"code":"170"
},
{
"desc":"This section describes how to copy a job.A job has been developed. For details about how to develop a job, see Developing a Job.Log in to the DataArts Studio console. Loc",
"product_code":"dataartsstudio",
"title":"Copying a Job",
"uri":"dataartsstudio_01_0440.html",
"doc_type":"usermanual",
"p_code":"170",
"code":"171"
},
{
"desc":"You can copy the name of a job and rename a job.A job has been developed. For details about how to develop a job, see Developing a Job.Log in to the DataArts Studio conso",
"product_code":"dataartsstudio",
"title":"Copying the Job Name and Renaming a Job",
"uri":"dataartsstudio_01_0436.html",
"doc_type":"usermanual",
"p_code":"170",
"code":"172"
},
{
"desc":"You can move a job file from one directory to another or move a job directory to another directory.A job has been developed. For details about how to develop a job, see D",
"product_code":"dataartsstudio",
"title":"Moving a Job or Job Directory",
"uri":"dataartsstudio_01_0437.html",
"doc_type":"usermanual",
"p_code":"170",
"code":"173"
},
{
"desc":"Exporting a job is to export the latest saved content in the development state.After a job is imported, the content in the development state is overwritten and a new vers",
"product_code":"dataartsstudio",
"title":"Exporting and Importing a Job",
"uri":"dataartsstudio_01_0438.html",
"doc_type":"usermanual",
"p_code":"170",
"code":"174"
},
{
"desc":"If you do not need to use a job any more, perform the following operations to delete it to reduce the quota usage of the job.Deleted jobs cannot be recovered. Exercise ca",
"product_code":"dataartsstudio",
"title":"Deleting a Job",
"uri":"dataartsstudio_01_0439.html",
"doc_type":"usermanual",
"p_code":"170",
"code":"175"
},
{
"desc":"DataArts Factory allows you to change the owner for jobs with a few clicks.Log in to the DataArts Studio console. Locate an instance and click Access. On the displayed pa",
"product_code":"dataartsstudio",
"title":"Changing the Job Owner",
"uri":"dataartsstudio_01_1101.html",
"doc_type":"usermanual",
"p_code":"170",
"code":"176"
},
{
"desc":"This section describes how to unlock jobs in batches.Log in to the DataArts Studio console. Locate an instance and click Access. On the displayed page, locate a workspace",
"product_code":"dataartsstudio",
"title":"Unlocking Jobs",
"uri":"dataartsstudio_01_1108.html",
"doc_type":"usermanual",
"p_code":"170",
"code":"177"
},
{
"desc":"The solution aims to provide users with convenient and systematic management operations and better meet service requirements and objectives. Each solution can contain one",
"product_code":"dataartsstudio",
"title":"Solution",
"uri":"dataartsstudio_01_0503.html",
"doc_type":"usermanual",
"p_code":"139",
"code":"178"
},
{
"desc":"This section describes how to view the execution history of scripts, jobs, and nodes over a week.This function depends on OBS buckets. For details about how to configure ",
"product_code":"dataartsstudio",
"title":"Execution History",
"uri":"dataartsstudio_01_1105.html",
"doc_type":"usermanual",
"p_code":"139",
"code":"179"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"O&M and Scheduling",
"uri":"dataartsstudio_01_0505.html",
"doc_type":"usermanual",
"p_code":"139",
"code":"180"
},
{
"desc":"Choose Monitoring > Overview. On the Overview page, you can view the statistics of job instances in charts. Currently, you can view four types of statistics:Today's Job I",
"product_code":"dataartsstudio",
"title":"Overview",
"uri":"dataartsstudio_01_0506.html",
"doc_type":"usermanual",
"p_code":"180",
"code":"181"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Monitoring a Job",
"uri":"dataartsstudio_01_0413.html",
"doc_type":"usermanual",
"p_code":"180",
"code":"182"
},
{
"desc":"In the batch processing mode, data is processed periodically in batches based on the job-level scheduling plan, which is used in scenarios with low real-time requirements",
"product_code":"dataartsstudio",
"title":"Monitoring a Batch Job",
"uri":"dataartsstudio_01_0508.html",
"doc_type":"usermanual",
"p_code":"182",
"code":"183"
},
{
"desc":"In the real-time processing mode, data is processed in real time, which is used in scenarios with high real-time performance. This type of job is a pipeline that consists",
"product_code":"dataartsstudio",
"title":"Monitoring a Real-Time Job",
"uri":"dataartsstudio_01_0509.html",
"doc_type":"usermanual",
"p_code":"182",
"code":"184"
},
{
"desc":"Each time a job is executed, a job instance record is generated. In the navigation pane of the DataArts Factory console, choose Monitoring. On the Monitor Instance page, ",
"product_code":"dataartsstudio",
"title":"Monitoring an Instance",
"uri":"dataartsstudio_01_0511.html",
"doc_type":"usermanual",
"p_code":"180",
"code":"185"
},
{
"desc":"In the navigation tree of the DataArts Factory console, choose MonitoringMonitor PatchData.On the PatchData Monitoring page, you can view the task status, service date, n",
"product_code":"dataartsstudio",
"title":"Monitoring PatchData",
"uri":"dataartsstudio_01_0512.html",
"doc_type":"usermanual",
"p_code":"180",
"code":"186"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Managing Notifications",
"uri":"dataartsstudio_01_0414.html",
"doc_type":"usermanual",
"p_code":"180",
"code":"187"
},
{
"desc":"You can configure DLF to notify you of job success after it is performed.Before configuring a notification for a job:Message notification has been enabled and a topic has",
"product_code":"dataartsstudio",
"title":"Managing a Notification",
"uri":"dataartsstudio_01_0514.html",
"doc_type":"usermanual",
"p_code":"187",
"code":"188"
},
{
"desc":"Notifications can be set to specified personnel by day, week, or month, allowing related personnel to regularly understand job scheduling information about the quantity o",
"product_code":"dataartsstudio",
"title":"Cycle Overview",
"uri":"dataartsstudio_01_0515.html",
"doc_type":"usermanual",
"p_code":"187",
"code":"189"
},
{
"desc":"You can back up all jobs, scripts, resources, and environment variables on a daily basis.You can also restore assets that have been backed up, including jobs, scripts, re",
"product_code":"dataartsstudio",
"title":"Managing Backups",
"uri":"dataartsstudio_01_0516.html",
"doc_type":"usermanual",
"p_code":"180",
"code":"190"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Configuration and Management",
"uri":"dataartsstudio_01_0517.html",
"doc_type":"usermanual",
"p_code":"139",
"code":"191"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Configuring Resources",
"uri":"dataartsstudio_01_0510.html",
"doc_type":"usermanual",
"p_code":"191",
"code":"192"
},
{
"desc":"This topic describes how to configure and use environment variables.Configure job parameters. If a parameter belongs to multiple jobs, you can extract this parameter as a",
"product_code":"dataartsstudio",
"title":"Configuring Environment Variables",
"uri":"dataartsstudio_01_0504.html",
"doc_type":"usermanual",
"p_code":"192",
"code":"193"
},
{
"desc":"The execution history of scripts, jobs, and nodes is stored in OBS buckets. If no OBS bucket is available, you cannot view the execution history. This section describes h",
"product_code":"dataartsstudio",
"title":"Configuring an OBS Bucket",
"uri":"dataartsstudio_01_1106.html",
"doc_type":"usermanual",
"p_code":"192",
"code":"194"
},
{
"desc":"Job labels are used to label jobs of the same or similar purposes to facilitate job management and query. This section describes how to manage job labels, including addin",
"product_code":"dataartsstudio",
"title":"Managing Job Labels",
"uri":"dataartsstudio_01_0532.html",
"doc_type":"usermanual",
"p_code":"192",
"code":"195"
},
{
"desc":"The following problems may occur during job execution in DataArts Factory:The job execution mechanism of the DataArts Factory module is to execute the job as the user who",
"product_code":"dataartsstudio",
"title":"Configuring Agencies",
"uri":"dataartsstudio_01_0555.html",
"doc_type":"usermanual",
"p_code":"192",
"code":"196"
},
{
"desc":"This section describes how to configure a default item.If a parameter is invoked by multiple jobs, you can use this parameter as the default configuration item. In this w",
"product_code":"dataartsstudio",
"title":"Configuring a Default Item",
"uri":"dataartsstudio_01_04501.html",
"doc_type":"usermanual",
"p_code":"192",
"code":"197"
},
{
"desc":"You can upload custom code or text files as resources on Manage Resource and schedule them when running nodes. Nodes that can invoke resources include DLI Spark, MRS Spar",
"product_code":"dataartsstudio",
"title":"Managing Resources",
"uri":"dataartsstudio_01_0519.html",
"doc_type":"usermanual",
"p_code":"191",
"code":"198"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Node Reference",
"uri":"dataartsstudio_01_0441.html",
"doc_type":"usermanual",
"p_code":"139",
"code":"199"
},
{
"desc":"A node defines the operations performed on data. DataArts Factory provides nodes used for data integration, computing and analysis, database operations, and resource mana",
"product_code":"dataartsstudio",
"title":"Node Overview",
"uri":"dataartsstudio_01_0442.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"200"
},
{
"desc":"The CDM Job node is used to run a predefined CDM job for data migration.Table 1, Table 2, and Table 3 describe the parameters of the CDM Job node. Configure the lineage t",
"product_code":"dataartsstudio",
"title":"CDM Job",
"uri":"dataartsstudio_01_0443.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"201"
},
{
"desc":"The Rest Client node is used to respond to RESTful requests in . Only the RESTful requests that have been authenticated by using IAM tokens are supported.If some APIs of ",
"product_code":"dataartsstudio",
"title":"Rest Client",
"uri":"dataartsstudio_01_0447.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"202"
},
{
"desc":"The Import GES node is used to import files from an OBS bucket to a GES graph.Table 1 and Table 2 describe the parameters of the Import GES node.",
"product_code":"dataartsstudio",
"title":"Import GES",
"uri":"dataartsstudio_01_0448.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"203"
},
{
"desc":"The MRS Kafka node is used to query the number of messages that are not consumed by a topic.Table 1 and Table 2 describe the parameters of the MRS Kafka node.",
"product_code":"dataartsstudio",
"title":"MRS Kafka",
"uri":"dataartsstudio_01_0537.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"204"
},
{
"desc":"The Kafka Client node is used to send data to Kafka topics.Table 1 describes the parameters of the Kafka Client node.",
"product_code":"dataartsstudio",
"title":"Kafka Client",
"uri":"dataartsstudio_01_0538.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"205"
},
{
"desc":"The ROMA FDI Job node executes a predefined ROMA Connect data integration task to implement data integration and conversion between the source and destination.This node e",
"product_code":"dataartsstudio",
"title":"ROMA FDI Job",
"uri":"dataartsstudio_01_1098.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"206"
},
{
"desc":"The DLI Flink Job node is used to execute a predefined DLI job for real-time analysis of streaming data.This node enables you to start a DLI job or query whether a DLI jo",
"product_code":"dataartsstudio",
"title":"DLI Flink Job",
"uri":"dataartsstudio_01_0536.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"207"
},
{
"desc":"The DLI SQL node is used to transfer SQL statements to DLI for data source analysis and exploration.This node enables you to execute DLI statements during periodical or r",
"product_code":"dataartsstudio",
"title":"DLI SQL",
"uri":"dataartsstudio_01_0450.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"208"
},
{
"desc":"The DLI Spark node is used to execute a predefined Spark job.Table 1, Table 2, and Table 3 describe the parameters of the DLI Sparknode node.",
"product_code":"dataartsstudio",
"title":"DLI Spark",
"uri":"dataartsstudio_01_0451.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"209"
},
{
"desc":"The DWS SQL node is used to transfer SQL statements to DWS.For details about how to use the DWS SQL operator, see Developing a DWS SQL Job.This node enables you to execut",
"product_code":"dataartsstudio",
"title":"DWS SQL",
"uri":"dataartsstudio_01_0452.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"210"
},
{
"desc":"The MRS Spark SQL node is used to execute a predefined SparkSQL statement on MRS.Table 1, Table 2, and Table 3 describe the parameters of the MRS Spark SQLnode node.",
"product_code":"dataartsstudio",
"title":"MRS Spark SQL",
"uri":"dataartsstudio_01_0453.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"211"
},
{
"desc":"The MRS Hive SQL node is used to execute a predefined Hive SQL script on DLF.Table 1, Table 2, and Table 3 describe the parameters of the MRS Hive SQLnode node.",
"product_code":"dataartsstudio",
"title":"MRS Hive SQL",
"uri":"dataartsstudio_01_0454.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"212"
},
{
"desc":"The MRS Presto SQL node is used to execute the Presto SQL script predefined in DataArts Factory.Table 1, Table 2, and Table 3 describe the parameters of the MRS Presto SQ",
"product_code":"dataartsstudio",
"title":"MRS Presto SQL",
"uri":"dataartsstudio_01_1099.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"213"
},
{
"desc":"The MRS Spark node is used to execute a predefined Spark job on MRS.Table 1, Table 2, and Table 3 describe the parameters of the MRS Sparknode node.",
"product_code":"dataartsstudio",
"title":"MRS Spark",
"uri":"dataartsstudio_01_0455.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"214"
},
{
"desc":"The MRS Spark Python node is used to execute a predefined Spark Python job on MRS.For details about how to use the MRS Spark Python operator, see Developing an MRS Spark ",
"product_code":"dataartsstudio",
"title":"MRS Spark Python",
"uri":"dataartsstudio_01_0456.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"215"
},
{
"desc":"The MRS Flink node is used to execute predefined Flink jobs in MRS.Table 1 and Table 2 describe the parameters of the MRS Flink node.",
"product_code":"dataartsstudio",
"title":"MRS Flink Job",
"uri":"dataartsstudio_01_0554.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"216"
},
{
"desc":"The MRS MapReduce node is used to execute a predefined MapReduce program on MRS.Table 1 and Table 2 describe the parameters of the MRS MapReduce node.",
"product_code":"dataartsstudio",
"title":"MRS MapReduce",
"uri":"dataartsstudio_01_0457.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"217"
},
{
"desc":"The CSS node is used to process CSS requests and enable online distributed searching.Table 1 and Table 2 describe the parameters of the CSS node.",
"product_code":"dataartsstudio",
"title":"CSS",
"uri":"dataartsstudio_01_0458.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"218"
},
{
"desc":"The Shell node is used to execute a shell script.With EL expression #{Job.getNodeOutput()}, you can obtain the desired content (4000 characters at most and counted backwa",
"product_code":"dataartsstudio",
"title":"Shell",
"uri":"dataartsstudio_01_0459.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"219"
},
{
"desc":"The RDS SQL node is used to transfer SQL statements to RDS.Table 1 and Table 2 describe the parameters of the RDS SQL node.",
"product_code":"dataartsstudio",
"title":"RDS SQL",
"uri":"dataartsstudio_01_0460.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"220"
},
{
"desc":"The ETL Job node is used to extract data from a specified data source, preprocess the data, and import the data to the target data source.Table 1, Table 2, and Table 3 de",
"product_code":"dataartsstudio",
"title":"ETL Job",
"uri":"dataartsstudio_01_0461.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"221"
},
{
"desc":"The Python node is used to execute Python statements.Before using a Python node, ensure that the host connected to the node has an environment for executing Python script",
"product_code":"dataartsstudio",
"title":"Python",
"uri":"dataartsstudio_01_4504.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"222"
},
{
"desc":"This function depends on OBS.The Create OBS node is used to create buckets and directories on OBS.Table 1 and Table 2 describe the parameters of the Create OBS node.",
"product_code":"dataartsstudio",
"title":"Create OBS",
"uri":"dataartsstudio_01_0462.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"223"
},
{
"desc":"This function depends on OBS.The Delete OBS node is used to delete a bucket or directory on OBS.Table 1 and Table 2 describe the parameters of the Delete OBS node.",
"product_code":"dataartsstudio",
"title":"Delete OBS",
"uri":"dataartsstudio_01_0463.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"224"
},
{
"desc":"This function depends on OBS.The OBS Manager node is used to move or copy files from an OBS bucket to a specified directory.Table 1, Table 2, and Table 3 describe the par",
"product_code":"dataartsstudio",
"title":"OBS Manager",
"uri":"dataartsstudio_01_0464.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"225"
},
{
"desc":"You can use the Open/Close Resource node to enable or disable services as required.Table 1 and Table 2 describe the parameters of the Open/Close Resource node.",
"product_code":"dataartsstudio",
"title":"Open/Close Resource",
"uri":"dataartsstudio_01_0465.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"226"
},
{
"desc":"The Data Quality Monitor node is used to monitor the quality of running data.Table 1 and Table 2 describe the parameters of the Data Quality Monitor node.",
"product_code":"dataartsstudio",
"title":"Data Quality Monitor",
"uri":"dataartsstudio_01_0472.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"227"
},
{
"desc":"The Subjob node is used to call the batch job that does not contain the subjob node.Table 1 and Table 2 describe the parameters of the Subjob node.",
"product_code":"dataartsstudio",
"title":"Subjob",
"uri":"dataartsstudio_01_0467.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"228"
},
{
"desc":"The For Each node specifies a subjob to be executed cyclically and assigns values to variables in a subjob with a dataset.Table 1 describes the parameters of the For Each",
"product_code":"dataartsstudio",
"title":"For Each",
"uri":"dataartsstudio_01_0535.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"229"
},
{
"desc":"The SMN node is used to send notifications to users.Table 1 and Table 2 describe the parameters of the SMN node.",
"product_code":"dataartsstudio",
"title":"SMN",
"uri":"dataartsstudio_01_0468.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"230"
},
{
"desc":"The Dummy node is empty and does not perform any operations. It is used to simplify the complex connection relationships of nodes. Figure 1 shows an example.Table 1 descr",
"product_code":"dataartsstudio",
"title":"Dummy",
"uri":"dataartsstudio_01_0469.html",
"doc_type":"usermanual",
"p_code":"199",
"code":"231"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"EL Expression Reference",
"uri":"dataartsstudio_01_0493.html",
"doc_type":"usermanual",
"p_code":"139",
"code":"232"
},
{
"desc":"Node parameter values in a DataArts Factory job can be dynamically generated based on the running environment by using Expression Language (EL). You can determine whether",
"product_code":"dataartsstudio",
"title":"Expression Overview",
"uri":"dataartsstudio_01_0494.html",
"doc_type":"usermanual",
"p_code":"232",
"code":"233"
},
{
"desc":"EL supports most of the arithmetic and logic operators provided by Java.If variable a is empty, default is returned. If variable a is not empty, a itself is returned. The",
"product_code":"dataartsstudio",
"title":"Basic Operators",
"uri":"dataartsstudio_01_0495.html",
"doc_type":"usermanual",
"p_code":"232",
"code":"234"
},
{
"desc":"The date and time in the EL expression can be displayed in a user-specified format. The date and time format is specified by the date and time mode character string. The ",
"product_code":"dataartsstudio",
"title":"Date and Time Mode",
"uri":"dataartsstudio_01_0496.html",
"doc_type":"usermanual",
"p_code":"232",
"code":"235"
},
{
"desc":"An Env embedded object provides a method of obtaining an environment variable value.The EL expression used to obtain the value of environment variable test is as follows:",
"product_code":"dataartsstudio",
"title":"Env Embedded Objects",
"uri":"dataartsstudio_01_0497.html",
"doc_type":"usermanual",
"p_code":"232",
"code":"236"
},
{
"desc":"A job object provides properties and methods of obtaining the output message, job scheduling plan time, and job execution time of the previous node in a job.The expressio",
"product_code":"dataartsstudio",
"title":"Job Embedded Objects",
"uri":"dataartsstudio_01_0498.html",
"doc_type":"usermanual",
"p_code":"232",
"code":"237"
},
{
"desc":"A StringUtil embedded object provides methods of operating character strings, for example, truncating a substring from a character string.StringUtil is implemented throug",
"product_code":"dataartsstudio",
"title":"StringUtil Embedded Objects",
"uri":"dataartsstudio_01_0499.html",
"doc_type":"usermanual",
"p_code":"232",
"code":"238"
},
{
"desc":"A DateUtil embedded object provides methods of formatting time and calculating time.The previous day of the job scheduling plan time is used as the subdirectory name to g",
"product_code":"dataartsstudio",
"title":"DateUtil Embedded Objects",
"uri":"dataartsstudio_01_0500.html",
"doc_type":"usermanual",
"p_code":"232",
"code":"239"
},
{
"desc":"A JSONUtil embedded object provides JSON object methods.The content of variable str is as follows:The expression for obtaining the area code of city1 is as follows:",
"product_code":"dataartsstudio",
"title":"JSONUtil Embedded Objects",
"uri":"dataartsstudio_01_0501.html",
"doc_type":"usermanual",
"p_code":"232",
"code":"240"
},
{
"desc":"You can use Loop embedded objects to obtain data from the For Each dataset.The EL expression for the Foreach operator to cyclically obtain the first column of the output ",
"product_code":"dataartsstudio",
"title":"Loop Embedded Objects",
"uri":"dataartsstudio_01_0534.html",
"doc_type":"usermanual",
"p_code":"232",
"code":"241"
},
{
"desc":"The OBSUtil embedded objects provide a series of OBS operation methods, for example, checking whether an OBS file or directory exists.The following is the EL expression f",
"product_code":"dataartsstudio",
"title":"OBSUtil Embedded Objects",
"uri":"dataartsstudio_01_0553.html",
"doc_type":"usermanual",
"p_code":"232",
"code":"242"
},
{
"desc":"With this example, you can understand how to use EL expressions in the following applications:Using variables in the SQL script of DataArts FactoryTransferring parameters",
"product_code":"dataartsstudio",
"title":"Expression Use Example",
"uri":"dataartsstudio_01_0502.html",
"doc_type":"usermanual",
"p_code":"232",
"code":"243"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Usage Guidance",
"uri":"dataartsstudio_01_0520.html",
"doc_type":"usermanual",
"p_code":"139",
"code":"244"
},
{
"desc":"You can set a job that meets the scheduling period conditions as the dependency jobs for a job that is scheduled periodically. For details about how to set a dependency j",
"product_code":"dataartsstudio",
"title":"Job Dependency",
"uri":"dataartsstudio_01_0580.html",
"doc_type":"usermanual",
"p_code":"244",
"code":"245"
},
{
"desc":"When developing and orchestrating jobs in DataArts Factory, you can use IF statements to determine the branch to execute.This section describes how to use IF statements i",
"product_code":"dataartsstudio",
"title":"IF Statements",
"uri":"dataartsstudio_01_0583.html",
"doc_type":"usermanual",
"p_code":"244",
"code":"246"
},
{
"desc":"The Rest Client node can execute RESTful requests.This tutorial describes how to obtain the return value of the Rest Client node, covering the following two application s",
"product_code":"dataartsstudio",
"title":"Obtaining the Return Value of a Rest Client Node",
"uri":"dataartsstudio_01_0581.html",
"doc_type":"usermanual",
"p_code":"244",
"code":"247"
},
{
"desc":"During job development, if some jobs have different parameters but the same processing logic, you can use For Each nodes to avoid repeated job development.You can use a F",
"product_code":"dataartsstudio",
"title":"Using For Each Nodes",
"uri":"dataartsstudio_01_0582.html",
"doc_type":"usermanual",
"p_code":"244",
"code":"248"
},
{
"desc":"This section describes how to develop and execute a Python script using DataArts Factory.An ECS named ecs-dgc has been created.In this example, the ECS uses the CentOS 8.",
"product_code":"dataartsstudio",
"title":"Developing a Python Script",
"uri":"dataartsstudio_01_0529.html",
"doc_type":"usermanual",
"p_code":"244",
"code":"249"
},
{
"desc":"This section describes how to use the DWS SQL operator to develop a job on DataArts Factory.This tutorial describes how to develop a DWS job to collect the sales volume o",
"product_code":"dataartsstudio",
"title":"Developing a DWS SQL Job",
"uri":"dataartsstudio_01_0524.html",
"doc_type":"usermanual",
"p_code":"244",
"code":"250"
},
{
"desc":"This section introduces how to develop Hive SQL scripts on DataArts Factory.As a one-stop big data development platform, DataArts Factory supports development of multiple",
"product_code":"dataartsstudio",
"title":"Developing a Hive SQL Job",
"uri":"dataartsstudio_01_0522.html",
"doc_type":"usermanual",
"p_code":"244",
"code":"251"
},
{
"desc":"This section introduces how to develop a DLI Spark job on DataArts Factory.In most cases, SQL is used to analyze and process data when using Data Lake Insight (DLI). Howe",
"product_code":"dataartsstudio",
"title":"Developing a DLI Spark Job",
"uri":"dataartsstudio_01_0521.html",
"doc_type":"usermanual",
"p_code":"244",
"code":"252"
},
{
"desc":"This section describes how to develop an MRS Flink job on DataArts Factory. Use an MRS Flink job to count the number of words.You have the permission to access OBS paths.",
"product_code":"dataartsstudio",
"title":"Developing an MRS Flink Job",
"uri":"dataartsstudio_01_0526.html",
"doc_type":"usermanual",
"p_code":"244",
"code":"253"
},
{
"desc":"This section describes how to develop an MRS Spark Python on DataArts Factory.PrerequisitesYou have the permission to access OBS paths.Data preparationPrepare the script ",
"product_code":"dataartsstudio",
"title":"Developing an MRS Spark Python Job",
"uri":"dataartsstudio_01_0525.html",
"doc_type":"usermanual",
"p_code":"244",
"code":"254"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"FAQs",
"uri":"dataartsstudio_12_0005.html",
"doc_type":"usermanual",
"p_code":"",
"code":"255"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Consultation and Billing",
"uri":"dataartsstudio_03_0002.html",
"doc_type":"usermanual",
"p_code":"255",
"code":"256"
},
{
"desc":"We use a region to identify the location of a data center. You can create resources in a specific region.A region is a physical data center. Each region is completely ind",
"product_code":"dataartsstudio",
"title":"Regions",
"uri":"dataartsstudio_03_0052.html",
"doc_type":"usermanual",
"p_code":"256",
"code":"257"
},
{
"desc":"Check whether the user has been added to the workspace. If not, perform the following steps to add the user:Log in to the DataArts Studio console and access the Workspace",
"product_code":"dataartsstudio",
"title":"What Should I Do If a User Cannot View Existing Workspaces After I Have Assigned the Required Policy to the User?",
"uri":"dataartsstudio_03_0061.html",
"doc_type":"usermanual",
"p_code":"256",
"code":"258"
},
{
"desc":"After workspaces are created, they cannot be deleted. You can disable workspaces when they are no longer needed. You can enable them again when you need these workspaces.",
"product_code":"dataartsstudio",
"title":"Can I Delete DataArts Studio Workspaces?",
"uri":"dataartsstudio_03_0222.html",
"doc_type":"usermanual",
"p_code":"256",
"code":"259"
},
{
"desc":"No. the purchased or trial instance cannot be transferred to another account.",
"product_code":"dataartsstudio",
"title":"Can I Transfer a Purchased or Trial Instance to Another Account?",
"uri":"dataartsstudio_03_0131.html",
"doc_type":"usermanual",
"p_code":"256",
"code":"260"
},
{
"desc":"No. You cannot downgrade a purchased DataArts Studio instance.",
"product_code":"dataartsstudio",
"title":"Does DataArts Studio Support Version Downgrade?",
"uri":"dataartsstudio_03_0087.html",
"doc_type":"usermanual",
"p_code":"256",
"code":"261"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Management Center",
"uri":"dataartsstudio_03_0022.html",
"doc_type":"usermanual",
"p_code":"255",
"code":"262"
},
{
"desc":"For details about the data connections supported by DataArts Studio, see Data Sources Supported by DataArts Studio.",
"product_code":"dataartsstudio",
"title":"Which Data Connections Are Supported by DataArts Studio?",
"uri":"dataartsstudio_03_0008.html",
"doc_type":"usermanual",
"p_code":"262",
"code":"263"
},
{
"desc":"When creating a DWS, MRS Hive, RDS, and SparkSQL data connection, you must bind an agent provided by the CDM cluster. Currently, a version of the CDM cluster earlier than",
"product_code":"dataartsstudio",
"title":"What Are the Precautions for Creating Data Connections?",
"uri":"dataartsstudio_03_0009.html",
"doc_type":"usermanual",
"p_code":"262",
"code":"264"
},
{
"desc":"The possible cause is that the CDM cluster is stopped or a concurrency conflict occurs. You can switch to another agent to temporarily avoid this issue.To resolve this is",
"product_code":"dataartsstudio",
"title":"Why Do DWS/Hive/HBase Data Connections Fail to Obtain the Information About Database or Tables?",
"uri":"dataartsstudio_03_0016.html",
"doc_type":"usermanual",
"p_code":"262",
"code":"265"
},
{
"desc":"Possible causes are as follows:Hive/HBase components were not selected during MRS cluster creation.The network between the CDM cluster and MRS cluster was disconnected wh",
"product_code":"dataartsstudio",
"title":"Why Are MRS Hive/HBase Clusters Not Displayed on the Page for Creating Data Connections?",
"uri":"dataartsstudio_03_0017.html",
"doc_type":"usermanual",
"p_code":"262",
"code":"266"
},
{
"desc":"The failure may be caused by the rights separation function of the DWS cluster. On the DWS console, click the corresponding cluster, choose Security Settings, and disable",
"product_code":"dataartsstudio",
"title":"What Should I Do If the Connection Test Fails When I Enable the SSL Connection During the Creation of a DWS Data Connection?",
"uri":"dataartsstudio_03_0054.html",
"doc_type":"usermanual",
"p_code":"262",
"code":"267"
},
{
"desc":"Multiple data connections of the same type or different types can be created in the same workspace, but their names must be unique.",
"product_code":"dataartsstudio",
"title":"Can I Create Multiple Data Connections in a Workspace in Proxy Mode?",
"uri":"dataartsstudio_03_0089.html",
"doc_type":"usermanual",
"p_code":"262",
"code":"268"
},
{
"desc":"You are advised to choose a proxy connection.",
"product_code":"dataartsstudio",
"title":"Should I Choose a Direct or a Proxy Connection When Creating a DWS Connection?",
"uri":"dataartsstudio_03_0137.html",
"doc_type":"usermanual",
"p_code":"262",
"code":"269"
},
{
"desc":"You can export the jobs in DataArts Factory and then import them to DataArts Factory in another workspace.You can export data connections on the Migrate Resources page of",
"product_code":"dataartsstudio",
"title":"How Do I Migrate the Data Development Jobs and Data Connections from One Workspace to Another?",
"uri":"dataartsstudio_03_0153.html",
"doc_type":"usermanual",
"p_code":"262",
"code":"270"
},
{
"desc":"No, but you can change the names of workspaces.",
"product_code":"dataartsstudio",
"title":"Can I Delete Workspaces?",
"uri":"dataartsstudio_03_0154.html",
"doc_type":"usermanual",
"p_code":"262",
"code":"271"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"DataArts Migration",
"uri":"dataartsstudio_03_0027.html",
"doc_type":"usermanual",
"p_code":"255",
"code":"272"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"General",
"uri":"dataartsstudio_03_0138.html",
"doc_type":"usermanual",
"p_code":"272",
"code":"273"
},
{
"desc":"CDM is developed based on a distributed computing framework and leverages the parallel data processing technology. Table 1 details the advantages of CDM.",
"product_code":"dataartsstudio",
"title":"What Are the Advantages of CDM?",
"uri":"dataartsstudio_03_0139.html",
"doc_type":"usermanual",
"p_code":"273",
"code":"274"
},
{
"desc":"CDM is a fully hosted service that provides the following capabilities to protect user data security:Instance isolation: CDM users can use only their own instances. Insta",
"product_code":"dataartsstudio",
"title":"What Are the Security Protection Mechanisms of CDM?",
"uri":"dataartsstudio_03_0140.html",
"doc_type":"usermanual",
"p_code":"273",
"code":"275"
},
{
"desc":"When migrating the data on the public network, use NAT Gateway to share the EIPs with other ECSs in the subnet. In this way, data on the on-premises data center or third-",
"product_code":"dataartsstudio",
"title":"How Do I Reduce the Cost of Using CDM?",
"uri":"dataartsstudio_03_0099.html",
"doc_type":"usermanual",
"p_code":"273",
"code":"276"
},
{
"desc":"No. To use a later version cluster, you can create one.",
"product_code":"dataartsstudio",
"title":"Can I Upgrade a CDM Cluster?",
"uri":"dataartsstudio_03_0302.html",
"doc_type":"usermanual",
"p_code":"273",
"code":"277"
},
{
"desc":"Theoretically, a cdm.large CDM instance can migrate 1 TB to 8 TB data per day. The actual transmission rate is affected by factors such as the Internet bandwidth, cluster",
"product_code":"dataartsstudio",
"title":"How Is the Migration Performance of CDM?",
"uri":"dataartsstudio_03_0141.html",
"doc_type":"usermanual",
"p_code":"273",
"code":"278"
},
{
"desc":"Table 1 lists the number of concurrent jobs for different CDM cluster versions.You are advised to use multiple CDM clusters in the following and other scenarios as needed",
"product_code":"dataartsstudio",
"title":"What Is the Number of Concurrent Jobs for Different CDM Cluster Versions?",
"uri":"dataartsstudio_03_0124.html",
"doc_type":"usermanual",
"p_code":"273",
"code":"279"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Functions",
"uri":"dataartsstudio_03_0142.html",
"doc_type":"usermanual",
"p_code":"272",
"code":"280"
},
{
"desc":"CDM supports incremental data migration. With scheduled jobs and macro variables of date and time, CDM provides incremental data migration in the following scenarios:Incr",
"product_code":"dataartsstudio",
"title":"Does CDM Support Incremental Data Migration?",
"uri":"dataartsstudio_03_0069.html",
"doc_type":"usermanual",
"p_code":"280",
"code":"281"
},
{
"desc":"Yes. CDM supports the following field converters:AnonymizationTrimReverse StringReplace StringExpression ConversionYou can create a field converter on the Map Field page ",
"product_code":"dataartsstudio",
"title":"Does CDM Support Field Conversion?",
"uri":"dataartsstudio_03_0028.html",
"doc_type":"usermanual",
"p_code":"280",
"code":"282"
},
{
"desc":"The recommended component versions can be used as both the source and destination.",
"product_code":"dataartsstudio",
"title":"What Component Versions Are Recommended for Migrating Hadoop Data Sources?",
"uri":"dataartsstudio_03_0107.html",
"doc_type":"usermanual",
"p_code":"280",
"code":"283"
},
{
"desc":"CDM can read and write data in SequenceFile, TextFile, ORC, or Parquet format from the Hive data source.",
"product_code":"dataartsstudio",
"title":"What Data Formats Are Supported When the Data Source Is Hive?",
"uri":"dataartsstudio_03_0029.html",
"doc_type":"usermanual",
"p_code":"280",
"code":"284"
},
{
"desc":"CDM does not support direct job migration across clusters. However, you can use the batch job import and export function to indirectly implement cross-cluster migration a",
"product_code":"dataartsstudio",
"title":"Can I Synchronize Jobs to Other Clusters?",
"uri":"dataartsstudio_03_0030.html",
"doc_type":"usermanual",
"p_code":"280",
"code":"285"
},
{
"desc":"CDM supports batch job creation with the help of the batch import function. You can create jobs in batches as follows:Create a job manually.Export the job and save the jo",
"product_code":"dataartsstudio",
"title":"Can I Create Jobs in Batches?",
"uri":"dataartsstudio_03_0031.html",
"doc_type":"usermanual",
"p_code":"280",
"code":"286"
},
{
"desc":"Yes.Access the DataArts Factory module of the DataArts Studio service.In the navigation pane of the DataArts Factory homepage, choose Data Development > Develop Job to cr",
"product_code":"dataartsstudio",
"title":"Can I Schedule Jobs in Batches?",
"uri":"dataartsstudio_03_0100.html",
"doc_type":"usermanual",
"p_code":"280",
"code":"287"
},
{
"desc":"Yes. If you do not need to use the CDM cluster for a long time, you can stop or delete it to reduce costs.Before the deletion, you can use the batch export function of CD",
"product_code":"dataartsstudio",
"title":"How Do I Back Up CDM Jobs?",
"uri":"dataartsstudio_03_0032.html",
"doc_type":"usermanual",
"p_code":"280",
"code":"288"
},
{
"desc":"To ensure that CDM can communicate with the HANA cluster, perform the following operations:Disable Statement Routing of the HANA cluster. Note that this will increase the",
"product_code":"dataartsstudio",
"title":"How Do I Configure the Connection If Only Some Nodes in the HANA Cluster Can Communicate with the CDM Cluster?",
"uri":"dataartsstudio_03_0119.html",
"doc_type":"usermanual",
"p_code":"280",
"code":"289"
},
{
"desc":"CDM provides RESTful APIs to implement automatic job creation or execution control by program invocation.The following describes how to use CDM to migrate data from table",
"product_code":"dataartsstudio",
"title":"How Do I Use Java to Invoke CDM RESTful APIs to Create Data Migration Jobs?",
"uri":"dataartsstudio_03_0101.html",
"doc_type":"usermanual",
"p_code":"280",
"code":"290"
},
{
"desc":"Many enterprises deploy key data sources on the intranet, such as databases and file servers. CDM runs on the cloud. To migrate the intranet data to the cloud using CDM, ",
"product_code":"dataartsstudio",
"title":"How Do I Connect the On-Premises Intranet or Third-Party Private Network to CDM?",
"uri":"dataartsstudio_03_0033.html",
"doc_type":"usermanual",
"p_code":"280",
"code":"291"
},
{
"desc":"The number of concurrent extractors in a CDM migration job is related to the cluster specifications and table size. The value range is 1 to 300. If the value is too large",
"product_code":"dataartsstudio",
"title":"How Do I Set the Number of Concurrent Extractors for a CDM Migration Job?",
"uri":"dataartsstudio_03_0336.html",
"doc_type":"usermanual",
"p_code":"280",
"code":"292"
},
{
"desc":"No. If data is written to the source during the migration, an error may occur.",
"product_code":"dataartsstudio",
"title":"Does CDM Support Real-Time Migration of Dynamic Data?",
"uri":"dataartsstudio_03_0337.html",
"doc_type":"usermanual",
"p_code":"280",
"code":"293"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Troubleshooting",
"uri":"dataartsstudio_03_0143.html",
"doc_type":"usermanual",
"p_code":"272",
"code":"294"
},
{
"desc":"When CDM is used to import data from OBS to SQL Server, the job fails to be executed and error message \"Unable to execute the SQL statement. Cause: \"String or binary data",
"product_code":"dataartsstudio",
"title":"What Can I Do If Error Message \"Unable to execute the SQL statement\" Is Displayed When I Import Data from OBS to SQL Server?",
"uri":"dataartsstudio_03_0106.html",
"doc_type":"usermanual",
"p_code":"294",
"code":"295"
},
{
"desc":"When CDM is used to migrate Oracle data to DWS, an error is reported, as shown in Figure 1.During data migration, if the entire table is queried and the table contains a ",
"product_code":"dataartsstudio",
"title":"Why Is Error ORA-01555 Reported During Migration from Oracle to DWS?",
"uri":"dataartsstudio_03_0071.html",
"doc_type":"usermanual",
"p_code":"294",
"code":"296"
},
{
"desc":"By default, the userAdmin role has only the permissions to manage roles and users and does not have the read and write permissions on a database.If the MongoDB connection",
"product_code":"dataartsstudio",
"title":"What Should I Do If the MongoDB Connection Migration Fails?",
"uri":"dataartsstudio_03_0072.html",
"doc_type":"usermanual",
"p_code":"294",
"code":"297"
},
{
"desc":"Manually stop the Hive migration job and add the following attribute settings to the Hive data connection:Attribute Name: hive.server2.idle.operation.timeoutValue: 10mIn ",
"product_code":"dataartsstudio",
"title":"What Should I Do If a Hive Migration Job Is Suspended for a Long Period of Time?",
"uri":"dataartsstudio_03_0093.html",
"doc_type":"usermanual",
"p_code":"294",
"code":"298"
},
{
"desc":"When you use CDM to migrate data to DWS, the migration job fails and the error message \"value too long for type character varying\" is displayed in the execution log.The p",
"product_code":"dataartsstudio",
"title":"What Should I Do If an Error Is Reported Because the Field Type Mapping Does Not Match During Data Migration Using CDM?",
"uri":"dataartsstudio_03_0109.html",
"doc_type":"usermanual",
"p_code":"294",
"code":"299"
},
{
"desc":"The following error message is displayed during MySQL migration: \"Unable to connect to the database server. Cause: connect timed out.\"The table has a large data volume, a",
"product_code":"dataartsstudio",
"title":"What Should I Do If a JDBC Connection Timeout Error Is Reported During MySQL Migration?",
"uri":"dataartsstudio_03_0110.html",
"doc_type":"usermanual",
"p_code":"294",
"code":"300"
},
{
"desc":"You are advised to clear historical data and try again. In addition, when creating a migration job, you are advised to enable the system to clear historical data. This gr",
"product_code":"dataartsstudio",
"title":"What Should I Do If a CDM Migration Job Fails After a Link from Hive to DWS Is Created?",
"uri":"dataartsstudio_03_0121.html",
"doc_type":"usermanual",
"p_code":"294",
"code":"301"
},
{
"desc":"CDM does not support this operation. You are advised to manually export a MySQL data file, enable the SFTP service on the server, and create a CDM job with SFTP as the so",
"product_code":"dataartsstudio",
"title":"How Do I Use CDM to Export MySQL Data to an SQL File and Upload the File to an OBS Bucket?",
"uri":"dataartsstudio_03_0122.html",
"doc_type":"usermanual",
"p_code":"294",
"code":"302"
},
{
"desc":"Dirty data writing is configured, but no dirty data exists. You need to decrease the number of concurrent tasks to avoid this issue.",
"product_code":"dataartsstudio",
"title":"What Should I Do If CDM Fails to Migrate Data from OBS to DLI?",
"uri":"dataartsstudio_03_0123.html",
"doc_type":"usermanual",
"p_code":"294",
"code":"303"
},
{
"desc":"This error is reported because the customer's certificate has expired. Update the certificate and reconfigure the connector.",
"product_code":"dataartsstudio",
"title":"What Should I Do If a CDM Connector Reports the Error \"Configuration Item [linkConfig.iamAuth] Does Not Exist\"?",
"uri":"dataartsstudio_03_0132.html",
"doc_type":"usermanual",
"p_code":"294",
"code":"304"
},
{
"desc":"If you create a link or save a job in a CDM cluster of an earlier version, and then access a CDM cluster of a later version, this error occurs occasionally.Manually clear",
"product_code":"dataartsstudio",
"title":"What Should I Do If Error Message\"Configuration Item [throttlingConfig.concurrentSubJobs] Does Not Exist\" Is Displayed During Job Creation?",
"uri":"dataartsstudio_03_0333.html",
"doc_type":"usermanual",
"p_code":"294",
"code":"305"
},
{
"desc":"This failure occurs because you do not have the required permissions. Create another service user, grant the required permissions to it, and try again.To create a data co",
"product_code":"dataartsstudio",
"title":"What Should I Do If Message \"CORE_0031:Connect time out. (Cdm.0523)\" Is Displayed During the Creation of an MRS Hive Link?",
"uri":"dataartsstudio_03_0166.html",
"doc_type":"usermanual",
"p_code":"294",
"code":"306"
},
{
"desc":"The cause is that the database table name contains special characters, resulting in incorrect syntax. You can resolve this issue by renaming the database table according ",
"product_code":"dataartsstudio",
"title":"What Should I Do If Message \"CDM Does Not Support Auto Creation of an Empty Table with No Column\" Is Displayed When I Enable Auto Table Creation?",
"uri":"dataartsstudio_03_0167.html",
"doc_type":"usermanual",
"p_code":"294",
"code":"307"
},
{
"desc":"This may be because you have uploaded the latest ORACLE_8 driver (for example, Oracle Database 21c (21.3) driver), which is not supported yet. You are advised to use the ",
"product_code":"dataartsstudio",
"title":"What Should I Do If I Cannot Obtain the Schema Name When Creating an Oracle Relational Database Migration Job?",
"uri":"dataartsstudio_03_0334.html",
"doc_type":"usermanual",
"p_code":"294",
"code":"308"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"DataArts Factory",
"uri":"dataartsstudio_03_0035.html",
"doc_type":"usermanual",
"p_code":"255",
"code":"309"
},
{
"desc":"By default, each user can create a maximum of 10,000 jobs, and each job can contain a maximum of 200 nodes.In addition, the system allows you to adjust the maximum quota ",
"product_code":"dataartsstudio",
"title":"How Many Jobs Can Be Created in DataArts Factory? Is There a Limit on the Number of Nodes in a Job?",
"uri":"dataartsstudio_03_0036.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"310"
},
{
"desc":"On the Running History page, there is a large difference between Job Execution Time and Start Time, as shown in the figure below. Job Execution Time is the time when the ",
"product_code":"dataartsstudio",
"title":"Why Is There a Large Difference Between Job Execution Time and Start Time of a Job?",
"uri":"dataartsstudio_03_0041.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"311"
},
{
"desc":"The subsequent jobs may be suspended, continued, or terminated, depending on the configuration.In this case, do not stop the job. You can rerun the failed job instance or",
"product_code":"dataartsstudio",
"title":"Will Subsequent Jobs Be Affected If a Job Fails to Be Executed During Scheduling of Dependent Jobs? What Should I Do?",
"uri":"dataartsstudio_03_0042.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"312"
},
{
"desc":"Lock management is unavailable for DLI and MRS. Therefore, if you perform read and write operations on the tables simultaneously, data conflict will occur and the operati",
"product_code":"dataartsstudio",
"title":"What Should I Pay Attention to When Using DataArts Studio to Schedule Big Data Services?",
"uri":"dataartsstudio_03_0149.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"313"
},
{
"desc":"Parameters can be set in environment variables, job parameters, and script parameters, but their application scopes are different. If there is a conflict when parameters ",
"product_code":"dataartsstudio",
"title":"What Are the Differences and Connections Among Environment Variables, Job Parameters, and Script Parameters?",
"uri":"dataartsstudio_03_0150.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"314"
},
{
"desc":"Error logs are stored in OBS. The current account must have the OBS read permissions to view logs. You can check the OBS permissions and OBS bucket policies in IAM.When y",
"product_code":"dataartsstudio",
"title":"What Do I Do If Node Error Logs Cannot Be Viewed When a Job Fails?",
"uri":"dataartsstudio_03_0050.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"315"
},
{
"desc":"When a workspace- or job-level agency is configured, the following error is reported when the agency list is viewed:Policy doesn't allow iam:agencies:listAgencies to be p",
"product_code":"dataartsstudio",
"title":"What Should I Do If the Agency List Fails to Be Obtained During Agency Configuration?",
"uri":"dataartsstudio_03_0051.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"316"
},
{
"desc":"If the number of daily executed nodes exceeds the upper limit, it may be caused by frequent job scheduling. Perform the following operations:In the left navigation tree o",
"product_code":"dataartsstudio",
"title":"How Do I Locate Job Scheduling Nodes with a Large Number?",
"uri":"dataartsstudio_03_0055.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"317"
},
{
"desc":"Ensure that the current instance and peripheral resources are in the same region and IAM project. If the enterprise project function is enabled for your account, the curr",
"product_code":"dataartsstudio",
"title":"Why Cannot Specified Peripheral Resources Be Selected When a Data Connection Is Created in Data Development?",
"uri":"dataartsstudio_03_0056.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"318"
},
{
"desc":"On the Data Development page, choose MonitoringMonitor Job to check whether the target job is being scheduled. A job can be scheduled only within the scheduling period.Vi",
"product_code":"dataartsstudio",
"title":"Why Is There No Job Running Scheduling Log on the Monitor Instance Page After Periodic Scheduling Is Configured for a Job?",
"uri":"dataartsstudio_03_0058.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"319"
},
{
"desc":"Check whether the data connection used by the Hive SQL and Spark SQL scripts is direct connection or proxy connection.In direct connection mode, DataArts Studio users sub",
"product_code":"dataartsstudio",
"title":"Why Does the GUI Display Only the Failure Result but Not the Specific Error Cause After Hive SQL and Spark SQL Scripts Fail to Be Executed?",
"uri":"dataartsstudio_03_0059.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"320"
},
{
"desc":"Check whether the permissions of the current user in IAM are changed, whether the user is removed from the user group, or whether the permission policy of the user group ",
"product_code":"dataartsstudio",
"title":"What Do I Do If the Token Is Invalid During the Running of a Data Development Node?",
"uri":"dataartsstudio_03_0060.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"321"
},
{
"desc":"Method 1: After the node test is complete, right-click the current node and choose View Log from the shortcut menu.Method 2: Click Monitorin the upper part of the canvas,",
"product_code":"dataartsstudio",
"title":"How Do I View Run Logs After a Job Is Tested?",
"uri":"dataartsstudio_03_0062.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"322"
},
{
"desc":"Jobs scheduled by month depend on jobs scheduled by day. Why does a job scheduled by month start running before the job scheduled by day is complete?Although jobs schedul",
"product_code":"dataartsstudio",
"title":"Why Does a Job Scheduled by Month Start Running Before the Job Scheduled by Day Is Complete?",
"uri":"dataartsstudio_03_0063.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"323"
},
{
"desc":"Check whether the current user has the DLI Service User or DLI Service Admin permissions in IAM.",
"product_code":"dataartsstudio",
"title":"What Should I Do If Invalid Authentication Is Reported When I Run a DLI Script?",
"uri":"dataartsstudio_03_0065.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"324"
},
{
"desc":"Check whether the CDM cluster is stopped. If it is stopped, restart it.",
"product_code":"dataartsstudio",
"title":"Why Cannot I Select the Desired CDM Cluster in Proxy Mode When Creating a Data Connection?",
"uri":"dataartsstudio_03_0066.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"325"
},
{
"desc":"Daily scheduling is configured for the job, but there is no job scheduling record in the instance.Cause 1: Check whether the job scheduling is started. If not, the job wi",
"product_code":"dataartsstudio",
"title":"Why Is There No Job Running Scheduling Record After Daily Scheduling Is Configured for the Job?",
"uri":"dataartsstudio_03_0111.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"326"
},
{
"desc":"There is no content contained in the job log.Check whether the user has the global permission of the object storage service (OBS) in IAM to ensure that the user can creat",
"product_code":"dataartsstudio",
"title":"What Do I Do If No Content Is Displayed in Job Logs?",
"uri":"dataartsstudio_03_0112.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"327"
},
{
"desc":"Two jobs are created, but the dependency relationship cannot be established.Check whether the two jobs' recurrence are both every week or every month. Currently, if the t",
"product_code":"dataartsstudio",
"title":"Why Do I Fail to Establish a Dependency Between Two Jobs?",
"uri":"dataartsstudio_03_0113.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"328"
},
{
"desc":"An error is reported when DataArts Studio executes scheduling: The job does not have a submitted version. Submit the job version first.Job scheduling process begins befor",
"product_code":"dataartsstudio",
"title":"What Should I Do If an Error Is Displayed During DataArts Studio Scheduling: The Job Does Not Have a Submitted Version?",
"uri":"dataartsstudio_03_0114.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"329"
},
{
"desc":"An error is reported when DataArts Studio executes scheduling: The script associated with node XXX in the job is not submitted.Job scheduling process begins before the sc",
"product_code":"dataartsstudio",
"title":"What Do I Do If an Error Is Displayed During DataArts Studio Scheduling: The Script Associated with Node XXX in the Job Is Not Submitted?",
"uri":"dataartsstudio_03_0115.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"330"
},
{
"desc":"After a job is submitted for scheduling, the job fails to be executed and the following error is displayed \"depend job [XXX] is not running or pause\".The upstream depende",
"product_code":"dataartsstudio",
"title":"What Should I Do If a Job Fails to Be Executed After Being Submitted for Scheduling and an Error Displayed: Depend Job [XXX] Is Not Running Or Pause?",
"uri":"dataartsstudio_03_0116.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"331"
},
{
"desc":"Databases and data tables can be created in DLI.A database does not correspond to a data connection. A data connection is a connection channel for creating DataArts Studi",
"product_code":"dataartsstudio",
"title":"How Do I Create a Database And Data Table? Is the database a data connection?",
"uri":"dataartsstudio_03_0127.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"332"
},
{
"desc":"Solution: Clear the cache data and use the direct connection to display the data.",
"product_code":"dataartsstudio",
"title":"Why Is No Result Displayed After an HIVE Task Is Executed?",
"uri":"dataartsstudio_03_0129.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"333"
},
{
"desc":"The last instance status indicates a job has been executed, and the status can only be successful or failed. The Monitor Instance page displays all statuses of the job, i",
"product_code":"dataartsstudio",
"title":"Why Does the Last Instance Status On the Monitor Instance page Only Display Succeeded or Failed?",
"uri":"dataartsstudio_03_0135.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"334"
},
{
"desc":"Choose Monitoring > Monitor Job and click the Batch Job Monitoring tab.Select the jobs to be configured and click Configure Notification.Creating a notificationSet notifi",
"product_code":"dataartsstudio",
"title":"How Do I Create a Notification for All Jobs?",
"uri":"dataartsstudio_03_0148.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"335"
},
{
"desc":"The following table lists the number of nodes that can be executed concurrently in each DataArts Studio version.",
"product_code":"dataartsstudio",
"title":"How Many Nodes Can Be Executed Concurrently in Each DataArts Studio Version?",
"uri":"dataartsstudio_03_0200.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"336"
},
{
"desc":"The system obtains permissions for the job agency, workspace agency, and execution user in sequence, and then executes jobs with the permissions.By default, a job is exec",
"product_code":"dataartsstudio",
"title":"What Is the Priority of the Startup User, Execution User, Workspace Agency, and Job Agency?",
"uri":"dataartsstudio_03_0201.html",
"doc_type":"usermanual",
"p_code":"309",
"code":"337"
},
{
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
"product_code":"dataartsstudio",
"title":"Change History",
"uri":"dataartsstudio_12_0006.html",
"doc_type":"usermanual",
"p_code":"",
"code":"338"
}
]