forked from docs/doc-exports
Reviewed-by: Pruthi, Vineet <vineet.pruthi@t-systems.com> Co-authored-by: Su, Xiaomeng <suxiaomeng1@huawei.com> Co-committed-by: Su, Xiaomeng <suxiaomeng1@huawei.com>
2756 lines
113 KiB
JSON
2756 lines
113 KiB
JSON
[
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Service Overview",
|
||
"uri":"dli_01_0538.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"1"
|
||
},
|
||
{
|
||
"desc":"Data Lake Insight (DLI) is a serverless data processing and analysis service fully compatible with Apache Spark and Apache Flink ecosystems. It frees you from managing an",
|
||
"product_code":"dli",
|
||
"title":"What Is Data Lake Insight?",
|
||
"uri":"dli_01_0378.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"1",
|
||
"code":"2"
|
||
},
|
||
{
|
||
"desc":"You do not need a background in big data to use DLI for data analysis. You only need to know SQL, and you are good to go. The SQL syntax is fully compatible with the stan",
|
||
"product_code":"dli",
|
||
"title":"Advantages",
|
||
"uri":"dli_07_0007.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"1",
|
||
"code":"3"
|
||
},
|
||
{
|
||
"desc":"DLI is applicable to large-scale log analysis, federated analysis of heterogeneous data sources, and big data ETL processing.Gaming operations data analysisDifferent depa",
|
||
"product_code":"dli",
|
||
"title":"Application Scenarios",
|
||
"uri":"dli_07_0002.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"1",
|
||
"code":"4"
|
||
},
|
||
{
|
||
"desc":"Only the latest 100 jobs are displayed on DLI's SparkUI.A maximum of 1,000 job results can be displayed on the console. To view more or all jobs, export the job data to O",
|
||
"product_code":"dli",
|
||
"title":"Notes and Constraints",
|
||
"uri":"dli_07_0005.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"1",
|
||
"code":"5"
|
||
},
|
||
{
|
||
"desc":"If you need to assign different permissions to employees in your enterprise to access your DLI resources, IAM is a good choice for fine-grained permissions management. IA",
|
||
"product_code":"dli",
|
||
"title":"Permissions Management",
|
||
"uri":"dli_07_0006.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"1",
|
||
"code":"6"
|
||
},
|
||
{
|
||
"desc":"A quota limits the quantity of a resource available to users, thereby preventing spikes in the usage of the resource.You can also request for an increased quota if your e",
|
||
"product_code":"dli",
|
||
"title":"Quotas",
|
||
"uri":"dli_07_0009.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"1",
|
||
"code":"7"
|
||
},
|
||
{
|
||
"desc":"DLI allows multiple organizations, departments, or applications to share resources. A logical entity, also called a tenant, is provided to use diverse resources and servi",
|
||
"product_code":"dli",
|
||
"title":"Basic Concepts",
|
||
"uri":"dli_07_0003.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"1",
|
||
"code":"8"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Getting Started",
|
||
"uri":"dli_01_0220.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"9"
|
||
},
|
||
{
|
||
"desc":"DLI can query data stored in OBS. This section describes how to us a Spark SQL job on DLI to query OBS data.You can use DLI to submit a Spark SQL job to query data. The g",
|
||
"product_code":"dli",
|
||
"title":"Creating and Submitting a Spark SQL Job",
|
||
"uri":"dli_01_0002.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"9",
|
||
"code":"10"
|
||
},
|
||
{
|
||
"desc":"DLI allows you to customize query templates or save frequently used SQL statements as templates to facilitate SQL operations. After templates are saved, you do not need t",
|
||
"product_code":"dli",
|
||
"title":"Developing and Submitting a Spark SQL Job Using the TPC-H Sample Template",
|
||
"uri":"dli_01_0512.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"9",
|
||
"code":"11"
|
||
},
|
||
{
|
||
"desc":"DLI can query data stored in OBS. This section describes how to use a Spark Jar job on DLI to query OBS data in real time.You can use DLI to submit Spark jobs for real-ti",
|
||
"product_code":"dli",
|
||
"title":"Creating and Submitting a Spark Jar Job",
|
||
"uri":"dli_01_0375.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"9",
|
||
"code":"12"
|
||
},
|
||
{
|
||
"desc":"DLI Flink jobs can use other cloud services as data sources and sink streams for real-time compute. This example describes how to create and submit a Flink Opensource SQL",
|
||
"product_code":"dli",
|
||
"title":"Creating and Submitting a Flink OpenSource SQL Job",
|
||
"uri":"dli_01_0531.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"9",
|
||
"code":"13"
|
||
},
|
||
{
|
||
"desc":"The Overview page of the DLI console provides you with the DLI workflow and resource usage.The process of using DLI is as follows:Create a queue.Queues are DLI's compute ",
|
||
"product_code":"dli",
|
||
"title":"DLI Console Overview",
|
||
"uri":"dli_01_0377.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"14"
|
||
},
|
||
{
|
||
"desc":"You can edit and run SQL statements in the SQL job editor to execute data query.The editor supports SQL:2003 and is compatible with Spark SQL. For details about the synta",
|
||
"product_code":"dli",
|
||
"title":"SQL Editor",
|
||
"uri":"dli_01_0320.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"15"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Job Management",
|
||
"uri":"dli_01_0001.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"16"
|
||
},
|
||
{
|
||
"desc":"DLI provides the following job types:SQL job: SQL jobs provide you with standard SQL statements and are compatible with Spark SQL and Presto SQL (based on Presto). You ca",
|
||
"product_code":"dli",
|
||
"title":"Overview",
|
||
"uri":"dli_01_0567.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"16",
|
||
"code":"17"
|
||
},
|
||
{
|
||
"desc":"SQL jobs allow you to execute SQL statements in the SQL job editing window, import data, and export data.SQL job management provides the following functions:Searching for",
|
||
"product_code":"dli",
|
||
"title":"SQL Job Management",
|
||
"uri":"dli_01_0017.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"16",
|
||
"code":"18"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Flink Job Management",
|
||
"uri":"dli_01_0389.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"16",
|
||
"code":"19"
|
||
},
|
||
{
|
||
"desc":"On the Job Management page of Flink jobs, you can submit a Flink job. Currently, the following job types are supported:Flink SQL uses SQL statements to define jobs and ca",
|
||
"product_code":"dli",
|
||
"title":"Overview",
|
||
"uri":"dli_01_0403.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"19",
|
||
"code":"20"
|
||
},
|
||
{
|
||
"desc":"You can isolate Flink jobs allocated to different users by setting permissions to ensure data query performance.The administrator and job creator have all permissions, wh",
|
||
"product_code":"dli",
|
||
"title":"Managing Flink Job Permissions",
|
||
"uri":"dli_01_0479.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"19",
|
||
"code":"21"
|
||
},
|
||
{
|
||
"desc":"To create a Flink job, you need to enter the data source and data output channel, that is, source and sink. To use another service as the source or sink stream, you need ",
|
||
"product_code":"dli",
|
||
"title":"Preparing Flink Job Data",
|
||
"uri":"dli_01_0454.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"19",
|
||
"code":"22"
|
||
},
|
||
{
|
||
"desc":"This section describes how to create a Flink OpenSource SQL job.DLI Flink OpenSource SQL jobs are fully compatible with the syntax of Flink 1.10 and 1.12 provided by the ",
|
||
"product_code":"dli",
|
||
"title":"(Recommended) Creating a Flink OpenSource SQL Job",
|
||
"uri":"dli_01_0498.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"19",
|
||
"code":"23"
|
||
},
|
||
{
|
||
"desc":"This section describes how to create a Flink SQL job. You can use Flink SQLs to develop jobs to meet your service requirements. Using SQL statements simplifies logic impl",
|
||
"product_code":"dli",
|
||
"title":"Creating a Flink SQL Job",
|
||
"uri":"dli_01_0455.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"19",
|
||
"code":"24"
|
||
},
|
||
{
|
||
"desc":"This section describes how to create a Flink Jar job. You can perform secondary development based on Flink APIs, build your own JAR file, and submit the JAR file to DLI q",
|
||
"product_code":"dli",
|
||
"title":"Creating a Flink Jar Job",
|
||
"uri":"dli_01_0457.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"19",
|
||
"code":"25"
|
||
},
|
||
{
|
||
"desc":"After a job is created, you can perform operations on the job as required.Editing a JobStarting a JobStopping a JobDeleting a JobExporting a JobImporting a JobModifying N",
|
||
"product_code":"dli",
|
||
"title":"Performing Operations on a Flink Job",
|
||
"uri":"dli_01_0461.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"19",
|
||
"code":"26"
|
||
},
|
||
{
|
||
"desc":"After creating a job, you can view the job details to learn about the following information:Viewing Job DetailsChecking Job Monitoring InformationViewing the Task List of",
|
||
"product_code":"dli",
|
||
"title":"Flink Job Details",
|
||
"uri":"dli_01_0462.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"19",
|
||
"code":"27"
|
||
},
|
||
{
|
||
"desc":"A tag is a key-value pair customized by users and used to identify cloud resources. It helps users to classify and search for cloud resources. A tag consists of a tag key",
|
||
"product_code":"dli",
|
||
"title":"Tag Management",
|
||
"uri":"dli_01_0463.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"19",
|
||
"code":"28"
|
||
},
|
||
{
|
||
"desc":"In actual job operations, the compute resources required by a job vary depending on the data volume. As a result, compute resources are wasted when the volume is small an",
|
||
"product_code":"dli",
|
||
"title":"Enabling Dynamic Scaling for Flink Jobs",
|
||
"uri":"dli_01_0534.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"19",
|
||
"code":"29"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Spark Job Management",
|
||
"uri":"dli_01_0465.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"16",
|
||
"code":"30"
|
||
},
|
||
{
|
||
"desc":"Based on the open-source Spark, DLI optimizes performance and reconstructs services to be compatible with the Apache Spark ecosystem and interfaces, and executes batch pr",
|
||
"product_code":"dli",
|
||
"title":"Spark Job Management",
|
||
"uri":"dli_01_0385.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"30",
|
||
"code":"31"
|
||
},
|
||
{
|
||
"desc":"DLI provides fully-managed Spark computing services by allowing you to execute Spark jobs.On the Overview page, click Create Job in the upper right corner of the Spark Jo",
|
||
"product_code":"dli",
|
||
"title":"Creating a Spark Job",
|
||
"uri":"dli_01_0384.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"30",
|
||
"code":"32"
|
||
},
|
||
{
|
||
"desc":"In actual job operations, jobs have different importance and urgency levels. So, the compute resources required for normal operations of important and urgent jobs need to",
|
||
"product_code":"dli",
|
||
"title":"Setting the Priority for a Job",
|
||
"uri":"dli_01_0535.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"16",
|
||
"code":"33"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Queue Management",
|
||
"uri":"dli_01_0012.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"34"
|
||
},
|
||
{
|
||
"desc":"Queues in DLI are computing resources, which are the basis for using DLI. All executed jobs require computing resources.Currently, DLI provides two types of queues: For S",
|
||
"product_code":"dli",
|
||
"title":"Overview",
|
||
"uri":"dli_01_0402.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"34",
|
||
"code":"35"
|
||
},
|
||
{
|
||
"desc":"You can isolate queues allocated to different users by setting permissions to ensure data query performance.The administrator and queue owner have all permissions, which ",
|
||
"product_code":"dli",
|
||
"title":"Queue Permission Management",
|
||
"uri":"dli_01_0015.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"34",
|
||
"code":"36"
|
||
},
|
||
{
|
||
"desc":"Before executing a job, you need to create a queue.If you use a sub-account to create a queue for the first time, log in to the DLI management console using the main acco",
|
||
"product_code":"dli",
|
||
"title":"Creating a Queue",
|
||
"uri":"dli_01_0363.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"34",
|
||
"code":"37"
|
||
},
|
||
{
|
||
"desc":"You can delete a queue based on actual conditions.This operation will fail if there are jobs in the Submitting or Running state on this queue.Deleting a queue does not ca",
|
||
"product_code":"dli",
|
||
"title":"Deleting a Queue",
|
||
"uri":"dli_01_0016.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"34",
|
||
"code":"38"
|
||
},
|
||
{
|
||
"desc":"You can create enterprise projects matching the organizational structure of your enterprises to centrally manage cloud resources across regions by project. Then you can c",
|
||
"product_code":"dli",
|
||
"title":"Allocating a Queue to an Enterprise Project",
|
||
"uri":"dli_01_0565.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"34",
|
||
"code":"39"
|
||
},
|
||
{
|
||
"desc":"If the CIDR block of the DLI queue conflicts with that of the user data source, you can change the CIDR block of the queue.If the queue whose CIDR block is to be modified",
|
||
"product_code":"dli",
|
||
"title":"Modifying the CIDR Block",
|
||
"uri":"dli_01_0443.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"34",
|
||
"code":"40"
|
||
},
|
||
{
|
||
"desc":"Elastic scaling can be performed for a newly created queue only when there were jobs running in this queue.Queues with 16 CUs do not support scale-out or scale-in.Queues ",
|
||
"product_code":"dli",
|
||
"title":"Elastic Scaling of Queues",
|
||
"uri":"dli_01_0487.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"34",
|
||
"code":"41"
|
||
},
|
||
{
|
||
"desc":"When services are busy, you might need to use more compute resources to process services in a period. After this period, you do not require the same amount of resources. ",
|
||
"product_code":"dli",
|
||
"title":"Scheduling CU Changes",
|
||
"uri":"dli_01_0488.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"34",
|
||
"code":"42"
|
||
},
|
||
{
|
||
"desc":"It can be used to test the connectivity between the DLI queue and the peer IP address specified by the user in common scenarios, or the connectivity between the DLI queue",
|
||
"product_code":"dli",
|
||
"title":"Testing Address Connectivity",
|
||
"uri":"dli_01_0489.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"34",
|
||
"code":"43"
|
||
},
|
||
{
|
||
"desc":"Once you have created an SMN topic, you can easily subscribe to it by going to the Topic Management > Topics page of the SMN console. You can choose to receive notificati",
|
||
"product_code":"dli",
|
||
"title":"Creating an SMN Topic",
|
||
"uri":"dli_01_0421.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"34",
|
||
"code":"44"
|
||
},
|
||
{
|
||
"desc":"A tag is a key-value pair that you can customize to identify cloud resources. It helps you to classify and search for cloud resources. A tag consists of a tag key and a t",
|
||
"product_code":"dli",
|
||
"title":"Managing Queue Tags",
|
||
"uri":"dli_01_0022.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"34",
|
||
"code":"45"
|
||
},
|
||
{
|
||
"desc":"DLI allows you to set properties for queues.You can set Spark driver parameters to improve the scheduling efficiency of queues.This section describes how to set queue pro",
|
||
"product_code":"dli",
|
||
"title":"Setting Queue Properties",
|
||
"uri":"dli_01_0563.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"34",
|
||
"code":"46"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Elastic Resource Pool",
|
||
"uri":"dli_01_0508.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"47"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Before You Start",
|
||
"uri":"dli_01_0528.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"47",
|
||
"code":"48"
|
||
},
|
||
{
|
||
"desc":"An elastic resource pool provides compute resources (CPU and memory) for running DLI jobs. The unit is CU. One CU contains one CPU and 4 GB memory.You can create multiple",
|
||
"product_code":"dli",
|
||
"title":"Overview",
|
||
"uri":"dli_01_0504.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"48",
|
||
"code":"49"
|
||
},
|
||
{
|
||
"desc":"This section walks you through the procedure of adding a queue to an elastic resource pool and binding an enhanced datasource connection to the elastic resource pool.Proc",
|
||
"product_code":"dli",
|
||
"title":"Creating an Elastic Resource Pool and Running a Job",
|
||
"uri":"dli_01_0515.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"48",
|
||
"code":"50"
|
||
},
|
||
{
|
||
"desc":"A company has multiple departments that perform data analysis in different periods during a day.Department A requires a large number of compute resources from 00:00 a.m. ",
|
||
"product_code":"dli",
|
||
"title":"Configuring Scaling Policies for Queues",
|
||
"uri":"dli_01_0516.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"48",
|
||
"code":"51"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Regular Operations",
|
||
"uri":"dli_01_0529.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"47",
|
||
"code":"52"
|
||
},
|
||
{
|
||
"desc":"For details about the application scenarios of elastic resource pools, see the Overview. This section describes how to create an elastic resource pool.If you use an enhan",
|
||
"product_code":"dli",
|
||
"title":"Creating an Elastic Resource Pool",
|
||
"uri":"dli_01_0505.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"52",
|
||
"code":"53"
|
||
},
|
||
{
|
||
"desc":"Administrators can assign permissions of different operation scopes to users for each elastic resource pool.The administrator and elastic resource pool owner have all per",
|
||
"product_code":"dli",
|
||
"title":"Managing Permissions",
|
||
"uri":"dli_01_0526.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"52",
|
||
"code":"54"
|
||
},
|
||
{
|
||
"desc":"You can add one or more queues to an elastic resource pool to run jobs. This section describes how to add a queue to an elastic resource pool.Automatic scaling of an elas",
|
||
"product_code":"dli",
|
||
"title":"Adding a Queue",
|
||
"uri":"dli_01_0509.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"52",
|
||
"code":"55"
|
||
},
|
||
{
|
||
"desc":"If you want a queue to use resources in an elastic resource pool, bind the queue to the pool.You can click Associate Queue on the Resource Pool page to bind a queue to an",
|
||
"product_code":"dli",
|
||
"title":"Binding a Queue",
|
||
"uri":"dli_01_0530.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"52",
|
||
"code":"56"
|
||
},
|
||
{
|
||
"desc":"Multiple queues can be added to an elastic resource pool. For details about how to add a queue, see Adding a Queue. You can configure the number of CUs you want based on ",
|
||
"product_code":"dli",
|
||
"title":"Managing Queues",
|
||
"uri":"dli_01_0506.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"52",
|
||
"code":"57"
|
||
},
|
||
{
|
||
"desc":"CU settings are used to control the maximum and minimum CU ranges for elastic resource pools to avoid unlimited resource scaling.For example, an elastic resource pool has",
|
||
"product_code":"dli",
|
||
"title":"Setting CUs",
|
||
"uri":"dli_01_0507.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"52",
|
||
"code":"58"
|
||
},
|
||
{
|
||
"desc":"If the current specifications of your elastic resource pool do not meet your service needs, you can modify them using the change specifications function.In the navigation",
|
||
"product_code":"dli",
|
||
"title":"Modifying Specifications",
|
||
"uri":"dli_01_0524.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"52",
|
||
"code":"59"
|
||
},
|
||
{
|
||
"desc":"A tag is a key-value pair that you can customize to identify cloud resources. It helps you to classify and search for cloud resources. A tag consists of a tag key and a t",
|
||
"product_code":"dli",
|
||
"title":"Managing Tags",
|
||
"uri":"dli_01_0525.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"52",
|
||
"code":"60"
|
||
},
|
||
{
|
||
"desc":"If you added a queue to or deleted one from an elastic resource pool, or you scaled an added queue, the CU quantity of the elastic resource pool may be changed. You can v",
|
||
"product_code":"dli",
|
||
"title":"Viewing Scaling History",
|
||
"uri":"dli_01_0532.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"52",
|
||
"code":"61"
|
||
},
|
||
{
|
||
"desc":"You can create enterprise projects matching the organizational structure of your enterprises to centrally manage cloud resources across regions by project. Then you can c",
|
||
"product_code":"dli",
|
||
"title":"Allocating to an Enterprise Project",
|
||
"uri":"dli_01_0566.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"52",
|
||
"code":"62"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Data Management",
|
||
"uri":"dli_01_0004.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"63"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Databases and Tables",
|
||
"uri":"dli_01_0390.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"63",
|
||
"code":"64"
|
||
},
|
||
{
|
||
"desc":"DLI database and table management provide the following functions:Database Permission ManagementTable Permission ManagementCreating a Database or a TableDeleting a Databa",
|
||
"product_code":"dli",
|
||
"title":"Overview",
|
||
"uri":"dli_01_0228.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"64",
|
||
"code":"65"
|
||
},
|
||
{
|
||
"desc":"By setting permissions, you can assign varying database permissions to different users.The administrator and database owner have all permissions, which cannot be set or m",
|
||
"product_code":"dli",
|
||
"title":"Managing Database Permissions",
|
||
"uri":"dli_01_0447.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"64",
|
||
"code":"66"
|
||
},
|
||
{
|
||
"desc":"By setting permissions, you can assign varying table permissions to different users.The administrator and table owner have all permissions, which cannot be set or modifie",
|
||
"product_code":"dli",
|
||
"title":"Managing Table Permissions",
|
||
"uri":"dli_01_0448.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"64",
|
||
"code":"67"
|
||
},
|
||
{
|
||
"desc":"A database, built on the computer storage device, is a data warehouse where data is organized, stored, and managed based on its structure.The table is an important part o",
|
||
"product_code":"dli",
|
||
"title":"Creating a Database or a Table",
|
||
"uri":"dli_01_0005.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"64",
|
||
"code":"68"
|
||
},
|
||
{
|
||
"desc":"You can delete unnecessary databases and tables based on actual conditions.You are not allowed to delete databases or tables that are being used for running jobs.The admi",
|
||
"product_code":"dli",
|
||
"title":"Deleting a Database or a Table",
|
||
"uri":"dli_01_0011.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"64",
|
||
"code":"69"
|
||
},
|
||
{
|
||
"desc":"During actual use, developers create databases and tables and submit them to test personnel for testing. After the test is complete, the databases and tables are transfer",
|
||
"product_code":"dli",
|
||
"title":"Modifying the Owners of Databases and Tables",
|
||
"uri":"dli_01_0376.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"64",
|
||
"code":"70"
|
||
},
|
||
{
|
||
"desc":"You can import data from OBS to a table created in DLI.Only one path can be specified during data import. The path cannot contain commas (,).To import data in CSV format ",
|
||
"product_code":"dli",
|
||
"title":"Importing Data to the Table",
|
||
"uri":"dli_01_0253.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"64",
|
||
"code":"71"
|
||
},
|
||
{
|
||
"desc":"You can export data from a DLI table to OBS. During the export, a folder is created in OBS or the content in the existing folder is overwritten.The exported file can be i",
|
||
"product_code":"dli",
|
||
"title":"Exporting Data from DLI to OBS",
|
||
"uri":"dli_01_0010.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"64",
|
||
"code":"72"
|
||
},
|
||
{
|
||
"desc":"Metadata is used to define data types. It describes information about the data, including the source, size, format, and other data features. In database fields, metadata ",
|
||
"product_code":"dli",
|
||
"title":"Viewing Metadata",
|
||
"uri":"dli_01_0008.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"64",
|
||
"code":"73"
|
||
},
|
||
{
|
||
"desc":"The Preview page displays the first 10 records in the table.You can preview data on either the Data Management page or the SQL Editor page.To preview data on the Data Man",
|
||
"product_code":"dli",
|
||
"title":"Previewing Data",
|
||
"uri":"dli_01_0007.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"64",
|
||
"code":"74"
|
||
},
|
||
{
|
||
"desc":"A tag is a key-value pair that you can customize to identify cloud resources. It helps you to classify and search for cloud resources. A tag consists of a tag key and a t",
|
||
"product_code":"dli",
|
||
"title":"Managing Tags",
|
||
"uri":"dli_01_0552.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"64",
|
||
"code":"75"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Package Management",
|
||
"uri":"dli_01_0366.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"63",
|
||
"code":"76"
|
||
},
|
||
{
|
||
"desc":"Package management provides the following functions:Managing Package PermissionsCreating a PackageDeleting a PackageYou can delete program packages in batches.You can del",
|
||
"product_code":"dli",
|
||
"title":"Overview",
|
||
"uri":"dli_01_0407.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"76",
|
||
"code":"77"
|
||
},
|
||
{
|
||
"desc":"You can isolate package groups or packages allocated to different users by setting permissions to ensure data query performance.The administrator and the owner of a packa",
|
||
"product_code":"dli",
|
||
"title":"Managing Permissions on Packages and Package Groups",
|
||
"uri":"dli_01_0477.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"76",
|
||
"code":"78"
|
||
},
|
||
{
|
||
"desc":"DLI allows you to submit program packages in batches to the general-use queue for running.If you need to update a package, you can use the same package or file to upload ",
|
||
"product_code":"dli",
|
||
"title":"Creating a Package",
|
||
"uri":"dli_01_0367.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"76",
|
||
"code":"79"
|
||
},
|
||
{
|
||
"desc":"You can delete a package based on actual conditions.On the left of the management console, choose Data Management > Package Management.Click Delete in the Operation colum",
|
||
"product_code":"dli",
|
||
"title":"Deleting a Package",
|
||
"uri":"dli_01_0369.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"76",
|
||
"code":"80"
|
||
},
|
||
{
|
||
"desc":"To change the owner of a package, click More > Modify Owner in the Operation column of a package on the Package Management page.If the package has been grouped, you can m",
|
||
"product_code":"dli",
|
||
"title":"Modifying the Owner",
|
||
"uri":"dli_01_0478.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"76",
|
||
"code":"81"
|
||
},
|
||
{
|
||
"desc":"DLI built-in dependencies are provided by the platform by default. In case of conflicts, you do not need to upload them when packaging JAR packages of Spark or Flink Jar ",
|
||
"product_code":"dli",
|
||
"title":"Built-in Dependencies",
|
||
"uri":"dli_01_0397.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"76",
|
||
"code":"82"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Job Templates",
|
||
"uri":"dli_01_0379.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"83"
|
||
},
|
||
{
|
||
"desc":"To facilitate SQL operation execution, DLI allows you to customize query templates or save the SQL statements in use as templates. After templates are saved, you do not n",
|
||
"product_code":"dli",
|
||
"title":"Managing SQL Templates",
|
||
"uri":"dli_01_0021.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"83",
|
||
"code":"84"
|
||
},
|
||
{
|
||
"desc":"Flink templates include sample templates and custom templates. You can modify an existing sample template to meet the actual job logic requirements and save time for edit",
|
||
"product_code":"dli",
|
||
"title":"Managing Flink Templates",
|
||
"uri":"dli_01_0464.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"83",
|
||
"code":"85"
|
||
},
|
||
{
|
||
"desc":"You can modify a sample template to meet the Spark job requirements, saving time for editing SQL statements.Currently, the cloud platform does not provide preset Spark te",
|
||
"product_code":"dli",
|
||
"title":"Managing Spark SQL Templates",
|
||
"uri":"dli_01_0551.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"83",
|
||
"code":"86"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Appendix",
|
||
"uri":"dli_01_05110.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"83",
|
||
"code":"87"
|
||
},
|
||
{
|
||
"desc":"TPC-H is a test set developed by the Transaction Processing Performance Council (TPC) to simulate decision-making support applications. It is widely used in academia and ",
|
||
"product_code":"dli",
|
||
"title":"TPC-H Sample Data in the SQL Template",
|
||
"uri":"dli_01_05111.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"87",
|
||
"code":"88"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Enhanced Datasource Connections",
|
||
"uri":"dli_01_0426.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"89"
|
||
},
|
||
{
|
||
"desc":"In cross-source data analysis scenarios, DLI needs to connect to external data sources. However, due to the different VPCs between the data source and DLI, the network ca",
|
||
"product_code":"dli",
|
||
"title":"Overview",
|
||
"uri":"dli_01_0003.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"89",
|
||
"code":"90"
|
||
},
|
||
{
|
||
"desc":"If DLI needs to access external data sources, you need to establish enhanced datasource connections to enable the network between DLI and the data sources, and then devel",
|
||
"product_code":"dli",
|
||
"title":"Cross-Source Analysis Development Methods",
|
||
"uri":"dli_01_0410.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"89",
|
||
"code":"91"
|
||
},
|
||
{
|
||
"desc":"Create an enhanced datasource connection for DLI to access, import, query, and analyze data of other data sources.For example, to connect DLI to the MRS, RDS, CSS, Kafka,",
|
||
"product_code":"dli",
|
||
"title":"Creating an Enhanced Datasource Connection",
|
||
"uri":"dli_01_0006.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"89",
|
||
"code":"92"
|
||
},
|
||
{
|
||
"desc":"VPC sharing allows sharing VPC resources created in one account with other accounts using Resource Access Manager (RAM). For example, account A can share its VPC and subn",
|
||
"product_code":"dli",
|
||
"title":"Establishing a Network Connection Between DLI and Resources in a Shared VPC",
|
||
"uri":"dli_01_0624.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"89",
|
||
"code":"93"
|
||
},
|
||
{
|
||
"desc":"Delete an enhanced datasource connection that is no longer used on the console.Log in to the DLI management console.In the left navigation pane, choose Datasource Connect",
|
||
"product_code":"dli",
|
||
"title":"Deleting an Enhanced Datasource Connection",
|
||
"uri":"dli_01_0553.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"89",
|
||
"code":"94"
|
||
},
|
||
{
|
||
"desc":"Host information is the mapping between host IP addresses and domain names. After you configure host information, jobs can only use the configured domain names to access ",
|
||
"product_code":"dli",
|
||
"title":"Modifying Host Information in an Elastic Resource Pool",
|
||
"uri":"dli_01_0013.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"89",
|
||
"code":"95"
|
||
},
|
||
{
|
||
"desc":"The CIDR block of the DLI queue that is bound with a datasource connection cannot overlap with that of the data source.The default queue cannot be bound with a connection",
|
||
"product_code":"dli",
|
||
"title":"Binding and Unbinding a Queue",
|
||
"uri":"dli_01_0514.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"89",
|
||
"code":"96"
|
||
},
|
||
{
|
||
"desc":"A route is configured with the destination, next hop type, and next hop to determine where the network traffic is directed. Routes are classified into system routes and c",
|
||
"product_code":"dli",
|
||
"title":"Adding a Route",
|
||
"uri":"dli_01_0014.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"89",
|
||
"code":"97"
|
||
},
|
||
{
|
||
"desc":"Delete a route that is no longer used.A custom route table cannot be deleted if it is associated with a subnet.Log in to the DLI management console.In the left navigation",
|
||
"product_code":"dli",
|
||
"title":"Deleting a Route",
|
||
"uri":"dli_01_0556.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"89",
|
||
"code":"98"
|
||
},
|
||
{
|
||
"desc":"Enhanced connections support user authorization by project. After authorization, users in the project have the permission to perform operations on the enhanced connection",
|
||
"product_code":"dli",
|
||
"title":"Enhanced Connection Permission Management",
|
||
"uri":"dli_01_0018.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"89",
|
||
"code":"99"
|
||
},
|
||
{
|
||
"desc":"A tag is a key-value pair customized by users and used to identify cloud resources. It helps users to classify and search for cloud resources. A tag consists of a tag key",
|
||
"product_code":"dli",
|
||
"title":"Enhanced Datasource Connection Tag Management",
|
||
"uri":"dli_01_0019.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"89",
|
||
"code":"100"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Datasource Authentication",
|
||
"uri":"dli_01_0422.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"101"
|
||
},
|
||
{
|
||
"desc":"When analyzing across multiple sources, it is not recommended to configure authentication information directly in a job as it can lead to password leakage. Instead, you a",
|
||
"product_code":"dli",
|
||
"title":"Overview",
|
||
"uri":"dli_01_0561.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"101",
|
||
"code":"102"
|
||
},
|
||
{
|
||
"desc":"Create a CSS datasource authentication on the DLI console to store the authentication information of the CSS security cluster to DLI. This will allow you to access to the",
|
||
"product_code":"dli",
|
||
"title":"Creating a CSS Datasource Authentication",
|
||
"uri":"dli_01_0427.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"101",
|
||
"code":"103"
|
||
},
|
||
{
|
||
"desc":"Create a Kerberos datasource authentication on the DLI console to store the authentication information of the data source to DLI. This will allow you to access to the dat",
|
||
"product_code":"dli",
|
||
"title":"Creating a Kerberos Datasource Authentication",
|
||
"uri":"dli_01_0558.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"101",
|
||
"code":"104"
|
||
},
|
||
{
|
||
"desc":"Create a Kafka_SSL datasource authentication on the DLI console to store the Kafka authentication information to DLI. This will allow you to access to Kafka instances wit",
|
||
"product_code":"dli",
|
||
"title":"Creating a Kafka_SSL Datasource Authentication",
|
||
"uri":"dli_01_0560.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"101",
|
||
"code":"105"
|
||
},
|
||
{
|
||
"desc":"Create a password datasource authentication on the DLI console to store passwords of the GaussDB(DWS), RDS, DCS, and DDS data sources to DLI. This will allow you to acces",
|
||
"product_code":"dli",
|
||
"title":"Creating a Password Datasource Authentication",
|
||
"uri":"dli_01_0559.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"101",
|
||
"code":"106"
|
||
},
|
||
{
|
||
"desc":"Grant permissions on a datasource authentication to users so multiple user jobs can use the datasource authentication without affecting each other.The administrator and t",
|
||
"product_code":"dli",
|
||
"title":"Datasource Authentication Permission Management",
|
||
"uri":"dli_01_0480.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"101",
|
||
"code":"107"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Global Configuration",
|
||
"uri":"dli_01_0485.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"108"
|
||
},
|
||
{
|
||
"desc":"DLI allows you to set variables that are frequently used during job development as global variables on the DLI management console. This avoids repeated definitions during",
|
||
"product_code":"dli",
|
||
"title":"Global Variables",
|
||
"uri":"dli_01_0476.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"108",
|
||
"code":"109"
|
||
},
|
||
{
|
||
"desc":"You can grant permissions on a global variable to users.The administrator and the global variable owner have all permissions. You do not need to set permissions for them,",
|
||
"product_code":"dli",
|
||
"title":"Permission Management for Global Variables",
|
||
"uri":"dli_01_0533.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"108",
|
||
"code":"110"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Permissions Management",
|
||
"uri":"dli_01_0408.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"111"
|
||
},
|
||
{
|
||
"desc":"DLI has a comprehensive permission control mechanism and supports fine-grained authentication through Identity and Access Management (IAM). You can create policies in IAM",
|
||
"product_code":"dli",
|
||
"title":"Overview",
|
||
"uri":"dli_01_0440.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"111",
|
||
"code":"112"
|
||
},
|
||
{
|
||
"desc":"You can use Identity and Access Management (IAM) to implement fine-grained permissions control on DLI resources. For details, see Overview.If your cloud account does not ",
|
||
"product_code":"dli",
|
||
"title":"Creating an IAM User and Granting Permissions",
|
||
"uri":"dli_01_0418.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"111",
|
||
"code":"113"
|
||
},
|
||
{
|
||
"desc":"Custom policies can be created as a supplement to the system policies of DLI. You can add actions to custom policies. For the actions supported for custom policies, see \"",
|
||
"product_code":"dli",
|
||
"title":"Creating a Custom Policy",
|
||
"uri":"dli_01_0451.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"111",
|
||
"code":"114"
|
||
},
|
||
{
|
||
"desc":"A resource is an object that exists within a service. You can select DLI resources by specifying their paths.",
|
||
"product_code":"dli",
|
||
"title":"DLI Resources",
|
||
"uri":"dli_01_0417.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"111",
|
||
"code":"115"
|
||
},
|
||
{
|
||
"desc":"Request conditions are useful in determining when a custom policy takes effect. A request condition consists of a condition key and operator. Condition keys are either gl",
|
||
"product_code":"dli",
|
||
"title":"DLI Request Conditions",
|
||
"uri":"dli_01_0475.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"111",
|
||
"code":"116"
|
||
},
|
||
{
|
||
"desc":"Table 1 lists the common operations supported by each system policy of DLI. Choose proper system policies according to this table. For details about the SQL statement per",
|
||
"product_code":"dli",
|
||
"title":"Common Operations Supported by DLI System Policy",
|
||
"uri":"dli_01_0441.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"111",
|
||
"code":"117"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Other Common Operations",
|
||
"uri":"dli_01_0513.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"118"
|
||
},
|
||
{
|
||
"desc":"On the DLI management console, you can import data stored in OBS into DLI tables.To import OBS data to a DLI table, either choose Data Management > Databases and Tables i",
|
||
"product_code":"dli",
|
||
"title":"Importing Data to a DLI Table",
|
||
"uri":"dli_01_0420.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"118",
|
||
"code":"119"
|
||
},
|
||
{
|
||
"desc":"This section describes metrics reported by DLI to Cloud Eye as well as their namespaces and dimensions. You can use the management console or APIs provided by Cloud Eye t",
|
||
"product_code":"dli",
|
||
"title":"Viewing Monitoring Metrics",
|
||
"uri":"dli_01_0445.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"118",
|
||
"code":"120"
|
||
},
|
||
{
|
||
"desc":"With CTS, you can record operations associated with DLI for later query, audit, and backtrack operations.",
|
||
"product_code":"dli",
|
||
"title":"DLI Operations That Can Be Recorded by CTS",
|
||
"uri":"dli_01_0318.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"118",
|
||
"code":"121"
|
||
},
|
||
{
|
||
"desc":"A quota limits the quantity of a resource available to users, thereby preventing spikes in the usage of the resource.You can also request for an increased quota if your e",
|
||
"product_code":"dli",
|
||
"title":"Quota Management",
|
||
"uri":"dli_01_0550.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"118",
|
||
"code":"122"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"FAQ",
|
||
"uri":"dli_01_0539.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"123"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Flink Jobs",
|
||
"uri":"dli_03_0037.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"123",
|
||
"code":"124"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Usage",
|
||
"uri":"dli_03_0137.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"124",
|
||
"code":"125"
|
||
},
|
||
{
|
||
"desc":"DLI Flink jobs support the following data formats:Avro, Avro_merge, BLOB, CSV, EMAIL, JSON, ORC, Parquet, and XML.DLI Flink jobs support data from the following data sour",
|
||
"product_code":"dli",
|
||
"title":"What Data Formats and Data Sources Are Supported by DLI Flink Jobs?",
|
||
"uri":"dli_03_0083.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"125",
|
||
"code":"126"
|
||
},
|
||
{
|
||
"desc":"A sub-user can view queues but cannot view Flink jobs. You can authorize the sub-user using DLI or IAM.Authorization on DLILog in to the DLI console using a tenant accoun",
|
||
"product_code":"dli",
|
||
"title":"How Do I Authorize a Subuser to View Flink Jobs?",
|
||
"uri":"dli_03_0139.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"125",
|
||
"code":"127"
|
||
},
|
||
{
|
||
"desc":"DLI Flink jobs are highly available. You can enable the automatic restart function to automatically restart your jobs after short-time faults of peripheral services are r",
|
||
"product_code":"dli",
|
||
"title":"How Do I Set Auto Restart upon Exception for a Flink Job?",
|
||
"uri":"dli_03_0090.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"125",
|
||
"code":"128"
|
||
},
|
||
{
|
||
"desc":"When you create a Flink SQL job or Flink Jar job, you can select Save Job Log on the job editing page to save job running logs to OBS.To set the OBS bucket for storing th",
|
||
"product_code":"dli",
|
||
"title":"How Do I Save Flink Job Logs?",
|
||
"uri":"dli_03_0099.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"125",
|
||
"code":"129"
|
||
},
|
||
{
|
||
"desc":"DLI can output Flink job results to DIS. You can view the results in DIS. For details, see \"Obtaining Data from DIS\" in Data Ingestion Service User Guide.DLI can output F",
|
||
"product_code":"dli",
|
||
"title":"How Can I Check Flink Job Results?",
|
||
"uri":"dli_03_0043.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"125",
|
||
"code":"130"
|
||
},
|
||
{
|
||
"desc":"Choose Job Management > Flink Jobs. In the Operation column of the target job, choose More > Permissions. When a new user is authorized, No such user. userName:xxxx. is d",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"No such user. userName:xxxx.\" Reported on the Flink Job Management Page When I Grant Permission to a User?",
|
||
"uri":"dli_03_0160.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"125",
|
||
"code":"131"
|
||
},
|
||
{
|
||
"desc":"Checkpoint was enabled when a Flink job is created, and the OBS bucket for storing checkpoints was specified. After a Flink job is manually stopped, no message is display",
|
||
"product_code":"dli",
|
||
"title":"How Do I Know Which Checkpoint the Flink Job I Stopped Will Be Restored to When I Start the Job Again?",
|
||
"uri":"dli_03_0180.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"125",
|
||
"code":"132"
|
||
},
|
||
{
|
||
"desc":"When you set running parameters of a DLI Flink job, you can enable Alarm Generation upon Job Exception to receive alarms when the job runs abnormally or is in arrears.If ",
|
||
"product_code":"dli",
|
||
"title":"Why Is a Message Displayed Indicating That the SMN Topic Does Not Exist When I Use the SMN Topic in DLI?",
|
||
"uri":"dli_03_0036.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"125",
|
||
"code":"133"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Flink SQL",
|
||
"uri":"dli_03_0131.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"124",
|
||
"code":"134"
|
||
},
|
||
{
|
||
"desc":"The consumption capability of a Flink SQL job depends on the data source transmission, queue size, and job parameter settings. The peak consumption is 10 Mbit/s.",
|
||
"product_code":"dli",
|
||
"title":"How Much Data Can Be Processed in a Day by a Flink SQL Job?",
|
||
"uri":"dli_03_0130.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"134",
|
||
"code":"135"
|
||
},
|
||
{
|
||
"desc":"The temp stream in Flink SQL is similar to a subquery. It is a logical stream used to simplify the SQL logic and does not generate data storage. Therefore, there is no ne",
|
||
"product_code":"dli",
|
||
"title":"Does Data in the Temporary Stream of Flink SQL Need to Be Cleared Periodically? How Do I Clear the Data?",
|
||
"uri":"dli_03_0061.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"134",
|
||
"code":"136"
|
||
},
|
||
{
|
||
"desc":"When you create a Flink SQL job and configure the parameters, you select an OBS bucket you have created. The system displays a message indicating that the OBS bucket is n",
|
||
"product_code":"dli",
|
||
"title":"Why Is a Message Displayed Indicating That the OBS Bucket Is Not Authorized When I Select an OBS Bucket for a Flink SQL Job?",
|
||
"uri":"dli_03_0138.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"134",
|
||
"code":"137"
|
||
},
|
||
{
|
||
"desc":"When using a Flink SQL job, you need to create an OBS partition table for subsequent batch processing.In the following example, the day field is used as the partition fie",
|
||
"product_code":"dli",
|
||
"title":"How Do I Create an OBS Partitioned Table for a Flink SQL Job?",
|
||
"uri":"dli_03_0089.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"134",
|
||
"code":"138"
|
||
},
|
||
{
|
||
"desc":"In this example, the day field is used as the partition field with the parquet encoding format (only the parquet format is supported currently) to dump car_info data to O",
|
||
"product_code":"dli",
|
||
"title":"How Do I Dump Data to OBS and Create an OBS Partitioned Table?",
|
||
"uri":"dli_03_0075.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"134",
|
||
"code":"139"
|
||
},
|
||
{
|
||
"desc":"When I run the creation statement with an EL expression in the table name in a Flink SQL job, the following error message is displayed:DLI.0005: AnalysisException: t_user",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error Message \"DLI.0005\" Displayed When I Use an EL Expression to Create a Table in a Flink SQL Job?",
|
||
"uri":"dli_03_0167.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"134",
|
||
"code":"140"
|
||
},
|
||
{
|
||
"desc":"After data is written to OBS through the Flink job output stream, data cannot be queried from the DLI table created in the OBS file path.For example, use the following Fl",
|
||
"product_code":"dli",
|
||
"title":"Why Is No Data Queried in the DLI Table Created Using the OBS File Path When Data Is Written to OBS by a Flink Job Output Stream?",
|
||
"uri":"dli_03_0168.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"134",
|
||
"code":"141"
|
||
},
|
||
{
|
||
"desc":"After a Flink SQL job is submitted on DLI, the job fails to be executed. The following error information is displayed in the job log:connect to DIS failed java.lang.Illeg",
|
||
"product_code":"dli",
|
||
"title":"Why Does a Flink SQL Job Fails to Be Executed, and Is \"connect to DIS failed java.lang.IllegalArgumentException: Access key cannot be null\" Displayed in the Log?",
|
||
"uri":"dli_03_0174.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"134",
|
||
"code":"142"
|
||
},
|
||
{
|
||
"desc":"Semantic verification for a Flink SQL job (reading DIS data) fails. The following information is displayed when the job fails:Get dis channel xxxinfo failed. error info: ",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"Not authorized\" Reported When a Flink SQL Job Reads DIS Data?",
|
||
"uri":"dli_03_0176.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"134",
|
||
"code":"143"
|
||
},
|
||
{
|
||
"desc":"After a Flink SQL job consumed Kafka and sent data to the Elasticsearch cluster, the job was successfully executed, but no data is available.Possible causes are as follow",
|
||
"product_code":"dli",
|
||
"title":"Data Writing Fails After a Flink SQL Job Consumed Kafka and Sank Data to the Elasticsearch Cluster",
|
||
"uri":"dli_03_0232.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"134",
|
||
"code":"144"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Flink Jar Jobs",
|
||
"uri":"dli_03_0132.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"124",
|
||
"code":"145"
|
||
},
|
||
{
|
||
"desc":"You can upload configuration files for custom jobs (Jar).Upload the configuration file to DLI through Package Management.In the Other Dependencies area of the Flink Jar j",
|
||
"product_code":"dli",
|
||
"title":"Can I Upload Configuration Files for Flink Jar Jobs?",
|
||
"uri":"dli_03_0044.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"145",
|
||
"code":"146"
|
||
},
|
||
{
|
||
"desc":"The dependency of your Flink job conflicts with a built-in dependency of the DLI Flink platform. As a result, the job submission fails.Delete your JAR file that is the sa",
|
||
"product_code":"dli",
|
||
"title":"Why Does a Flink Jar Package Conflict Result in Submission Failure?",
|
||
"uri":"dli_03_0119.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"145",
|
||
"code":"147"
|
||
},
|
||
{
|
||
"desc":"When a Flink Jar job is submitted to access GaussDB(DWS), an error message is displayed indicating that the job fails to be started. The job log contains the following er",
|
||
"product_code":"dli",
|
||
"title":"Why Does a Flink Jar Job Fail to Access GaussDB(DWS) and a Message Is Displayed Indicating Too Many Client Connections?",
|
||
"uri":"dli_03_0161.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"145",
|
||
"code":"148"
|
||
},
|
||
{
|
||
"desc":"An exception occurred when a Flink Jar job is running. The following error information is displayed in the job log:org.apache.flink.shaded.curator.org.apache.curator.Conn",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error Message \"Authentication failed\" Displayed During Flink Jar Job Running?",
|
||
"uri":"dli_03_0165.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"145",
|
||
"code":"149"
|
||
},
|
||
{
|
||
"desc":"The storage path of the Flink Jar job checkpoints was set to an OBS bucket. The job failed to be submitted, and an error message indicating an invalid OBS bucket name was",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error Invalid OBS Bucket Name Reported After a Flink Job Submission Failed?",
|
||
"uri":"dli_03_0233.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"145",
|
||
"code":"150"
|
||
},
|
||
{
|
||
"desc":"Flink Job submission failed. The exception information is as follows:Flink JAR files conflicted. The submitted Flink JAR file conflicted with the HDFS JAR file of the DLI",
|
||
"product_code":"dli",
|
||
"title":"Why Does the Flink Submission Fail Due to Hadoop JAR File Conflict?",
|
||
"uri":"dli_03_0234.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"145",
|
||
"code":"151"
|
||
},
|
||
{
|
||
"desc":"You can use Flink Jar to connect to Kafka with SASL SSL authentication enabled.",
|
||
"product_code":"dli",
|
||
"title":"How Do I Connect a Flink jar Job to SASL_SSL?",
|
||
"uri":"dli_03_0266.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"145",
|
||
"code":"152"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Performance Tuning",
|
||
"uri":"dli_03_0133.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"124",
|
||
"code":"153"
|
||
},
|
||
{
|
||
"desc":"Data Stacking in a Consumer GroupThe accumulated data of a consumer group can be calculated by the following formula: Total amount of data to be consumed by the consumer ",
|
||
"product_code":"dli",
|
||
"title":"How Do I Optimize Performance of a Flink Job?",
|
||
"uri":"dli_03_0106.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"153",
|
||
"code":"154"
|
||
},
|
||
{
|
||
"desc":"Add the following SQL statements to the Flink job:",
|
||
"product_code":"dli",
|
||
"title":"How Do I Write Data to Different Elasticsearch Clusters in a Flink Job?",
|
||
"uri":"dli_03_0048.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"153",
|
||
"code":"155"
|
||
},
|
||
{
|
||
"desc":"The DLI Flink checkpoint/savepoint mechanism is complete and reliable. You can use this mechanism to prevent data loss when a job is manually restarted or restarted due t",
|
||
"product_code":"dli",
|
||
"title":"How Do I Prevent Data Loss After Flink Job Restart?",
|
||
"uri":"dli_03_0096.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"153",
|
||
"code":"156"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"O&M Guide",
|
||
"uri":"dli_03_0135.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"124",
|
||
"code":"157"
|
||
},
|
||
{
|
||
"desc":"On the Flink job management page, hover the cursor on the status of the job that fails to be submitted to view the brief information about the failure.The possible causes",
|
||
"product_code":"dli",
|
||
"title":"How Do I Locate a Flink Job Submission Error?",
|
||
"uri":"dli_03_0103.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"157",
|
||
"code":"158"
|
||
},
|
||
{
|
||
"desc":"On the Flink job management, click Edit in the Operation column of the target job. On the displayed page, check whether Save Job Log in the Running Parameters tab is enab",
|
||
"product_code":"dli",
|
||
"title":"How Do I Locate a Flink Job Running Error?",
|
||
"uri":"dli_03_0105.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"157",
|
||
"code":"159"
|
||
},
|
||
{
|
||
"desc":"Flink's checkpointing is a fault tolerance and recovery mechanism. This mechanism ensures that real-time programs can self-recover in case of exceptions or machine issues",
|
||
"product_code":"dli",
|
||
"title":"How Can I Check if a Flink Job Can Be Restored From a Checkpoint After Restarting It?",
|
||
"uri":"dli_03_0136.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"157",
|
||
"code":"160"
|
||
},
|
||
{
|
||
"desc":"To rectify this fault, perform the following steps:Log in to the DIS management console. In the navigation pane, choose Stream Management. View the Flink job SQL statemen",
|
||
"product_code":"dli",
|
||
"title":"Why Does DIS Stream Not Exist During Job Semantic Check?",
|
||
"uri":"dli_03_0040.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"157",
|
||
"code":"161"
|
||
},
|
||
{
|
||
"desc":"If the OBS bucket selected for a job is not authorized, perform the following steps:Select Enable Checkpointing or Save Job Log.Specify OBS Bucket.Select Authorize OBS.",
|
||
"product_code":"dli",
|
||
"title":"Why Is the OBS Bucket Selected for Job Not Authorized?",
|
||
"uri":"dli_03_0045.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"157",
|
||
"code":"162"
|
||
},
|
||
{
|
||
"desc":"Mode for storing generated job logs when a DLI Flink job fails to be submitted or executed. The options are as follows:If the submission fails, a submission log is genera",
|
||
"product_code":"dli",
|
||
"title":"Why Are Logs Not Written to the OBS Bucket After a DLI Flink Job Fails to Be Submitted for Running?",
|
||
"uri":"dli_03_0064.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"157",
|
||
"code":"163"
|
||
},
|
||
{
|
||
"desc":"The Flink/Spark UI was displayed with incomplete information.When the queue is used to run a job, the system releases the cluster and takes about 10 minutes to create a n",
|
||
"product_code":"dli",
|
||
"title":"Why Is Information Displayed on the FlinkUI/Spark UI Page Incomplete?",
|
||
"uri":"dli_03_0235.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"157",
|
||
"code":"164"
|
||
},
|
||
{
|
||
"desc":"JobManager and TaskManager heartbeats timed out. As a result, the Flink job is abnormal.Check whether the network is intermittently disconnected and whether the cluster l",
|
||
"product_code":"dli",
|
||
"title":"Why Is the Flink Job Abnormal Due to Heartbeat Timeout Between JobManager and TaskManager?",
|
||
"uri":"dli_03_0236.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"157",
|
||
"code":"165"
|
||
},
|
||
{
|
||
"desc":"Test address connectivity.If the network is unreachable, rectify the network connection first. Ensure that the network connection between the DLI queue and the external d",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"Timeout expired while fetching topic metadata\" Repeatedly Reported in Flink JobManager Logs?",
|
||
"uri":"dli_03_0265.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"157",
|
||
"code":"166"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Problems Related to SQL Jobs",
|
||
"uri":"dli_03_0020.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"123",
|
||
"code":"167"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Usage",
|
||
"uri":"dli_03_0216.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"167",
|
||
"code":"168"
|
||
},
|
||
{
|
||
"desc":"A temporary table is used to store intermediate results. When a transaction or session ends, the data in the temporary table can be automatically deleted. For example, in",
|
||
"product_code":"dli",
|
||
"title":"SQL Jobs",
|
||
"uri":"dli_03_0200.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"168",
|
||
"code":"169"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Job Development",
|
||
"uri":"dli_03_0204.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"167",
|
||
"code":"170"
|
||
},
|
||
{
|
||
"desc":"If a large number of small files are generated during SQL execution, job execution and table query will take a long time. In this case, you should merge small files.Set t",
|
||
"product_code":"dli",
|
||
"title":"How Do I Merge Small Files?",
|
||
"uri":"dli_03_0086.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"170",
|
||
"code":"171"
|
||
},
|
||
{
|
||
"desc":"When creating an OBS table, you must specify a table path in the database. The path format is as follows: obs://xxx/database name/table name.If the specified path is akdc",
|
||
"product_code":"dli",
|
||
"title":"How Do I Specify an OBS Path When Creating an OBS Table?",
|
||
"uri":"dli_03_0092.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"170",
|
||
"code":"172"
|
||
},
|
||
{
|
||
"desc":"DLI allows you to associate JSON data in an OBS bucket to create tables in asynchronous mode.The statement for creating the table is as follows:",
|
||
"product_code":"dli",
|
||
"title":"How Do I Create a Table Using JSON Data in an OBS Bucket?",
|
||
"uri":"dli_03_0108.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"170",
|
||
"code":"173"
|
||
},
|
||
{
|
||
"desc":"You can use the where condition statement in the select statement to filter data. For example:",
|
||
"product_code":"dli",
|
||
"title":"How Do I Set Local Variables in SQL Statements?",
|
||
"uri":"dli_03_0087.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"170",
|
||
"code":"174"
|
||
},
|
||
{
|
||
"desc":"The correct method for using the count function to perform aggregation is as follows:OrIf an incorrect method is used, an error will be reported.",
|
||
"product_code":"dli",
|
||
"title":"How Can I Use the count Function to Perform Aggregation?",
|
||
"uri":"dli_03_0069.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"170",
|
||
"code":"175"
|
||
},
|
||
{
|
||
"desc":"You can use the cross-region replication function of OBS. The procedure is as follows:Export the DLI table data in region 1 to the user-defined OBS bucket.Use the OBS cro",
|
||
"product_code":"dli",
|
||
"title":"How Do I Synchronize DLI Table Data from One Region to Another?",
|
||
"uri":"dli_03_0072.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"170",
|
||
"code":"176"
|
||
},
|
||
{
|
||
"desc":"Currently, DLI does not allow you to insert table data into specific fields. To insert table data, you must insert data of all table fields at a time.",
|
||
"product_code":"dli",
|
||
"title":"How Do I Insert Table Data into Specific Fields of a Table Using a SQL Job?",
|
||
"uri":"dli_03_0191.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"170",
|
||
"code":"177"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Job O&M Errors",
|
||
"uri":"dli_03_0206.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"167",
|
||
"code":"178"
|
||
},
|
||
{
|
||
"desc":"Create an OBS directory with a unique name. Alternatively, you can manually delete the existing OBS directory and submit the job again. However, exercise caution when del",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"path obs://xxx already exists\" Reported When Data Is Exported to OBS?",
|
||
"uri":"dli_03_0014.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"178",
|
||
"code":"179"
|
||
},
|
||
{
|
||
"desc":"This message indicates that the two tables to be joined contain the same column, but the owner of the column is not specified when the command is executed.For example, ta",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"SQL_ANALYSIS_ERROR: Reference 't.id' is ambiguous, could be: t.id, t.id.;\" Displayed When Two Tables Are Joined?",
|
||
"uri":"dli_03_0066.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"178",
|
||
"code":"180"
|
||
},
|
||
{
|
||
"desc":"Check if your account is in arrears and top it up if necessary.If the same error message persists after the top-up, log out of your account and log back in.",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"The current account does not have permission to perform this operation,the current account was restricted. Restricted for no budget.\" Reported when a SQL Statement Is Executed?",
|
||
"uri":"dli_03_0071.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"178",
|
||
"code":"181"
|
||
},
|
||
{
|
||
"desc":"Cause AnalysisWhen you query the partitioned table XX.YYY, the partition column is not specified in the search criteria.A partitioned table can be queried only when the q",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"There should be at least one partition pruning predicate on partitioned table XX.YYY\" Reported When a Query Statement Is Executed?",
|
||
"uri":"dli_03_0145.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"178",
|
||
"code":"182"
|
||
},
|
||
{
|
||
"desc":"The following error message is displayed when the LOAD DATA command is executed by a Spark SQL job to import data to a DLI table:In some cases ,the following error messag",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"IllegalArgumentException: Buffer size too small. size\" Reported When Data Is Loaded to an OBS Foreign Table?",
|
||
"uri":"dli_03_0169.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"178",
|
||
"code":"183"
|
||
},
|
||
{
|
||
"desc":"An error is reported during SQL job execution:Please contact DLI service. DLI.0002: FileNotFoundException: getFileStatus on obs://xxx: status [404]Check whether there is ",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"DLI.0002 FileNotFoundException\" Reported During SQL Job Running?",
|
||
"uri":"dli_03_0189.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"178",
|
||
"code":"184"
|
||
},
|
||
{
|
||
"desc":"Currently, DLI supports the Hive syntax for creating tables of the TEXTFILE, SEQUENCEFILE, RCFILE, ORC, AVRO, and PARQUET file types. If the file format specified for cre",
|
||
"product_code":"dli",
|
||
"title":"Why Is a Schema Parsing Error Reported When I Create a Hive Table Using CTAS?",
|
||
"uri":"dli_03_0046.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"178",
|
||
"code":"185"
|
||
},
|
||
{
|
||
"desc":"When you run a DLI SQL script on DataArts Studio, the log shows that the statements fail to be executed. The error information is as follows:DLI.0999: RuntimeException: o",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"org.apache.hadoop.fs.obs.OBSIOException\" Reported When I Run DLI SQL Scripts on DataArts Studio?",
|
||
"uri":"dli_03_0173.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"178",
|
||
"code":"186"
|
||
},
|
||
{
|
||
"desc":"After the migration job is submitted, the following error information is displayed in the log:org.apache.sqoop.common.SqoopException:UQUERY_CONNECTOR_0001:Invoke DLI serv",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"UQUERY_CONNECTOR_0001:Invoke DLI service api failed\" Reported in the Job Log When I Use CDM to Migrate Data to DLI?",
|
||
"uri":"dli_03_0172.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"178",
|
||
"code":"187"
|
||
},
|
||
{
|
||
"desc":"Error message \"File not Found\" is displayed when a SQL job is accessed.Generally, the file cannot be found due to a read/write conflict. Check whether a job is overwritin",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"File not Found\" Reported When I Access a SQL Job?",
|
||
"uri":"dli_03_0207.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"178",
|
||
"code":"188"
|
||
},
|
||
{
|
||
"desc":"Error message \"DLI.0003: AccessControlException XXX\" is reported when a SQL job is accessed.Check the OBS bucket written in the AccessControlException to confirm if your ",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"DLI.0003: AccessControlException XXX\" Reported When I Access a SQL Job?",
|
||
"uri":"dli_03_0208.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"178",
|
||
"code":"189"
|
||
},
|
||
{
|
||
"desc":"Error message \"DLI.0001: org.apache.hadoop.security.AccessControlException: verifyBucketExists on {{bucket name}}: status [403]\" is reported when a SQL job is Accessed.Yo",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"DLI.0001: org.apache.hadoop.security.AccessControlException: verifyBucketExists on {{bucket name}}: status [403]\" Reported When I Access a SQL Job?",
|
||
"uri":"dli_03_0209.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"178",
|
||
"code":"190"
|
||
},
|
||
{
|
||
"desc":"Error message \"The current account does not have permission to perform this operation,the current account was restricted.\" is reported during SQL statement execution.Chec",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"The current account does not have permission to perform this operation,the current account was restricted. Restricted for no budget\" Reported During SQL Statement Execution? Restricted for no budget.",
|
||
"uri":"dli_03_0210.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"178",
|
||
"code":"191"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"O&M Guide",
|
||
"uri":"dli_03_0211.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"167",
|
||
"code":"192"
|
||
},
|
||
{
|
||
"desc":"If the job runs slowly, perform the following steps to find the causes and rectify the fault:Check whether the problem is caused by FullGC.Log in to the DLI console. In t",
|
||
"product_code":"dli",
|
||
"title":"How Do I Troubleshoot Slow SQL Jobs?",
|
||
"uri":"dli_03_0196.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"193"
|
||
},
|
||
{
|
||
"desc":"You can view SQL job logs for routine O&M.Obtain the ID of the DLI job executed on the DataArts Studio console.Job IDOn the DLI console, choose Job Management > SQL Jobs.",
|
||
"product_code":"dli",
|
||
"title":"How Do I View DLI SQL Logs?",
|
||
"uri":"dli_03_0091.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"194"
|
||
},
|
||
{
|
||
"desc":"You can view the job execution records when a job is running.Log in to the DLI management console.In the navigation pane on the left, choose Job Management > SQL Jobs.Ent",
|
||
"product_code":"dli",
|
||
"title":"How Do I View SQL Execution Records?",
|
||
"uri":"dli_03_0116.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"195"
|
||
},
|
||
{
|
||
"desc":"Data skew is a common issue during the execution of SQL jobs. When data is unevenly distributed, some compute nodes process significantly more data than others, which can",
|
||
"product_code":"dli",
|
||
"title":"How Do I Do When Data Skew Occurs During the Execution of a SQL Job?",
|
||
"uri":"dli_03_0093.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"196"
|
||
},
|
||
{
|
||
"desc":"A DLI table exists but cannot be queried on the DLI console.If a table exists but cannot be queried, there is a high probability that the current user does not have the p",
|
||
"product_code":"dli",
|
||
"title":"What Can I Do If a Table Cannot Be Queried on the DLI Console?",
|
||
"uri":"dli_03_0184.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"197"
|
||
},
|
||
{
|
||
"desc":"A high compression ratio of OBS tables in the Parquet or ORC format (for example, a compression ratio of 5 or higher compared with text compression) will lead to large da",
|
||
"product_code":"dli",
|
||
"title":"The Compression Ratio of OBS Tables Is Too High",
|
||
"uri":"dli_03_0013.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"198"
|
||
},
|
||
{
|
||
"desc":"DLI supports only UTF-8-encoded texts. Ensure that data is encoded using UTF-8 during table creation and import.",
|
||
"product_code":"dli",
|
||
"title":"How Can I Avoid Garbled Characters Caused by Inconsistent Character Codes?",
|
||
"uri":"dli_03_0009.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"199"
|
||
},
|
||
{
|
||
"desc":"User A created the testTable table in a database through a SQL job and granted user B the permission to insert and delete table data. User A deleted the testTable table a",
|
||
"product_code":"dli",
|
||
"title":"Do I Need to Grant Table Permissions to a User and Project After I Delete a Table and Create One with the Same Name?",
|
||
"uri":"dli_03_0175.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"200"
|
||
},
|
||
{
|
||
"desc":"A CSV file is imported to a DLI partitioned table, but the imported file data does not contain the data in the partitioning column. The partitioning column needs to be sp",
|
||
"product_code":"dli",
|
||
"title":"Why Can't I Query Table Data After Data Is Imported to a DLI Partitioned Table Because the File to Be Imported Does Not Contain Data in the Partitioning Column?",
|
||
"uri":"dli_03_0177.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"201"
|
||
},
|
||
{
|
||
"desc":"When an OBS foreign table is created, a field in the specified OBS file contains a carriage return line feed (CRLF) character. As a result, the data is incorrect.The stat",
|
||
"product_code":"dli",
|
||
"title":"How Do I Fix the Data Error Caused by CRLF Characters in a Field of the OBS File Used to Create an External OBS Table?",
|
||
"uri":"dli_03_0181.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"202"
|
||
},
|
||
{
|
||
"desc":"A SQL job contains join operations. After the job is submitted, it is stuck in the Running state and no result is returned.When a Spark SQL job has join operations on sma",
|
||
"product_code":"dli",
|
||
"title":"Why Does a SQL Job That Has Join Operations Stay in the Running State?",
|
||
"uri":"dli_03_0182.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"203"
|
||
},
|
||
{
|
||
"desc":"The on clause was not added to the SQL statement for joining tables. As a result, the Cartesian product query occurs due to multi-table association, and the queue resourc",
|
||
"product_code":"dli",
|
||
"title":"The on Clause Is Not Added When Tables Are Joined. Cartesian Product Query Causes High Resource Usage of the Queue, and the Job Fails to Be Executed",
|
||
"uri":"dli_03_0187.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"204"
|
||
},
|
||
{
|
||
"desc":"Partition data is manually uploaded to a partition of an OBS table. However, the data cannot be queried using DLI SQL editor.After manually adding partition data, you nee",
|
||
"product_code":"dli",
|
||
"title":"Why Can't I Query Data After I Manually Add Data to the Partition Directory of an OBS Table?",
|
||
"uri":"dli_03_0190.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"205"
|
||
},
|
||
{
|
||
"desc":"To dynamically overwrite the specified partitioned data in the DataSource table, set dli.sql.dynamicPartitionOverwrite.enabled to true and then run the insert overwrite s",
|
||
"product_code":"dli",
|
||
"title":"Why Is All Data Overwritten When insert overwrite Is Used to Overwrite Partitioned Table?",
|
||
"uri":"dli_03_0212.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"206"
|
||
},
|
||
{
|
||
"desc":"The possible causes and solutions are as follows:After you purchase a DLI queue and submit a SQL job for the first time, wait for 5 to 10 minutes. After the cluster is st",
|
||
"product_code":"dli",
|
||
"title":"Why Is a SQL Job Stuck in the Submitting State?",
|
||
"uri":"dli_03_0213.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"207"
|
||
},
|
||
{
|
||
"desc":"Spark does not have the datetime type and uses the TIMESTAMP type instead.You can use a function to convert data types.The following is an example.select cast(create_date",
|
||
"product_code":"dli",
|
||
"title":"Why Is the create_date Field in the RDS Table Is a Timestamp in the DLI query result?",
|
||
"uri":"dli_03_0214.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"208"
|
||
},
|
||
{
|
||
"desc":"If the table name is changed immediately after SQL statements are executed, the data size of the table may be incorrect.If you need to change the table name, change it 5 ",
|
||
"product_code":"dli",
|
||
"title":"What Can I Do If datasize Cannot Be Changed After the Table Name Is Changed in a Finished SQL Job?",
|
||
"uri":"dli_03_0215.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"209"
|
||
},
|
||
{
|
||
"desc":"When DLI is used to insert data into an OBS temporary table, only part of data is imported.Possible causes are as follows:The amount of data read during job execution is ",
|
||
"product_code":"dli",
|
||
"title":"Why Is the Data Volume Changes When Data Is Imported from DLI to OBS?",
|
||
"uri":"dli_03_0231.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"192",
|
||
"code":"210"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Problems Related to Spark Jobs",
|
||
"uri":"dli_03_0021.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"123",
|
||
"code":"211"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Usage",
|
||
"uri":"dli_03_0163.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"211",
|
||
"code":"212"
|
||
},
|
||
{
|
||
"desc":"DLI Spark does not support job scheduling. You can use other services, such as DataArts Studio, or use APIs or SDKs to customize job schedule.The Spark SQL syntax does no",
|
||
"product_code":"dli",
|
||
"title":"Spark Jobs",
|
||
"uri":"dli_03_0201.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"212",
|
||
"code":"213"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Job Development",
|
||
"uri":"dli_03_0217.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"211",
|
||
"code":"214"
|
||
},
|
||
{
|
||
"desc":"To use Spark to write data into a DLI table, configure the following parameters:fs.obs.access.keyfs.obs.secret.keyfs.obs.implfs.obs.endpointThe following is an example:",
|
||
"product_code":"dli",
|
||
"title":"How Do I Use Spark to Write Data into a DLI Table?",
|
||
"uri":"dli_03_0107.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"214",
|
||
"code":"215"
|
||
},
|
||
{
|
||
"desc":"To obtain the AK/SK, set the parameters as follows:Create a SparkContext using code.val sc: SparkContext = new SparkContext()\nsc.hadoopConfiguration.set(\"fs.obs.access.ke",
|
||
"product_code":"dli",
|
||
"title":"How Do I Set the AK/SK for a Queue to Operate an OBS Table?",
|
||
"uri":"dli_03_0017.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"214",
|
||
"code":"216"
|
||
},
|
||
{
|
||
"desc":"Log in to the DLI console. In the navigation pane, choose Job Management > Spark Jobs. In the job list, locate the target job and click next to Job ID to view the parame",
|
||
"product_code":"dli",
|
||
"title":"How Do I View the Resource Usage of DLI Spark Jobs?",
|
||
"uri":"dli_03_0102.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"214",
|
||
"code":"217"
|
||
},
|
||
{
|
||
"desc":"If the pymysql module is missing, check whether the corresponding EGG package exists. If the package does not exist, upload the pyFile package on the Package Management p",
|
||
"product_code":"dli",
|
||
"title":"How Do I Use Python Scripts to Access the MySQL Database If the pymysql Module Is Missing from the Spark Job Results Stored in MySQL?",
|
||
"uri":"dli_03_0076.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"214",
|
||
"code":"218"
|
||
},
|
||
{
|
||
"desc":"DLI natively supports PySpark.For most cases, Python is preferred for data analysis, and PySpark is the best choice for big data analysis. Generally, JVM programs are pac",
|
||
"product_code":"dli",
|
||
"title":"How Do I Run a Complex PySpark Program in DLI?",
|
||
"uri":"dli_03_0082.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"214",
|
||
"code":"219"
|
||
},
|
||
{
|
||
"desc":"You can use DLI Spark jobs to access data in the MySQL database using either of the following methods:Solution 1: Buy a queue, create an enhanced datasource connection, a",
|
||
"product_code":"dli",
|
||
"title":"How Does a Spark Job Access a MySQL Database?",
|
||
"uri":"dli_03_0127.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"214",
|
||
"code":"220"
|
||
},
|
||
{
|
||
"desc":"When shuffle statements, such as GROUP BY and JOIN, are executed in Spark jobs, data skew occurs, which slows down the job execution.To solve this problem, you can config",
|
||
"product_code":"dli",
|
||
"title":"How Do I Use JDBC to Set the spark.sql.shuffle.partitions Parameter to Improve the Task Concurrency?",
|
||
"uri":"dli_03_0068.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"214",
|
||
"code":"221"
|
||
},
|
||
{
|
||
"desc":"You can use SparkFiles to read the file submitted using –-file form a local path: SparkFiles.get(\"Name of the uploaded file\").The file path in the Driver is different fro",
|
||
"product_code":"dli",
|
||
"title":"How Do I Read Uploaded Files for a Spark Jar Job?",
|
||
"uri":"dli_03_0118.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"214",
|
||
"code":"222"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Job O&M Errors",
|
||
"uri":"dli_03_0218.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"211",
|
||
"code":"223"
|
||
},
|
||
{
|
||
"desc":"The following error is reported when a Spark job accesses OBS data:Set the AK/SK to enable Spark jobs to access OBS data. For details, see How Do I Set the AK/SK for a Qu",
|
||
"product_code":"dli",
|
||
"title":"Why Are Errors \"ResponseCode: 403\" and \"ResponseStatus: Forbidden\" Reported When a Spark Job Accesses OBS Data?",
|
||
"uri":"dli_03_0156.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"223",
|
||
"code":"224"
|
||
},
|
||
{
|
||
"desc":"Check whether the OBS bucket is used to store DLI logs on the Global Configuration > Job Configurations page. The job log bucket cannot be used for other purpose.",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"verifyBucketExists on XXXX: status [403]\" Reported When I Use a Spark Job to Access an OBS Bucket That I Have Access Permission?",
|
||
"uri":"dli_03_0164.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"223",
|
||
"code":"225"
|
||
},
|
||
{
|
||
"desc":"When a Spark job accesses a large amount of data, for example, accessing data in a GaussDB(DWS) database, you are advised to set the number of concurrent tasks and enable",
|
||
"product_code":"dli",
|
||
"title":"Why Is a Job Running Timeout Reported When a Spark Job Runs a Large Amount of Data?",
|
||
"uri":"dli_03_0157.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"223",
|
||
"code":"226"
|
||
},
|
||
{
|
||
"desc":"Spark jobs cannot access SFTP. Upload the files you want to access to OBS and then you can analyze the data using Spark jobs.",
|
||
"product_code":"dli",
|
||
"title":"Why Does the Job Fail to Be Executed and the Log Shows that the File Directory Is Abnormal When I Use a Spark Job to Access Files in SFTP?",
|
||
"uri":"dli_03_0188.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"223",
|
||
"code":"227"
|
||
},
|
||
{
|
||
"desc":"When a Spark job is running, an error message is displayed, indicating that the user does not have the database permission. The error information is as follows:org.apache",
|
||
"product_code":"dli",
|
||
"title":"Why Does the Job Fail to Be Executed Due to Insufficient Database and Table Permissions?",
|
||
"uri":"dli_03_0192.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"223",
|
||
"code":"228"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"O&M Guide",
|
||
"uri":"dli_03_0219.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"211",
|
||
"code":"229"
|
||
},
|
||
{
|
||
"desc":"I cannot find the specified Python environment after adding the Python 3 package.Set spark.yarn.appMasterEnv.PYSPARK_PYTHON to python3 in the conf file to specify the Pyt",
|
||
"product_code":"dli",
|
||
"title":"Why Can't I Find the Specified Python Environment After Adding the Python Package?",
|
||
"uri":"dli_03_0077.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"229",
|
||
"code":"230"
|
||
},
|
||
{
|
||
"desc":"The remaining CUs in the queue may be insufficient. As a result, the job cannot be submitted.To view the remaining CUs of a queue, perform the following steps:Check the C",
|
||
"product_code":"dli",
|
||
"title":"Why Is a Spark Jar Job Stuck in the Submitting State?",
|
||
"uri":"dli_03_0220.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"229",
|
||
"code":"231"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Product Consultation",
|
||
"uri":"dli_03_0001.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"123",
|
||
"code":"232"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Usage",
|
||
"uri":"dli_03_0221.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"232",
|
||
"code":"233"
|
||
},
|
||
{
|
||
"desc":"DLI supports the following data formats:ParquetCSVORCJsonAvro",
|
||
"product_code":"dli",
|
||
"title":"Which Data Formats Does DLI Support?",
|
||
"uri":"dli_03_0025.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"233",
|
||
"code":"234"
|
||
},
|
||
{
|
||
"desc":"The Spark component of DLI is a fully managed service. You can only use the DLI Spark through its APIs. .The Spark component of MRS is built on the VM in an MRS cluster. ",
|
||
"product_code":"dli",
|
||
"title":"What Are the Differences Between MRS Spark and DLI Spark?",
|
||
"uri":"dli_03_0115.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"233",
|
||
"code":"235"
|
||
},
|
||
{
|
||
"desc":"DLI data can be stored in either of the following:OBS: Data used by SQL jobs, Spark jobs, and Flink jobs can be stored in OBS, reducing storage costs.DLI: The column-base",
|
||
"product_code":"dli",
|
||
"title":"Where Can DLI Data Be Stored?",
|
||
"uri":"dli_03_0029.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"233",
|
||
"code":"236"
|
||
},
|
||
{
|
||
"desc":"DLI tables store data within the DLI service, and you do not need to know the data storage path.OBS tables store data in your OBS buckets, and you need to manage the sour",
|
||
"product_code":"dli",
|
||
"title":"What Are the Differences Between DLI Tables and OBS Tables?",
|
||
"uri":"dli_03_0117.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"233",
|
||
"code":"237"
|
||
},
|
||
{
|
||
"desc":"Currently, DLI supports analysis only on the data uploaded to the cloud. In scenarios where regular (for example, on a per day basis) one-off analysis on incremental data",
|
||
"product_code":"dli",
|
||
"title":"How Can I Use DLI If Data Is Not Uploaded to OBS?",
|
||
"uri":"dli_03_0010.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"233",
|
||
"code":"238"
|
||
},
|
||
{
|
||
"desc":"Data in the OBS bucket shared by IAM users under the same account can be imported. You cannot import data in the OBS bucket shared with other IAM account.",
|
||
"product_code":"dli",
|
||
"title":"Can I Import OBS Bucket Data Shared by Other Tenants into DLI?",
|
||
"uri":"dli_03_0129.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"233",
|
||
"code":"239"
|
||
},
|
||
{
|
||
"desc":"Log in to the management console.Click in the upper left corner and select a region and a project.Click the My Quota icon in the upper right corner of the page.The Serv",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"Failed to create the database. {\"error_code\":\"DLI.1028\";\"error_msg\":\"Already reached the maximum quota of databases:XXX\".\" Reported?",
|
||
"uri":"dli_03_0264.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"233",
|
||
"code":"240"
|
||
},
|
||
{
|
||
"desc":"No, a global variable can only be used by the user who created it. Global variables can be used to simplify complex parameters. For example, long and difficult variables ",
|
||
"product_code":"dli",
|
||
"title":"Can a Member Account Use Global Variables Created by Other Member Accounts?",
|
||
"uri":"dli_03_0263.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"233",
|
||
"code":"241"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Job Management",
|
||
"uri":"dli_03_0222.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"232",
|
||
"code":"242"
|
||
},
|
||
{
|
||
"desc":"If you are suggested to perform following operations to run a large number of DLI jobs:Group the DLI jobs by type, and run each group on a queue.Alternatively, create IAM",
|
||
"product_code":"dli",
|
||
"title":"How Do I Manage Tens of Thousands of Jobs Running on DLI?",
|
||
"uri":"dli_03_0126.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"242",
|
||
"code":"243"
|
||
},
|
||
{
|
||
"desc":"The field names of tables that have been created cannot be changed.You can create a table, define new table fields, and migrate data from the old table to the new one.",
|
||
"product_code":"dli",
|
||
"title":"How Do I Change the Name of a Field in a Created Table?",
|
||
"uri":"dli_03_0162.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"242",
|
||
"code":"244"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Privacy and Security",
|
||
"uri":"dli_03_0261.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"232",
|
||
"code":"245"
|
||
},
|
||
{
|
||
"desc":"No. The spark.acls.enable configuration item is not used in DLI. The Apache Spark command injection vulnerability (CVE-2022-33891) does not exist in DLI.",
|
||
"product_code":"dli",
|
||
"title":"Does DLI Have the Apache Spark Command Injection Vulnerability (CVE-2022-33891)?",
|
||
"uri":"dli_03_0260.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"245",
|
||
"code":"246"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Quota",
|
||
"uri":"dli_03_0053.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"123",
|
||
"code":"247"
|
||
},
|
||
{
|
||
"desc":"Log in to the management console.Click in the upper left corner and select Region and Project.Click (the My Quotas icon) in the upper right corner.The Service Quota pag",
|
||
"product_code":"dli",
|
||
"title":"How Do I View My Quotas?",
|
||
"uri":"dli_03_0031.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"247",
|
||
"code":"248"
|
||
},
|
||
{
|
||
"desc":"The system does not support online quota adjustment. To increase a resource quota, dial the hotline or send an email to the customer service. We will process your applica",
|
||
"product_code":"dli",
|
||
"title":"How Do I Increase a Quota?",
|
||
"uri":"dli_03_0032.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"247",
|
||
"code":"249"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Permission",
|
||
"uri":"dli_03_0054.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"123",
|
||
"code":"250"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Usage",
|
||
"uri":"dli_03_0223.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"250",
|
||
"code":"251"
|
||
},
|
||
{
|
||
"desc":"DLI has a comprehensive permission control mechanism and supports fine-grained authentication through Identity and Access Management (IAM). You can create policies in IAM",
|
||
"product_code":"dli",
|
||
"title":"How Do I Manage Fine-Grained DLI Permissions?",
|
||
"uri":"dli_03_0100.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"251",
|
||
"code":"252"
|
||
},
|
||
{
|
||
"desc":"You cannot perform permission-related operations on the partition column of a partitioned table.However, when you grant the permission of any non-partition column in a pa",
|
||
"product_code":"dli",
|
||
"title":"What Is Column Permission Granting of a DLI Partition Table?",
|
||
"uri":"dli_03_0008.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"251",
|
||
"code":"253"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"O&M Guide",
|
||
"uri":"dli_03_0226.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"250",
|
||
"code":"254"
|
||
},
|
||
{
|
||
"desc":"When you submit a job, a message is displayed indicating that the job fails to be submitted due to insufficient permission caused by arrears. In this case, you need to ch",
|
||
"product_code":"dli",
|
||
"title":"Why Does My Account Have Insufficient Permissions Due to Arrears?",
|
||
"uri":"dli_03_0140.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"254",
|
||
"code":"255"
|
||
},
|
||
{
|
||
"desc":"When the user update an existing program package, the following error information is displayed:\"error_code\"*DLI.0003\",\"error_msg\":\"Permission denied for resource 'resourc",
|
||
"product_code":"dli",
|
||
"title":"Why Does the System Display a Message Indicating Insufficient Permissions When I Update a Program Package?",
|
||
"uri":"dli_03_0195.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"254",
|
||
"code":"256"
|
||
},
|
||
{
|
||
"desc":"When the SQL query statement is executed, the system displays a message indicating that the user does not have the permission to query resources.Error information: DLI.00",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"DLI.0003: Permission denied for resource...\" Reported When I Run a SQL Statement?",
|
||
"uri":"dli_03_0227.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"254",
|
||
"code":"257"
|
||
},
|
||
{
|
||
"desc":"The table permission has been granted and verified. However, after a period of time, an error is reported indicating that the table query fails.There are two possible rea",
|
||
"product_code":"dli",
|
||
"title":"Why Can't I Query Table Data After I've Been Granted Table Permissions?",
|
||
"uri":"dli_03_0228.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"254",
|
||
"code":"258"
|
||
},
|
||
{
|
||
"desc":"If a table inherits database permissions, you do not need to regrant the inherited permissions to the table.When you grant permissions on a table on the console:If you se",
|
||
"product_code":"dli",
|
||
"title":"Will an Error Be Reported if the Inherited Permissions Are Regranted to a Table That Inherits Database Permissions?",
|
||
"uri":"dli_03_0057.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"254",
|
||
"code":"259"
|
||
},
|
||
{
|
||
"desc":"User A created Table1.User B created View1 based on Table1.After the Select Table permission on Table1 is granted to user C, user C fails to query View1.User B does not h",
|
||
"product_code":"dli",
|
||
"title":"Why Can't I Query a View After I'm Granted the Select Table Permission on the View?",
|
||
"uri":"dli_03_0067.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"254",
|
||
"code":"260"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Queue",
|
||
"uri":"dli_03_0049.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"123",
|
||
"code":"261"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Usage",
|
||
"uri":"dli_03_0229.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"261",
|
||
"code":"262"
|
||
},
|
||
{
|
||
"desc":"Currently, you are not allowed to modify the description of a created queue. You can add the description when purchasing the queue.",
|
||
"product_code":"dli",
|
||
"title":"Does the Description of a DLI Queue Can Be Modified?",
|
||
"uri":"dli_03_0109.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"262",
|
||
"code":"263"
|
||
},
|
||
{
|
||
"desc":"Deleting a queue does not cause table data loss in your database.",
|
||
"product_code":"dli",
|
||
"title":"Will Table Data in My Database Be Lost If I Delete a Queue?",
|
||
"uri":"dli_03_0166.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"262",
|
||
"code":"264"
|
||
},
|
||
{
|
||
"desc":"You need to develop a mechanism to retry failed jobs. When a faulty queue is recovered, your application tries to submit the failed jobs to the queue again.",
|
||
"product_code":"dli",
|
||
"title":"How Does DLI Ensure the Reliability of Spark Jobs When a Queue Is Abnormal?",
|
||
"uri":"dli_03_0170.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"262",
|
||
"code":"265"
|
||
},
|
||
{
|
||
"desc":"DLI allows you to subscribe to an SMN topic for failed jobs.Log in to the DLI console.In the navigation pane on the left, choose Queue Management.On the Queue Management ",
|
||
"product_code":"dli",
|
||
"title":"How Do I Monitor Queue Exceptions?",
|
||
"uri":"dli_03_0098.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"262",
|
||
"code":"266"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"O&M Guide",
|
||
"uri":"dli_03_0230.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"261",
|
||
"code":"267"
|
||
},
|
||
{
|
||
"desc":"To check the running status of the DLI queue and determine whether to run more jobs on that queue, you need to check the queue load.Search for Cloud Eye on the console.In",
|
||
"product_code":"dli",
|
||
"title":"How Do I View DLI Queue Load?",
|
||
"uri":"dli_03_0095.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"267",
|
||
"code":"268"
|
||
},
|
||
{
|
||
"desc":"You need to check the large number of jobs in the Submitting and Running states on the queue.Use Cloud Eye to view jobs in different states on the queue. The procedure is",
|
||
"product_code":"dli",
|
||
"title":"How Do I Determine Whether There Are Too Many Jobs in the Current Queue?",
|
||
"uri":"dli_03_0183.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"267",
|
||
"code":"269"
|
||
},
|
||
{
|
||
"desc":"Currently, DLI provides two types of queues, For SQL and For general use. SQL queues are used to run SQL jobs. General-use queues are compatible with Spark queues of earl",
|
||
"product_code":"dli",
|
||
"title":"How Do I Switch an Earlier-Version Spark Queue to a General-Purpose Queue?",
|
||
"uri":"dli_03_0065.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"267",
|
||
"code":"270"
|
||
},
|
||
{
|
||
"desc":"DLI queues do not use resources or bandwidth when no job is running. In this case, the running status of DLI queues is not displayed on CES.",
|
||
"product_code":"dli",
|
||
"title":"Why Cannot I View the Resource Running Status of DLI Queues on Cloud Eye?",
|
||
"uri":"dli_03_0193.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"267",
|
||
"code":"271"
|
||
},
|
||
{
|
||
"desc":"In DLI, 64 CU = 64 cores and 256 GB memory.In a Spark job, if the driver occupies 4 cores and 16 GB memory, the executor can occupy 60 cores and 240 GB memory.",
|
||
"product_code":"dli",
|
||
"title":"How Do I Allocate Queue Resources for Running Spark Jobs If I Have Purchased 64 CUs?",
|
||
"uri":"dli_03_0088.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"267",
|
||
"code":"272"
|
||
},
|
||
{
|
||
"desc":"Queue plans create failed. The plan xxx target cu is out of quota is displayed when you create a scheduled scaling task.The CU quota of the current account is insufficien",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"Queue plans create failed. The plan xxx target cu is out of quota\" Reported When I Schedule CU Changes?",
|
||
"uri":"dli_03_0159.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"267",
|
||
"code":"273"
|
||
},
|
||
{
|
||
"desc":"After a SQL job was submitted to the default queue, the job runs abnormally. The job log reported that the execution timed out. The exception logs are as follows:[ERROR] ",
|
||
"product_code":"dli",
|
||
"title":"Why Is a Timeout Exception Reported When a DLI SQL Statement Fails to Be Executed on the Default Queue?",
|
||
"uri":"dli_03_0171.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"267",
|
||
"code":"274"
|
||
},
|
||
{
|
||
"desc":"In daily big data analysis work, it is important to allocate and manage compute resources properly to provide a good job execution environment.You can allocate resources ",
|
||
"product_code":"dli",
|
||
"title":"How Can I Check the Actual and Used CUs for an Elastic Resource Pool as Well as the Required CUs for a Job?",
|
||
"uri":"dli_03_0276.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"267",
|
||
"code":"275"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Datasource Connections",
|
||
"uri":"dli_03_0022.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"123",
|
||
"code":"276"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Datasource Connections",
|
||
"uri":"dli_03_0110.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"276",
|
||
"code":"277"
|
||
},
|
||
{
|
||
"desc":"You need to create a VPC peering connection to enable network connectivity. Take MRS as an example. If DLI and MRS clusters are in the same VPC, and the security group is",
|
||
"product_code":"dli",
|
||
"title":"Why Do I Need to Create a VPC Peering Connection for an Enhanced Datasource Connection?",
|
||
"uri":"dli_03_0128.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"277",
|
||
"code":"278"
|
||
},
|
||
{
|
||
"desc":"An enhanced datasource connection failed to pass the network connectivity test. Datasource connection cannot be bound to a queue. The following error information is displ",
|
||
"product_code":"dli",
|
||
"title":"Failed to Bind a Queue to an Enhanced Datasource Connection",
|
||
"uri":"dli_03_0237.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"277",
|
||
"code":"279"
|
||
},
|
||
{
|
||
"desc":"The outbound rule had been configured for the security group of the queue associated with the enhanced datasource connection. The datasource authentication used a passwor",
|
||
"product_code":"dli",
|
||
"title":"DLI Failed to Connect to GaussDB(DWS) Through an Enhanced Datasource Connection",
|
||
"uri":"dli_03_0238.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"277",
|
||
"code":"280"
|
||
},
|
||
{
|
||
"desc":"A datasource connection is created and bound to a queue. The connectivity test fails and the following error information is displayed:failed to connect to specified addre",
|
||
"product_code":"dli",
|
||
"title":"How Do I Do if the Datasource Connection Is Created But the Network Connectivity Test Fails?",
|
||
"uri":"dli_03_0179.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"277",
|
||
"code":"281"
|
||
},
|
||
{
|
||
"desc":"Configuring the Connection Between a DLI Queue and a Data Source in a Private NetworkIf your DLI job needs to connect to a data source, for example, MRS, RDS, CSS, Kafka,",
|
||
"product_code":"dli",
|
||
"title":"How Do I Configure the Network Between a DLI Queue and a Data Source?",
|
||
"uri":"dli_03_0186.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"277",
|
||
"code":"282"
|
||
},
|
||
{
|
||
"desc":"The possible causes and solutions are as follows:If you have created a queue, do not bind it to a datasource connection immediately. Wait for 5 to 10 minutes. After the c",
|
||
"product_code":"dli",
|
||
"title":"What Can I Do If a Datasource Connection Is Stuck in Creating State When I Try to Bind a Queue to It?",
|
||
"uri":"dli_03_0257.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"277",
|
||
"code":"283"
|
||
},
|
||
{
|
||
"desc":"DLI enhanced datasource connection uses VPC peering to directly connect the VPC networks of the desired data sources for point-to-point data exchanges.",
|
||
"product_code":"dli",
|
||
"title":"How Do I Connect DLI to Data Sources?",
|
||
"uri":"dli_03_0259.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"277",
|
||
"code":"284"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Cross-Source Analysis",
|
||
"uri":"dli_03_0112.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"276",
|
||
"code":"285"
|
||
},
|
||
{
|
||
"desc":"To perform query on data stored on services rather than DLI, perform the following steps:Assume that the data to be queried is stored on multiple services (for example, O",
|
||
"product_code":"dli",
|
||
"title":"How Can I Perform Query on Data Stored on Services Rather Than DLI?",
|
||
"uri":"dli_03_0011.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"285",
|
||
"code":"286"
|
||
},
|
||
{
|
||
"desc":"Connect VPCs in different regions.Create an enhanced datasource connection on DLI and bind it to a queue.Add a DLI route.",
|
||
"product_code":"dli",
|
||
"title":"How Can I Access Data Across Regions?",
|
||
"uri":"dli_03_0085.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"285",
|
||
"code":"287"
|
||
},
|
||
{
|
||
"desc":"When data is inserted into DLI, set the ID field to NULL.",
|
||
"product_code":"dli",
|
||
"title":"How Do I Set the Auto-increment Primary Key or Other Fields That Are Automatically Filled in the RDS Table When Creating a DLI and Associating It with the RDS Table?",
|
||
"uri":"dli_03_0028.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"285",
|
||
"code":"288"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Datasource Connection O&M",
|
||
"uri":"dli_03_0256.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"276",
|
||
"code":"289"
|
||
},
|
||
{
|
||
"desc":"Possible CausesThe network connectivity is abnormal. Check whether the security group is correctly selected and whether the VPC is correctly configured.The network connec",
|
||
"product_code":"dli",
|
||
"title":"Why Is the Error Message \"communication link failure\" Displayed When I Use a Newly Activated Datasource Connection?",
|
||
"uri":"dli_03_0047.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"289",
|
||
"code":"290"
|
||
},
|
||
{
|
||
"desc":"The cluster host information is not added to the datasource connection. As a result, the KRB authentication fails, the connection times out, and no error is recorded in l",
|
||
"product_code":"dli",
|
||
"title":"Connection Times Out During MRS HBase Datasource Connection, and No Error Is Recorded in Logs",
|
||
"uri":"dli_03_0080.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"289",
|
||
"code":"291"
|
||
},
|
||
{
|
||
"desc":"When you create a VPC peering connection for the datasource connection, the following error information is displayed:Before you create a datasource connection, check whet",
|
||
"product_code":"dli",
|
||
"title":"Why Can't I Find the Subnet When Creating a DLI Datasource Connection?",
|
||
"uri":"dli_03_0111.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"289",
|
||
"code":"292"
|
||
},
|
||
{
|
||
"desc":"A datasource RDS table was created in the DataArts Studio, and the insert overwrite statement was executed to write data into RDS. DLI.0999: BatchUpdateException: Incorre",
|
||
"product_code":"dli",
|
||
"title":"Error Message \"Incorrect string value\" Is Displayed When insert overwrite Is Executed on a Datasource RDS Table",
|
||
"uri":"dli_03_0239.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"289",
|
||
"code":"293"
|
||
},
|
||
{
|
||
"desc":"The system failed to create a datasource RDS table, and null pointer error was reported.The following table creation statement was used:The RDS database is in a PostGre c",
|
||
"product_code":"dli",
|
||
"title":"Null Pointer Error Is Displayed When the System Creates a Datasource RDS Table",
|
||
"uri":"dli_03_0250.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"289",
|
||
"code":"294"
|
||
},
|
||
{
|
||
"desc":"The system failed to execute insert overwrite on the datasource GaussDB(DWS) table, and org.postgresql.util.PSQLException: ERROR: tuple concurrently updated was displayed",
|
||
"product_code":"dli",
|
||
"title":"Error Message \"org.postgresql.util.PSQLException: ERROR: tuple concurrently updated\" Is Displayed When the System Executes insert overwrite on a Datasource GaussDB(DWS) Table",
|
||
"uri":"dli_03_0251.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"289",
|
||
"code":"295"
|
||
},
|
||
{
|
||
"desc":"A datasource table was used to import data to a CloudTable HBase table. This HBase table contains a column family and a rowkey for 100 million simulating data records. Th",
|
||
"product_code":"dli",
|
||
"title":"RegionTooBusyException Is Reported When Data Is Imported to a CloudTable HBase Table Through a Datasource Table",
|
||
"uri":"dli_03_0252.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"289",
|
||
"code":"296"
|
||
},
|
||
{
|
||
"desc":"A table was created on GaussDB(DWS) and then a datasource connection was created on DLI to read and write data. An error message was displayed during data writing, indica",
|
||
"product_code":"dli",
|
||
"title":"A Null Value Is Written Into a Non-Null Field When a DLI Datasource Connection Is Used to Connect to a GaussDB(DWS) Table",
|
||
"uri":"dli_03_0253.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"289",
|
||
"code":"297"
|
||
},
|
||
{
|
||
"desc":"A datasource GaussDB(DWS) table and the datasource connection were created in DLI, and the schema of the source table in GaussDB(DWS) were updated. During the job executi",
|
||
"product_code":"dli",
|
||
"title":"An Insert Operation Failed After the Schema of the GaussDB(DWS) Source Table Is Updated",
|
||
"uri":"dli_03_0254.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"289",
|
||
"code":"298"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"APIs",
|
||
"uri":"dli_03_0056.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"123",
|
||
"code":"299"
|
||
},
|
||
{
|
||
"desc":"In the REST API provided by DLI, the request header can be added to the request URI, for example, Content-Type.Content-Type indicates the request body type or format. The",
|
||
"product_code":"dli",
|
||
"title":"Why Is Error \"unsupported media Type\" Reported When I Subimt a SQL Job?",
|
||
"uri":"dli_03_0060.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"299",
|
||
"code":"300"
|
||
},
|
||
{
|
||
"desc":"When different IAM users call an API under the same enterprise project in the same region, the project ID is the same.",
|
||
"product_code":"dli",
|
||
"title":"Is the Project ID Fixed when Different IAM Users Call an API?",
|
||
"uri":"dli_03_0125.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"299",
|
||
"code":"301"
|
||
},
|
||
{
|
||
"desc":"When the API call for submitting a SQL job times out, and the following error information is displayed:There are currently no resources tracked in the state, so there is ",
|
||
"product_code":"dli",
|
||
"title":"What Can I Do If an Error Is Reported When the Execution of the API for Creating a SQL Job Times Out?",
|
||
"uri":"dli_03_0178.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"299",
|
||
"code":"302"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"SDKs",
|
||
"uri":"dli_03_0058.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"123",
|
||
"code":"303"
|
||
},
|
||
{
|
||
"desc":"When you query the SQL job results using SDK, the system checks the job status when the job is submitted. The timeout interval set in the system is 300s. If the job is no",
|
||
"product_code":"dli",
|
||
"title":"How Do I Set the Timeout Duration for Querying SQL Job Results Using SDK?",
|
||
"uri":"dli_03_0073.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"303",
|
||
"code":"304"
|
||
},
|
||
{
|
||
"desc":"Run the ping command to check whether dli.xxx can be accessed.If dli.xxx can be accessed, check whether DNS resolution is correctly configured.If dli.xxx can be accessed,",
|
||
"product_code":"dli",
|
||
"title":"How Do I Handle the dli.xxx,unable to resolve host address Error?",
|
||
"uri":"dli_03_0255.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"303",
|
||
"code":"305"
|
||
},
|
||
{
|
||
"desc":"HUAWEI CLOUD Help Center presents technical documents to help you quickly get started with HUAWEI CLOUD services. The technical documents include Service Overview, Price Details, Purchase Guide, User Guide, API Reference, Best Practices, FAQs, and Videos.",
|
||
"product_code":"dli",
|
||
"title":"Change History",
|
||
"uri":"dli_01_00006.html",
|
||
"doc_type":"usermanual",
|
||
"p_code":"",
|
||
"code":"306"
|
||
}
|
||
] |