Before creating a data migration job, create a link to enable the CDM cluster to read data from and write data to a data source. A migration job requires a source link and a destination link. For details on the data sources that can be exported (source links) and imported (destination links) in different migration modes (table/file migration), see Supported Data Sources.
The link configurations depend on the data source. This section describes how to create these links.
If changes occur in the connected data source (for example, the MRS cluster capacity is expanded), you need to edit and save the connection.
The connectors are classified based on the type of the data source to be connected. All supported data types are displayed.
Connector |
Description |
---|---|
|
Because the JDBC drivers used to connect to these relational databases are the same, the parameters to be configured are also the same and are described in Link to a Common Relational Database. |
MySQL |
For details about the parameters, see Link to a MySQL Database. |
Oracle |
For details about the parameters, see Link to an Oracle Database. |
Database Sharding |
For details about the parameters, see Link to a Database Shard. |
Object StorageService(OBS) |
For details about the parameters, see Link to OBS. |
|
If the data source is HDFS of MRS, Apache Hadoop, or FusionInsight HD, see Link to HDFS. |
|
If the data source is HBase of MRS, Apache Hadoop, or FusionInsight HD, see Link to HBase. |
|
If the data source is Hive on MRS, Apache Hadoop, or FusionInsight HD, see Link to Hive. |
CloudTable Service |
If the data source is CloudTable, see Link to CloudTable. |
|
If the data source is an FTP or SFTP server, see Link to an FTP or SFTP Server. |
HTTP |
These connectors are used to read files with an HTTP/HTTPS URL, such as reading public files on the third-party object storage system and web disks. When creating an HTTP link, you only need to configure the link name. The URL is configured during job creation. |
MongoDB |
If the data source is a local MongoDB, see Link to MongoDB. |
Document Database Service (DDS) |
If the data source is DDS, see Link to DDS. |
|
If the data source is Redis or DCS, see Link to Redis/DCS. |
|
If the data source is MRS Kafka or Apache Kafka, see Link to Kafka. |
Cloud Search Service (CSS) Elasticsearch |
If the data source is CSS or Elasticsearch, see Link to Elasticsearch/CSS. |
Data Lake Insight |
If the data source is DLI, see Link to DLI. |
DMS Kafka |
If the data source is DMS Kafka, see Link to DMS Kafka. |
Cassandra |
If the data source is Cassandra, see Link to Cassandra. |
If the network is poor or the data source is too large, the link test may take 30 to 60 seconds.
Before managing a link, ensure that the link is not used by any job to avoid affecting jobs. The procedure for managing connections is as follows: