CREATE SERVER creates an external server.
An external server stores information of HDFS clusters, OBS servers, DLI connections, or other homogeneous clusters.
By default, only the system administrator can create a foreign server. Otherwise, creating a server requires permissions on the foreign data wrapper being used. Use the following syntax to grant permissions:
1 | GRANT USAGE ON FOREIGN DATA WRAPPER fdw_name TO username; |
fdw_name is the name of the foreign data wrapper, and username is the username of creating SERVER.
1 2 3 | CREATE SERVER server_name FOREIGN DATA WRAPPER fdw_name OPTIONS ( { option_name ' value ' } [, ...] ) ; |
Name of the foreign server to be created. The server name must be unique in a database.
Value range: The length must be less than or equal to 63.
Specifies the name of the foreign data wrapper.
Value range: fdw_name indicates the data wrapper created by the system in the initial phase of the database. Currently, fdw_name can be hdfs_fdw or dfs_fdw for the HDFS cluster, and can be gc_fdw for other homogeneous clusters.
Specifies the parameters for the server. The detailed parameter description is as follows:
Specifies the IP address of the OBS service endpoint or HDFS cluster.
OBS: address is the endpoint of OBS.
HDFS: Specifies the IP address and port number of a NameNode (metadata node) in the HDFS cluster, or the IP address and port number of a CN in other homogeneous clusters.
HDFS NameNodes are deployed in primary/secondary mode for HA. Add the addresses of the primary and secondary NameNodes to address. When accessing HDFS, GaussDB(DWS) dynamically searches for the active NameNode.
address option must exist.
If the server type is DLI, the address is the OBS address stored on DLI.
This parameter is available only when type is HDFS.
You can set the hdfscfgpath parameter to specify the HDFS configuration file path. GaussDB(DWS) accesses the HDFS cluster based on the connection configuration mode and security mode specified in the HDFS configuration file stored in that path. If the HDFS cluster is connected in non-secure mode, data transmission encryption is not supported.
If the address option is not specified, the address specified by hdfscfgpath in the configuration file is used by default.
Specifies whether data is encrypted. This parameter is available only when type is OBS. The default value is off.
Value range:
Specifies the access key (AK) (obtained by users from the OBS console) used for the OBS access protocol. When you create a foreign table, its AK value is encrypted and saved to the metadata table of the database. This parameter is available only when type is OBS.
Specifies the secret access key (SK) (obtained by users from the OBS console) used for the OBS access protocol. When you create a foreign table, its SK value is encrypted and saved to the metadata table of the database. This parameter is available only when type is OBS.
Specifies the dfs_fdw connection type.
Value range:
Specifies the endpoint of the DLI service. This parameter is available only when type is DLI.
Specifies the access key (AK) (obtained by users from the DLI console) used for the DLI access protocol. When you create a foreign table, its AK value is encrypted and saved to the metadata table of the database. This parameter is available only when type is DLI.
Specifies the secret access key (SK) (obtained by users from the DLI console) used for the DLI access protocol. When you create a foreign table, its SK value is encrypted and saved to the metadata table of the database. This parameter is available only when type is DLI.
Specifies the database name of a remote cluster to be connected. This parameter is used for collaborative analysis.
Specifies the username of a remote cluster to be connected. This parameter is used for collaborative analysis.
Specifies the user password of a remote cluster to be connected. This parameter is used for collaborative analysis.
When an on-premises cluster is migrated to the cloud, the password in the server configuration exported from the on-premises cluster is in ciphertext. The encryption and decryption keys of the on-premises cluster are different from those of the cloud cluster. Therefore, if CREATE SERVER is executed on the cloud cluster, the execution fails and a decryption failure error is reported. In this case, you need to manually change the password in CREATE SERVER to a plaintext password.
Create the hdfs_server server, in which hdfs_fdw is the built-in foreign data wrapper.
1 2 3 4 5 | CREATE SERVER hdfs_server FOREIGN DATA WRAPPER HDFS_FDW OPTIONS (address '10.10.0.100:25000,10.10.0.101:25000', hdfscfgpath '/opt/hadoop_client/HDFS/hadoop/etc/hadoop', type 'HDFS' ) ; |
Create the obs_server server, in which dfs_fdw is the built-in foreign data wrapper.
1 2 3 4 5 6 | CREATE SERVER obs_server FOREIGN DATA WRAPPER DFS_FDW OPTIONS ( address 'obs.xxx.xxx.com', access_key 'xxxxxxxxx', secret_access_key 'yyyyyyyyyyyyy', type 'obs' ); |
Create the dli_server server, in which dfs_fdw is the built-in foreign data wrapper.
1 2 3 4 5 6 7 8 9 | CREATE SERVER dli_server FOREIGN DATA WRAPPER DFS_FDW OPTIONS ( address 'obs.xxx.xxx.com', access_key 'xxxxxxxxx', secret_access_key 'yyyyyyyyyyyyy', type 'dli', dli_address 'dli.xxx.xxx.com', dli_access_key 'xxxxxxxxx', dli_secret_access_key 'yyyyyyyyyyyyy' ); |
Create another server in the homogeneous cluster, where gc_fdw is the foreign data wrapper in the database.
1 2 3 4 5 6 | CREATE SERVER server_remote FOREIGN DATA WRAPPER GC_FDW OPTIONS (address '10.10.0.100:25000,10.10.0.101:25000', dbname 'test', username 'test', password 'xxxxxxxx' ); |
Helpful Links