Table of Contents Previous Next



Use the CREATE SERVER command to define a connection to a foreign server. The syntax is:
CREATE SERVER server_name FOREIGN DATA WRAPPER hdfs_fdw
[OPTIONS (option 'value' [, ...])]
The role that defines the server is the owner of the server; use the ALTER SERVER command to reassign ownership of a foreign server. To create a foreign server, you must have USAGE privilege on the foreign-data wrapper specified in the CREATE SERVER command.
Use server_name to specify a name for the foreign server. The server name must be unique within the database.
Include the FOREIGN_DATA_WRAPPER clause to specify that the server should use the hdfs_fdw foreign data wrapper when connecting to the cluster.
Use the OPTIONS clause of the CREATE SERVER command to specify connection information for the foreign server. You can include:
client_type
Specify hiveserver2 or spark as the client type. To use the ANALYZE statement on Spark, you must specify a value of spark; if you do not specify a value for client_type, the default value is hiveserver2.
auth_type
The authentication type of the client; specify LDAP or NOSASL. If you do not specify an auth_type, the data wrapper will decide the auth_type value on the basis of the user mapping:
connect_timeout
fetch_size
If true, logging will include SQL commands executed on the remote hive server and the number of times that a scan is repeated. The default is false.
Use query_timeout to provide the number of seconds after which a request will timeout if it is not satisfied by the Hive server. Query timeout is not supported by the Hive JDBC driver.
use_remote_estimate
Include the use_remote_estimate to instruct the server to use EXPLAIN commands on the remote server when estimating processing costs. By default, use_remote_estimate is false, and remote tables are assumed to have 1000 rows.
The following command creates a foreign server named hdfs_server that uses the hdfs_fdw foreign data wrapper to connect to a host with an IP address of 170.11.2.148:
The foreign server uses the default port (10000) for the connection to the client on the Hadoop cluster; the connection uses an LDAP server.
For more information about using the CREATE SERVER command, see:


Table of Contents Previous Next