The Ranger administrator can use Ranger to set permissions for Spark2x users.
Enable Ranger: spark.ranger.plugin.authorization.enable=true
Disable Ranger: spark.ranger.plugin.authorization.enable=false
Parameter |
Description |
---|---|
Policy Name |
Policy name, which can be customized and must be unique in the service. |
Policy Conditions |
IP address filtering policy, which can be customized. You can enter one or more IP addresses or IP address segments. The IP address can contain the wildcard character (*), for example, 192.168.1.10,192.168.1.20, or 192.168.1.*. |
Policy Label |
A label specified for the current policy. You can search for reports and filter policies based on labels. |
database |
Name of the Spark2x database to which the policy applies. The Include policy applies to the current input object, and the Exclude policy applies to objects other than the current input object. |
table |
Name of the Spark2x table to which the policy applies. To add a UDF-based policy, switch to UDF and enter the UDF name. The Include policy applies to the current input object, and the Exclude policy applies to objects other than the current input object. |
column |
Name of the column to which the policy applies. The value * indicates all columns. The Include policy applies to the current input object, and the Exclude policy applies to objects other than the current input object. |
Description |
Policy description. |
Audit Logging |
Whether to audit the policy. |
Allow Conditions |
Policy allowed condition. You can configure permissions and exceptions allowed by the policy. In the Select Role, Select Group, and Select User columns, select the role, user group, or user to which the permission is to be granted, click Add Conditions, add the IP address range to which the policy applies, and click Add Permissions to add the corresponding permission.
To add multiple permission control rules, click If users or user groups in the current condition need to manage this policy, select Delegate Admin. These users will become the agent administrators. The agent administrators can update and delete this policy and create sub-policies based on the original policy. |
Deny Conditions |
Policy rejection condition, which is used to configure the permissions and exceptions to be denied in the policy. The configuration method is similar to that of Allow Conditions. |
Task |
Operation |
---|---|
role admin operation |
NOTE:
After being bound to the Hive administrator role, perform the following operations during each maintenance operation:
|
Creating a database table |
|
Deleting a table |
|
ALTER operation |
|
LOAD operation |
|
INSERT operation |
|
GRANT operation |
|
ADD JAR operation |
|
VIEW and INDEX permissions |
|
Operations on other user database tables |
|
After Spark SQL access policy is added on Ranger, you need to add the corresponding path access policies in the HDFS access policy. Otherwise, data files cannot be accessed. For details, see Adding a Ranger Access Permission Policy for HDFS.
To disable a policy, click to edit the policy and set the policy to Disabled.
If a policy is no longer used, click to delete it.
Ranger supports data masking for Spark2x data. It can process the returned result of the select operation you performed to mask sensitive information.
Parameter |
Description |
---|---|
Policy Name |
Policy name, which can be customized and must be unique in the service. |
Policy Conditions |
IP address filtering policy, which can be customized. You can enter one or more IP addresses or IP address segments. The IP address can contain the wildcard character (*), for example, 192.168.1.10,192.168.1.20, or 192.168.1.*. |
Policy Label |
A label specified for the current policy. You can search for reports and filter policies based on labels. |
Hive Database |
Name of the Spark2x database to which the current policy applies. |
Hive Table |
Name of the Spark2x table to which the current policy applies. |
Hive Column |
Name of the Spark2x column to which the current policy applies. |
Description |
Policy description. |
Audit Logging |
Whether to audit the policy. |
Mask Conditions |
In the Select Group and Select User columns, select the user group or user to which the permission is to be granted, click Add Conditions, add the IP address range to which the policy applies, then click Add Permissions, and select select. Click Select Masking Option and select a data masking policy.
To add a multi-column masking policy, click |
Deny Conditions |
Policy rejection condition, which is used to configure the permissions and exceptions to be denied in the policy. The configuration method is similar to that of Allow Conditions. |
Ranger allows you to filter data at the row level when you perform the select operation on Spark2x data tables.
Parameter |
Description |
---|---|
Policy Name |
Policy name, which can be customized and must be unique in the service. |
Policy Conditions |
IP address filtering policy, which can be customized. You can enter one or more IP addresses or IP address segments. The IP address can contain the wildcard character (*), for example, 192.168.1.10,192.168.1.20, or 192.168.1.*. |
Policy Label |
A label specified for the current policy. You can search for reports and filter policies based on labels. |
Hive Database |
Name of the Spark2x database to which the current policy applies. |
Hive Table |
Name of the Spark2x table to which the current policy applies. |
Description |
Policy description. |
Audit Logging |
Whether to audit the policy. |
Row Filter Conditions |
In the Select Role, Select Group, and Select User columns, select the object to which the permission is to be granted, click Add Conditions, add the IP address range to which the policy applies, then click Add Permissions, and select select. Click Row Level Filter and enter data filtering rules. For example, if you want to filter the data in the zhangsan row in the name column of table A, the filtering rule is name <>'zhangsan'. For more information, see the official Ranger document. To add more rules, click |