Skip to main content

Ranger KMS Encryption

 Ranger KMS Encryption

Range KMS is used to encrypt the HDFS (data at rest). This is very important for your cluster and customer to encrypt your data. It gives more security to your data.

Ranger provides centralized administration of the key management server through the Ranger admin portal.

There are 3 main functions provided by Ranger KMS.

1.    Key Management: It provided you facility to create/update/delete keys using UI interface. Using keyadmin username and password.

2.    Access Control Policies: through this you can manage the permission of your keys.

3.    Audit: this helps you to track the activities on your Ranger KMS.

Ranger KMS with hdfs encryption is recommended to use in all env. To secure the key storage using database.

KMS is also scalable and you can use multiple versions of KMS behind the load balancer.

This blog page depicts the Process of Creating the Encryption Zone.

Process:

Step 1: Create the directory structure on your hdfs which you want to encrypt.
Note: you cannot encrypt the existing hdfs path which already consisting data.
[techzone@node01 ~]$ hdfs dfs –mkdir <hdfs path>

Step 2: Assign 000 Permission on directory
[techzone@node01 ~]$ hdfs dfs –chmod 000 <hdfs path>

Step 3: Create the KMS key. Login to the ranger using kmsadmin as user.

Step 4: Select Encryption Key manager Select Service  service name Add New key

Step 5: Enter the Key name of the directory using (-) as separator ,Length: set the key length 256 and Save the

Step 6: Once the key is created, verify if the key is listed in dashboard.

Step 7: Select Access Manger service name Add New Policy

1. Policy Creation Details
2.  Policy name: name of the directory.
3.    Key name: name of the created KMS Key
4.    Select group: select the appropriate group to provide access
5.    Permission: provide the Decrypt EEK and Encrypt EEK Permission

Provide the delegate admin access to privileged user group.

Step 8: Logout from keyadmin user and Login with you EID to create the ranger policy for hdfs directory. Under the HDFS select the file  Add New Policy
Policy Creation Info
1. Policy Name: name of the directory-Encryption-Zone
2. source Path: path of the directory
3. select Group: Provide the group name to access the path
4.  Permission: select All (Execute,Read,Write)

Save the Policy.

Step 9: Login to the Command line. Destroy the currect user creds and kinit with user 

Step 10: Use the below command to create the encryption zone.

[techzone@node-01 ~]$ hdfs crypto -createZone -keyName <keynameyoucreated> -path
<pathonwhichEncryptiontoapply>

Step 11: After creating the key successfully you will receive the output as key created. Verify the newly created key using below command.
[techzone@node01 ~]$ hdfs crypto –listZones (it will list the all created keys)

Please let me know if comment section if you face any issue while performing above steps. I will try my best to help you.

 Thank you !!! 
 
 

Thank you !! Example HTML page Pleaes provide your valuable feedback.

Comments

Popular posts from this blog

Docker In Details

  Course Contents:- 1. Overview of Docker 2. Difference between Virtualization & Containerization 3. Installation & Configuration of Docker Runtime on Linux & Windows 4. Practice on Docker commands 5. launch a Webserver in a container 6. Launch public & official images of application like Jenkins, Nginx, DB etc.. 7. Launch a base OS Container 8. How to save changes inside the container & create a fresh image(commit) 9. How to ship image & container from one hardware to another. 10. How to remove stop/rm multiple container/images 11. Docker Registry 12. Docker Networking       Check current docker network                  Docker Network Bridge                     Docker Network Weaving                  Launch our own Docker Cluster with our defined Network             ...

Connect SparkThriftServer with Tableau/PowerBI

  Connect SparkThriftServer with Tableau/PowerBI REFERENCE : https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-apache-spark-use-bi-tools Use Power BI for Spark data visualization Note This section is applicable only for Spark 1.6 on HDInsight 3.4 and Spark 2.0 on HDInsight 3.5.   Once you have saved the data as a table, you can use Power BI to connect to the data and visualize it to create reports, dashboards, etc.   1.       Make sure you have access to Power BI. You can get a free preview subscription of Power BI from http://www.powerbi.com/ . 2.       Sign in to Power BI . 3.       From the bottom of the left pane, click Get Data . 4.       On the Get Data page, under Import or Connect to Data , for Databases , click Get . 5.       On the next screen, click Spark on Azure HDInsight and then click Connect . When prompted, enter th...

Roadmap to DevOps

    DevOps is nothing but the combination of process and philosophies which contains four basic component culture, collaboration, tools, and practices. In return, this gives a good automated system and infrastructure which helps an organisation to deliver a quality and reliable build. The beauty of this culture is it enables a quality for organizations to better serve their customers and compete more effectively in the market and also add some promised benefits which include confidence and trust, faster software releases, ability to solve critical issues quickly, and better manage unplanned work.   1. What are the tasks of a DevOps Engineer? Design, build, test and deploy scalable, distributed systems from development through production Manage the code repository(such as Git, SVN, BitBucket, etc.) including code merging and integrating, branching and maintenance and remote repository management Manage, configure and maintain infra...

Kubernetes-Update

                                                    https://kubernetes.io/ Kubernetes (K8s)  is an open-source system for automating deployment, scaling, and management of containerized applications. It groups containers that make up an application into logical units for easy management and discovery. Kubernetes builds upon  15 years of experience of running production workloads at Google , combined with best-of-breed ideas and practices from the community. Latest Verion:-  1.19 Kubernetes Objects Kubernetes defines a set of building blocks ("primitives"), which collectively provide mechanisms that deploy, maintain, and scale applications based on CPU, memory or custom metrics. Kubernetes is loosely coupled and extensible to meet different workloads. This extensibility is provided in large part by the Kubernetes API, which is used by int...

Git

Git Git  has steadily risen from being just a preferred skill to a must-have skill in last few years. in this blog we will go through top 20 git commands that every devops uses daily. If you don't have a gitlab account. please follow below link to create it free.   https://gitlab.com/ before using git please install git on your linux machine using below command. yum install git -y Use below command to create ssh keyol ssh-keygen -t rsa Below are the git command which we will cover in this blog. ·          git config ·          git init ·          git clone ·          git add ·          git commit ·          git diff ·          git reset ·      ...