Skip to main content

Enable Hive LLAP Via Knox

 

Enable Hive LLAP Via Knox


What is Hive llap: Live Long And Process (LLAP)

LLAP provides a hybrid execution model.  It consists of a long-lived daemon which replaces direct interactions with the HDFS DataNode, and a tightly integrated DAG-based framework.

For more details on hive llap follow below url:

https://cwiki.apache.org/confluence/pages/viewpage.action?pageId=62689557

What is knox:

 The Apache Knox™ Gateway is an Application Gateway for interacting with the REST APIs and UIs

of Apache Hadoop deployments.

The Knox Gateway provides a single access point for all REST and HTTP interactions with Apache Hadoop clusters.

For more details on knox please follow below url

https://knox.apache.org/


Purpose:

This page explains the process of enabling Hive LLAP Via Knox.

 Process:

Step 1: Login to ambari and add the below properties in hive configs

Hive.server2.thrift.http.path = cliservice

Hive.server2.thrift.http.port = 10001

Hive.server2.transport.mode = http

Step 2: Login to knox server and perform he below actions

Go to /var/lib/knox/data-2.6.5.1108-1/services and cp hive folder

cp -r hive llap/0.13.0

Go to llap/0.13.0 folder and and edit the xml files as below

[root@techzone-n1 0.13.0]# ls -ltr

total 8

-rw-r----- 1 knox knox 1052 Apr 16 08:36 service.xml

-rw-r----- 1 knox knox  950 Apr 16 08:37 rewrite.xml

[root@techzone-n1 0.13.0]# cat

^C

[root@techzone-n1 0.13.0]# cat service.xml

<!--

   Licensed to the Apache Software Foundation (ASF) under one or more

   contributor license agreements.  See the NOTICE file distributed with

   this work for additional information regarding copyright ownership.

   The ASF licenses this file to You under the Apache License, Version 2.0

   (the "License"); you may not use this file except in compliance with

   the License.  You may obtain a copy of the License at

       http://www.apache.org/licenses/LICENSE-2.0

 

   Unless required by applicable law or agreed to in writing, software

   distributed under the License is distributed on an "AS IS" BASIS,

   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.

   See the License for the specific language governing permissions and

   limitations under the License.

-->

<service role="LLAP" name="llap" version="0.13.0">

    <routes>

        <route path="/llap"/>

    </routes>

    <dispatch classname="org.apache.hadoop.gateway.hive.HiveDispatch" ha-classname="org.apache.hadoop.gateway.hive.HiveHaDispatch"/>

</service>

[root@techzone-n1 0.13.0]# cat rewrite.xml

<!--

   Licensed to the Apache Software Foundation (ASF) under one or more

   contributor license agreements.  See the NOTICE file distributed with

   this work for additional information regarding copyright ownership.

   The ASF licenses this file to You under the Apache License, Version 2.0

   (the "License"); you may not use this file except in compliance with

   the License.  You may obtain a copy of the License at

 

       http://www.apache.org/licenses/LICENSE-2.0

 

   Unless required by applicable law or agreed to in writing, software

   distributed under the License is distributed on an "AS IS" BASIS,

   WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.

 

See the License for the specific language governing permissions and

   limitations under the License.

-->

<rules>

    <rule dir="IN" name="LLAP/llap/inbound" pattern="*://*:*/**/llap">

        <rewrite template="{$serviceUrl[LLAP]}"/>

    </rule>

</rules>

 

Step 3: Restart the hive,knox services in Ambari.

Step 4: Go to knox server and login to the beeline shell and give the below connection string.

You will be able to connect with the username and password.

!connect jdbc:hive2://nodefqdn-n1.connected.xyz.com:8443/;

ssl=true;transportMode=http;httpPath=gateway/default/llap

above steps are tested and performed on hdp 2.6 version, if you face any error while configuring hive llap via knox,

I request you to post your error in comment section and I will have a look.

 

Thank you !!!

 

Thank you !! Example HTML page Pleaes provide your valuable feedback.

Comments

Popular posts from this blog

Docker In Details

  Course Contents:- 1. Overview of Docker 2. Difference between Virtualization & Containerization 3. Installation & Configuration of Docker Runtime on Linux & Windows 4. Practice on Docker commands 5. launch a Webserver in a container 6. Launch public & official images of application like Jenkins, Nginx, DB etc.. 7. Launch a base OS Container 8. How to save changes inside the container & create a fresh image(commit) 9. How to ship image & container from one hardware to another. 10. How to remove stop/rm multiple container/images 11. Docker Registry 12. Docker Networking       Check current docker network                  Docker Network Bridge                     Docker Network Weaving                  Launch our own Docker Cluster with our defined Network             ...

Jenkins

Pre-requisites 1. Install a Webserver https://gitlab.com/Azam-devops/webserver/-/blob/main/README.md Code for index.html https://gitlab.com/Azam-devops/webserver 2. Maven Code https://gitlab.com/Azam-devops/imperial-maven-project 1. Install & configure Jenkins Automation Server on Linux Vm. 2. Go through at some of the important options in Jenkins. 3. Manage Jenkins. 4. Plugins 5. Global Tools Configuration. 6. Credentials 7. Users 8. Slave Nodes 9. Configuring CI pipeline using Gitlab. 10. Configuring standalone CICD pipeline using. 11. Automating the CICD pipeline. 12. Jenkins log 13. Introduction to Jenkins file. 14. Basic groovy syntax & file formation. 15. Launching a Pipeline using Jenkins file. 3. DevOps Architecture Description of above DevOps plan. Create Maven based source code in Gitlab. Create a Jenkins job which will execute below stages. Checkout code from Gitlab Build/compile the source code using Maven as a build tool. scan the code virtually. Test...

Roadmap to DevOps

    DevOps is nothing but the combination of process and philosophies which contains four basic component culture, collaboration, tools, and practices. In return, this gives a good automated system and infrastructure which helps an organisation to deliver a quality and reliable build. The beauty of this culture is it enables a quality for organizations to better serve their customers and compete more effectively in the market and also add some promised benefits which include confidence and trust, faster software releases, ability to solve critical issues quickly, and better manage unplanned work.   1. What are the tasks of a DevOps Engineer? Design, build, test and deploy scalable, distributed systems from development through production Manage the code repository(such as Git, SVN, BitBucket, etc.) including code merging and integrating, branching and maintenance and remote repository management Manage, configure and maintain infra...

Git

Git Git  has steadily risen from being just a preferred skill to a must-have skill in last few years. in this blog we will go through top 20 git commands that every devops uses daily. If you don't have a gitlab account. please follow below link to create it free.   https://gitlab.com/ before using git please install git on your linux machine using below command. yum install git -y Use below command to create ssh keyol ssh-keygen -t rsa Below are the git command which we will cover in this blog. ·          git config ·          git init ·          git clone ·          git add ·          git commit ·          git diff ·          git reset ·      ...

Connect SparkThriftServer with Tableau/PowerBI

  Connect SparkThriftServer with Tableau/PowerBI REFERENCE : https://docs.microsoft.com/en-us/azure/hdinsight/hdinsight-apache-spark-use-bi-tools Use Power BI for Spark data visualization Note This section is applicable only for Spark 1.6 on HDInsight 3.4 and Spark 2.0 on HDInsight 3.5.   Once you have saved the data as a table, you can use Power BI to connect to the data and visualize it to create reports, dashboards, etc.   1.       Make sure you have access to Power BI. You can get a free preview subscription of Power BI from http://www.powerbi.com/ . 2.       Sign in to Power BI . 3.       From the bottom of the left pane, click Get Data . 4.       On the Get Data page, under Import or Connect to Data , for Databases , click Get . 5.       On the next screen, click Spark on Azure HDInsight and then click Connect . When prompted, enter th...