[root@172 ~]# su hadoop[hadoop@172 root]$ cd /usr/local/service/kyuubi
[hadoop@10kyuubi]$ bin/beeline -u "jdbc:hive2://${zkserverip1}:${zkport},${zkserverip2}:${zkport},${zkserverip3}:${zkport}/default;serviceDiscoveryMode=zooKeeper;zooKeeperNamespace=kyuubi" -n hadoop
[hadoop@10kyuubi]$ bin/beeline -u "jdbc:hive2://${kyuubiserverip}:${kyuubiserverport}" -n hadoop
${zkserverip}:${zkport}, see the kyuubi.ha.zookeeper.quorum configuration item of kyuubi-defaults.conf.
For more information on ${kyuubiserverport}, see the kyuubi.frontend.bind.port configuration item of kyuubi-defaults.conf.0: jdbc:hive2://ip:port> create database sparksql;+---------+| Result |+---------++---------+No rows selected (0.326 seconds)
0: jdbc:hive2://ip:port> use sparksql;+---------+| Result |+---------++---------+No rows selected (0.077 seconds)0: jdbc:hive2://ip:port> create table sparksql_test(a int,b string);+---------+| Result |+---------++---------+No rows selected (0.402 seconds)0: jdbc:hive2://ip:port> show tables;+-----------+----------------+--------------+| database | tableName | isTemporary |+-----------+----------------+--------------+| sparksql | sparksql_test | false |+-----------+----------------+--------------+1 row selected (0.108 seconds)
hadoop.proxyuser.hue.groups and hadoop.proxyuser.hue.hosts configuration items to core-site.xml, and set their values to *.

/usr/local/service/.pom.xml file, eliminating your need to add them manually. Download and install Maven locally first and then configure its environment variables. If you are using the IDE, set the Maven-related configuration items in the IDE.
In the local shell environment, enter the directory where you want to create the Maven project, such as D://mavenWorkplace, and enter the following command to create it:mvn archetype:generate -DgroupId=$yourgroupID -DartifactId=$yourartifactID -DarchetypeArtifactId=maven-archetype-quickstart
$yourgroupID is your package name, $yourartifactID is your project name, and maven-archetype-quickstart indicates to create a Maven Java project. Some files need to be downloaded for creating the project, so stay connected to the internet. After successfully creating the project, you will see a folder named $yourartifactID in the D://mavenWorkplace directory. The files included in the folder have the following structure:simple---pom.xml Core configuration, under the project root directory---src---main---java Java source code directory---resources Java configuration file directory---test---java Test source code directory---resources Test configuration directory
pom.xml file and the Java folder under the main directory. The pom.xml file is primarily used to create dependencies and package configurations; the Java folder is used to store your source code. First, add the Maven dependencies to pom.xml:<dependencies><dependency><groupId>org.apache.kyuubi</groupId><artifactId>kyuubi-hive-jdbc-shaded</artifactId><version>1.4.1-incubating</version></dependency><dependency><groupId>org.apache.hadoop</groupId><artifactId>hadoop-common</artifactId><!-- keep consistent with the build hadoop version --><version>2.8.5</version></dependency></dependencies>
<build><plugins><plugin><groupId>org.apache.maven.plugins</groupId><artifactId>maven-compiler-plugin</artifactId><configuration><source>1.8</source><target>1.8</target><encoding>utf-8</encoding></configuration></plugin><plugin><artifactId>maven-assembly-plugin</artifactId><configuration><descriptorRefs><descriptorRef>jar-with-dependencies</descriptorRef></descriptorRefs></configuration><executions><execution><id>make-assembly</id><phase>package</phase><goals><goal>single</goal></goals></execution></executions></plugin></plugins></build>
KyuubiJDBCTest.java here) and add the sample code to the Class:import java.sql.*;public class KyuubiJDBCTest {private static String driverName ="org.apache.kyuubi.jdbc.KyuubiHiveDriver";public static void main(String[] args)throws SQLException {try {Class.forName(driverName);} catch (ClassNotFoundException e) {e.printStackTrace();System.exit(1);}Connection con = DriverManager.getConnection("jdbc:hive2://$kyuubiserverhost:$kyuubiserverport/default", "hadoop", "");Statement stmt = con.createStatement();String tableName = "KyuubiTestByJava";stmt.execute("drop table if exists " + tableName);stmt.execute("create table " + tableName +" (key int, value string)");System.out.println("Create table success!");// show tablesString sql = "show tables '" + tableName + "'";System.out.println("Running: " + sql);ResultSet res = stmt.executeQuery(sql);if (res.next()) {System.out.println(res.getString(1));}// describe tablesql = "describe " + tableName;System.out.println("Running: " + sql);res = stmt.executeQuery(sql);while (res.next()) {System.out.println(res.getString(1) + "\\t" + res.getString(2));}sql = "insert into " + tableName + " values (42,\\"hello\\"),(48,\\"world\\")";stmt.execute(sql);sql = "select * from " + tableName;System.out.println("Running: " + sql);res = stmt.executeQuery(sql);while (res.next()) {System.out.println(String.valueOf(res.getInt(1)) + "\\t"+ res.getString(2));}sql = "select count(1) from " + tableName;System.out.println("Running: " + sql);res = stmt.executeQuery(sql);while (res.next()) {System.out.println(res.getString(1));}}}
$kyuubiserverhost and $kyuubiserverport parameters in the program should be replaced with the values of the IP and port number of the KyuubiServer you queried.mvn package
scp $localfile root@public IP address:/usr/local/service/kyuubi
/usr/local/service/kyuubi directory. Then, you can run the following program:[hadoop@172 kyuubi]$ yarn jar $package.jar KyuubiJDBCTest
$package.jar is the path plus name of your JAR package, and KyuubiJDBCTest is the name of the previously created Java Class. The result is as follows:Create table success!Running: show tables 'KyuubiTestByJava'defaultRunning: describe KyuubiTestByJavakey intvalue stringRunning: select * from KyuubiTestByJava42 hello48 worldRunning: select count(1) from KyuubiTestByJava2
피드백