ITPub博客

首页 > 大数据 > Hadoop > Java API操作Hive

Java API操作Hive

原创 Hadoop 作者:hz_ganwei 时间:2018-07-29 13:09:40 0 删除 编辑

环境:

   IDEA2017.3+Maven-3.3.9+Hive1.1.0


1. pom.xml里面的依赖包配置


<properties>

 <hive.version>1.1.0</hive.version>
</properties>
<dependencies>
 <!-- 添加Hive-jdbc的dependency -->
 <dependency>
   <groupId>org.apache.hive</groupId>
   <artifactId>hive-jdbc</artifactId>
   <version>${hive.version}</version>
 </dependency>
</dependencies>

2. 新建HiveJdbcClient.java 文件

package com.ruozedata.day6;

import java.sql.*;

public class HiveJdbcClient {
   /**
    * 需要把:org.apache.hadoop.hive.jdbc.HiveDriver
    * 改成:org.apache.hive.jdbc.HiveDriver
    */
   private static String driverName = "org.apache.hive.jdbc.HiveDriver";

   public static void main(String[] args) throws SQLException {
       try {
           Class.forName(driverName);
       } catch (ClassNotFoundException e) {
           // TODO Auto-generated catch block
           e.printStackTrace();
           System.exit(1);
       }
       /**
        * 需要把:jdbc:hive://localhost:10000/default
        * 改成:jdbc:hive2://192.168.1.108:10000/ruozedata_test
        */
       Connection con = DriverManager.getConnection("jdbc:hive2://192.168.1.108:10000/ruozedata_test", "root", "root");
       Statement stmt = con.createStatement();
       String tableName = "a";
       String sql = "" ;
       /**
        * a的表结构:
        * # col_name              data_type
        * id                      int
        * name                    string
        * age                     int
        * select id,name,age query
        */

       sql = "select id,name,age  from " + tableName;
       System.out.println("Running: " + sql);
       ResultSet  res = stmt.executeQuery(sql);
       while (res.next()) {
           System.out.println(String.valueOf(res.getInt(1)) + "\t"
                   + res.getString(2)+ "\t"
                   + res.getInt(3));
       }
   }
}

需要注意2点:
1.官网的驱动名称driverName: org.apache.hadoop.hive.jdbc.HiveDriver
需要改成:org.apache.hive.jdbc.HiveDriver

2.官网的数据库连接URL:jdbc:hive://localhost:10000/default
需要改成:jdbc:hive2://localhost:10000/default  
其中localhost改成数据库的IP,default改成需要连接的数据名

3. 需要开启Hive的 hiveserver2 服务。
  如果不开启hiveserver2 服务,运行程序的时候就会报错:
  Exception in thread "main" java.sql.SQLException: Could not open client transport with JDBC Uri: jdbc:hive2://192.168.1.108:10000/ruozedata_test: java.net.SocketException: Connection reset


4. 运行程序的结果:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/D:/software/apache-maven-3.3.9/repository/org/slf4j/slf4j-log4j12/1.7.5/slf4j-log4j12-1.7.5.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/D:/software/apache-maven-3.3.9/repository/ch/qos/logback/logback-classic/1.2.3/logback-classic-1.2.3.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]
log4j:WARN No appenders could be found for logger (org.apache.hive.jdbc.Utils).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Running: select id,name,age  from a
1 zhangsan 15

Process finished with exit code 0


来自 “ ITPUB博客 ” ,链接:http://blog.itpub.net/29609890/viewspace-2168574/,如需转载,请注明出处,否则将追究法律责任。

请登录后发表评论 登录
全部评论

注册时间:2014-04-15

  • 博文量
    44
  • 访问量
    38502