本文共 5514 字,大约阅读时间需要 18 分钟。
apache-hive-2.1.0-bin.tar.gz
包hive
vim /etc/profileexport HIVE_HOME=xxxxexport PATH=$PATH:$HIVE_HOME/bin
然后刷新配置source /etc/profile
首先需要下载并把mysql-connector-java-5.1.17.jar
拷贝到hive/lib
目录下,作为驱动要用到。
配置文件都在hive/conf
目录下
更名:mv hive-default.xml.template hive-site.xml
javax.jdo.option.ConnectionURL jdbc:mysql://192.168.27.166:3306/hive?createDatabaseIfNotExist=true javax.jdo.option.ConnectionDriverName com.mysql.jdbc.Driver javax.jdo.option.ConnectionUserName hive javax.jdo.option.ConnectionPassword hive hive.metastore.warehouse.dir /user/hive/warehouse
然后把全文中的${system:java.io.tmpdir}
替换成/home/fantj/hive/fantj
${system:user.name}
替换成 root
最后,创建该目录 mkdir -p /home/fantj/hive/fantj/root
更名:mv hive-env.sh.template hive-env.sh
export JAVA_HOME=/soft/jdkexport HIVE_HOME=/soft/hiveexport HADOOP_HOME=/soft/hadoop
schematool -initSchema -dbType mysql
[root@s166 conf]# schematool -initSchema -dbType mysqlwhich: no hbase in (/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/home/fantj/jdk/bin:/home/fantj/hadoop/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/home/fantj/jdk/bin:/home/fantj/hadoop/sbin:/root/bin:/home/fantj/jdk/bin:/home/fantj/hadoop/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/home/fantj/jdk/bin:/home/fantj/hadoop/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/home/fantj/jdk/bin:/home/fantj/hadoop/sbin:/root/bin:/home/fantj/jdk/bin:/home/fantj/hadoop/sbin:/home/fantj/hive/bin)SLF4J: Class path contains multiple SLF4J bindings.SLF4J: Found binding in [jar:file:/home/fantj/download/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: Found binding in [jar:file:/home/fantj/download/hadoop-2.7.0/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]Metastore connection URL: jdbc:mysql://192.168.27.166:3306/hive?createDatabaseIfNotExist=trueMetastore Connection Driver : com.mysql.jdbc.DriverMetastore connection User: hiveStarting metastore schema initialization to 2.1.0Initialization script hive-schema-2.1.0.mysql.sqlInitialization script completedschemaTool completed
注意:注意hadoop要启动
[root@s166 bin]# hivewhich: no hbase in (/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/home/fantj/jdk/bin:/home/fantj/hadoop/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/home/fantj/jdk/bin:/home/fantj/hadoop/sbin:/root/bin:/home/fantj/jdk/bin:/home/fantj/hadoop/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/home/fantj/jdk/bin:/home/fantj/hadoop/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin:/home/fantj/jdk/bin:/home/fantj/hadoop/sbin:/root/bin:/home/fantj/jdk/bin:/home/fantj/hadoop/sbin:/home/fantj/hive/bin)SLF4J: Class path contains multiple SLF4J bindings.SLF4J: Found binding in [jar:file:/home/fantj/download/apache-hive-2.1.0-bin/lib/log4j-slf4j-impl-2.4.1.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: Found binding in [jar:file:/home/fantj/download/hadoop-2.7.0/share/hadoop/common/lib/slf4j-log4j12-1.7.10.jar!/org/slf4j/impl/StaticLoggerBinder.class]SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.SLF4J: Actual binding is of type [org.apache.logging.slf4j.Log4jLoggerFactory]Logging initialized using configuration in jar:file:/home/fantj/download/apache-hive-2.1.0-bin/lib/hive-common-2.1.0.jar!/hive-log4j2.properties Async: trueHive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.hive>
hive> show databases;OKdefaultmydb2Time taken: 1.55 seconds, Fetched: 2 row(s)hive> create database fantj;OKTime taken: 0.801 secondshive> use fantj;OKTime taken: 0.035 secondshive> create table test(id int,name string,age int);OKTime taken: 0.833 secondshive> insert into test values(1,'fantj',18);WARNING: Hive-on-MR is deprecated in Hive 2 and may not be available in the future versions. Consider using a different execution engine (i.e. spark, tez) or using Hive 1.X releases.Query ID = root_20180727115808_c39d95f3-9bbd-4a60-b627-d5f0016ff6c3Total jobs = 3Launching Job 1 out of 3Number of reduce tasks is set to 0 since there's no reduce operatorJob running in-process (local Hadoop)07-27 11:58:19,477 Stage-1 map = 0%, reduce = 0%07-27 11:58:20,487 Stage-1 map = 100%, reduce = 0%Ended Job = job_local1311590634_0001Stage-4 is selected by condition resolver.Stage-3 is filtered out by condition resolver.Stage-5 is filtered out by condition resolver.Moving data to directory hdfs://s166/user/hive/warehouse/fantj.db/test/.hive-staging_hive_07-27_11-58-08_359_7490410987943534015-1/-ext-10000Loading data to table fantj.test[Warning] could not update stats.MapReduce Jobs Launched: Stage-Stage-1: HDFS Read: 11 HDFS Write: 88 SUCCESSTotal MapReduce CPU Time Spent: 0 msecOKTime taken: 39.138 secondshive> select * from test;OK1 fantj 18Time taken: 3.26 seconds, Fetched: 1 row(s)
转载地址:http://ruiym.baihongyu.com/