ITPub博客

首页 > 大数据 > 可视化 > hdp zeppelin 数据可视化分析配置

hdp zeppelin 数据可视化分析配置

原创 可视化 作者:破棉袄 时间:2017-01-19 14:54:04 0 删除 编辑

hdp版本

执行:hdp-select status hadoop-client | sed 's/hadoop-client - \(.*\)/\1/'
2.4.0.0-169

zeppelin安装(摘录官网):

(Optional) Installing Zeppelin Manually

The Zeppelin Technical Preview is available as an HDP package compiled against Spark 1.6.

To install the Zeppelin Technical Preview manually (instead of using Ambari), complete the following steps as user root.

  1. Install the Zeppelin service:
    yum install zeppelin
  2. Make a copy of zeppelin-env.sh:
    cd /usr/hdp/current/zeppelin-server/lib
    cp conf/zeppelin-env.sh.template conf/zeppelin-env.sh
  3. In the zeppelin-env.sh file, export the following three values.
    Note: you will use PORT to access the Zeppelin Web UI. corresponds to the version of HDP where you are installing Zeppelin; for example, 2.4.0.0-169.
    export HADOOP_CONF_DIR=/etc/hadoop/conf export ZEPPELIN_PORT=9995 export ZEPPELIN_JAVA_OPTS="-Dhdp.version="
  4. To obtain the HDP version for your HDP cluster, run the following command:
    hdp-select status hadoop-client | sed 's/hadoop-client - (.*)/1/'
  5. Copy hive-site.xml to Zeppelin’s conf directory:
    cd /usr/hdp/current/zeppelin-server/lib
    cp /etc/hive/conf/hive-site.xml conf/
  6. Remove “s” from the values of hive.metastore.client.connect.retry.delay and hive.metastore.client.socket.timeout, in the hive-site.xml file in zeppelin/conf dir. (This will avoid a number format exception.)
  7. Create a root user in HDFS:
    su hdfs
    hdfs dfs -mkdir /user/root
    hdfs dfs -chown root /user/root

To launch Zeppelin, run the following commands:

cd /usr/hdp/current/zeppelin-server/lib
bin/zeppelin-daemon.sh start
访问:http://ZEPPELIN_HOST:9995 
zeppelin spark配置: 
1)点击Interpreters :
master           yarn-client
spark.home       /usr/hdp/current/spark-client
spark.yarn.jar   /usr/hdp/current/spark-client/lib/spark-assembly-1.6.0.2.4.0.0-169-hadoop2.7.1.2.4.0.0-169.jar
spark.driver.extraJavaOptions -Dhdp.version=2.4.0.0-169
spark.yarn.am.extraJavaOptions -Dhdp.version=2.4.0.0-169
spark.executor.instances executor个数(默认2个)
2)conf/zeppelin-env.sh配置:
export HADOOP_CONF_DIR=/etc/hadoop/conf
export SPARK_HOME=/usr/hdp/2.4.0.0-169/spark
export ZEPPELIN_PORT=9995
export ZEPPELIN_JAVA_OPTS="-Dhdp.version=2.4.0.0-169"


new note ...

 

来自 “ ITPUB博客 ” ,链接:http://blog.itpub.net/29754888/viewspace-2132689/,如需转载,请注明出处,否则将追究法律责任。

请登录后发表评论 登录
全部评论

注册时间:2014-07-16

  • 博文量
    180
  • 访问量
    1111136