ITPub博客

首页 > 大数据 > 数据分析 > spark 2.1.0 standalone模式配置&&打包jar包通过spark-submit提交

spark 2.1.0 standalone模式配置&&打包jar包通过spark-submit提交

原创 数据分析 作者:hgs19921112 时间:2018-10-08 22:31:10 1 删除 编辑
配置
spark-env.sh
	export JAVA_HOME=/apps/jdk1.8.0_181
	export SPARK_MASTER_HOST=bigdata00
	export SPARK_MASTER_PORT=7077
slaves
	bigdata01
	bigdata02
	bigdata03
启动spark shell
./spark-shell  --master spark://bigdata00:7077 --executor-memory 512M 
用spark shell 完成一个wordcount
scala> sc.textFile("hdfs://bigdata00:9000/words").flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).collect
结果:
res3: Array[(String, Int)] = Array((this,1), (is,4), (girl,3), (love,1), (will,1), (day,1), (boreing,1), (my,1), (miss,2), (test,2), (forget,1), (spark,2), (soon,1), (most,1), (that,1), (a,2), (afternonn,1), (i,3), (might,1), (of,1), (today,2), (good,1), (for,1), (beautiful,1), (time,1), (and,1), (the,5))
//主类
package hgs.sparkwc
import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
object WordCount {
  def main(args: Array[String]): Unit = {
    val conf = new SparkConf().setAppName("WordCount")
    val context = new SparkContext()
    context.textFile(args(0),1).flatMap(_.split(" ")).map((_,1)).reduceByKey(_+_).sortBy(_._2).saveAsTextFile(args(1))
    context.stop
  }
}
//------------------------------------------------------------------------------------------
//以下式pom.xml文件
<project xmlns="
  xsi:schemaLocation="
  <modelVersion>4.0.0</modelVersion>
  <groupId>hgs</groupId>
  <artifactId>sparkwc</artifactId>
  <version>1.0.0</version>
  <packaging>jar</packaging>
  <name>sparkwc</name>
  <url>
  <properties>
    <project.build.sourceEncoding>UTF-8</project.build.sourceEncoding>
  </properties>
<dependencies>
        <dependency>
            <groupId>org.scala-lang</groupId>
            <artifactId>scala-library</artifactId>
            <version>2.11.8</version>
        </dependency>
        <dependency>
            <groupId>org.apache.spark</groupId>
            <artifactId>spark-core_2.11</artifactId>
            <version>2.1.0</version>
        </dependency>
        <dependency>
            <groupId>org.apache.hadoop</groupId>
            <artifactId>hadoop-client</artifactId>
            <version>2.6.1</version>
        </dependency>
    </dependencies>
    
    
    <build>
        <plugins>
            <plugin>
                <artifactId>maven-assembly-plugin</artifactId>
                <version>2.6</version>
                <configuration>
           
                  <archive>
                        <manifest>
                            <!-- 我运行这个jar所运行的主类 -->
                            <mainClass>hgs.sparkwc.WordCount</mainClass>
                        </manifest>
                    </archive> 
                    
                    <descriptorRefs>
                        <descriptorRef>
                            <!-- 必须是这样写 -->
                            jar-with-dependencies
                        </descriptorRef>
                    </descriptorRefs>
                </configuration>
                
                <executions>
                    <execution>
                        <id>make-assembly</id>
                        <phase>package</phase>
                        <goals>
                            <goal>single</goal>
                        </goals>
                    </execution>
                </executions>
            </plugin>
            
              <plugin>
                <groupId>org.apache.maven.plugins</groupId>
                <artifactId>maven-compiler-plugin</artifactId>
                <configuration>
                    <source>1.8</source>
                    <target>1.8</target>
                </configuration>
            </plugin> 
             <plugin>
				<groupId>net.alchim31.maven</groupId>
				<artifactId>scala-maven-plugin</artifactId>
				<version>3.2.0</version>
				<executions>
					<execution>
						<goals>
							<goal>compile</goal>
							<goal>testCompile</goal>
					    </goals>
						<configuration>
							<args>
							<!-- <arg>-make:transitive</arg> -->
                			<arg>-dependencyfile</arg>
                			<arg>${project.build.directory}/.scala_dependencies</arg>
              				</args>
						</configuration>
					</execution>
				</executions>
			</plugin>
			
			<plugin>
				<groupId>org.apache.maven.plugins</groupId>
				<artifactId>maven-surefire-plugin</artifactId>
				<version>2.18.1</version>
				<configuration>
				<useFile>false</useFile>
				<disableXmlReport>true</disableXmlReport>
				<!-- If you have classpath issue like NoDefClassError,... -->
				<!-- useManifestOnlyJar>false</useManifestOnlyJar -->
				<includes>
					<include>**/*Test.*</include>
					<include>**/*Suite.*</include>
				</includes>
				</configuration>
			</plugin>
          
        </plugins>
    </build>
</project>
最后在build assembly:assembly的时候出现以下问题
      scalac error: bad option: '-make:transitive'
      原因是scala-maven-plugin 插件的配置 <arg>-make:transitive</arg> 有问题,把该行注释掉即可
      
      网上的答案:
      删除<arg>-make:transitive</arg> 
      或者添加该依赖:
<dependency>
<groupId>org.specs2</groupId>
<artifactId>specs2-junit_${scala.compat.version}</artifactId>
<version>2.4.16</version>
<scope>test</scope>
</dependency>
最后在服务器提交任务:
./spark-submit --master spark://bigdata00:7077  --executor-memory 512M --total-executor-cores 3  /home/sparkwc.jar   hdfs://bigdata00:9000/words  hdfs://bigdata00:9000/wordsout2


来自 “ ITPUB博客 ” ,链接:http://blog.itpub.net/31506529/viewspace-2215620/,如需转载,请注明出处,否则将追究法律责任。

上一篇: redis 常用api操作
请登录后发表评论 登录
全部评论

注册时间:2017-11-22

  • 博文量
    105
  • 访问量
    145166