spark1.4 Windows local调试环境搭建总结1.scala版本scala-2.10.4 官方推荐scala-2.11.7【不推荐,非sbt项目、需要后加载】2.spark版本spark
spark1.4 Windows local调试环境搭建总结
1.scala版本scala-2.10.4 官方推荐scala-2.11.7【不推荐,非sbt项目、需要后加载】
2.spark版本spark-1.4.0-bin-hadoop2.6.tgz
3.hadoop3.1版本hadoop-2.6.0.tar.gz3.2环境变量HADOOP_HOME=E:/ysg.tools/spark/hadoop-2.6.0或System.setProperty("hadoop.home.dir", "E:/ysg.tools/spark/hadoop-2.6.0");3.3winutils.exe
winutils.exe拷贝至spark/hadoop-2.6.0/bin
文件下载地址http://files.cnblogs.com/files/yjmyzz/hadoop2.6%28x64%29V0.2.zip
4.idea新建 NO-SBT项目
libraties 增加 scala sdk spark-1.4.0-bin-hadoop2.6/lib/spark-assembly-1.4.0-hadoop2.6.0.jar
spark.test.iml先加载 spark-assembly-1.4.0-hadoop2.6.0再加载 scala-sdk-2.11.7
<?xml version="1.0" encoding="UTF-8"?><module type="JAVA_MODULE" version="4"> <component name="NewModuleRootManager" inherit-compiler-output="true"> <exclude-output /> <content url="file://$MODULE_DIR$"> <sourceFolder url="file://$MODULE_DIR$/src" isTestSource="false" /> </content> <orderEntry type="inheritedJdk" /> <orderEntry type="sourceFolder" forTests="false" /> <orderEntry type="library" name="spark-assembly-1.4.0-hadoop2.6.0" level="project" /> <orderEntry type="library" name="scala-sdk-2.11.7" level="project" /> </component></module>