spark install

.bashrc

export HADOOP_HOME=/usr/local/hadoop/hadoop-2.6.4
export HADOOP_CONF_DIR=/usr/local/hadoop/hadoop-2.6.4/etc/hadoop
export SCALA_HOME=/usr/local/scala/scala-2.10.6
export SPARK_HOME=/usr/local/spark/spark-2.0.0-bin-hadoop2.6
export JAVA_HOME=/usr/java/jdk1.8.0
export JRE_HOME=${JAVA_HOME}/jre
export CLASS_PATH=.:${JAVA_HOME}/lib:${JRE_HOME}/lib
export PATH=${JAVA_HOME}/bin:${SPARK_HOME}/bin:${SPARK_HOME}/sbin:${SCALA_HOME}/bin:${HADOOP_HOME}/bin:${HADOOP_HOME}/sbin:$PATH

spark_env.sh

export JAVA_HOME=/usr/java/jdk1.8.0

export SCALA_HOME=/usr/local/scala/scala-2.10.6

export HADOOP_HOME=/usr/local/hadoop/hadoop-2.6.4

export HADOOP_CONF_DIR=/usr/local/hadoop/hadoop-2.6.4/etc/hadoop

export SPARK_MASTER_IP=Master

export SPARK_WORKER_MEMORY=1g

export SPARK_EXECUTOR_MEMORY=1g

export SPARK_DRIVER_MEMORY=1G

export SPARK_WORKER_CORES=8

slaves

Worker1

Worker2

export SCALA_HOME=/usr/local/scala/scala-2.10.6

export SPARK_HOME=/usr/local/spark/spark-2.0.0-bin-hadoop2.6

spark-defaults.conf

spark.executor.extraJavaOptions  -XX:+PrintGCDetails -Dkey=value -Dnumbers="one two three"

spark.eventLog.enabled           true

spark.eventLog.dir               hdfs://Master:9000/historyserverforSpark

spark.yarn.historyServer.address Master:18080

spark.history.fs.logDirectory hdfs://Master:9000/historyserverforSpark

hadoop dfs -r

hadoop dfs -mkdir /historyserverforSpark

spark-submit --class org.apache.spark.examples.SparkPi --master spark://Master:7077 ../examples/jars/spark-examples_2.11-2.0.0.jar    1000

--class org.apache.spark.examples.SparkPi \

--master spark://Master:7077 \

../examples/jars/spark-examples_2.11-2.0.0.jar    1000

时间: 2024-10-06 16:42:56

spark install的相关文章

sparklyr包--实现R与Spark接口

1.sparklyr包简介 Rstudio公司发布的sparklyr包具有以下几个功能: 实现R与Spark的连接: sparklyr包提供了一个完整的dplyr后端,可筛选并聚合Spark数据集,接着在R中实现分析与可视化: 利用Spark的MLlib机器学习库在R中实现分布式机器学习算法: 可以创建一个扩展,用于调用Spark API,并为Spark的所有包集提供了一个接口. 2.RStudio Server安装sparklyr包 Linux版本:Ubuntu 16.04 LTS 64bit

shopkeep/spark Dockerfile示例

FROM java:openjdk-8 ENV HADOOP_HOME /opt/spark/hadoop-2.6.0 ENV MESOS_NATIVE_LIBRARY /opt/libmesos-0.22.1.so ENV SBT_VERSION 0.13.8 ENV SCALA_VERSION 2.11.7 RUN mkdir /opt/spark WORKDIR /opt/spark # Install Scala RUN cd /root && curl -o scala-$SCA

Alex 的 Hadoop 菜鸟教程: 第17课 Spark 安装以及使用教程

声明 本文基于Centos6.x + CDH 5.x 本文基于CSDN的markdown编辑器写成,csdn终于支持markdown了,高兴! Spark是什么 Spark是Apache的顶级项目.项目背景是 Hadoop 的 MapReduce 太挫太慢了,于是有人就做了Spark,目前Spark声称在内存中比Hadoop快100倍,在磁盘上比Hadoop快10倍. 安装Spark spark有5个组件 spark-core: spark核心包 spark-worker: spark-work

Spark的介绍和集群部署

介绍 1.spark处理大数据的统一分析计算引擎: a.速度:在迭代循环的计算模型下,spark比Hadoop快100倍: b.易用性:spark提供多种语言的API,如Java.Python.Scala.R.SQL等 c.扩展性:在spark RDD基础上,提供一整套的分析计算模型:spark SQL.spark Stresaming.spark MLLib和图计算: d.运行: spark支持在hadoop.Hadoop, Apache Mesos, Kubernetes, standalo

50分钟学会Laravel 50个小技巧

50分钟学会Laravel 50个小技巧 时间 2015-12-09 17:13:45  Yuansir-web菜鸟 原文  http://www.yuansir-web.com/2015/12/09/50分钟学会laravel-50个小技巧/ 主题 Laravel 转载请注明:转载自 Yuansir-web菜鸟 | LAMP学习笔记 本文链接地址: 50分钟学会Laravel 50个小技巧 原文链接:< 50 Laravel Tricks in 50 Minutes by willroth >

laravel 技巧

Eloquent 1.Automatic model validation class Post extends Eloquent { public static $autoValidate = true; protected static $rules = array(); protected static function boot() { parent::boot(); // You can also replace this with static::creating or static

Spark生态之Tachyon学习4---下载源码通过maven install安装失败记录

版本:v0.7.1.tar.gz 记录 (2)制定版本 mvn clean package -Djava.version=1.7 -Dhadoop.version=2.6.0 -Dspark.version=1.5.2 -DskipTests (1) [email protected]:~/cloud/tachyon-0.7.1$ mvn install [INFO] Scanning for projects... [INFO] --------------------------------

install spark in Windows

http://stackoverflow.com/questions/34697744/spark-1-6-failed-to-locate-the-winutils-binary-in-the-hadoop-binary-path http://toodey.com/2015/08/10/hadoop-installation-on-windows-without-cygwin-in-10-mints/ https://github.com/steveloughran/winutils/tre

spark机器学习-第3章

1.安装工具ipython https://www.continuum.io/downloads 选择自己需要的版本 2.安装过程 (1)赋权限 chmod u+x ./Anaconda2-4.2.0-Linux-x86_64.sh (2)回车 [[email protected] tool]# ./Anaconda2-4.2.0-Linux-x86_64.sh Welcome to Anaconda2 4.2.0 (by Continuum Analytics, Inc.) In order