1 Star 0 Fork 0

liukaiyi/spark-lucenerdd

加入 Gitee
与超过 1200万 开发者一起发现、参与优秀开源项目,私有仓库也完全免费 :)
免费加入
克隆/下载
spark-shell.sh 1.42 KB
一键复制 编辑 原始数据 按行查看 历史
Anastasios Zouzias 提交于 2021-11-09 11:30 . fix sbt assembly
#!/usr/bin/env bash
CURRENT_DIR=`pwd`
# Read the version from version.sbt
SPARK_LUCENERDD_VERSION=`cat version.sbt | awk '{print $5}' | xargs`
# You should have downloaded this spark version under your ${HOME}
SPARK_VERSION="3.2.0"
echo "==============================================="
echo "Loading LuceneRDD with version ${SPARK_LUCENERDD_VERSION}"
echo "==============================================="
echo "==============================================="
echo "SPARK version: ${SPARK_VERSION}"
echo "==============================================="
# Assumes that spark is installed under home directory
HOME_DIR=`echo ~`
#export SPARK_LOCAL_IP=localhost
SPARK_HOME=${HOME_DIR}/spark-${SPARK_VERSION}-bin-hadoop3.2
# spark-lucenerdd assembly JAR
MAIN_JAR=${CURRENT_DIR}/target/scala-2.12/spark-lucenerdd-assembly-${SPARK_LUCENERDD_VERSION}.jar
# Run spark shell locally
${SPARK_HOME}/bin/spark-shell --jars "${MAIN_JAR}" \
--conf "spark.executor.memory=1g" \
--conf "spark.driver.memory=1g" \
--conf "spark.rdd.compress=true" \
--conf "spark.serializer=org.apache.spark.serializer.KryoSerializer" \
--conf "spark.kryo.registrator=org.zouzias.spark.lucenerdd.LuceneRDDKryoRegistrator" \
--conf spark.executor.extraJavaOptions="-Dlucenerdd.index.store.mode=disk" \
--conf spark.driver.extraJavaOptions="-Dlucenerdd.index.store.mode=disk" \
--conf "spark.kryoserializer.buffer=24mb" \
--master local[*]
马建仓 AI 助手
尝试更多
代码解读
代码找茬
代码优化
1
https://gitee.com/skynetliu/spark-lucenerdd.git
[email protected]:skynetliu/spark-lucenerdd.git
skynetliu
spark-lucenerdd
spark-lucenerdd
develop

搜索帮助