yep,u can submit a app to spark ensemble by spark-submit command ,e.g.
spark-submit --master spark://gzsw-02:7077 --class org.apache.spark.examples.JavaWordCount --verbose --deploy-mode client ~/spark/spark-1.4.1-bin-hadoop2.4/lib/spark-examples-1.4.1-hadoop2.4.0.jar spark/spark-1.4.1-bin-hadoop2.4/RELEASE,spark/spark-1.4.1-bin-hadoop2.4/README.md
but also,u can spawn a command to run a app via spark-shell,
spark-shell --jars lib/spark-examples-1.4.1-hadoop2.4.0-my.jar --master spark://gzsw-02:7077 --executor-memory 600m --total-executor-cores 16
note:this param total-executor-cores is a must,else u will get a java.lang.OutOfMemoryError: Java heap space exception after this log
15/11/25 12:08:17 INFO SparkDeploySchedulerBackend: Registered executor: AkkaRpcEndpointRef(Actor[akka.tcp://sparkExecutor@192.168.100.13:56889/user/Executor#-1124709965]) with ID 0
after than ,i check the master ui,i found that the app's cores shown as the max integer value(2147483647).so u can add this property to set used cores right.
second,if u want to limit the executors to run ,u can specify this property to chieve that
set in spark-defaults.conf
spark.deploy.spreadOut=false
(but with spark-class command ,this will be a side affect:it maybe cause the same effect with "java heap space" exceptions like above,so comment it is better in this case)
then ,u can import the entry class of your app like this
import class.to.your.appentry
followed by invoking the entry method
val arr = ...//params for running this app JavaWordCount.main(arr)
but in this demo,the app will issue a new sparkcontext which differs with the default one initiated by spark-shell,so launcher will fail to complain :
WARN SparkContext: Multiple running SparkContexts detected in the same JVM!
so u can add a property below in the spark-defaults.conf to ignore it
spark.driver.allowMultipleContexts=true
after all,u will the app is launched same as command spark-submit.
ref:
相关推荐
Laravel开发-laravel-app-spawn Laravel自定义应用程序实例引导创建者。主要用于进行拉拉维尔特定的测试。
前端开源库-cross-spawn-with-kill杀戮交叉产卵,增加交叉平台
前端开源库-easy-spawn轻松生成,使编写子进程更容易的实用程序。
节点的spawn和spawnSync的跨平台解决方案。 安装 Node.js版本8及更高版本: $ npm install cross-spawn Node.js版本7及以下版本: $ npm install cross-spawn@6 为什么 在Windows上使用Spawn时,节点出现问题: ...
前端开源库-svn-spawn基于svn cli命令的svn-spawn、svn访问
官方离线安装包,亲测可用
Laravel开发-spawn Laravel的模板生成器。
前端开源库-cross-spawn-with-kill.zip
前端开源库-svn-spawn.zip
前端开源库-easy-spawn.zip
前端开源库-better-spawn更好的产卵,更好的产卵
前端开源库-gulp-spawnGulp Spawn,Gulp的Spawn插件
交叉杀戮 向跨生成进程添加跨平台... npm install --save cross-spawn-with-kill 用法 const spawn = require ( 'cross-spawn-with-kill' ) const child = spawn ( 'webpack' ) child . kill ( ) // <-- that's t
cross-spawn-windows-exe 跨平台支持,可通过Node.js运行Windows可执行文件。 利用 (扩展名为 )来执行Windows可执行文件,而不管平台如何。 对于所有平台,都需要节点10或更高版本。 在非Windows非WSL主机系统上,...
Spawn 特定kubectl上下文(集群)的一个shell
Dock-spawn的TypeScript版本(请参阅 ) 主页位于 信息 Dock Spawn TS是一个Typescript Docking框架,用于创建类似HTML的IDE的Visual Studio。 es5版本 lib / es5目录中有一个ES5版本。用法示例在page / demo / ...
安装$ npm i await-spawn -S用法const spawn = require ( 'await-spawn' )const main = async ( ) => { try { const bl = await spawn ( 'ls' , [ '-al' ] ) console . log ( bl . toString ( ) ) } catch ( e ) { ...
spawn-fcgi-master 是 spawn-fcgi 的官方源码,但是不能在 windows 中编译。 spawn-fcgi-win32.c 是 windows 版源码,使用 MinGW 编译:命令行输入 “gcc spawn-fcgi-win32.c -lwsock32”即可无优化地编译, 懒得...
> const objectToSpawnArgs = require ( 'object-to-spawn-args' ) > const spawnArgs = objectToSpawnArgs ( { l : true , c : 'red' , name : 'pete' , tramp : true } ) > console . log ( spawnArgs ) [ '-...
:sparkles: 可以在此仓库中找到带有代码示例的完整演示: : 安装yarn add rn-spawn-componentornpm i rn-spawn-component用法SpawnProvider必须包装您的App import { SpawnProvider , SpawnController , AddSpawn }...