赞
踩
1.有关local模式
曾经在国外网站上看到一篇帖子,觉着很受用,拿来分享。
说Spark的local模式,就是指在本机上运行的本机模式,所以,有关于你设置的executor.memory以及driver.memory的,并不会影响Spark本身内存设置问题。经本人测试之后发现,local模式下设置这两个参数完全没有作用。
2.spark.memory.fraction与spark.storage.memoryFraction
内存占比,用于对Spark性能进行调优,官方文档上就有介绍:
spark.memory.fraction
expresses the size of M
as a fraction of the (JVM heap space - 300MB) (default 0.6). The rest of the space (25%) is reserved for user data structures, internal metadata in Spark, and safeguarding against OOM errors in the case of sparse and unusually large records.spark.memory.storageFraction
expresses the size of R
as a fraction of M
(default 0.5). R
is the storage space within M
where cacCopyright © 2003-2013 www.wpsshop.cn 版权所有,并保留所有权利。