hadoop 啟動(dòng)sparkpyspark為什么要先啟動(dòng)hdfs?
啟動(dòng)sparkpyspark為什么要先啟動(dòng)hdfs?usr/spark/sbin/start-全部.sh啟動(dòng)spark失敗。我怎么能試試火花-環(huán)境sh設(shè)置為:exportspark MASTER IP
啟動(dòng)sparkpyspark為什么要先啟動(dòng)hdfs?
usr/spark/sbin/start-全部.sh啟動(dòng)spark失敗。我怎么能試試火花-環(huán)境sh設(shè)置為:exportspark MASTER IP=127.0.0.1 exportspark LOCAL IP=127.0.0.1