Saya mencoba menjalankan spark 1.6.0 (spark-1.6.0-bin-hadoop2.4) di Mac OS Yosemite 10.10.5 menggunakan
"./bin/spark-shell".
Ada kesalahan di bawah ini. Saya juga mencoba menginstal versi Spark yang berbeda tetapi semuanya memiliki kesalahan yang sama. Ini adalah kedua kalinya saya menjalankan Spark. Lari saya sebelumnya bekerja dengan baik.
log4j:WARN No appenders could be found for logger (org.apache.hadoop.metrics2.lib.MutableMetricsFactory).
log4j:WARN Please initialize the log4j system properly.
log4j:WARN See http://logging.apache.org/log4j/1.2/faq.html#noconfig for more info.
Using Spark's repl log4j profile: org/apache/spark/log4j-defaults-repl.properties
To adjust logging level use sc.setLogLevel("INFO")
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 1.6.0
/_/
Using Scala version 2.10.5 (Java HotSpot(TM) 64-Bit Server VM, Java 1.7.0_79)
Type in expressions to have them evaluated.
Type :help for more information.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 WARN Utils: Service 'sparkDriver' could not bind on port 0. Attempting port 1.
16/01/04 13:49:40 ERROR SparkContext: Error initializing SparkContext.
java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:444)
at sun.nio.ch.Net.bind(Net.java:436)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!
at sun.nio.ch.Net.bind0(Native Method)
at sun.nio.ch.Net.bind(Net.java:444)
at sun.nio.ch.Net.bind(Net.java:436)
at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
at java.lang.Thread.run(Thread.java:745)
java.lang.NullPointerException
at org.apache.spark.sql.SQLContext$.createListenerAndUI(SQLContext.scala:1367)
at org.apache.spark.sql.hive.HiveContext.<init>(HiveContext.scala:101)
at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)
at sun.reflect.NativeConstructorAccessorImpl.newInstance(NativeConstructorAccessorImpl.java:57)
at sun.reflect.DelegatingConstructorAccessorImpl.newInstance(DelegatingConstructorAccessorImpl.java:45)
at java.lang.reflect.Constructor.newInstance(Constructor.java:526)
at org.apache.spark.repl.SparkILoop.createSQLContext(SparkILoop.scala:1028)
at $iwC$$iwC.<init>(<console>:15)
at $iwC.<init>(<console>:24)
at <init>(<console>:26)
at .<init>(<console>:30)
at .<clinit>(<console>)
at .<init>(<console>:7)
at .<clinit>(<console>)
at $print(<console>)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.repl.SparkIMain$ReadEvalPrint.call(SparkIMain.scala:1065)
at org.apache.spark.repl.SparkIMain$Request.loadAndRun(SparkIMain.scala:1346)
at org.apache.spark.repl.SparkIMain.loadAndRunReq$1(SparkIMain.scala:840)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:871)
at org.apache.spark.repl.SparkIMain.interpret(SparkIMain.scala:819)
at org.apache.spark.repl.SparkILoop.reallyInterpret$1(SparkILoop.scala:857)
at org.apache.spark.repl.SparkILoop.interpretStartingWith(SparkILoop.scala:902)
at org.apache.spark.repl.SparkILoop.command(SparkILoop.scala:814)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:132)
at org.apache.spark.repl.SparkILoopInit$$anonfun$initializeSpark$1.apply(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkIMain.beQuietDuring(SparkIMain.scala:324)
at org.apache.spark.repl.SparkILoopInit$class.initializeSpark(SparkILoopInit.scala:124)
at org.apache.spark.repl.SparkILoop.initializeSpark(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1$$anonfun$apply$mcZ$sp$5.apply$mcV$sp(SparkILoop.scala:974)
at org.apache.spark.repl.SparkILoopInit$class.runThunks(SparkILoopInit.scala:159)
at org.apache.spark.repl.SparkILoop.runThunks(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoopInit$class.postInitialization(SparkILoopInit.scala:108)
at org.apache.spark.repl.SparkILoop.postInitialization(SparkILoop.scala:64)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply$mcZ$sp(SparkILoop.scala:991)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop$$anonfun$org$apache$spark$repl$SparkILoop$$process$1.apply(SparkILoop.scala:945)
at scala.tools.nsc.util.ScalaClassLoader$.savingContextLoader(ScalaClassLoader.scala:135)
at org.apache.spark.repl.SparkILoop.org$apache$spark$repl$SparkILoop$$process(SparkILoop.scala:945)
at org.apache.spark.repl.SparkILoop.process(SparkILoop.scala:1059)
at org.apache.spark.repl.Main$.main(Main.scala:31)
at org.apache.spark.repl.Main.main(Main.scala)
at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:57)
at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
at java.lang.reflect.Method.invoke(Method.java:606)
at org.apache.spark.deploy.SparkSubmit$.org$apache$spark$deploy$SparkSubmit$$runMain(SparkSubmit.scala:731)
at org.apache.spark.deploy.SparkSubmit$.doRunMain$1(SparkSubmit.scala:181)
at org.apache.spark.deploy.SparkSubmit$.submit(SparkSubmit.scala:206)
at org.apache.spark.deploy.SparkSubmit$.main(SparkSubmit.scala:121)
at org.apache.spark.deploy.SparkSubmit.main(SparkSubmit.scala)
<console>:16: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:16: error: not found: value sqlContext
import sqlContext.sql
Lalu saya tambahkan
export SPARK_LOCAL_IP="127.0.0.1"
ke spark-env.sh, kesalahan berubah menjadi:
ERROR : No route to host
java.net.ConnectException: No route to host
at java.net.Inet6AddressImpl.isReachable0(Native Method)
at java.net.Inet6AddressImpl.isReachable(Inet6AddressImpl.java:77)
at java.net.InetAddress.isReachable(InetAddress.java:475)
...
<console>:10: error: not found: value sqlContext
import sqlContext.implicits._
^
<console>:10: error: not found: value sqlContext
import sqlContext.sql
apache-spark
Jia
sumber
sumber
bin/spark-shell --driver-java-options "-Djava.net.preferIPv4Stack=true"
16/01/05 10:29:39 ERROR : No route to host 16/01/05 10:29:39 ERROR : No route to host java.net.ConnectException: No route to host at java.net.Inet4AddressImpl.isReachable0(Native Method) at java.net.Inet4AddressImpl.isReachable(Inet4AddressImpl.java:70) at java.net.InetAddress.isReachable(InetAddress.java:475) at java.net.InetAddress.isReachable(InetAddress.java:434) at tachyon.util.NetworkUtils.getLocalIpAddress(NetworkUtils.java:122) at tachyon.util.NetworkUtils.getLocalHostName(NetworkUtils.java:78)...
ping 127.0.0.1
?Jawaban:
Langkah-langkah berikut mungkin membantu:
Dapatkan nama host Anda dengan menggunakan perintah "nama host".
Buat entri di file / etc / hosts untuk nama host Anda jika tidak ada seperti berikut:
Semoga ini membantu!!
sumber
ping $(hostname)
, ini harus diselesaikan ke 127.0.0.1Saya selalu mendapatkannya saat beralih antar jaringan. Ini menyelesaikannya:
$ sudo hostname -s 127.0.0.1
sumber
$ sudo hostname -b 127.0.0.1
Saya telah membangunnya dari cabang master saat ini dengan versi
2.0.0-SNAPSHOT
. Setelah menambahkanexport SPARK_LOCAL_IP="127.0.0.1"
untukload-spark-env.sh
itu bekerja untuk saya. Saya menggunakan Macos 10.10.5. Jadi itu bisa jadi masalah versi?sumber
$ SPARK_LOCAL_IP="127.0.0.1" gradle my-gradle-task-using-local-spark
. Masalah muncul saat menggunakan vpn. Saya menggunakan Macos 10.11.1.Cukup setel
spark.driver.host
menjadi localhost Anda jika Anda menggunakan IDEsumber
spark.driver.host
langsung dalam kode sebagai lawan mengotak-atik$SPARK_HOME/conf/spark-defaults.conf
Ada dua kesalahan menurut saya.
Untuk 1. Saya mencoba:
Tapi tidak ada yang berhasil. Kemudian saya mencoba yang berikut dan berhasil:
Untuk 2. Anda dapat mencoba:
lalu
import sqlContext.implicits._
Builder di SparkSession akan secara otomatis menggunakan SparkContext jika ada, jika tidak maka akan membuatnya. Anda dapat membuat dua secara eksplisit jika perlu.
sumber
Jika Anda tidak ingin mengubah nama host Mac Anda, Anda dapat melakukan hal berikut:
spark-env.sh.template
di komputer Anda (Mungkin ada di/usr/local/Cellar/apache-spark/2.1.0/libexec/conf/
).cp spark-env.sh.template spark-env.sh
export SPARK_LOCAL_IP=127.0.0.1
bawah komentar untuk IP lokal.Mulailah
spark-shell
dan nikmati.sumber
Jika Anda menggunakan Scala untuk menjalankan kode dalam IDE, dan jika Anda menghadapi masalah yang sama dan Anda tidak menggunakan SparkConf () seperti yang ditunjukkan di atas dan menggunakan SparkSession () maka Anda dapat mengikat alamat localhost sebagai berikut karena set hanya berfungsi di SparkConf (). Anda harus menggunakan .config () untuk mengatur konfigurasi spark seperti yang ditunjukkan di bawah ini:
sumber
ekspor SPARK_LOCAL_IP = 127.0.0.1 (untuk mac .bash_profile)
sumber
Terkadang firewall mencegah pembuatan dan pengikatan soket. pastikan firewall Anda tidak aktif dan Anda juga harus memeriksa ip mesin Anda di / etc / hosts dan pastikan tidak apa-apa, lalu coba lagi:
sumber
Ini terjadi ketika Anda beralih di antara jaringan yang berbeda (VPN - PROD, CI berdasarkan jaringan perusahaan Anda untuk mengakses lingkungan yang berbeda).
Saya mengalami masalah yang sama, setiap kali saya mengganti VPN.
perbarui sudo / etc / hosts dengan nilai nama host di Mac Anda.
sumber
hanya melakukan di atas berhasil untuk saya.
sumber
Di Mac, Periksa IP di Preferensi Sistem -> Jaringan -> klik Wifi Anda terhubung (itu akan menunjukkan ikon hijau) -> periksa IP tepat di atas Nama Jaringan Anda.
Buat entri berikut di ../conf/spark-env.sh:
dan daripada mencoba percikan-cangkang. Melakukan perubahan di atas berhasil untuk saya.
sumber