Error initializing SparkContext with java.net.BindException - Big Data In Real World

Error initializing SparkContext with java.net.BindException

How to get a list of consumers connected to a Kafka topic?
August 18, 2021
LeaseExpiredException: No lease error on HDFS
August 23, 2021
How to get a list of consumers connected to a Kafka topic?
August 18, 2021
LeaseExpiredException: No lease error on HDFS
August 23, 2021

java.net.BindException is a common exception when Spark is trying to initialize SparkContext. This is especially a common error when you try to run Spark locally.

16/01/04 13:49:40 ERROR SparkContext: Error initializing SparkContext.

java.net.BindException: Can't assign requested address: Service 'sparkDriver' failed after 16 retries!
    at sun.nio.ch.Net.bind0(Native Method)
    at sun.nio.ch.Net.bind(Net.java:444)
    at sun.nio.ch.Net.bind(Net.java:436)
    at sun.nio.ch.ServerSocketChannelImpl.bind(ServerSocketChannelImpl.java:214)
    at sun.nio.ch.ServerSocketAdaptor.bind(ServerSocketAdaptor.java:74)
    at io.netty.channel.socket.nio.NioServerSocketChannel.doBind(NioServerSocketChannel.java:125)
    at io.netty.channel.AbstractChannel$AbstractUnsafe.bind(AbstractChannel.java:485)
    at io.netty.channel.DefaultChannelPipeline$HeadContext.bind(DefaultChannelPipeline.java:1089)
    at io.netty.channel.AbstractChannelHandlerContext.invokeBind(AbstractChannelHandlerContext.java:430)
    at io.netty.channel.AbstractChannelHandlerContext.bind(AbstractChannelHandlerContext.java:415)
    at io.netty.channel.DefaultChannelPipeline.bind(DefaultChannelPipeline.java:903)
    at io.netty.channel.AbstractChannel.bind(AbstractChannel.java:198)
    at io.netty.bootstrap.AbstractBootstrap$2.run(AbstractBootstrap.java:348)
    at io.netty.util.concurrent.SingleThreadEventExecutor.runAllTasks(SingleThreadEventExecutor.java:357)
    at io.netty.channel.nio.NioEventLoop.run(NioEventLoop.java:357)
    at io.netty.util.concurrent.SingleThreadEventExecutor$2.run(SingleThreadEventExecutor.java:111)
    at java.lang.Thread.run(Thread.java:745)

Do you like us to send you a 47 page Definitive guide on Spark join algorithms? ===>

Reason

Most common reason is Spark is trying to bind to the localhost (that is your computer) for the master node and not able to do so.

Solution

Find the hostname of your computer and add it the /etc/hosts. 

Find hostname

hostname command will get you the hostname

[osboxes@wk1 ~]$ hostname
Wk1.hirw.com

 

Add hostname to hosts file

Add an entry to your /etc/hosts file like below

[osboxes@wk1 ~]$ cat /etc/hosts

127.0.0.1   wk1.hirw.com

If you are using Windows, hosts file will be under C:\Windows\System32\drivers\etc 

By doing this when Spark ping 127.0.0.1 it will properly resolve to a hostname and will be able to bind to the address.

Big Data In Real World
Big Data In Real World
We are a group of Big Data engineers who are passionate about Big Data and related Big Data technologies. We have designed, developed, deployed and maintained Big Data applications ranging from batch to real time streaming big data platforms. We have seen a wide range of real world big data problems, implemented some innovative and complex (or simple, depending on how you look at it) solutions.

Comments are closed.

Error initializing SparkContext with java.net.BindException
This website uses cookies to improve your experience. By using this website you agree to our Data Protection Policy.

Hadoop In Real World is now Big Data In Real World!

X