Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

SAC doesn't run with livy on zeppelin but works fine on pyspark-shell on master node #308

Open
pulkitkalia1994 opened this issue Oct 28, 2020 · 1 comment

Comments

@pulkitkalia1994
Copy link

Hi Team,
I have installed and configured SAC on master node and it works fine with pyspark shell. However, when I try to run the same code using livy on zeppelin it gives me the below error (I have checked without configuring SAC and running via livy and works fine as well) -

20/10/28 02:57:27 INFO Client: Uploading resource file:/usr/lib/livy/rsc-jars/netty-all-4.1.17.Final.jar -> hdfs://ip-172-26-51-231.us-west-2.compute.internal:8020/user/ff08/.sparkStaging/application_1603849413644_0050/netty-all-4.1.17.Final.jar
20/10/28 02:57:27 INFO Client: Uploading resource file:/usr/lib/livy/rsc-jars/livy-rsc-0.6.0-incubating.jar -> hdfs://ip-172-26-51-231.us-west-2.compute.internal:8020/user/ff08/.sparkStaging/application_1603849413644_0050/livy-rsc-0.6.0-incubating.jar
20/10/28 02:57:27 INFO Client: Uploading resource file:/usr/lib/livy/repl_2.11-jars/commons-codec-1.9.jar -> hdfs://ip-172-26-51-231.us-west-2.compute.internal:8020/user/ff08/.sparkStaging/application_1603849413644_0050/commons-codec-1.9.jar
20/10/28 02:57:27 INFO Client: Uploading resource file:/usr/lib/livy/repl_2.11-jars/livy-core_2.11-0.6.0-incubating.jar -> hdfs://ip-172-26-51-231.us-west-2.compute.internal:8020/user/ff08/.sparkStaging/application_1603849413644_0050/livy-core_2.11-0.6.0-incubating.jar
20/10/28 02:57:27 INFO Client: Uploading resource file:/usr/lib/livy/repl_2.11-jars/livy-repl_2.11-0.6.0-incubating.jar -> hdfs://ip-172-26-51-231.us-west-2.compute.internal:8020/user/ff08/.sparkStaging/application_1603849413644_0050/livy-repl_2.11-0.6.0-incubating.jar
20/10/28 02:57:27 INFO Client: Uploading resource file:/usr/lib/spark/conf/atlas-application.properties -> hdfs://ip-172-26-51-231.us-west-2.compute.internal:8020/user/ff08/.sparkStaging/application_1603849413644_0050/atlas-application.properties
20/10/28 02:57:27 INFO Client: Uploading resource file:/usr/lib/spark/R/lib/sparkr.zip#sparkr -> hdfs://ip-172-26-51-231.us-west-2.compute.internal:8020/user/ff08/.sparkStaging/application_1603849413644_0050/sparkr.zip
20/10/28 02:57:27 INFO Client: Uploading resource file:/usr/lib/spark/python/lib/pyspark.zip -> hdfs://ip-172-26-51-231.us-west-2.compute.internal:8020/user/ff08/.sparkStaging/application_1603849413644_0050/pyspark.zip
20/10/28 02:57:27 INFO Client: Uploading resource file:/usr/lib/spark/python/lib/py4j-0.10.7-src.zip -> hdfs://ip-172-26-51-231.us-west-2.compute.internal:8020/user/ff08/.sparkStaging/application_1603849413644_0050/py4j-0.10.7-src.zip
20/10/28 02:57:27 WARN Client: Same name resource file:///usr/lib/spark/python/lib/pyspark.zip added multiple times to distributed cache
20/10/28 02:57:27 WARN Client: Same name resource file:///usr/lib/spark/python/lib/py4j-0.10.7-src.zip added multiple times to distributed cache
20/10/28 02:57:27 INFO Client: Uploading resource file:/mnt/tmp/spark-7b32ab91-1a6d-4470-9038-631a08b155a4/__spark_conf__3554264488014972258.zip -> hdfs://ip-172-26-51-231.us-west-2.compute.internal:8020/user/ff08/.sparkStaging/application_1603849413644_0050/spark_conf.zip
20/10/28 02:57:27 INFO SecurityManager: Changing view acls to: livy,ff08
20/10/28 02:57:27 INFO SecurityManager: Changing modify acls to: livy,ff08
20/10/28 02:57:27 INFO SecurityManager: Changing view acls groups to:
20/10/28 02:57:27 INFO SecurityManager: Changing modify acls groups to:
20/10/28 02:57:27 INFO SecurityManager: SecurityManager: authentication disabled; ui acls disabled; users with view permissions: Set(livy, ff08); groups with view permissions: Set(); users with modify permissions: Set(livy, ff08); groups with modify permissions: Set()
20/10/28 02:57:28 INFO Client: Submitting application application_1603849413644_0050 to ResourceManager
20/10/28 02:57:28 INFO YarnClientImpl: Submitted application application_1603849413644_0050
20/10/28 02:57:28 INFO Client: Application report for application_1603849413644_0050 (state: ACCEPTED)
20/10/28 02:57:28 INFO Client:
client token: N/A
diagnostics: [Wed Oct 28 02:57:28 +0000 2020] Application is Activated, waiting for resources to be assigned for AM. Details : AM Partition = CORE ; Partition Resource = <memory:491520, vCores:64> ; Queue's Absolute capacity = 100.0 % ; Queue's Absolute used capacity = 0.0 % ; Queue's Absolute max capacity = 100.0 % ;
ApplicationMaster host: N/A
ApplicationMaster RPC port: -1
queue: default
start time: 1603853848513
final status: UNDEFINED
tracking URL: http://ip-172-26-51-231.us-west-2.compute.internal:20888/proxy/application_1603849413644_0050/
user: ff08
20/10/28 02:57:28 INFO ShutdownHookManager: Shutdown hook called
20/10/28 02:57:28 INFO ShutdownHookManager: Deleting directory /mnt/tmp/spark-7b32ab91-1a6d-4470-9038-631a08b155a4
20/10/28 02:57:28 INFO ShutdownHookManager: Deleting directory /mnt/tmp/spark-817fe12d-4410-4781-a29f-affd97ce07f1

stderr:
SLF4J: Class path contains multiple SLF4J bindings.
SLF4J: Found binding in [jar:file:/usr/lib/spark/spark-atlas-connector-assembly-0.1.0-SNAPSHOT.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: Found binding in [jar:file:/usr/lib/spark/jars/slf4j-log4j12-1.7.16.jar!/org/slf4j/impl/StaticLoggerBinder.class]
SLF4J: See http://www.slf4j.org/codes.html#multiple_bindings for an explanation.
SLF4J: Actual binding is of type [org.slf4j.impl.Log4jLoggerFactory]

YARN Diagnostics:
Application application_1603849413644_0050 was killed by user livy at 172.26.51.231
at org.apache.zeppelin.livy.BaseLivyInterpreter.createSession(BaseLivyInterpreter.java:360)
at org.apache.zeppelin.livy.BaseLivyInterpreter.initLivySession(BaseLivyInterpreter.java:210)
at org.apache.zeppelin.livy.LivySharedInterpreter.open(LivySharedInterpreter.java:59)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at org.apache.zeppelin.livy.BaseLivyInterpreter.getLivySharedInterpreter(BaseLivyInterpreter.java:191)
at org.apache.zeppelin.livy.BaseLivyInterpreter.open(BaseLivyInterpreter.java:164)
at org.apache.zeppelin.interpreter.LazyOpenInterpreter.open(LazyOpenInterpreter.java:69)
at org.apache.zeppelin.interpreter.remote.RemoteInterpreterServer$InterpretJob.jobRun(RemoteInterpreterServer.java:616)
at org.apache.zeppelin.scheduler.Job.run(Job.java:188)
at org.apache.zeppelin.scheduler.FIFOScheduler$1.run(FIFOScheduler.java:140)
at java.util.concurrent.Executors$RunnableAdapter.call(Executors.java:511)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.access$201(ScheduledThreadPoolExecutor.java:180)
at java.util.concurrent.ScheduledThreadPoolExecutor$ScheduledFutureTask.run(ScheduledThreadPoolExecutor.java:293)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1149)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:624)
at java.lang.Thread.run(Thread.java:748)

Is there anything that I am missing?

@pulkitkalia1994
Copy link
Author

I was able to resolve it by copying the uber jar and atlas-application.properties to all the core nodes

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant