Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

env: jupyter: No such file or directory #11

Open
tongokongo opened this issue Oct 7, 2019 · 6 comments
Open

env: jupyter: No such file or directory #11

tongokongo opened this issue Oct 7, 2019 · 6 comments

Comments

@tongokongo
Copy link

Working on cloudera VM for windows, course 3, week1, trying to set up spark with jupyter notebooks.

After installing scripts and supposedly anaconda with jupyter support (./setup.sh) and after sourcing bashrc, while trying to run pyspark I am getting error:
env: jupyter: No such file or directory

I tried to install some components manually but it didn't help. Does anyone know how to go through this step?

@TheInterpreter
Copy link

I had the same issue. I managed to finish the installation until pyspark actually runs BUT not anaconda. There's nothing in the localhost...

@cchnoe
Copy link

cchnoe commented Jan 29, 2020

Encontraron alguna solucion?

@iam01000
Copy link

Seems the setup.sh script has certificate errors, and anaconda does not install properly.
I manually installed anaconda by this command

wget http://repo.continuum.io/archive/Anaconda3-4.0.0-Linux-x86_64.sh --no-check-certificate

After that, open a new terminal and then run pyspark
Voila~~

@arkadipb
Copy link

arkadipb commented May 13, 2020

No solution even after passing --no-check-certificate and going through the installation steps. Seems there are errors with software versions in the VM.

Instead, you can install spark locally on your system, download the .csv.gz files used in the hands-on from github, unzip them, and read them as dataframes from csv. Then carry on with the work.

Hope this helps.

@enaurt
Copy link

enaurt commented Jun 24, 2020

If it doesn´t work after passing --no-check-certificate, then you should delete your Cloudera VM, download it again and start over. Make sure to manually install anaconda with the command above before running ./setup.sh
It should run this time.

@Nishtha1007
Copy link

Nishtha1007 commented Feb 27, 2022

Seems the setup.sh script has certificate errors, and anaconda does not install properly. I manually installed anaconda by this command

wget http://repo.continuum.io/archive/Anaconda3-4.0.0-Linux-x86_64.sh --no-check-certificate

After that, open a new terminal and then run pyspark Voila~~

--Hey thanks this approach worked for me.
i was getting open SSL error by calling the wget command so,
I downloaded the Anaconda .sh file from the URL repo u mentioned, placed it in the big-data-3 folder, changed the permission to executable file and run it.
https://www.cyberciti.biz/faq/run-execute-sh-shell-script/

Now everything works fine. Thanks a ton!!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

7 participants