Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

Exploring Data with Apache Pig from the Grunt shell

avatar
Contributor

Hello friends -

I seem to be stuck once again as I work my way through the Hortonworks Tutorials. I am currently working "Exploring Data with Apache Pig from the Grunt shell" and stuck on the step "DUMP Movies;"

I seem to me stuck in a loop that is waiting on "sandbox.hortonworks.com/10.0.2.15/:10020. Already tried 38 times ............"

I have attached a screen for clarification.

1967-virtualbox-hortonworks-sandbox-with-hdp-232-09-02.png

There was one funky twist in the instruction; the tutorial calls for loading movies.txt to /user/hadoop, and several comments suggested this was a type, and the location /user/hue should be used instead. That is the only variation on the instructions I have made knowingly.

Any assistance, as always, is greatly appreciated.

Mike

1 ACCEPTED SOLUTION

avatar
Master Mentor

@Mike Vogt How much memory have you allocated to Sandbox? Make sure that you shutdown hbase and any other component that you don't need.

Can you share screenshot of ambari?

View solution in original post

6 REPLIES 6

avatar
Master Mentor

@Mike Vogt How much memory have you allocated to Sandbox? Make sure that you shutdown hbase and any other component that you don't need.

Can you share screenshot of ambari?

avatar
Master Mentor

@Mike Vogt Make sure HDFS, Mapreduce and yarn is running

port 10020 belongs job history server.

avatar
Master Mentor

Port seems odd, what user are you running the tutorial as? @Mike Vogt I suggest creating the user you are running as so if you are root

sudo -u hdfs hdfs dfs -mkdir /user/root

sudo -u hdfs hdfs dfs -chown -R user/root

Then

hdfs dfs -put Movies.txt ./

avatar
Contributor

Hi Artem -

I appreciate you looking into this. I checked the directories and permissions and they were all in order.

The issue turned out to be yarn and mapreduce needing restarts. So I restarted, and got through the immediate crisis.

But I do appreciate you taking a look nonetheless.

Cheers,

Mike

avatar
Contributor

Aha! I checked Mapreduce and yarn, and they both were requiring restarts. Completed the task, and back on track.

Thanks again for your patience and guidance.

avatar
Master Mentor

@Mike Vogt Thanks for following up!!! 🙂