Support Questions

Find answers, ask questions, and share your expertise
Announcements
Celebrating as our community reaches 100,000 members! Thank you!

How to import Pandas and Numpy in the Livy2.

avatar
Explorer

Hi,

 

Can some one help me how to import pandas and Numpy in Livy2.

 

I'm using zeppelin and using the Livy2. In the Livy i'm creating a notebook and provide the following.

 

%pyspark

import pandas as py

 

I see the below error as

"No module found"

 

If i use the same using the pyspark interpreter it looks good and i don't have any issues. Need some help on this.

 

Thanks

Sambasivam.

 

 

 

 

1 ACCEPTED SOLUTION

avatar
Rising Star

Hi @Sambavi ,

You can install any required dependencies on all nodes and use them but you need to keep in mind that Pandas and Numpy doesn't provide distributed computing option and it wouldn't work with big data sets.

If your zeppelin configured to use yarn cluster mode It will take all data to spark driver in data node where spark driver located and try to process it there. (if its not big data set you can increase driver resources and it will work but its not looks like solution)

if you use client mode it will take everything in zeppelin node.

 

I recommend to try HandySpark https://github.com/dvgodoy/handyspark 

View solution in original post

2 REPLIES 2

avatar
Rising Star

Hi @Sambavi ,

You can install any required dependencies on all nodes and use them but you need to keep in mind that Pandas and Numpy doesn't provide distributed computing option and it wouldn't work with big data sets.

If your zeppelin configured to use yarn cluster mode It will take all data to spark driver in data node where spark driver located and try to process it there. (if its not big data set you can increase driver resources and it will work but its not looks like solution)

if you use client mode it will take everything in zeppelin node.

 

I recommend to try HandySpark https://github.com/dvgodoy/handyspark 

avatar
Explorer

Thanks,

 

I did try this and it worked out fine.

 

Thanks!