Poll Results
Participate in Poll, Choose Your Answer.
Is it possible to create PySpark DataFrame from the external data source?
Sign Up to our social questions and Answers Engine to ask questions, answer people’s questions, and connect with other people.
Login to our social questions & Answers Engine to ask questions answer people’s questions & connect with other people.
Lost your password? Please enter your email address. You will receive a link and will create a new password via email.
Please briefly explain why you feel this question should be reported.
Please briefly explain why you feel this answer should be reported.
Please briefly explain why you feel this user should be reported.
Dataframes in PySpark are distributed collections of data that organize data into named columns and can be run on different machines. External databases, structured data files, or existing resilient distributed datasets (RDDs) can be used to populate these dataframes.
Yes, it possible to create PySpark DataFrame from the external data source.
Yes