access_time 4mo language中文
more_vert

PySpark - 转换Python数组或串列为Spark DataFrame

visibility 10 comment 0
在Spark中,函数 SparkContext.parallelize  可以用于将Python的串列转换为RDD,然后将RDD转换为DataFrame。以下的示例代码基于Spark 2.x。 在这片文章中,我将展示怎样将以下的串列转换为数据集: data = [('Category A', 100, "This is category A"), ('Category B', 120, "This is category B"), ('Category C', 150, "This is category C")] ...
thumb_up 0

Please log in or register to comment.

account_circle Log in person_add Register

Log in with external accounts

comment Comments
No comments yet.
timeline Stats
Page index 0.10