By using this site, you acknowledge that you have read and understand our Cookie policy, Privacy policy and Terms .
close

Code snippets for various programming languages/frameworks.

rss_feed Subscribe RSS

Pickle files are commonly used Python data related projects. This article shows how to create and load pickle files using Pandas. 

Create pickle file

import pandas as pd 
import numpy as np

file_name="data/test.pkl"
data = np.random.randn(1000, 2)
# pd.set_option('display.max_rows', None)
df = pd.DataFrame(data=data, columns=['foo', 'bar'])
print(df)
df.to_pickle(file_name)

Read pickle file

import pandas as pd 
import numpy as np

file_name="data/test.pkl"
df2 = pd.read_pickle(file_name)
print(df2)
info About author

info License/Terms

More from Kontext

local_offer pyspark local_offer spark-2-x local_offer python

visibility 39
comment 0
thumb_up 0
access_time 26 days ago

This articles show you how to convert a Python dictionary list to a Spark DataFrame. The code snippets runs on Spark 2.x environments. Input The input data (dictionary list looks like the following): data = [{"Category": 'Category A', 'ItemID': 1, 'Amount': 12.40}, ...

open_in_new View

Improve PySpark Performance using Pandas UDF with Apache Arrow

local_offer pyspark local_offer spark local_offer spark-2-x local_offer pandas

visibility 134
comment 0
thumb_up 4
access_time 28 days ago

Apache Arrow is an in-memory columnar data format that can be used in Spark to efficiently transfer data between JVM and Python processes. This currently is most beneficial to Python users that work with Pandas/NumPy data. In this article, ...

open_in_new View

local_offer pyspark local_offer spark-2-x local_offer spark local_offer python

visibility 16
comment 0
thumb_up 0
access_time 1 month ago

This article shows how to convert a Python dictionary list to a DataFrame in Spark using Python. Example dictionary list data = [{"Category": 'Category A', "ID": 1, "Value": 12.40}, {"Category": 'Category B', "ID": 2, "Value": 30.10}, {"Category": 'Category C', "...

open_in_new View

local_offer python local_offer pyspark local_offer pandas

visibility 1490
comment 0
thumb_up 0
access_time 6 months ago

In Spark, it’s easy to convert Spark Dataframe to Pandas dataframe through one line of code: df_pd = df.toPandas() In this page, I am going to show you how to convert a list of PySpark row objects to a Pandas data frame. Prepare the data frame The fo...

open_in_new View