Friday, November 15, 2024
Google search engine
HomeLanguagesHow to drop all columns with null values in a PySpark DataFrame...

How to drop all columns with null values in a PySpark DataFrame ?

The pyspark.sql.DataFrameNaFunctions class in PySpark has many methods to deal with NULL/None values, one of which is the drop() function, which is used to remove/delete rows containing NULL values in DataFrame columns. You can also use df.dropna(), as shown in this article. You may drop all rows in any, all, single, multiple, and chosen columns using the drop() method. When you need to sanitize data before processing it, this function is quite useful. Any column with an empty value when reading a file into the PySpark DataFrame API returns NULL on the DataFrame. To drop rows in RDBMS SQL, you must check each column for null values, but the PySpark drop() method is more powerful since it examines all columns for null values and drops the rows.

PySpark drop() Syntax 

The drop() method in PySpark has three optional arguments that may be used to eliminate NULL values from single, any, all, or numerous DataFrame columns. Because drop() is a transformation method, it produces a new DataFrame after removing rows/records from the current Dataframe.

drop(how='any', thresh=None, subset=None)

All of these settings are optional.

  • how – This accepts any or all values. Drop a row if it includes NULLs in any column by using the ‘any’ operator. Drop a row only if all columns contain NULL values if you use the ‘all’ option. The default value is ‘any’.
  • thresh – This is an int quantity; rows with less than thresh hold non-null values are dropped. ‘None’ is the default.
  • subset – This is used to select the columns that contain NULL values. ‘None’ is the default.

Implementation

Before we begin, let’s read a CSV file into a DataFrame. PySpark assigns null values to empty String and Integer columns when there are no values on those rows.

CSV Used:

 

Python3




import pyspark.sql.functions as sqlf
from pyspark.sql import SparkSession
import findspark
  
findspark.init()
  
  
spark: SparkSession = SparkSession.builder \
    .master("local[1]") \
    .appName("SparkByExamples.com") \
    .getOrCreate()
  
filePath = "example1.csv"
df = spark.read.options(header='true', inferSchema='true') \
          .csv(filePath)
  
df.printSchema()
df.show(truncate=False)


This results in the output shown below, name and city have null values, as you can see.

 

Drop Columns with NULL Values

Python3




def dropNullColumns(df):
    """
    This function drops columns containing all null values.
    :param df: A PySpark DataFrame
    """
  
    null_counts = df.select([sqlf.count(sqlf.when(sqlf.col(c).isNull(), c)).alias(
        c) for c in df.columns]).collect()[0].asDict()  # 1
    col_to_drop = [k for k, v in null_counts.items() if v > 0# 2
    df = df.drop(*col_to_drop)  # 3
  
    return df


 

We’re using the pyspark’s select method in the first line, which projects a group of expressions and returns a new dataframe. The collection of expressions included in brackets will be evaluated and a new dataframe will be created as a result. The expression counts the number of null values in each column and then can use the collect method to retrieve the data from the dataframe and create a dict with the column names and the number of nulls in each.

We’re only filtering out columns with null values greater than 0 in the second line, which basically means any column with null values.

After figuring out the columns containing null values, we used the drop function in the third line and finally returned the dataframe.

Example:

CSV Used:

 

Python3




import pyspark.sql.functions as sqlf
from pyspark.sql import SparkSession
import findspark
  
findspark.init()
  
  
spark: SparkSession = SparkSession.builder \
    .master("local[1]") \
    .appName("SparkByExamples.com") \
    .getOrCreate()
  
filePath = "/content/swimming_pool.csv"
df = spark.read.options(header='true', inferSchema='true') \
          .csv(filePath)
  
df.printSchema()
df.show(truncate=False)


 

After using dropNullColumns function – 

 

RELATED ARTICLES

Most Popular

Recent Comments