Programming Essentials Python - Overview of Pandas Libraries - Performing Total Aggregations

Let us understand how to perform total or global aggregations using Pandas.

Key Concepts Explanation

Getting Number of Records in the Data Frame

To obtain the number of records in a Data Frame:


Getting Number of Non-NaN Values

To get the number of non-NaN values in each attribute in a Data Frame:


Getting Basic Statistics of Numeric Fields

To retrieve basic statistics of numeric fields in a Data Frame:


Getting Revenue for a Specific Order ID

To calculate the revenue for a specific order ID (e.g., order ID 2) from order_items:

order_items[order_items.order_item_order_id == 2]['order_item_subtotal'].sum()

Hands-On Tasks

  1. Task 1: Get the total number of records for a given month (201401) using the orders Data Frame.
orders[orders['order_date'].str.slice(0, 7).str.replace('-', '').astype('int64') == 201401]['order_id'].count()
  1. Task 2: Compute the total revenue generated for a given product ID (e.g., 502) using the order_items data set.
order_items.query('order_item_product_id == 502').order_item_subtotal.sum()
  1. Task 3: Determine the total number of items sold and the total revenue generated for a specific product ID (e.g., 502) using the order_items data set.
order_items_for_product_id = order_items.query('order_item_product_id == 502')
dict(order_items_for_product_id[['order_item_quantity', 'order_item_subtotal']].sum())
  1. Task 4: Create a collection with sales and commission percentages, then compute the total commission amount.
transactions = [(376.0, 8), (548.23, 14), (107.93, 8), (838.22, 14), (846.85, 21), (234.84,), (850.2, 21), (992.2, 21), (267.01,), (958.91, 21), (412.59,), (283.14,), (350.01, 14), (226.95,), (132.7, 14)]
sales_filled = sales.fillna(0.0)
(sales_filled['sale_amount'] * (sales_filled['commission_pct'] / 100)).sum().round(2)


In this article, we explored how to perform total aggregations using Pandas for various operations. Keep practicing and engaging with the community for further learning and growth. Enjoy your data manipulation journey!

Watch the video tutorial here