Apache Spark Python - Spark Metastore - Using Spark SQL

In this article, we will explore how to use Spark SQL to process data in Metastore Tables and Temporary Views. We will guide you through creating Metastore tables, executing queries, and joining tables to perform data transformations efficiently.

Creating Metastore Tables

We will demonstrate how to create Metastore tables for datasets such as orders and order_items using SQL statements. Here is an example code snippet:

    order_id INT,
    order_date STRING,
    order_customer_id INT,
    order_status STRING

Loading Data into Tables

After creating Metastore tables, we will load data into them using the LOAD DATA command. Here is an example code snippet:

LOAD DATA LOCAL INPATH '/data/retail_db/orders' INTO TABLE orders

Creating Temporary Views

We will also create a temporary view for a dataset, such as products, to perform queries. Here is an example of creating a temporary view:


Joining Tables

Lastly, we will join the Metastore tables and temporary view to calculate daily product revenue. We will perform joins based on common keys and apply transformations. Here is a sample SQL query:

SELECT o.order_date,
    round(sum(oi.order_item_subtotal), 2) AS revenue
FROM orders AS o 
JOIN order_items AS oi ON o.order_id = oi.order_item_order_id
JOIN products_v AS p ON p.product_id = oi.order_item_product_id
WHERE o.order_status IN ('COMPLETE', 'CLOSED')
GROUP BY o.order_date, p.product_id, p.product_name
ORDER BY o.order_date, revenue DESC

Watch the video tutorial here

Hands-On Tasks

  1. Create Metastore tables for orders and order_items data sets.
  2. Load data into the created tables.
  3. Create a temporary view for the products dataset.
  4. Perform a join operation to calculate daily product revenue.


In conclusion, by using Spark SQL, you can efficiently process data in Metastore Tables and Temporary Views. We encourage you to practice the concepts discussed in this article and explore further by engaging with the community. Happy learning!

Remember to join our community to ask questions and discuss further!