- Views: 1
- Report Article
- Articles
- Business & Careers
- Business Opportunities
How Price Scraping Can Be Used in E-commerce?
Posted: Aug 22, 2021
We have been frequently said that between two big e-commerce platforms of Malaysia (Shopee and Lazada), one is normally cheaper as well as attracts good deal hunters whereas other usually deals with lesser price sensitive.
So, we have decided to discover ourselves… in the battle of these e-commerce platforms!
For that, we have written a Python script with Selenium as well as Chrome driver for automating the scraping procedure and create a dataset. Here, we would be extracting for these:
- Product’s Name
- Product’s Name
Then we will do some basic analysis with Pandas on dataset that we have extracted. Here, some data cleaning would be needed and in the end, we will provide price comparisons on an easy visual chart with Seaborn and Matplotlib.
Between these two platforms, we have found Shopee harder to extract data for some reasons: (1) it has frustrating popup boxes that appear while entering the pages; as well as (2) website-class elements are not well-defined (a few elements have different classes).
For the reason, we would start with extracting Lazada first. We will work with Shopee during Part 2!
Initially, we import the required packages:
- Web Scraping from selenium import webdriver from selenium.common.exceptions import * # Data manipulation import pandas as pd # Visualization import matplotlib.pyplot as plt import seaborn as sns
- Path of a Chrome web driver
- Website URL
- Items we wish to search
After that, we would start off the Chrome browser. We would do it with a few customized options:
- Select custom Chrome options options = webdriver.ChromeOptions() options.add_argument('--headless') options.add_argument('start-maximized') options.add_argument('disable-infobars') options.add_argument('--disable-extensions') # Open the Chrome browser browser = webdriver.Chrome(webdriver_path, options=options) browser.get(Lazada_url)
Some other arguments like ‘disable-infobars’, ‘start-maximised’, as well as ‘— disable-extensions’ are included to make sure smoother operations of a browser (extensions, which interfere with the webpages particularly can disrupt the automation procedure).
Running the shorter code block will open your browser.
When the browser gets opened, we would require to automate the item search. The Selenium tool helps you find HTML elements with different techniques including the class, id, CSS selectors, as well as XPath that is the XML path appearance.
Then how do you recognize which features to get? An easy way of doing this is using Chrome’s inspect tool:
search_bar = browser.find_element_by_id('q') search_bar.send_keys(search_item).submit()That was the easy part. Now a part comes that could be challenging even more in case, you try extract data from Shopee website!
For working out about how you might scrape item names as well as pricing from Lazada, just think about how you might do that manually. What you can? Let’s see:
- Copy all the item names as well as their prices onto the spreadsheet table;
- Then go to next page as well as repeat the initial step till you’ve got the last page
That’s how we will do that in the automation procedure! To perform that, we will have to get the elements having item names as well as prices with the next page’s button.
With the Chrome’s inspect tool, it’s easy to see that product titles with prices have class names called ‘c16H9d’ as well as ‘c13VH6’ respectively. So, it’s vital to check that the similar class of names applied to all items on a page to make sure successful extraction of all items on a page.
item_titles = browser.find_elements_by_class_name('c16H9d') item_prices = browser.find_elements_by_class_name('c13VH6')After that, we have unpacked variables like item_titles as well as item_prices in the lists:
- Initialize empty lists titles_list = [] prices_list = [] # Loop over the item_titles and item_prices for title in item_titles: titles_list.append(title.text) for price in item_prices: prices_list.append(prices.text)
When we complete scraping from the page, it’s time to move towards the next page. Also, we will utilize a find_element technique using XPath. The use of XPath is very important here as next page buttons have two classes, as well as a find_element_by_class_name technique only gets elements from the single class.
That’s it now for this Lazada website! During Part 2, we will go through the particular challenges while extracting the Shopee website as well as we would plot one more box plot used for Shopee pricing to complete the comparison!
Looking to scrape price data from e-commerce websites? Contact Retailgators for eCommerce Data Scraping Services.
source code: https://medium.com/@Retailgators_32/how-price-scraping-can-be-used-in-e-commerce-6e7173e2378c
ECommerce Web Scraping Tools & Services | Retailgators USA, UK, Australia, UAE, Germany. https://www.retailgators.com/index.