Sunday, November 17, 2024
Google search engine
HomeLanguagesHow to Extract Weather Data from Google in Python?

How to Extract Weather Data from Google in Python?

In this article, we will see how to extract weather data from google. Google does not have its own weather API, it fetches data from weather.com and shows it when you search on Google. So, we will scrape the data from Google, and also we will see another method to fetch a schematic depiction of a location’s weather data for the next two days in Python without utilizing an API.

Method 1:

Module needed:

Requests: Requests allow you to send HTTP/1.1 requests extremely easily. This module also does not come built-in with Python. To install this type the below command in the terminal.

pip install requests

bs4: Beautiful Soup is a library that makes it easy to scrape information from web pages. Whether it be an HTML or XML page, that can later be used for iterating, searching, and modifying the data within it.

Approach:

  • Import the module
  • Enter the city name with the URL
"https://www.google.com/search?q="+"weather"+{cityname}
  • Make requests instance and pass the URL
  • Get the raw data.
  • Extract the required data from the soup.
  • Finally, print the required data.

Step-wise implementation of code:

Step 1: Import the requests and bs4 library

Python3




# importing the library
import requests
from bs4 import BeautifulSoup


Step 2: Create a URL with the entered city name in it and pass it to the get function.

Python3




# enter city name
city = "lucknow"
 
# create url
url = "https://www.google.com/search?q="+"weather"+city
 
# requests instance
html = requests.get(url).content
 
# getting raw data
soup = BeautifulSoup(html, 'html.parser')


Step 3: Soup will return a heap of data with HTML tags. So, a chunk of data has been shown below from which we will get all the necessary data with the help of the find function and passing the tag name and class name.

<div class=”kvKEAb”><div><div><div class=”BNeawe iBp4i AP7Wnd”><div><div class=”BNeawe 
iBp4i AP7Wnd”>13°C</div></div></div></div></div><div><div><div class=”BNeawe tAd8D AP7Wnd”> 
<div><div class=”BNeawe tAd8D AP7Wnd”>Saturday 11:10 am

Python3




# get the temperature
temp = soup.find('div', attrs={'class': 'BNeawe iBp4i AP7Wnd'}).text
 
# this contains time and sky description
str = soup.find('div', attrs={'class': 'BNeawe tAd8D AP7Wnd'}).text
 
# format the data
data = str.split('\n')
time = data[0]
sky = data[1]


Step 4: Here list1 contains all the div tags with a particular class name and index 5 of this list has all other required data.

Python3




# list having all div tags having particular class name
listdiv = soup.findAll('div', attrs={'class': 'BNeawe s3v9rd AP7Wnd'})
 
# particular list with required data
strd = listdiv[5].text
 
# formatting the string
pos = strd.find('Wind')
other_data = strd[pos:]


Step 5: Printing all the data

Python3




# printing all the data
print("Temperature is", temp)
print("Time: ", time)
print("Sky Description: ", sky)
print(other_data)


Output:

Below is the full implementation:

Python3




# importing library
import requests
from bs4 import BeautifulSoup
 
# enter city name
city = "lucknow"
 
# creating url and requests instance
url = "https://www.google.com/search?q="+"weather"+city
html = requests.get(url).content
 
# getting raw data
soup = BeautifulSoup(html, 'html.parser')
temp = soup.find('div', attrs={'class': 'BNeawe iBp4i AP7Wnd'}).text
str = soup.find('div', attrs={'class': 'BNeawe tAd8D AP7Wnd'}).text
 
# formatting data
data = str.split('\n')
time = data[0]
sky = data[1]
 
# getting all div tag
listdiv = soup.findAll('div', attrs={'class': 'BNeawe s3v9rd AP7Wnd'})
strd = listdiv[5].text
 
# getting other required data
pos = strd.find('Wind')
other_data = strd[pos:]
 
# printing all data
print("Temperature is", temp)
print("Time: ", time)
print("Sky Description: ", sky)
print(other_data)


Output:

Method 2:

Module needed:

Requests: Requests allow you to send HTTP/1.1 requests extremely easily. The HTTP request returns a response object with all of the required response data. This module also does not come built-in with Python. To install this type the below command in the terminal.

pip install requests

Approach:

  • Import the requests module
  • Sending request to get the IP Location Information
  • Extracting the location in JSON format
  • Printing the location extracted
  • Passing the city name and retrieving the weather data of the city
  • Printing the output

Below is the implementation:

Python3




# Python code to display schematic weather details
import requests
#Sending requests to get the IP Location Information
res = requests.get('https://ipinfo.io/')
# Receiving the response in JSON format
data = res.json()
# Extracting the Location of the City from the response
citydata = data['city']
# Prints the Current Location
print(citydata)
# Passing the City name to the url
url = 'https://wttr.in/{}'.format(citydata)
# Getting the Weather Data of the City
res = requests.get(url)
# Printing the results!
print(res.text)
# This code is contributed by PL VISHNUPPRIYAN


Output:

Displays the weather of the current location

RELATED ARTICLES

Most Popular

Recent Comments