The OpenWeatherMap is a service that provides weather data, including current weather data, forecasts, and historical data to the developers of web services and mobile applications.
It provides an API with JSON, XML, and HTML endpoints and a limited free usage tier. Making more than 60 calls per minute requires a paid subscription starting at USD 40 per month. Access to historical data requires a subscription starting at 150 USD per month. Users can request current weather information, extended forecasts, and graphical maps (showing cloud cover, wind speed, pressure, and precipitation).
To use this current weather data API, one must need the API key, which can be get from here.
Note: User need to create an account on openweathermap.org then only can use the APIs.
Current weather of any city using OpenWeatherMap API in Python
Modules Needed :
- requests
- json
Method 1: Using the json and request module
Python3
# import required modules import requests, json # Enter your API key here api_key = "Your_API_Key" # base_url variable to store url # Give city name city_name = input ( "Enter city name : " ) # complete_url variable to store # complete url address complete_url = base_url + "appid=" + api_key + "&q=" + city_name # get method of requests module # return response object response = requests.get(complete_url) # json method of response object # convert json format data into # python format data x = response.json() # Now x contains list of nested dictionaries # Check the value of "cod" key is equal to # "404", means city is found otherwise, # city is not found if x[ "cod" ] ! = "404" : # store the value of "main" # key in variable y y = x[ "main" ] # store the value corresponding # to the "temp" key of y current_temperature = y[ "temp" ] # store the value corresponding # to the "pressure" key of y current_pressure = y[ "pressure" ] # store the value corresponding # to the "humidity" key of y current_humidity = y[ "humidity" ] # store the value of "weather" # key in variable z z = x[ "weather" ] # store the value corresponding # to the "description" key at # the 0th index of z weather_description = z[ 0 ][ "description" ] # print following values print ( " Temperature (in kelvin unit) = " + str (current_temperature) + "\n atmospheric pressure (in hPa unit) = " + str (current_pressure) + "\n humidity (in percentage) = " + str (current_humidity) + "\n description = " + str (weather_description)) else : print ( " City Not Found " ) |
Output :
Enter city name : Delhi Temperature (in kelvin unit) = 312.15 atmospheric pressure (in hPa unit) = 996 humidity (in percentage) = 40 description = haze
Method 2: Using BeautifulSoup and request module
Python3
from bs4 import BeautifulSoup import requests headers = { 'User-Agent' : 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/58.0.3029.110 Safari/537.3' } def weather(city): city = city.replace( " " , "+" ) res = requests.get( f 'https://www.google.com/search?q={city}&oq={city}&aqs=chrome.0.35i39l2j0l4j46j69i60.6128j1j7&sourceid=chrome&ie=UTF-8' , headers = headers) print ( "Searching...\n" ) soup = BeautifulSoup(res.text, 'html.parser' ) location = soup.select( '#wob_loc' )[ 0 ].getText().strip() time = soup.select( '#wob_dts' )[ 0 ].getText().strip() info = soup.select( '#wob_dc' )[ 0 ].getText().strip() weather = soup.select( '#wob_tm' )[ 0 ].getText().strip() print (location) print (time) print (info) print (weather + "°C" ) city = input ( "Enter the Name of City -> " ) city = city + " weather" weather(city) print ( "Have a Nice Day:)" ) # This code is contributed by adityatri |
Sample Input:
Mahoba
Sample Output:
Enter the Name of City -> Mahoba Searching... Mahoba, Uttar Pradesh Monday, 12:00 am Cloudy 27°C Have a Nice Day:)
Explanation:
Here in the second approach, we will use some of the following modules and functions as listed below,
- BeautifulSoup: It is a library in python used to extract data from HTML and XML files i.e. for web scraping purposes. It generates a parse tree from the page source code, which can be used to extract data in a more readable and hierarchical manner. For installing a beautiful soup library in the system use the code below in the terminal,
pip install beautifulsoup
- Requests: Here we will use Python’s requests module to make HTTP requests. For installing use the code below in the terminal.
- Here we are using headers because the headers contain protocol-specific information that is placed before the raw message i.e retrieved from the website.
- After that, we will use the get() function and pass the google search in it along with the name of the city to retrieve the data from google.
- Then we will use BeautifulSoup and parse the HTML data that is required from the website.
- Then we will use the select() function to retrieve the particular information like time, info, location, store them in some variable, and, store them further.