Monday, January 31, 2022

LeafletJS Vs Python Folium Web map

 In this post, I will show how various components are made in both LeafletJS and Python Folium.


First map: Initialize a map with center, zoom and openstreetmap background



LeafletJS
<!DOCTYPE html>
<html>
<head>
	<script src="https://cdnjs.cloudflare.com/ajax/libs/leaflet/1.0.0-beta.2.rc.2/leaflet.js"></script>
	<link href="https://cdnjs.cloudflare.com/ajax/libs/leaflet/1.0.0-beta.2.rc.2/leaflet.css" rel="stylesheet" />
	<script src="https://cdnjs.cloudflare.com/ajax/libs/leaflet.draw/0.2.3/leaflet.draw.js"></script>

	<link href="https://cdnjs.cloudflare.com/ajax/libs/leaflet.draw/0.2.3/leaflet.draw.css" rel="stylesheet" />

	<meta charset="utf-8">
	<title>Web map....</title>
</head>

<style type="text/css">
	html, body, #map { margin: 0; height: 100%; width: 100%; }
</style>


<body>



  <div id='map'></div>



  <script>
  	// center of the map
	var center = [8.242, 7.671];

	// Create the map
	var map = L.map('map').setView(center, 7);

	// Set up the OSM layer
	L.tileLayer(
	  'https://{s}.tile.openstreetmap.org/{z}/{x}/{y}.png', {
	    attribution: 'Data © <a href="http://osm.org/copyright">OpenStreetMap</a>',
	    maxZoom: 18
	  }).addTo(map);





  </script>

</body>
</html>


Folium

import folium

# initialize a map with center, zoom and openstreetmap background...
mapObj = folium.Map(location=[8.242, 7.671],
                     zoom_start=7, tiles='openstreetmap')


mapObj






Draw point



LeafletJS
l


Folium

f






point

Draw line



LeafletJS
l


Folium

f






point

Draw polygon



LeafletJS
l


Folium

f






point

Plot geojson data



LeafletJS
l


Folium

f






point

Add layer control



LeafletJS
l


Folium

f






point

Add HTML



LeafletJS
l


Folium

f









Tuesday, January 18, 2022

Geopandas Vs Folium - Generate Web map from data

 The code snippet below will demonstrate how to create an interactive choropleth web map using Geopandas and Folium libraries.


The latest version of Geopandas has the explore() method which can create a leafletjs map as seen above. 


import geopandas as gpd

# Read shp...
gdf = gpd.read_file(r"NGA_adm1.shp")

# Create web map obj...
mymap = gdf.explore(column='geographic')

# Save to file...
mymap.save('map.html')







import folium
import geopandas as gpd

zones = {'NEZ':1, 'SEZ':2, 'SSZ':3, 'SWZ':4, 'NCZ':5, 'NWZ':6}

# Read shp...
gdf = gpd.read_file(r"NGA_adm1.shp")

gdf.reset_index(level=0, inplace=True)
gdf['Weight'] = gdf['geographic'].map(zones)
gdf['index'] = gdf['index'].apply( lambda x: str(x) )

# Create folium map obj...
mymap = folium.Map(location=[8.67, 7.22], zoom_start=6)

folium.Choropleth(
    geo_data=geo_json_str, 
    data=gdf,
    name = 'Choropleth Map',
    columns = ['index','Weight', 'state_name'],
    key_on = 'feature.id',
    fill_color = 'YlGnBu', # RdYlGn
    legend_name = 'Name of Legend...',
    smooth_factor=  0
    
    ).add_to(mymap)

mymap










Sunday, January 9, 2022

Keeping track on some favorite developers websites

 There are many developer authors who publish useful content on their blog on a regular basis.

As a learning fan, it is a great idea to use the skills you learnt from them to keep track of what is new on their blogs.

The two most common ways for achieving this are API and Scrapping. So, you will research if the author's blog has API service and in the case where it doesn't exist then you will think about using web scraping.

The authors I want to lookup in this post are: Renan MouraWilliam Vincent and Flavio Copes

As at the time of writing, the above authors don't have an API implemented on their respective websites, so we will use web scraping to keep track of the latest post on their blogs. So, basically we will write a scrapper to store the data in a file then compare it with feature scraped data to get the latest or newest entries on the blogs.

There are several libraries for scraping websites, here I will use python requests/selenium, beautifulsoup and pandas to get the job done.


Let's get started...


1- Renan Moura


From Renan Moura's  blog, I will like to keep track of the following post variables: Category, title, title url, published date and updated date.

Using requests library, I got "406 Not Acceptable client error response". Which means that there is a bot manager on the server where the website is hosted that prevents bots from accessing the website. To overcome this, we can either use request with a user-agent or selenium to access this website.

import requests
import pandas as pd
from bs4 import BeautifulSoup

url = 'https://renanmf.com'
# Get user-agent from: http://www.useragentstring.com/
headers = {'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36'}

response = requests.get(url, headers=headers)
html = response.text

soup = BeautifulSoup(html, 'html.parser')
article = soup.find_all("div", {'class':'card-content'})

print(len(article))


import requests
import pandas as pd
from bs4 import BeautifulSoup
from selenium import webdriver


url = 'https://renanmf.com'

driver = webdriver.Chrome('chromedriver.exe')
driver.get(url)

html = driver.page_source


soup = BeautifulSoup(html, 'html.parser')
article = soup.find_all("div", {'class':'card-content'})

print(len(article))


From any of the methods above, we can now loop through the columns we wanted as seen below:-

data_list = []
for art in article:
    category = art.find("li", {'class':'meta-categories'}).text
    title_txt = art.find("h2", {'class':'entry-title'}).text
    title_link = art.find("h2", {'class':'entry-title'}).find('a')['href']
    pub_date = art.find("li", {'class':'meta-date'}).text
    updated_date = art.find("li", {'class':'meta-updated-date'}).text
    
    data = category, title_txt, title_link, pub_date, updated_date
    
    data_list.append(data)

# ------------------------
data_list_df = pd.DataFrame(data_list, columns=['Category', 'Title', 'Title URL', 'Published Date', 'Updated Date'])




2- William Vincent


Here, we will get the following post variable: title, title url and published date

import requests
import pandas as pd
from bs4 import BeautifulSoup


url = 'https://wsvincent.com/'

headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36'}

response = requests.get(url, headers=headers)
html = response.text

soup = BeautifulSoup(html, 'html.parser')
article = soup.find_all("li")

# ------------------------


data_list = []

for art in article:
    title = art.find('h2').text
    title_link = art.find('h2').find('a')['href']
    pub_date = art.find('span', {'class':'post-meta'}).text
    
    data = title, title_link, pub_date
    
    data_list.append(data)
    
# ------------------------

    
data_list_df = pd.DataFrame(data_list, columns=['Title', 'Title URL', 'Published Date'])




3- Flavio Copes


Flavio's blog is similar to William Vincent above, we will get the following post variable: title, title url and published date.


url = 'https://flaviocopes.com'

headers = {'User-Agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/70.0.3538.77 Safari/537.36'}

response = requests.get(url, headers=headers)
html = response.text

soup = BeautifulSoup(html, 'html.parser')
article = soup.find_all("li", {'class':'post-stub'})
# ---------------

data_list = []

for art in article:
    title = art.find('h4').text
    title_link = art.find('a')['href']
    pub_date = art.find("time", {'class':'post-stub-date'}).text
    
    data = title, title_link, pub_date
    
    data_list.append(data)
    

    
data_list_df = pd.DataFrame(data_list, columns=['Title', 'Title URL', 'Published Date'])

data_list_df    




Happy scrapping!

Saturday, January 1, 2022

Make a WordCloud in Python

 Here is how to make something like this image below in python with less than ten lines of code. It is called "WordCloud" and it is a visual representations of words that give greater prominence to words that appear more frequently.


You need to install WordCloud and MatPlotLib libraries to run the code blow.

Make a list of text you want to use for the word cloud and generate it as seen below.

# Libraries
%matplotlib notebook
from wordcloud import WordCloud
import matplotlib.pyplot as plt
 
# Create a list of word
text=("Umar Umar Umar Matplotlib Matplotlib Seaborn Network Plot Violin Chart Pandas Datascience Wordcloud Spider Radar Parrallel Alpha Color Brewer Density Scatter Barplot Barplot Boxplot Violinplot Treemap Stacked Area Chart Chart Visualization Dataviz Donut Pie Time-Series Wordcloud Wordcloud Sankey Bubble")
 
# Create the wordcloud object
wordcloud = WordCloud(width=480, height=480, margin=0).generate(text)
 
# Display the generated image:
plt.imshow(wordcloud, interpolation='bilinear')
plt.axis("off")
plt.margins(x=0, y=0)
plt.show()

That is it!

Wednesday, December 1, 2021

Get IP address from domain name

 Given a domain name like "Google.com", the python script below will return its server IP address like this "216.58.223.238".

import socket
import pandas as pd

websites_df = pd.read_html('https://en.wikipedia.org/wiki/List_of_most_visited_websites')

for d in websites_df[0]['Domain Name']:
    IP_addres = socket.gethostbyname(d)
    print(d, ' - ', IP_addres)



That is it!

Friday, November 26, 2021

Several ways of doing same thing in programming - mapping two lists into one dictionary

 In programming there is always more than one way to solve the same problem. This variations depends on individual skills and way of thinking.

In this article I will demonstrate different ways to solve the same problem using python scripting.


The problem:

Python program to map two lists into a dictionary.

countries = ['Nigeria', 'Germany', 'Italy', 'USA', 'Japan', 'Ghana']
score = [39, 23, 12, 67, 45, 11]



The Solution:

1) Using zip() function

# Using zip() function
countries = ['Nigeria', 'Germany', 'Italy', 'USA', 'Japan', 'Ghana']
score = [39, 23, 12, 67, 45, 11]

data = dict(zip(countries, score))
print(data)


2) Using Dictionary Comprehension

# Using Dictionary Comprehension
countries = ['Nigeria', 'Germany', 'Italy', 'USA', 'Japan', 'Ghana']
score = [39, 23, 12, 67, 45, 11]

data  = {key:value for key, value in zip(countries, score)}
print(data)


3) Using For loop

# Using For loop
countries = ['Nigeria', 'Germany', 'Italy', 'USA', 'Japan', 'Ghana']
score = [39, 23, 12, 67, 45, 11]

countries_score = zip(countries, score)

data_dict = {}

for key, value in countries_score:
    if key in data_dict:
        # handling duplicate keys
        pass 
    else:
        data_dict[key] = value
        
print(data_dict)


4) Using For and Range

# Using For and Range
countries = ['Nigeria', 'Germany', 'Italy', 'USA', 'Japan', 'Ghana']
score = [39, 23, 12, 67, 45, 11]

data = {countries[i]: score[i] for i in range(len(countries))}
print(data)


All the four solutions above will give same output result as seen below:-

{'Nigeria': 39, 'Germany': 23, 'Italy': 12, 'USA': 67, 'Japan': 45, 'Ghana': 11}


That is it!

Tuesday, November 16, 2021

Making contour map using QuickGrid Software

 QuickGrid is a free software for making contour maps or 3D mesh using XYZ dataset. It is a good free alternative to Surfer.

Download and in the QuickGrid lets see how quick it is to generate a contour map.


First we have to prepare our dataset like this:-


The first column is X (Easting or Longitude), second column is Y (Northing or Latitude) and the last column is Z (Height or Altitude). Note that there is not column name for the data and is saved as a .CSV file.

To load in the dataset, go to: File >> Input scattered data points >> Input metric data points


This will display a gridded contour map immediately as seen above.

Now we can style the contour interval and labels like this:-


If you want to use the contour in AutoCAD, set the output option to use polyline for DXF output.


Then you can export to AutoCAD from the 'File' menu.


There is a lot more you can do with QuickGrid, however this is a good start for you to explore more on the software.


Thank you for following.

Wednesday, November 10, 2021

Overlay old image map on leafletjs web map

 In this post, I will show how to overlay an image map on the  leafletjs web map interface.

The image map is a seen below:-


We need to get the coordinates of the opposite corners in this format "[[y1, x1], [y2, x2]]".


To overlay rhe image, the code is thus:-

<!DOCTYPE html>
<html>
<head>
	
	<title>Leaflet Map...</title>

	<meta charset="utf-8" />
	<meta name="viewport" content="width=device-width, initial-scale=1.0">
	
    <link rel="stylesheet" href="https://unpkg.com/leaflet@1.7.1/dist/leaflet.css" integrity="sha512-xodZBNTC5n17Xt2atTPuE1HxjVMSvLVW9ocqUKLsCC5CXdbqCmblAshOMAS6/keqq/sMZMZ19scR4PsZChSR7A==" crossorigin=""/>

    <script src="https://unpkg.com/leaflet@1.7.1/dist/leaflet.js" integrity="sha512-XQoYMqMTK8LvdxXYG3nZ448hOEQiglfqkJs1NOQV44cWnUrBc8PkAOcXy20w0vlaXaVUearIOBhiXZ5V3ynxwA==" crossorigin=""></script>

    <!-- JQuery -->
    <script src="https://ajax.googleapis.com/ajax/libs/jquery/3.5.1/jquery.min.js"></script>    


<style type="text/css">

	#slider{
		position: fixed;
		z-index: 900;
		border: 2px solid gray;
		top: 100px;
		left: 20px;
	}
</style>
	
</head>
<body>


<div id="slider">
	<h4>Image Opacity: <span id="image-opacity">0.5<span/>  </h4>
	<input type='range' id='sldOpacity' min='0' max="1" step='0.1' value='0.5' >
</div>


<div id="mapid" style="width: 100%; height: 600px;"></div>



<script>
	// Create the map obj...
	var mymap = L.map('mapid', {minZoom: 2, maxZoom: 20})
				 .setView([8.54090, 7.71428], 13);


	// Set a default base map to...
    L.tileLayer('http://{s}.google.com/vt/lyrs=m&x={x}&y={y}&z={z}',{
        // minZoom: 0,
     	maxZoom: 18,
        subdomains:['mt0','mt1','mt2','mt3']
    }).addTo(mymap);


  	var imageUrl = "Group_Assignment.jpg",
  	imageBounds = [ [8.58061, 7.68495], [8.50206, 7.75181] ];
	var Old_Imge = L.imageOverlay(imageUrl, imageBounds, {
		opacity:0.4,
	}).addTo(mymap);

	Old_Imge.bringToFront();





var overlayMaps = {
    'Old Image' : Old_Imge
};



var baseLayers = {
	// Basemaps go here...
};

// Adding baseMaps and overlayMaps
L.control.layers(baseLayers, overlayMaps, {collapsed: false}).addTo(mymap);


$(document).ready(function(){
	  // jQuery methods go here...

	  $('#sldOpacity').on('change', function () {
	  		$('#image-opacity').html(this.value);

	  		Old_Imge.setOpacity(this.value);
	  });


}); // end Jquery doc ready.





</script>



</body>
</html>

Note that the code above include adding the image to layer control and a slider to give some controls on the overlaid image. See live demo below:-



Live Demo


That is it!

Thursday, November 4, 2021

Python GIS data wrangling - Mapping supper eagles head coaches since 1949

 The Nigerian senior national football team (super eagle) has had several coaches from 1949 till date. Lets prepare a data I found online about these coaches for use in any GIS platform.

The dataset for this exercise was collected from this hash tag: #Born2RichSports #Deliveringthebestinsports.

We will use python to wrangle this data into a GIS friendly format. Lets get started...

See all Super Eagles coach list from 1949 till date
------------------------------------
England: Jack Finch (1949)
Nigeria: Daniel Anyiam (1954–1956)
England: Les Courtier (1956–1960)
Israel: Moshe “Jerry” Beit haLevi (1960–1961)
Hungary: George Vardar (1961–1963)
England: Joey Blackwell (1963–1964)
Nigeria: Daniel Anyiam (1964–1965)
Hungary: József Ember (1965–1968)
Spain: Sabino Barinaga (1968–1969)
Nigeria: Peter ‘Eto’ Amaechina (1969–1970)
West Germany: Karl-Heinz Marotzke (1970–1971)
Brazil: Jorge Penna (1972–1973)
West Germany: Karl-Heinz Marotzke (1974)
Socialist Federal Republic of Yugoslavia: Tihomir Jelisavčić (1974–1978)
Brazil: Otto Glória (1979–1982)
West Germany: Gottlieb Göller (1981)
Nigeria: Festus Onigbinde (1983–1984)
Nigeria: Chris Udemezue (1984–1986)
Nigeria: Patrick Ekeji (1985)
Nigeria: Paul Hamilton (1987–1989)
West Germany: Manfred Höner (fr) (1988–1989)
Netherlands: Clemens Westerhof (1989–1994) as Technical Adviser
Nigeria: Shaibu Amodu (1994–1995)
Netherlands: Jo Bonfrere (1995–1996)
Nigeria: Shaibu Amodu (1996–1997)
France: Philippe Troussier (1997)
Nigeria: Monday Sinclair (1997–1998)
Federal Republic of Yugoslavia: Bora Milutinović (1998)
Netherlands: Thijs Libregts (1999)
Netherlands: Jo Bonfrere (1999–2001)
Nigeria: Shaibu Amodu (2001–2002)
Nigeria: Festus Onigbinde (2002)
Nigeria: Christian Chukwu (2002–2005)
Nigeria: Augustine Eguavoen (2005–2007)
Germany: Berti Vogts (2007–2008)
Nigeria: James Peters (2008)
Nigeria: Shaibu Amodu (2008–2010)
Sweden: Lars Lagerbäck (2010)
Nigeria: Augustine Eguavoen (2010)
Nigeria: Samson Siasia (2010–2011)
Nigeria: Stephen Keshi (2011–2014)
Nigeria: Shaibu Amodu (2014)
Nigeria: Stephen Keshi (2014)
Nigeria: Daniel Amokachi (2014–2015)
Nigeria: Stephen Keshi (2015)
Nigeria: Sunday Oliseh (2015-2016)
Germany: Gernot Rohr (2016–present)

#Born2RichSports #Deliveringthebestinsports
COPIED

Each row consist of the coach's country, coach's name and the year/period he severed. We need to separate each detail into its own column (that is three columns in this case).

There are several ways to prepare this data, here I saved the text above in a text file to read it into python object like this...


Then read each row/line into a list item for pandas dataframe as seen below...

with open(r"C:\Users\Yusuf_08039508010\Desktop\SuperEagle Coaches.txt", encoding='utf-8') as f:
    data = f.read()

coaches_list = data.split('\n')
print(coaches_list)

Now read the list into a dataframe. Next we can split the entries into separate columns for use in a GIS software.

coaches_df = pd.DataFrame(coaches_list, columns=['Coaches'])
coaches_df


coaches_df['Country'] = coaches_df['Coaches'].apply( lambda x: x.split(': ')[0] )
coaches_df['Coach Name'] = coaches_df['Coaches'].apply( lambda x: x.split(': ')[1].split(' (')[0] )
coaches_df['Period'] = coaches_df['Coaches'].apply( lambda x: x.split(': ')[1].split(' (')[1].replace(')', '') )

coaches_df

Now we have a beautiful table like this that we can integrate into GIS for further analysis.


For example, a quick look at the country column we see that the coaches came from 13 unique countries.


That is it!

Wednesday, November 3, 2021

Google base Maps in LeafletJS

 LeafletJS web map supports not just open source basemaps but also other preparatory basemaps such as Google maps and ESRI maps.

In this post, we shall see how to add varying flavors of Google basemaps.



The code below was inspired by this stackoverflow question.

<!DOCTYPE html>
<html>
<head>
	
	<title>Leaflet Map...</title>

	<meta charset="utf-8" />
	<meta name="viewport" content="width=device-width, initial-scale=1.0">
	
	<!-- <link rel="shortcut icon" type="image/x-icon" href="docs/images/favicon.ico" /> -->

    <link rel="stylesheet" href="https://unpkg.com/leaflet@1.7.1/dist/leaflet.css" integrity="sha512-xodZBNTC5n17Xt2atTPuE1HxjVMSvLVW9ocqUKLsCC5CXdbqCmblAshOMAS6/keqq/sMZMZ19scR4PsZChSR7A==" crossorigin=""/>

    <script src="https://unpkg.com/leaflet@1.7.1/dist/leaflet.js" integrity="sha512-XQoYMqMTK8LvdxXYG3nZ448hOEQiglfqkJs1NOQV44cWnUrBc8PkAOcXy20w0vlaXaVUearIOBhiXZ5V3ynxwA==" crossorigin=""></script>

	
</head>
<body>


<div id="mapid" style="width: 100%; height: 600px;"></div>



<script>
	// Create the map obj...
	var mymap = L.map('mapid', {minZoom: 2, maxZoom: 20})
				 .setView([0, 0], 2);


	// Restrict panning to this bounds
	var southWest = L.latLng(-90, -180),
		northEast = L.latLng(90, 180);
	var bounds = L.latLngBounds(southWest, northEast);

	mymap.setMaxBounds(bounds);


	// Set a default base map to...
    L.tileLayer('http://{s}.google.com/vt/lyrs=m&x={x}&y={y}&z={z}',{
        // minZoom: 0,
     	maxZoom: 18,
        subdomains:['mt0','mt1','mt2','mt3']
    }).addTo(mymap);



// CREATE GOOGLE MAP LAYER
	// 1- Streets...
	googleStreets = L.tileLayer('http://{s}.google.com/vt/lyrs=m&x={x}&y={y}&z={z}',{
	    maxZoom: 20,
	    subdomains:['mt0','mt1','mt2','mt3']
	});


	// 2- Hybrid...
	googleHybrid = L.tileLayer('http://{s}.google.com/vt/lyrs=s,h&x={x}&y={y}&z={z}',{
	    maxZoom: 20,
	    subdomains:['mt0','mt1','mt2','mt3']
	});


	// 3- Satellite...
	googleSat = L.tileLayer('http://{s}.google.com/vt/lyrs=s&x={x}&y={y}&z={z}',{
	    maxZoom: 20,
	    subdomains:['mt0','mt1','mt2','mt3']
	});



	// 4- Terrain...
	googleTerrain = L.tileLayer('http://{s}.google.com/vt/lyrs=p&x={x}&y={y}&z={z}',{
	    maxZoom: 20,
	    subdomains:['mt0','mt1','mt2','mt3']
	});



var overlayMaps = {
    // Other layers will go here....
};



var baseLayers = {
	'Google Street Map':googleStreets,
	'Google Hybrid Map':googleHybrid,
	'Google Satellite Map':googleSat,
	'Google Terrain Map':googleTerrain,
};

// Adding baseMaps and overlayMaps
L.control.layers(baseLayers, overlayMaps, {collapsed: false}).addTo(mymap);


</script>



</body>
</html>

That is it!