Saturday, November 19, 2022

Scrape online academic materials using python

 You know it can be a boring task to manually collect academic material you found online. In this blog post, I will demonstrate how I use python to collect some academic thesis, journals, and other materials for my profession.


Online Scientific Research Journals: 

Here my professor wants to have all the journals and their details published by "Scientific Research and Community Publishers" onlinescientificresearch.com neatly arranged in a spreadsheet table.

The specific details required are the journal name/title, the page URL, the description, cover image and ISSN number.

All the details should be organized in a spreadsheet as seen below.


The code:

import json
import requests
import pandas as pd
from bs4 import BeautifulSoup



# Section 1: Scrape journals page URLs and thumbnail images

url = 'https://www.onlinescientificresearch.com/journals.php'

# Get user-agent from: http://www.useragentstring.com/
headers = {'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36'}

response = requests.get(url, headers=headers)
html = response.text

soup = BeautifulSoup(html, 'html.parser')
journals = soup.find_all("div", {'class':'col-12 col-sm-6 col-lg-3'})

print(len(journals))
# ---------------------------------------------------------------

# Section 2: Extract paths to journals URL and thumbnail image...

url_list = []
image_list = []

for j in journals:
    url = j.find('a')['href']
    img = j.find('img')['src']
    
    url_list.append(url)
    image_list.append(img)
    
print('Done...')


# ---------------------------------------------------------------
# Section 3: Create dataframe and construct other details...

df = pd.DataFrame([url_list, image_list]).T
df.columns = ['Journal URL', 'Journal IMAGE URL']
# -------------------------------------
####### Construct Journal Name #######
df['Journal Name'] = df['Journal URL'].apply(lambda row: row.split('/')[-1].replace('.php', '').replace('-', ' ').title())


####### Construct Journal Description #######
def get_journal_descr(url):
    # Get user-agent from: http://www.useragentstring.com/
    headers = {'User-Agent': 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_10_1) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/39.0.2171.95 Safari/537.36'}

    response = requests.get(url, headers=headers)
    html = response.text
    
    soup = BeautifulSoup(html, 'html.parser')
    journal_descr = soup.find("div", {'class':'card-body'})
    
    return journal_descr.text
# -------------------------------------
# Scrape Journal description into a list 
j_descr_list = []
i = 1

for url in df['Journal URL']:
    print(i, 'Processing...', url)
    j_descr = get_journal_descr(url)
    
    j_descr_list.append((url, j_descr))
    i = i+1

desc_df = pd.DataFrame(j_descr_list)
# -------------------------------------

# We have to access each journal url page to get its description...
# df['Journal description'] = df['Journal URL'].apply(lambda url: get_journal_descr(url))
df['Journal description'] = desc_df[1]


####### Construct Journal ISSN #######
# We have to use OCR on the journal thumb nail to get its ISSN...
# Using OCR API at: https://ocr.space/ocrapi....

headers = {
    'apikey': 'helloworld', # 'helloworld'
    'content-type': 'application/x-www-form-urlencoded',
}

issn_list = []

for thumbnail in df['Journal IMAGE URL']:
    print('Processing....', thumbnail)
    
    data = f'isOverlayRequired=true&url={thumbnail}&language=eng'

    response = requests.post('https://api.ocr.space/Parse/Image', headers=headers, data=data, verify=False)

    result = json.loads(response.content.decode()) # Convert the result to dictionary using json.loads() function
    # type(result)

    # Check the dict keys, the ISSN is in: ParsedResults >> 0 >> ParsedText
    issn = result['ParsedResults'][0]['ParsedText'].strip().split('\r\n')[-1]

    issn_list.append(issn)

df['Journal ISSN'] = issn_list

df
Extracting the journal ISSN was definitely the trickiest part as it requires working with OCR API.



M.Sc. in GIST Theses

Master of Science (Geographic Information Science and Technology) Theses by University of Southern California. 


Here our professor wants the thesis details arranged in a table seen above.

Lets start by inspecting the html tags on the web page.

Here I copied the parent div tag that contains the needed data into a local html file. With this we don't need to send request to the website.

import pandas as pd
from bs4 import BeautifulSoup

# Copy the parent div tag into a html/txt file...
html_file = r"C:\Users\Yusuf_08039508010\Documents\Jupyter_Notebook\2022\M.S. IN GIST THESES\M.S. IN GIST THESES.HTML"

# Use BeautifulSoup to read the html div tag....
with open(html_file, encoding='utf-8') as f:
    div_data = f.read()

soup = BeautifulSoup(div_data, 'html.parser')

thesis_years = soup.find_all("h3")

thesis_authors = soup.find_all("strong")
thesis_authors = [ a.text for a in thesis_authors ]

thesis_topics = soup.find_all("em")
thesis_topics = [ t.text for t in thesis_topics ]

thesis_advisor = soup.find_all("p")
thesis_advisor = [ a.text for a in thesis_advisor if 'Advisor:' in a.text ]

thesis_pdf = soup.find_all("a")
thesis_pdf = [ link.get('href') for link in thesis_pdf if 'Abstract Text' not in link.text ]

# --------------------------------------------
df = pd.DataFrame(thesis_authors, columns=['Author'])
df['Topic'] = thesis_topics
df['Advisor'] = thesis_advisor
df['PDF Link'] = thesis_pdf

df

The code below will download the PDF files to local disc using the requests library.
i = 1
for indx, row in df.iterrows():
    link = row['PDF Link']
    print('Processsing...', link)

    pdf_name = str(i) +'_'+ link.split('/')[-1]
    pdf_file = requests.get(link, timeout=10).content

    with open( f'Thesis PDF\\{pdf_name}', 'wb' ) as f:
        f.write(pdf_file)
        
    i += 1
    # break


print('Finished...')





Journal - Nigerian Institution of Surveyors



This was little bit trick because the web page had inconsistent html tags.
import requests
import pandas as pd
from bs4 import BeautifulSoup


url = 'https://nisngr.net/journal/'
response = requests.get(url, verify=False)
html = response.text
# ----------------------------


soup = BeautifulSoup(html, 'html.parser')
div_boxes = soup.find_all("div", {'class':'wpb_text_column wpb_content_element'})
# ----------------------------


papers_dict = {}
for div in div_boxes:
    papers = div.find_all('a')
    
    for link in papers:
        papers_dict[link.text] = link['href']
# ----------------------------

df = pd.DataFrame([papers_dict]).T
df




Thank you for reading.

Thursday, November 10, 2022

Automate boring tasks in QGIS with PyQGIS

 In this post, I will use PyQGIS to automate some boring tasks I often encounter in QGIS. Hope you will find something useful to your workflow. Lets get started...

If you don't know what pyqgis is, then read this definition by hatarilabs.com: "PyQGIS is the Python environment inside QGIS with a set of QGIS libraries plus the Python tools with the potential of running other powerful libraries as Pandas, Numpy or Scikit-learn".

PyQGIS allows users to automate workflow and extend QGIS with the use of Python libraries and the documentation can be accessed here.

This means knowledge of python programming is required to understand some of the codes below.



  Task 1~ Count number of opened/loaded layers in the layer panel

I often find myself trying to count the layers in my QGIS project layer panel, so a simple pyqis script to automate the process will be ideal especially when there are many layers on the layer panel to count.

# This will return the all layers on the layer panel
all_layers = QgsProject.instance().mapLayers().values()
print('There are', len(all_layers), 'on the layer panel.')


  Task 2~ Count features in loaded vector layer

In this task, I want to get the number of features in each layer am working on. This is similar to 'Show Feature Count' function when you right-click on a vector layer.

# Get all layers into a list....
all_layers = list(QgsProject.instance().mapLayers().values())

# Get all displayed names of layer and corresponding number of features ...
ftCounts = [ (l.name(), l.featureCount()) for l in all_layers ]
print(ftCounts)


  Task 3~ Switch on/off all layers

To turn ON or OFF all layer can be frustrating when you got many layers to click through. So why not auto mate it in just a click.

# Get list of layers from the layer's panel...
qgis_prjt_lyrs = QgsProject.instance().layerTreeRoot().findLayers()

# Use index to Set layer on or off....
qgis_prjt_lyrs[20].setItemVisibilityChecked(True) # True=On, False=Off

# Do for all...
for l in qgis_prjt_lyrs:
    l.setItemVisibilityChecked(False)


  Task 4~ Identify layers that are on/off

Lets extend task3 above, so we know which layers are on (visible) and which layers are off (hidden).

# Get list of layers from the layer's panel...
qgis_prjt_lyrs = QgsProject.instance().layerTreeRoot().findLayers()

# Check if a layer is visible or not...
layer_visibility_check = [ (l.name(), l.isVisible()) for l in qgis_prjt_lyrs ]
print(layer_visibility_check)

visibility_ture = [ l.name() for l in qgis_prjt_lyrs if l.isVisible() == True ]
print('Number of visible layers:', len(visibility_ture))

visibility_false = [ l.name() for l in qgis_prjt_lyrs if l.isVisible() == False ]
print('Number of visible layers:', len(visibility_false))


  Task 5~ Read file path of layers

This is useful when you have many layers and don't know where they are located on your machine. You will also see interesting paths to other remote layer such as WMS, etc

# Returns path to every layer...
layer_paths = [layer.source() for layer in QgsProject.instance().mapLayers().values()]
print(layer_paths)


  Task 6~ Read layer type of layers

We can check the 'type' of a layer.

# Get dict of layers from the layer's panel...
layersDict = QgsProject.instance().mapLayers()


for (id, map) in layersDict.items():
    print(map.name(), '>>', map.type())


  Task 7~ Create multiple attribute fields/columns

Lets say we want to add multiple integer fields/columns to a vector layer. The code below will create attribute fields for year 2000 to 2023, that is twenty three (23) attribute columns/fields on the selected vector layer.

# Get Layer by name...
layer = QgsProject.instance().mapLayersByName("NIG LGA")[0]

# Define dataProvider for layer
layer_provider = layer.dataProvider()

# Add an Integer attribute field and update fields...
layer_provider.addAttributes([QgsField("2000", QVariant.Int)])
layer.updateFields()

# Add bulk attribute fields...
for x in range(2001, 2023):
    layer_provider.addAttributes([QgsField(str(x), QVariant.Int)])
    layer.updateFields()

print('Done...')


  Task 8~ Read/List all names of layers on layer panel

Here we just want to return the displayed names of layers.
# Get all layers into a list....
all_layers = list(QgsProject.instance().mapLayers().values())

# Get all displayed names of layer
all_layers_names = [ l.name() for l in all_layers ]
print(all_layers_names)


  Task 9~ Save attribute table to dataframe

# Save attribute table into Dataframe...

import pandas as pd

# Get Layer by name...
layer = QgsProject.instance().mapLayersByName("NIG LGA")[0]

# get attribute columns names
col_names = [ field.name() for field in layer.fields() ]

lga_list = []
state_list = []
apc_list = []
pdp_list = []
lp_list = []
nnpp_list = []
winner_list = []


for feature in layer.getFeatures():
    lga_list.append(feature['lga_name'])
    state_list.append(feature['state_name'])
    apc_list.append(feature['APC'])
    pdp_list.append(feature['PDP'])
    lp_list.append(feature['LP'])
    nnpp_list.append(feature['NNPP'])
    winner_list.append(feature['Winner'])

df = pd.DataFrame([state_list, lga_list, apc_list, pdp_list, lp_list, nnpp_list, winner_list]).T

df.to_csv(r'C:\Users\Yusuf_08039508010\Desktop\...\test.csv')

print('Done....')


  Task 10~ Select from multiple layers and attribute fields

Here we want to conduct multiple selection of given keywords from all listed layers and all attribute fields.
# Query to Select from all listed layers and all attribute fields
search_for = {'Bauchi', 'SSZ', 'Edo', 'Yobe'}

for lyr in QgsProject.instance().mapLayers().values():
    if isinstance(lyr, QgsVectorLayer):
        to_select = []
        # fieldlist = [f.name() for f in lyr.fields()]
        for f in lyr.getFeatures():
            # Check if any of the search keyword intersects to
            # feature's row attribute. If true, get the feature ID for selection...
            if len(search_for.intersection(f.attributes())) > 0:
                to_select.append(f.id())
        if len(to_select) > 0:
            lyr.select(to_select)




  Task 11~ Convert multiple GeoJSON files to shapefiles

import glob

input_files = glob.glob(r'C:\Users\Yusuf_08039508010\Desktop\Working_Files\GIS Data\US Zip Codes\*.json')
for f in input_files:
    out_filename = f.split('\\')[-1].split('.')[0]
    input_file = QgsVectorLayer(f, "polygon", "ogr")
    
    if input_file.isValid() == True:
        QgsVectorFileWriter.writeAsVectorFormat(input_file, rf"C:\Users\Yusuf_08039508010\Desktop\Working_Files\Fiverr\2021\05-May\Division_Region_Area Map\SHP\US ZipCode\{out_filename}.shp", "UTF-8", input_file.crs(), "ESRI Shapefile")
    else:
        print(f, 'is not a valid input file')
        
print('Done Processing..., ', f)



Thank you for reading.

Friday, November 4, 2022

Search nearby places - Comparing three API (Google Places API, Geoapify API and HERE API)

 In the post, I will compare API from three different providers to search nearby places the three API to compare are: Google Places API, Geoapify API and HERE API.

For each of the platforms, you need to register and get a developer API key to use. All the platform offer a limited free API quota to start with.



Google Places API


import requests
import pandas as pd
from datetime import datetime

df = pd.read_csv('datafile.csv')


YOUR_API_KEY = 'AIza......'

i = 1
for row, col in df.iterrows():
    lat = col['Latitude']
    long = col['Longitude']
    print(i, 'Processing...', lat, long)
    
    url = f'https://maps.googleapis.com/maps/api/place/nearbysearch/json?location={lat}%2C{long}&radius=4850&type=laundry&keyword=laundromats&key={YOUR_API_KEY}'

    payload={}
    headers = {}

    response = requests.request("GET", url, headers=headers, data=payload)

    # Get current time...
    now = datetime.now()
    current_time = now.strftime("%Y%m%d__%H%M%S")

    # Write to file....
    with open(fr'JSON folder\\GoogleAPI\\{state_folder}\\{current_time}.json', 'w') as outfile:
        json.dump(response.json(), outfile)

    i = i+1
    
    
print('Done...')


Geoapify API


GeoApify_API_KEY = '378122b08....'

url = 'https://api.geoapify.com/v2/places'

params = dict(
    categories: 'commercial',
    filter: 'rect:7.735282,48.586797,7.756289,48.574457',
    limit: 2000,
    apiKey=f'{GeoApify_API_KEY}'
)

resp = requests.get(url=url, params=params)
data = resp.json()

print(data)




HERE API


HERE_API_KEY = 'WEYn....'
coord = '27.95034271398129,-82.45670935632066' # lat, long
url = f'https://places.ls.hereapi.com/places/v1/discover/here?apiKey={HERE_API_KEY}&at={coord}&laundry'

response = requests.get(url).json()
# print(response.text)

# Get current time...
now = datetime.now()
current_time = now.strftime("%Y%m%d__%H%M%S")


# Write to file....
with open(fr'JSON folder\\{current_time}.json', 'w') as outfile:
    json.dump(response, outfile)
    
print('Done...')



Tuesday, November 1, 2022

Thursday, October 27, 2022

Convert Coordinates in United Nations Code for Trade and Transport Locations (UN/LOCODE) Code List to GIS friendly format

 The table code list at 'UN/LOCODE Code List by Country and Territory' has a column named coordinate. This column contains the geographical coordinates (latitude/longitude) in a format that is not suitable for use in GIS software. The reason is explained on this page by UN.

So, basically it say in order to avoid unnecessary use of non-standard characters and space, the following standard presentation is used: 0000lat 00000long

(lat - Latitude: N or S ; long – Longitude: W or E, only one digit, capital letter)

Where the last two rightmost digits refer to minutes and the first two or three digits refer to the degrees for latitude and longitude respectively. In addition, you must specify N or S for latitude and W or E for longitude, as appropriate.


While this may be a good format for them, it is not a good format for most GIS platforms. Hence there is need to convert it into what the GIS can easily utilize.

This means we will convert coordinate that looks like this '0507N 00722E' to decimal degrees or degree minute and seconds.


unlocode_coord = '0507S 00722E'

unlat, unlong = unlocode_coord.split(' ')


# Handling Latitide...
# --------------------------
# Remove the last characted which will always be either: N or S...
unlat = unlat.replace('N', '').replace('S', '') # .rstrip('N').rstrip('S')
lat_deg = unlat[:2]
lat_min = unlat[-2:]

# Result in DMS... Degrees Minutes
print(f"The result in Degree Munite is: {lat_deg}°{lat_min}'")


# Result in DD.... Decimal Degrees - Since 1° = 60' and 1' = 60"
lat_min_dd = round(float(lat_min)/60, 2)
# Get the fractional and integer parts, we can use: modulo (%) operator or math.modf
lat_sec_dd = int(lat_min_dd) + round(lat_min_dd % 1, 2)/60
# Add the D+M+S...
lat_dd = round(float(lat_deg) + lat_min_dd + lat_sec_dd, 3)
print(f"The result in Decimal Degree is: {lat_dd}°")




# Handling Longitude...
# --------------------------
# Remove the last characted which will always be either: N or S...
unlong = unlong.rstrip('E').rstrip('W')
long_deg = unlong[:3]
long_min = unlong[-2:]

print()
# Result in DMS... Degrees Minutes Seconds
print(f"The result in Degree Munite is: {long_deg}°{long_min}'")

# Result in DD.... Decimal Degrees - Since 1° = 60' and 1' = 60"
long_min_dd = round(float(long_min)/60, 2)
# Get the fractional and integer parts, we can use: modulo (%) operator or math.modf
long_sec_dd = int(long_min_dd) + round(long_min_dd % 1, 2)/60
# Add the D+M+S...
long_dd = round(float(long_deg) + long_min_dd + long_sec_dd, 3)
print(f"The result in Decimal Degree is: {long_dd}°")

You may use the tool on this website to learn more as seen below.





The reverse - from decimal degrees to UN/LOCODE coordinates

# ------------------- For Latitude ---------------------------------

lat = 13.893937

if lat >= 0: # Northern Hermisphere
    # Degree...
    lat_degree = int(lat)
    # Minute
    lat_minute = int((lat - lat_degree) * 60)
    
    lat_result = str(lat_degree) + str(lat_minute) + 'N'
    print(f'Latitude in UN/LOCODE is: {lat_result}')
    
else: # Southern Hermisphere
    lat = abs(lat)

    # Degree...
    lat_degree = int(lat)
    # Minute
    lat_minute = int((lat - lat_degree) * 60)
    
    lat_result = str(lat_degree) + str(lat_minute) + 'S'
    print(f'Latitude in UN/LOCODE is: {lat_result}')
    


# ------------------- For Longitude ---------------------------------

long = -123.893937

if long >= 0: # Northern Hermisphere
    # Degree...
    long_degree = int(long)
    long_degree1 = str(int(long))

    
    if len(long_degree1) == 1:
        long_degree1 = '00' + long_degree1
    elif len(long_degree1) == 2:
        long_degree1 = '0' + long_degree1
    elif len(long_degree1) == 3:
        long_degree1 = long_degree1    
        
    # Minute
    long_minute = int((long - long_degree) * 60)
    
    long_result = str(long_degree1) + str(long_minute) + 'E'
    print(f'Longitude in UN/LOCODE is: {long_result}')
    
else: # Southern Hermisphere
    long = abs(long)

    # Degree...
    long_degree = int(long)
    # Minute
    long_minute = int((long - long_degree) * 60)
    
    long_result = str(long_degree) + str(long_minute) + 'W'
    print(f'Longitude in UN/LOCODE is: {long_result}')



That is it!

Saturday, October 22, 2022

Mathematics of successful life in Python

There is this text that trends over the social media the twenty six alphabets are assigned number from one to twenty six and it was used to calculate the percentage of some word as quoted below;-

I found this to be very interesting and meaningful message to share:-
IF:
A = 1 
B = 2 
C = 3  
D = 4
E = 5  
F = 6
G = 7  
H = 8
I = 9  
J = 10  
K = 11  
L = 12
M = 13  
N = 14 
O = 15  
P = 16
Q = 17
R = 18 
S = 19
T = 20
U = 21
V = 22 
W = 23  
X = 24
Y = 25 
Z = 26

THEN,
H+A+R+D+W+O+R+K
8+1+18+4+23+15+18+11 = 98%

K+N+O+W+L+E+D+G+E
11+14+15+23+12+5+4+7+5 = 96%

L+O+V+E
12+15+22+5 = 54%

L+U+C+K
12+21+3+11 = 47%

None of them makes 100%.
Then what makes 100%?
Is it Money? NO!

M+O+N+E+Y
13+15+14+5+25 = 72%

E+D+U+C+A+T+I+O+N
5+4+21+3+1+20+9+15+14 = 92%

Leadership? NO!

L+E+A+D+E+R+S+H+I+P
12+5+1+4+5+18+19+8+9+16 = 97%

Every problem has a solution, only if we perhaps change our ATTITUDE...
A+T+T+I+T+U+D+E = 1+20+20+9+20+21+4+5  = 100%
It is therefore OUR ATTITUDE towards Life and Work that makes OUR Life 100% Successful.

Amazing mathematics
Let's change our Attitude of doing things in life.
Because it's our attitude that is our problem
Not the Devil.
Tusaai Piadin Gideon copied


Let see how we can transform this into a python script.

alphabets = {'A' : 1, 'B' : 2, 'C' : 3, 'D' : 4, 'E' : 5, 'F' : 6, 'G' : 7, 'H' : 8, 'I' : 9, 'J' : 10, 'K' : 11, 'L' : 12, 'M' : 13, 'N' : 14, 'O' : 15, 'P' : 16, 'Q' : 17, 'R' : 18, 'S' : 19, 'T' : 20, 'U' : 21, 'V' : 22, 'W' : 23, 'X' : 24, 'Y' : 25, 'Z' : 26}

solve = 'M+O+N+E+Y'
solve1 = solve.split('+')
solve2 = [ alphabets[a] for a in solve1 ]
solve3 = str(sum(solve2)) + '%'

print(solve3)

That is it!

Wednesday, October 19, 2022

Working with Dropbox API in python

 In this post, I will explore Dropbox API. If you don't know what Dropbox is, according to its Wikipedia page: "Dropbox is a file hosting service operated by the American company Dropbox, Inc., headquartered in San Francisco, California, U.S. that offers cloud storage, file synchronization, personal cloud, and client software".

Now that you know what dropbox is, lets see how we can do some basic operations such as uploading, renaming, create, copy, move, delete, download, list to files and folders using python API.

First create an app to generate an API TOKEN on the app developer console page after setting up the right permissions for you use case.


There are two common ways/methods of connecting to dropbox via python:-

The first is using a third party module named "dropbox" which you can install using pip install dropbox. The documentation is found on Dropbox for Python web page.



The second way is using requests module based on their official Dropbox API Explorer.


Which ever method you decide to adopt, it is just a matter of preference. In my case I usually use the combination of the two depending on what is easier for what I want to implement.

Both have excellent documentations. In fact, the Dropbox API Explorer is fantastic for visually constructing the end calls you wanted. For example, I want to get list of contents in a folder, I will on lis_folder tab on the left and configure the options as seen. Then click on show code to grab the code for use in python environment.


Just copy the resulting code into your python environment to execute the API.

import requests
import json

url = "https://api.dropboxapi.com/2/files/list_folder"

headers = {
    "Authorization": "Bearer <access-token>",
    "Content-Type": "application/json"
}

data = {
    "path": "/Images"
}

r = requests.post(url, headers=headers, data=json.dumps(data))

Friday, October 7, 2022

The cat API

 The cat API is a public service API all about Cats. This means it is a service where people who like cats share picture and other details about cats and developers uses it in there applications.


The python code below uses the free cat api to search for random cats, fetch there details and organize it into a dataframe table and the download the pictures onto the local disc.

import json
import shutil
import requests

url = 'https://api.thecatapi.com/v1/images/search'

data_list = []
for i in range(101):
    print('Getting random cat image...', i)
    
    # Send requests...
    response = requests.get(url)
    
    # Get values from response...
    data = list(response.json()[0].values())
    
    # Append data to list...
    data_list.append(data)


# Get columns names from response...
cols = list(response.json()[0].keys())

# Create df...
data_list_df = pd.DataFrame(data_list, columns=cols)
# ===========================================



# Download images....

i = 1
for url in data_list_df['url']:
    # Get image extension...
    img_ext = url.split('.')[-1]
    imgfile_name = f'cat_{i}.{img_ext}'
    print('Processsing...', imgfile_name)
    
    # Send requests...
    res = requests.get(url, stream=True)

    # Write image to disc...
    with open(f'cat_image/{imgfile_name}','wb') as f:
        shutil.copyfileobj(res.raw, f)
        
    # Alternative: write image to disc...
    # with open(f'cat_image/{imgfile_name}','wb') as f:
    #     f.write(res.content)
    
    i = i+1
    # break


Similarly, the code above can be adapted for the Dog API or Dog CEO API.


Thank you for following.

Monday, September 26, 2022

Spatial Distribution of Federal Polytechnics in Nigeria

 On this post, I will map the National Board for Technical Education (NBTE) approved Federal Polytechnics in Nigeria to have a sense of on how they are spatially distributed in the country.

There are 40 polytechnics listed on the NBTE web page above, copy the list into a spreadsheet and geocode the addresses. With this geocoded result, we can prepare the spatial distribution of the federal polytechnic schools as seen below:-


QGIS software was then use to prepare the map.



From the spatial distribution, it could be seen that boundaries between Kebbi and Niger states is a potential location a new proposed federal polytechnic. Similarly, we can recommend/proposed new federal polytechnic between Edo - Ondo states and Adamawa - Taraba states as seen in red color below.


Thank you for reading.

Friday, September 16, 2022

Batch Geocoding and Reverse Geocoding using HERE API

A python script for batch geocoding and reverse geocoding using HERE API

HERE API service allows you to submit batch geocoding and reverse geocoding requests.  Your submission must conform with the Input Data guidelines. The Batch Geocoder API handles the geocoding and reverse geocoding asynchronously.

As stated on the API doc page linked above, to retrieve the output of a successful batch request, you must follow the steps below:-

  1. Upload your data with a POST request to the resource jobs.
  2. Using the RequestId value contained in the response to your data upload request, check the status of the job with a GET request. You can only download the results when the job status is completed
  3. Using the RequestId value contained in the response to your data upload request, download the results by sending a GET request.


Batch Geocoding Python Script

Input Data:


Python Script:

import requests
import json
import time
import zipfile
import io
from bs4 import BeautifulSoup
import glob

import pandas as pd


mykey = 'xxxxxxxxxxxxxx' ## Register a HERE MAPS DEVELOPER API

class Batch:
    SERVICE_URL = "https://batch.geocoder.ls.hereapi.com/6.2/jobs"
    jobId = None
    
    
    def __init__(self, apikey=""): ## use a HERE MAPS DEVELOPER API
        self.apikey = apikey
        
            
    def start(self, filename='testfile.csv', indelim=",", outdelim=","):
        
        file = open(filename, 'rb')

        params = {
            "action": "run",
            "apiKey": self.apikey,
            "politicalview":"RUS",
            "gen": 9,
            "maxresults": "1",
            "header": "true",
            "indelim": indelim,
            "outdelim": outdelim,
            "outcols": "displayLatitude,displayLongitude,locationLabel,houseNumber,street,district,city,postalCode,county,state,country",
            "outputcombined": "true",
        }

        response = requests.post(self.SERVICE_URL, params=params, data=file)
        self.__stats (response)
        file.close()
    

    def status (self, jobId = None):

        if jobId is not None:
            self.jobId = jobId
        
        statusUrl = self.SERVICE_URL + "/" + self.jobId
        
        params = {
            "action": "status",
            "apiKey": self.apikey,
        }
        
        response = requests.get(statusUrl, params=params)
        self.__stats (response)
        

    def result (self, jobId = None):

        if jobId is not None:
            self.jobId = jobId
        
        print("Requesting result data ...")
        
        resultUrl = self.SERVICE_URL + "/" + self.jobId + "/result"
        
        params = {
            "apiKey": self.apikey
        }
        
        response = requests.get(resultUrl, params=params, stream=True)
        
        if (response.ok):    
            zipResult = zipfile.ZipFile(io.BytesIO(response.content))
            zipResult.extractall()
            print("File saved successfully")
        
        else:
            print("Error")
            print(response.text)
    

    
    def __stats (self, response):
        if (response.ok):
            parsedXMLResponse = BeautifulSoup(response.text, "lxml")

            self.jobId = parsedXMLResponse.find('requestid').get_text()
            
            for stat in parsedXMLResponse.find('response').findChildren():
                if(len(stat.findChildren()) == 0):
                    print("{name}: {data}".format(name=stat.name, data=stat.get_text()))
            
            # Contruct the zipfile url...
            self.zip_result = f'https://batch.geocoder.ls.hereapi.com/6.2/jobs/{self.jobId}/result?apiKey={mykey}'
            print("Zipfile URL: ", self.zip_result)
            print('-'*30)
            
            # Set delay for the zipfile to be ready for download on HERE server...
            time.sleep(50)
            
            # Download and extract zip file...
            self.r = requests.get(self.zip_result, stream=True)
            self.z = zipfile.ZipFile(io.BytesIO(self.r.content))
            self.z.extractall(r"HERE\\Output files")

        else:
            print(response.text)

if __name__=="__main__":
    
    service = Batch(apikey=mykey) ## use a HERE MAPS DEVELOPER API
    
    spreadsheet_folder = glob.glob(r'HERE\\Input files\\csv_file\\df_newPU\\*.csv')
    for cvs_file in spreadsheet_folder:
        print('Geocoding file...', cvs_file)
        service.start (cvs_file, indelim = ",", outdelim = ",")



Batch Reverse Geocoding Python Script

Input Data:


Python Script:

l# Reverse Geocode....
class Batch:
    SERVICE_URL = "https://batch.geocoder.ls.hereapi.com/6.2/jobs"
    jobId = None
    
    
    def __init__(self, apikey=""): ## use a HERE MAPS DEVELOPER API
        self.apikey = apikey
        
            
    def start(self, filename='togeocode619.csv', indelim=",", outdelim=","):
        
        file = open(filename, 'rb')

        params = {
            "action": "run",
            "apiKey": self.apikey,
            "politicalview":"RUS",
            "gen": 9,
            "maxresults": "1",
            "header": "true",
            "indelim": indelim,
            "outdelim": outdelim,
            "outCols": "recId,latitude,longitude,locationLabel",
            "outputcombined": "true",
            "mode":"retrieveAddresses",
        }

        
        response = requests.post(self.SERVICE_URL, params=params, data=file)
        self.__stats (response)
        file.close()
    

    def status (self, jobId = None):

        if jobId is not None:
            self.jobId = jobId
        
        statusUrl = self.SERVICE_URL + "/" + self.jobId
        
        params = {
            "action": "status",
            "apiKey": self.apikey,
        }
        
        response = requests.get(statusUrl, params=params)
        self.__stats (response)
        

    def result (self, jobId = None):

        if jobId is not None:
            self.jobId = jobId
        
        print("Requesting result data ...")
        
        resultUrl = self.SERVICE_URL + "/" + self.jobId + "/result"
        
        params = {
            "apiKey": self.apikey
        }
        
        response = requests.get(resultUrl, params=params, stream=True)
        
        if (response.ok):    
            zipResult = zipfile.ZipFile(io.BytesIO(response.content))
            zipResult.extractall()
            print("File saved successfully")
        
        else:
            print("Error")
            print(response.text)
    

    
    def __stats (self, response):
        if (response.ok):
            parsedXMLResponse = BeautifulSoup(response.text, "lxml")

            self.jobId = parsedXMLResponse.find('requestid').get_text()
            
            for stat in parsedXMLResponse.find('response').findChildren():
                if(len(stat.findChildren()) == 0):
                    print("{name}: {data}".format(name=stat.name, data=stat.get_text()))
            
            # Contruct the zipfile url...
            self.zip_result = f'https://batch.geocoder.ls.hereapi.com/6.2/jobs/{self.jobId}/result?apiKey={mykey}'
            print("Zipfile URL: ", self.zip_result)
            print('-'*30)
            
            # Set delay for the zipfile to be ready for download on HERE server...
            time.sleep(50)
            
            # Download and extract zip file...
            self.r = requests.get(self.zip_result, stream=True)
            self.z = zipfile.ZipFile(io.BytesIO(self.r.content))
            self.z.extractall(r"HERE\\Output files")

        else:
            print(response.text)

if __name__=="__main__":
    
    service = Batch(apikey=mykey) ## use a HERE MAPS DEVELOPER API
    cvs_file = r"C:\Users\Yusuf_08039508010\Documents\Jupyter_Notebook\2022\Geocode Addresses\19k LatLong in Fr\reverse_geocode_input.csv"
    
    print('Geocoding file...', cvs_file)
    service.start (cvs_file, indelim = ",", outdelim = ",")
    
#     spreadsheet_folder = glob.glob(r'HERE\\Input files\\csv_file\\df_newPU\\*.csv')
#     for cvs_file in spreadsheet_folder:
#         print('Geocoding file...', cvs_file)
#         service.start (cvs_file, indelim = ",", outdelim = ",")


Happy geocoding!

Monday, August 22, 2022

Read point click coordinates on image in pixel and inches

 Listed below is a python code that uses opencv library to read an image and read the coordinate at mouse click. Left mouse click will display the coordinates in both pixels and inches while right mouse click will clear the coordinates by reloading the image.


# importing the module
import cv2


# Get list of all events in cv2... the one we want to use is the 'EVENT_LBUTTONDOWN' (left click)
all_events = [i for i in dir(cv2) if 'EVENT' in i]
# print(all_events)
# print(dir(cv2.EVENT_LBUTTONDBLCLK))


# reading the image
img_file = r"IMG_20180426_235051_042.jpg"
img = cv2.imread(img_file, 1)

# displaying the image
cv2.imshow('Title image window...', img)


# --------------------------------------------------------
# define the callback function...
def click_event(event, x, y, flags, params):
	global img
	
	if event == cv2.EVENT_LBUTTONDOWN:
		print(x, '---', y)

		# convert pixels to inches
		# Assuming PixelsPerInch resolution (PPI) is 96, therefore: PPI = 96 px / inch
		# 1 pixel = 1 inch / 96 >>>> 1 pixel = 0.010417 inch
		x_inch = round(x * 0.010417, 2)
		y_inch = round(y * 0.010417, 2)

		font = cv2.FONT_HERSHEY_SIMPLEX
		cv2.putText(img, f'{str(x)}, {str(y)}px ({x_inch}, {y_inch}inches)', (x, y), font, 1, (255, 0, 0), 2)
		cv2.imshow('Title image window...', img)


	# Clear screen text by right click...
	if event == cv2.EVENT_RBUTTONDOWN:
		print('Right Click...')

		img = cv2.imread(img_file, 1)
		cv2.imshow('Title image window...', img)
		cv2.setMouseCallback('Title image window...', click_event)



# use the callback function by setting the Mouse callback....
cv2.setMouseCallback('Title image window...', click_event)
# --------------------------------------------------------


# wait for a key to be pressed to exit
cv2.waitKey(0)

# close the window
cv2.destroyAllWindows()



That is it!

Friday, August 19, 2022

Copy a file to multiple directories

 Lets say we got nested folders as seen below where we want copy a file (text file in this case) into all the folders.


The python script below does what was stated above.


import os
import glob
import shutil
txt_file = r"C:\Users\Yusuf_08039508010\Documents\...\PDF Others\textfile.txt"
fldr = r'C:\Users\Yusuf_08039508010\Documents\Jupyter_Notebook\2022\...\PDF Others'

# Change dir to parent directory and get all subfolder...
os.chdir(fldr)
list_of_folder = glob.glob('**/', recursive=True)

# Copy text file from source (src) to destination (dst)
for f in list_of_folder:
    print('Processing...', f)
    shutil.copy( txt_file, f ) # shutil.copy( src, dst )


Enjoy!

Sunday, July 31, 2022

Getting Started with ArcGIS and QGIS Python Scripting API - ArcPy and PyQGIS


On this blog, I intend to document some common GIS operation via the Python Scripting API for both ArcGIS (ArcPy) and QGIS (PyQGIS).

Lets begin....


1) The documentation

When you decided to work with a new API, then its documentation should always be your first place to learn more about the API's capabilities. Here below is where you will find the respective docs as at the time of writing.

The ArcGIS (ArcPy) is available at: http://desktop.arcgis.com/en/documentation/
Note that there are two versions of the ArcGIS desktop software: ArcMap (Including ArcCatalog, ArcScene, & ArcGlobe) and ArcGIS Pro. The focus of this blog is on: ArcMap.




 The QGIS (PyQGIS) is available at: QGIS Python API documentation project



You can compliment this docs with good books written by different authors:-

ArcPy related Books:
~ ArcPy and ArcGIS – Geospatial Analysis with Python by Silas Toms
~ Python Scripting for ArcGIS by Paul A. Zandbergen
~ Python For ArcGIS by Laura Tateosian
~ Python Scripting for ArcGIS by Paul A. Zandbergen


PyQGIS related Books:
~ PyQGIS developer cookbook by QGIS Project Team
~ The PyQGIS Programmer's Guide: Extending QGIS 3 with Python 3 by Gary Sherman
~ The Pyqgis Programmer's Guide by Gary Sherman
~ Mastering Geospatial Development with QGIS 3.x: An in-depth guide to becoming proficient in spatial data analysis using QGIS 3.4 and 3.6 with Python by Shammunul Islam, Simon Miles, et al.
~ QGIS Python Programming Cookbook by Joel Lawhead
~ Building Mapping Applications with QGIS by by Erik Westra


2) Launch and customize python window/console

In ArcGIS it is located under 'Geoprocessing' menu while in QGIS, it is under the 'Plugin' menu. Access the respective menus and launch the python window/console. you could also launch the window/console from the related icon on the tools bar.





To customize the ArcGIS python window, right click on the environment and select what you want to customize.



To customize the QGIS python console, click on the 'options' button as seen below. Then select the settings you want to customize accordingly.



3) The built-in documentation

Lets try to see the list of valid methods and attributes for the APIs. The primary object in the ArcGIS API is 'arcpy' so lets call the dir() method on it like this: dir(arcpy)
>>> dir(arcpy)
['ASCII3DToFeatureClass_3d', 'ASCIIToRaster_conversion', 'AcceptConnections', 'AddAttachments_management', 'AddCADFields_conversion', 'AddCodedValueToDomain_management', 'AddColormap_management', 'AddDataStoreItem', 'AddEdgeEdgeConnectivityRuleToGeometricNetwork_management', 'AddEdgeJunctionConnectivityRuleToGeometricNetwork_management', 'AddError', 'AddFeatureClassToTerrain_3d', 'AddFeatureClassToTopology_management', 'AddFieldConflictFilter_management', 'AddFieldDelimiters', 'AddFieldToAnalysisLayer_na', 'AddField_management', 'AddFilesToLasDataset_management', 'AddGeometryAttributes_management', 'AddGlobalIDs_management', 'AddIDMessage', 'AddIncrementingIDField_management', 'AddIndex_management', 'AddItem_arc', 'AddJoin_management', 'AddLocations_na', 'AddMessage', 'AddRasterToGeoPackage_conversion', 'AddRastersToMosaicDataset_management', 'AddRepresentation_cartography', 'AddReturnMessage', 'AddRuleToTopology_management', 'AddSpatialIndex_management', 'AddSubtype_management', 'AddSurfaceInformation_3d', 'AddTerrainPyramidLevel_3d', 'AddToolbox', 'AddWarning', 'AddXY_arc', 'AddXY_management', 'AddZInformation_3d', 'Adjust3DZ_management', 'AggregatePoints_cartography', 'AggregatePolygons_arc', 'AggregatePolygons_cartography', 'AlignFeatures_edit', 'AlignMarkerToStrokeOrFill_cartography', 'AlterAliasName', 'AlterField_management', 'AlterMosaicDatasetSchema_management', 'AlterVersion_management', 'AnalyzeControlPoints_management', 'AnalyzeDatasets_management', 'AnalyzeMosaicDataset_management', 'AnalyzeToolsForPro_management', 'Analyze_management', 'Annotation', 'AppendAnnotation_management', 'AppendControlPoints_management', 'AppendParcelFabric_fabric', 'AppendTerrainPoints_3d', 'Append_arc', 'Append_management', 'ApplyBlockAdjustment_management', 'ApplySymbologyFromLayer_management', 'ArcDLG_arc', 'ArcRoute_arc', 'ArcS57_arc', 'ArcSDESQLExecute', 'ArealInterpolationLayerToPolygons_ga', 'Array', 'AsShape', 'Aspect_3d', 'AssignDefaultToField_management', 'AssignDomainToField_management', 'AverageNearestNeighbor_stats', 'BatchBuildPyramids_management', 'BatchCalculateStatistics_management', 'BatchProject_management', 'BearingDistanceToLine_management', 'Buffer3D_3d', 'Buffer_analysis', 'Buffer_arc', 'BuildBoundary_management', 'BuildFootprints_management', 'BuildMosaicDatasetItemCache_management', 'BuildNetwork_na', 'BuildOverviews_management', 'BuildPyramids_management', 'BuildPyramidsandStatistics_management', 'BuildRasterAttributeTable_management', 'BuildSeamlines_management', 'BuildStereoModel_management', 'BuildTerrain_3d', 'Build_arc', 'CADToGeodatabase_conversion', 'CalculateAdjacentFields_cartography', 'CalculateAreas_stats', 'CalculateCellSizeRanges_management', 'CalculateCentralMeridianAndParallels_cartography', 'CalculateDefaultClusterTolerance_management', 'CalculateDefaultGridIndex_management', 'CalculateDistanceBand_stats', 'CalculateEndTime_management', 'CalculateField_management', 'CalculateGridConvergenceAngle_cartography', 'CalculateLineCaps_cartography', 'CalculateLocations_na', 'CalculatePolygonMainAngle_cartography', 'CalculateRepresentationRule_cartography', 'CalculateStatistics_management', 'CalculateTransformationErrors_edit', 'CalculateUTMZone_cartography', 'CalculateValue_management', 'CalibrateRoutes_lr', 'CentralFeature_stats', 'ChangeLasClassCodes_3d', 'ChangePrivileges_management', 'ChangeTerrainReferenceScale_3d', 'ChangeTerrainResolutionBounds_3d', 'ChangeVersion_management', 'CheckExtension', 'CheckGeometry_management', 'CheckInExtension', 'CheckOutExtension', 'CheckProduct', 'ClassifyLasBuilding_3d', 'ClassifyLasByHeight_3d', 'ClassifyLasGround_3d', 'Clean_arc', 'ClearEnvironment', 'ClearWorkspaceCache_management', 'Clip_analysis', 'Clip_arc', 'Clip_management', 'ClustersOutliersRendered_stats', 'ClustersOutliers_stats', 'CollapseDualLinesToCenterline_arc', 'CollapseDualLinesToCenterline_cartography', 'CollapseRoadDetail_cartography', 'CollectEventsRendered_stats', 'CollectEvents_stats', 'ColorBalanceMosaicDataset_management', 'Command', 'Compact_management', 'CompareReplicaSchema_management', 'CompositeBands_management', 'CompressFileGeodatabaseData_management', 'Compress_management', 'ComputeBlockAdjustment_management', 'ComputeCameraModel_management', 'ComputeControlPoints_management', 'ComputeDirtyArea_management', 'ComputeMosaicCandidates_management', 'ComputePansharpenWeights_management', 'ComputeTiePoints_management', 'ConcatenateDateAndTimeFields_ta', 'ConfigureGeodatabaseLogFileTables_management', 'ConsolidateLayer_management', 'ConsolidateLocator_geocoding', 'ConsolidateLocator_management', 'ConsolidateMap_management', 'ConsolidateResult_management', 'ConstructSightLines_3d', 'ContourAnnotation_cartography', 'ContourList_3d', 'ContourWithBarriers_3d', 'Contour_3d', 'ConvertCoordinateNotation_management', 'ConvertDiagram_schematics', 'ConvertMapServerCacheStorageFormat_server', 'ConvertSpatialWeightsMatrixtoTable_stats', 'ConvertTimeField_management', 'ConvertTimeZone_management', 'CopyFeatures_management', 'CopyParameter', 'CopyParcelFabric_fabric', 'CopyRasterCatalogItems_management', 'CopyRaster_management', 'CopyRows_management', 'CopyRuntimeGdbToFileGdb_conversion', 'CopyTin_3d', 'CopyTraversedSourceFeatures_na', 'Copy_management', 'CountRenderer_stats', 'CreateAddressLocator_geocoding', 'CreateArcInfoWorkspace_management', 'CreateArcSDEConnectionFile_management', 'CreateCartographicPartitions_cartography', 'CreateCompositeAddressLocator_geocoding', 'CreateCustomGeoTransformation_management', 'CreateDatabaseConnection_management', 'CreateDatabaseUser_management', 'CreateDatabaseView_management', 'CreateDiagram_schematics', 'CreateDomain_management', 'CreateEnterpriseGeodatabase_management', 'CreateFeatureDataset_management', 'CreateFeatureclass_management', 'CreateFileGDB_management', 'CreateFishnet_management', 'CreateFolder_management', 'CreateGPSDDraft', 'CreateGeocodeSDDraft', 'CreateGeometricNetwork_management', 'CreateImageSDDraft', 'CreateLabels_arc', 'CreateLasDataset_management', 'CreateMapServerCache_server', 'CreateMapTilePackage_management', 'CreateMosaicDataset_management', 'CreateObject', 'CreateOrthoCorrectedRasterDataset_management', 'CreateOverpass_cartography', 'CreatePansharpenedRasterDataset_management', 'CreatePersonalGDB_management', 'CreateRandomPoints_management', 'CreateRandomRaster_management', 'CreateRandomValueGenerator', 'CreateRasterCatalog_management', 'CreateRasterDataset_management', 'CreateRasterType_management', 'CreateReferencedMosaicDataset_management', 'CreateRelationshipClass_management', 'CreateReplicaFootPrints_management', 'CreateReplicaFromServer_management', 'CreateReplica_management', 'CreateRole_management', 'CreateRoutes_lr', 'CreateRuntimeContent_management', 'CreateSQLiteDatabase_management', 'CreateSchematicFolder_schematics', 'CreateScratchName', 'CreateSpaceTimeCube_stpm', 'CreateSpatialReference_management', 'CreateSpatialType_management', 'CreateSpatiallyBalancedPoints_ga', 'CreateTable_management', 'CreateTerrain_3d', 'CreateThiessenPolygons_analysis', 'CreateTin_3d', 'CreateTopology_management', 'CreateTurnFeatureClass_na', 'CreateUnRegisteredFeatureclass_management', 'CreateUnRegisteredTable_management', 'CreateUnderpass_cartography', 'CreateUniqueName', 'CreateVersion_management', 'CreateVersionedView_management', 'Create_arc', 'CrossValidationResult', 'CrossValidation_ga', 'CulDeSacMasks_cartography', 'Cursor', 'Curvature_3d', 'CutFill_3d', 'DEMToRaster_conversion', 'DLGArc_arc', 'DecimateTinNodes_3d', 'DecryptPYT', 'DefineMosaicDatasetNoData_management', 'DefineOverviews_management', 'DefineProjection_arc', 'DefineProjection_management', 'DeleteCodedValueFromDomain_management', 'DeleteColormap_management', 'DeleteDomain_management', 'DeleteFeatures_management', 'DeleteField_management', 'DeleteGlobeServerCache_server', 'DeleteGridsAndGraticules_cartography', 'DeleteIdentical_management', 'DeleteMapServerCache_server', 'DeleteMosaicDataset_management', 'DeleteRasterAttributeTable_management', 'DeleteRasterCatalogItems_management', 'DeleteRows_management', 'DeleteSchemaGeodatabase_management', 'DeleteTerrainPoints_3d', 'DeleteVersion_management', 'Delete_management', 'DelineateBuiltUpAreas_cartography', 'DelineateTinDataArea_3d', 'DensifySamplingNetwork_ga', 'Densify_edit', 'Describe', 'DetectFeatureChanges_management', 'DetectGraphicConflict_cartography', 'DiagnoseVersionMetadata_management', 'DiagnoseVersionTables_management', 'Dice_management', 'Difference3D_3d', 'DiffusionInterpolationWithBarriers_ga', 'Dimension', 'DirectionalDistribution_stats', 'DirectionalMean_stats', 'Directions_na', 'DisableArchiving_management', 'DisableAttachments_management', 'DisableEditorTracking_management', 'DisconnectUser', 'DisperseMarkers_cartography', 'DissolveNetwork_na', 'DissolveRouteEvents_lr', 'Dissolve_arc', 'Dissolve_management', 'Divide_3d', 'DomainToTable_management', 'DownloadRasters_management', 'DropIndex_arc', 'DropItem_arc', 'DropRepresentation_cartography', 'ESRITranslator_conversion', 'EdgematchFeatures_edit', 'EditRasterFunction_management', 'EditTin_3d', 'EliminatePolygonPart_management', 'Eliminate_arc', 'Eliminate_management', 'EmergingHotSpotAnalysis_stpm', 'EmpiricalBayesianKriging_ga', 'EnableArchiving_management', 'EnableAttachments_management', 'EnableEditorTracking_management', 'EnableEnterpriseGeodatabase_management', 'EncloseMultiPatch_3d', 'EncryptPYT', 'ErasePoint_edit', 'Erase_analysis', 'Erase_arc', 'ExcelToTable_conversion', 'ExecuteError', 'ExecuteWarning', 'Exists', 'ExploratoryRegression_stats', 'ExportAcknowledgementMessage_management', 'ExportCAD_conversion', 'ExportDataChangeMessage_management', 'ExportGeodatabaseConfigurationKeywords_management', 'ExportMapServerCache_server', 'ExportMetadataMultiple_conversion', 'ExportMetadata_conversion', 'ExportMosaicDatasetGeometry_management', 'ExportMosaicDatasetItems_management', 'ExportMosaicDatasetPaths_management', 'ExportRasterCatalogPaths_management', 'ExportRasterWorldFile_management', 'ExportReplicaSchema_management', 'ExportTileCache_management', 'ExportTo3DWebScene_3d', 'ExportTopologyErrors_management', 'ExportWebMap_server', 'ExportXMLWorkspaceDocument_management', 'ExportXYv_stats', 'Export_arc', 'ExtendLine_edit', 'Extent', 'ExtractDataAndEmailTask_server', 'ExtractDataTask_server', 'ExtractData_server', 'ExtractLas_3d', 'ExtractPackage_management', 'ExtractSubDataset_management', 'ExtractValuesToTable_ga', 'ExtrudeBetween_3d', 'FeatureClassToFeatureClass_conversion', 'FeatureClassToGeodatabase_conversion', 'FeatureClassToShapefile_conversion', 'FeatureClassZToASCII_3d', 'FeatureCompare_management', 'FeatureEnvelopeToPolygon_management', 'FeatureOutlineMasks_cartography', 'FeatureSet', 'FeatureTo3DByAttribute_3d', 'FeatureToLine_management', 'FeatureToNetCDF_md', 'FeatureToPoint_management', 'FeatureToPolygon_management', 'FeatureToRaster_conversion', 'FeatureVerticesToPoints_management', 'FeatureclassToCoverage_conversion', 'FeaturesFromCityEngineRules_3d', 'FeaturesToJSON_conversion', 'Field', 'FieldInfo', 'FieldMap', 'FieldMappings', 'FileCompare_management', 'Filter', 'FindClosestFacilities_na', 'FindConflicts_arc', 'FindDisconnectedFeaturesInGeometricNetwork_management', 'FindIdentical_management', 'FindRoutes_na', 'FlipLine_edit', 'Flip_management', 'FloatToRaster_conversion', 'Float_3d', 'Frequency_analysis', 'FromScriptingArcObject', 'FromWKB', 'FromWKT', 'GACalculateZValue_ga', 'GACreateGeostatisticalLayer_ga', 'GAGetModelParameter_ga', 'GALayerToContour_ga', 'GALayerToGrid_ga', 'GALayerToPoints_ga', 'GAMovingWindowKriging_ga', 'GANeighborhoodSelection_ga', 'GASemivariogramSensitivity_ga', 'GASetModelParameter_ga', 'GPXtoFeatures_conversion', 'GaussianGeostatisticalSimulations_ga', 'Generalize_edit', 'GenerateAttachmentMatchTable_management', 'GenerateEdgematchLinks_edit', 'GenerateExcludeArea_management', 'GenerateFgdbLicense_management', 'GenerateLicensedFgdb_management', 'GenerateMapServerCacheTilingScheme_server', 'GenerateNearTable_analysis', 'GenerateNetworkSpatialWeights_stats', 'GenerateOriginDestinationCostMatrix_na', 'GeneratePointCloud_management', 'GeneratePointsAlongLines_management', 'GenerateRasterFromRasterFunction_management', 'GenerateRubbersheetLinks_edit', 'GenerateServiceAreas_na', 'GenerateSpatialWeightsMatrix_stats', 'GenerateTessellation_management', 'GenerateTileCacheTilingScheme_management', 'Generate_arc', 'GeoProcessor', 'GeoTaggedPhotosToPoints_management', 'GeocodeAddresses_geocoding', 'GeodeticDensify_management', 'GeographicallyWeightedRegression_stats', 'Geometry', 'GeostatisticalDatasets', 'GetActivePortalURL', 'GetArgumentCount', 'GetCellValue_management', 'GetCount_management', 'GetIDMessage', 'GetImageEXIFProperties', 'GetInstallInfo', 'GetLayoutTemplatesInfo_server', 'GetLogHistory', 'GetMaxSeverity', 'GetMessage', 'GetMessageCount', 'GetMessages', 'GetPackageInfo', 'GetParameter', 'GetParameterAsText', 'GetParameterCount', 'GetParameterInfo', 'GetParameterValue', 'GetRasterProperties_management', 'GetReturnCode', 'GetSeverity', 'GetSeverityLevel', 'GetSigninToken', 'GetSystemEnvironment', 'GetUTMFromLocation', 'GlobalPolynomialInterpolation_ga', 'Graph', 'GraphTemplate', 'GraphicBuffer_analysis', 'GridIndexFeatures_cartography', 'GroupingAnalysis_stats', 'HighLowClustering_stats', 'HillShade_3d', 'HotSpotsRendered_stats', 'HotSpots_stats', 'IDEdit_arc', 'IDW_ga', 'Identity_analysis', 'Identity_arc', 'Idw_3d', 'Import3DFiles_3d', 'ImportCADAnnotation_conversion', 'ImportCoverageAnnotation_conversion', 'ImportFromE00_conversion', 'ImportGeodatabaseConfigurationKeywords_management', 'ImportMapServerCache_server', 'ImportMessage_management', 'ImportMetadata_conversion', 'ImportMosaicDatasetGeometry_management', 'ImportReplicaSchema_management', 'ImportTileCache_management', 'ImportToolbox', 'ImportXMLWorkspaceDocument_management', 'Import_arc', 'IncreaseMaximumEdges_na', 'IncrementalSpatialAutocorrelation_stats', 'Index', 'IndexItem_arc', 'InsertCursor', 'Inside3D_3d', 'Int_3d', 'Integrate_management', 'InterpolateFromPointCloud_management', 'InterpolatePolyToPatch_3d', 'InterpolateShape_3d', 'Intersect3DLineWithMultiPatch_3d', 'Intersect3DLineWithSurface_3d', 'Intersect3D_3d', 'Intersect_analysis', 'Intersect_arc', 'IntersectingLayersMasks_cartography', 'Intervisibility_3d', 'IsClosed3D_3d', 'IsSynchronous', 'JSONToFeatures_conversion', 'JoinField_management', 'JoinItem_arc', 'KMLToLayer_conversion', 'KernelInterpolationWithBarriers_ga', 'Kriging_3d', 'LASToMultipoint_3d', 'LandXMLToTin_3d', 'LasDatasetStatistics_management', 'LasDatasetToRaster_conversion', 'LasDatasetToTin_3d', 'LasPointStatsAsRaster_management', 'LasPointStatsByArea_3d', 'Layer3DToFeatureClass_3d', 'LayerToKML_conversion', 'LineOfSight_3d', 'ListDataStoreItems', 'ListDatasets', 'ListEnvironments', 'ListFeatureClasses', 'ListFields', 'ListFiles', 'ListIndexes', 'ListInstallations', 'ListPortalURLs', 'ListPrinterNames', 'ListRasters', 'ListSpatialReferences', 'ListTables', 'ListToolboxes', 'ListTools', 'ListTransformations', 'ListUsers', 'ListVersions', 'ListWorkspaces', 'LoadSettings', 'LoadTopologyToParcelFabric_fabric', 'LocalOutlierAnalysis_stpm', 'LocalPolynomialInterpolation_ga', 'LocateFeaturesAlongRoutes_lr', 'LocateLasPointsByProximity_3d', 'LocateOutliers_3d', 'LogUsageMetering', 'Lookup_3d', 'MDPublisher_conversion', 'MXDToWebMap_server', 'MakeClosestFacilityLayer_na', 'MakeFeatureLayer_management', 'MakeGraph_management', 'MakeGridsAndGraticulesLayer_cartography', 'MakeImageServerLayer_management', 'MakeLasDatasetLayer_management', 'MakeLocationAllocationLayer_na', 'MakeMosaicLayer_management', 'MakeNetCDFFeatureLayer_md', 'MakeNetCDFRasterLayer_md', 'MakeNetCDFTableView_md', 'MakeODCostMatrixLayer_na', 'MakeOPeNDAPRasterLayer_md', 'MakeParcelFabricLayer_fabric', 'MakeParcelFabricTableView_fabric', 'MakeQueryLayer_management', 'MakeQueryTable_management', 'MakeRasterCatalogLayer_management', 'MakeRasterLayer_management', 'MakeRouteEventLayer_lr', 'MakeRouteLayer_na', 'MakeServiceAreaLayer_na', 'MakeTableView_management', 'MakeTrackingLayer_ta', 'MakeVehicleRoutingProblemLayer_na', 'MakeWCSLayer_management', 'MakeXYEventLayer_management', 'ManageGlobeServerCacheTiles_server', 'ManageMapServerCacheScales_server', 'ManageMapServerCacheStatus_server', 'ManageMapServerCacheTiles_server', 'ManageTileCache_management', 'MapServerCacheTilingSchemeToPolygons_cartography', 'MapToKML_conversion', 'MatchPhotosToRowsByTime_management', 'MeanCenter_stats', 'MedianCenter_stats', 'MergeDividedRoads_cartography', 'MergeMosaicDatasetItems_management', 'Merge_management', 'MetadataImporter_conversion', 'MigrateRelationshipClass_management', 'MigrateStorage_management', 'MinimumBoundingGeometry_management', 'MinimumBoundingVolume_3d', 'Minus_3d', 'Mirror_management', 'MosaicToNewRaster_management', 'Mosaic_management', 'MultiDistanceSpatialClustering_stats', 'MultiPatchFootprint_3d', 'MultipartToSinglepart_management', 'Multipatch', 'MultipatchToCollada_conversion', 'MultipatchToRaster_conversion', 'MultipleRingBuffer_analysis', 'Multipoint', 'NaturalNeighbor_3d', 'Near3D_3d', 'Near_analysis', 'Near_arc', 'NetCDFFileProperties', 'NumPyArrayToRaster', 'ObserverPoints_3d', 'OptimizedHotSpotAnalysis_stats', 'OptimizedOutlierAnalysis_stats', 'OrdinaryLeastSquares_stats', 'OverlayRouteEvents_lr', 'PDFToTIFF_conversion', 'PackageLayer_management', 'PackageLocator_geocoding', 'PackageLocator_management', 'PackageMap_management', 'PackageResult_management', 'Parameter', 'ParseFieldName', 'ParseTableName', 'PivotTable_management', 'Plus_3d', 'Point', 'PointDistance_analysis', 'PointDistance_arc', 'PointFileInformation_3d', 'PointGeometry', 'PointNode_arc', 'PointToRaster_conversion', 'PointsToLine_management', 'PolyRegion_arc', 'Polygon', 'PolygonNeighbors_analysis', 'PolygonToLine_management', 'PolygonToRaster_conversion', 'PolygonVolume_3d', 'Polyline', 'PolylineToRaster_conversion', 'PopulateAlternateIDFields_na', 'ProductInfo', 'ProjectRaster_management', 'Project_arc', 'Project_management', 'PropagateDisplacement_cartography', 'QuickExport_interop', 'QuickImport_interop', 'RadialBasisFunctions_ga', 'RandomNumberGenerator', 'Raster', 'RasterCatalogToRasterDataset_management', 'RasterCompare_management', 'RasterDomain_3d', 'RasterTin_3d', 'RasterToASCII_conversion', 'RasterToDTED_management', 'RasterToFloat_conversion', 'RasterToGeodatabase_conversion', 'RasterToMultipoint_3d', 'RasterToNetCDF_md', 'RasterToNumPyArray', 'RasterToOtherFormat_conversion', 'RasterToPoint_conversion', 'RasterToPolygon_conversion', 'RasterToPolyline_conversion', 'RasterToVideo_conversion', 'ReExportUnacknowledgedMessages_management', 'RebuildAddressLocator_geocoding', 'RebuildGeometricNetwork_management', 'RebuildIndexes_management', 'RecalculateFeatureClassExtent_management', 'ReclassByASCIIFile_3d', 'ReclassByTable_3d', 'Reclassify_3d', 'ReconcileVersion_management', 'ReconcileVersions_management', 'RecordSet', 'RecoverFileGDB_management', 'RefreshActiveView', 'RefreshCatalog', 'RefreshTOC', 'RegionClass_arc', 'RegionPoly_arc', 'RegisterAsVersioned_management', 'RegisterRaster_management', 'RegisterWithGeodatabase_management', 'RegularizeBuildingFootprint_3d', 'RematchAddresses_geocoding', 'RemoveAttachments_management', 'RemoveConnectivityRuleFromGeometricNetwork_management', 'RemoveDataStoreItem', 'RemoveDomainFromField_management', 'RemoveEmptyFeatureClassFromGeometricNetwork_management', 'RemoveFeatureClassFromTerrain_3d', 'RemoveFeatureClassFromTopology_management', 'RemoveFieldConflictFilter_management', 'RemoveFilesFromLasDataset_management', 'RemoveIndex_management', 'RemoveJoin_management', 'RemoveOverride_cartography', 'RemoveRastersFromMosaicDataset_management', 'RemoveRuleFromTopology_management', 'RemoveSpatialIndex_management', 'RemoveSubtype_management', 'RemoveTerrainPyramidLevel_3d', 'RemoveToolbox', 'Rename_management', 'Renode_arc', 'RepairGeometry_management', 'RepairMosaicDatasetPaths_management', 'RepairRasterCatalogPaths_management', 'RepairVersionMetadata_management', 'RepairVersionTables_management', 'ReplaceTerrainPoints_3d', 'Resample_management', 'Rescale_management', 'Reselect_arc', 'ResetEnvironments', 'ResetProgressor', 'ResolveBuildingConflicts_cartography', 'ResolveRoadConflicts_cartography', 'Result', 'ReverseGeocode_geocoding', 'Rotate_management', 'Row', 'RubbersheetFeatures_edit', 'S57Arc_arc', 'SDTSExport_arc', 'SDTSImport_arc', 'SaveGraph_management', 'SaveSettings', 'SaveToLayerFile_management', 'Schema', 'SearchCursor', 'SearchNeighborhoodSmooth', 'SearchNeighborhoodSmoothCircular', 'SearchNeighborhoodStandard', 'SearchNeighborhoodStandardCircular', 'SelectByDimension_md', 'SelectData_management', 'SelectFeatureByOverride_cartography', 'SelectLayerByAttribute_management', 'SelectLayerByLocation_management', 'Select_analysis', 'SendEmailWithZipFileAttachment_server', 'SetClusterTolerance_management', 'SetDefaultSubtype_management', 'SetFlowDirection_management', 'SetLasClassCodesUsingFeatures_3d', 'SetLayerRepresentation_cartography', 'SetLogHistory', 'SetMosaicDatasetProperties_management', 'SetParameter', 'SetParameterAsText', 'SetProduct', 'SetProgressor', 'SetProgressorLabel', 'SetProgressorPosition', 'SetRasterProperties_management', 'SetRepresentationControlPointAtIntersect_cartography', 'SetRepresentationControlPointByAngle_cartography', 'SetSeverityLevel', 'SetSubtypeField_management', 'SetValueForRangeDomain_management', 'SharePackage_management', 'Shift_management', 'SignInToPortal_server', 'SignOutFromPortal_server', 'SimilaritySearch_stats', 'SimplifyBuilding_arc', 'SimplifyBuilding_cartography', 'SimplifyLineOrPolygon_arc', 'SimplifyLine_cartography', 'SimplifyPolygon_cartography', 'SkylineBarrier_3d', 'SkylineGraph_3d', 'Skyline_3d', 'Slice_3d', 'Slope_3d', 'SmoothLine_cartography', 'SmoothPolygon_cartography', 'Snap_edit', 'SolveLocationAllocation_na', 'SolveVehicleRoutingProblem_na', 'Solve_na', 'SortCodedValueDomain_management', 'Sort_management', 'SpatialAutocorrelation_stats', 'SpatialJoin_analysis', 'SpatialReference', 'SplineWithBarriers_3d', 'Spline_3d', 'SplitByAttributes_analysis', 'SplitLineAtPoint_management', 'SplitLine_management', 'SplitMosaicDatasetItems_management', 'SplitRaster_management', 'Split_analysis', 'Split_arc', 'StackProfile_3d', 'StageService_server', 'StandardDistance_stats', 'StandardizeAddresses_geocoding', 'Statistics_analysis', 'StripMapIndexFeatures_cartography', 'SubsetFeatures_ga', 'SunShadowVolume_3d', 'SurfaceAspect_3d', 'SurfaceContour_3d', 'SurfaceDifference_3d', 'SurfaceSlope_3d', 'SurfaceVolume_3d', 'SymDiff_analysis', 'SynchronizeChanges_management', 'SynchronizeMetadata_conversion', 'SynchronizeMosaicDataset_management', 'TINCompare_management', 'TableCompare_management', 'TableSelect_analysis', 'TableToDBASE_conversion', 'TableToDomain_management', 'TableToEllipse_management', 'TableToExcel_conversion', 'TableToGeodatabase_conversion', 'TableToNetCDF_md', 'TableToRelationshipClass_management', 'TableToTable_conversion', 'TabulateIntersection_analysis', 'TerrainToPoints_3d', 'TerrainToRaster_3d', 'TerrainToTin_3d', 'TestSchemaLock', 'Thiessen_arc', 'ThinRoadNetwork_cartography', 'TigerArc_arc', 'TigerTool_arc', 'TileLas_3d', 'TiledLabelsToAnnotation_cartography', 'Times_3d', 'TinDomain_3d', 'TinEdge_3d', 'TinLine_3d', 'TinNode_3d', 'TinPolygonTag_3d', 'TinRaster_3d', 'TinTriangle_3d', 'Tolerance_arc', 'TopoToRasterByFile_3d', 'TopoToRaster_3d', 'TraceGeometricNetwork_management', 'TrackIntervalsToFeature_ta', 'TrackIntervalsToLine_ta', 'TransferAttributes_edit', 'TransformFeatures_edit', 'TransformRouteEvents_lr', 'Transform_arc', 'TransposeFields_management', 'Trend_3d', 'TrimLine_edit', 'TruncateTable_management', 'TurnTableToTurnFeatureClass_na', 'USGSMPTranslator_conversion', 'UncompressFileGeodatabaseData_management', 'Ungenerate_arc', 'Union3D_3d', 'Union_analysis', 'Union_arc', 'UnregisterAsVersioned_management', 'UnsplitLine_management', 'UpdateAnalysisLayerAttributeParameter_na', 'UpdateAnnotation_management', 'UpdateByAlternateIDFields_na', 'UpdateByGeometry_na', 'UpdateCursor', 'UpdateDiagram_schematics', 'UpdateDiagrams_schematics', 'UpdateEnterpriseGeodatabaseLicense_management', 'UpdateOverride_cartography', 'UpdateTrafficData_na', 'UpdateTrafficIncidents_na', 'Update_analysis', 'Update_arc', 'UpgradeDataset_management', 'UpgradeGDB_management', 'UpgradeMapServerCacheStorageFormat_server', 'UpgradeMetadata_conversion', 'UpgradeNetwork_na', 'UpgradeParcelFabric_fabric', 'UpgradeSpatialReference_management', 'UploadServiceDefinition_server', 'Usage', 'VPFExport_arc', 'VPFImport_arc', 'VPFTile_arc', 'ValidateDataStoreItem', 'ValidateFieldName', 'ValidateMetadataMultiple_conversion', 'ValidateMetadata_conversion', 'ValidateTableName', 'ValidateTopology_management', 'Value', 'ValueTable', 'VerifyAndRepairGeometricNetworkConnectivity_management', 'Viewshed2_3d', 'Viewshed_3d', 'Visibility_3d', 'VisualizeSpaceTimeCube2D_stpm', 'VisualizeSpaceTimeCube3D_stpm', 'WFSToFeatureClass_conversion', 'WarpFromFile_management', 'Warp_management', 'WorkspaceToRasterCatalog_management', 'WorkspaceToRasterDataset_management', 'XMLSchemaValidator_conversion', 'XSLTransform_conversion', 'XYToLine_management', 'ZRenderer_stats', '_NumPyArrayToRaster', '_RasterToNumPyArray', '__builtins__', '__doc__', '__file__', '__name__', '__package__', '__path__', '_base', '_ga', '_gptooldoc', '_graph', '_management', '_mapping', '_na', 'analysis', 'arc', 'arcobjectconversion', 'arcobjects', 'arcpy', 'cartography', 'conversion', 'convertArcObjectToPythonObject', 'da', 'ddd', 'edit', 'env', 'f', 'fabric', 'ga', 'geocoding', 'geometries', 'geoprocessing', 'glob', 'gp', 'imp', 'import_local', 'interop', 'lr', 'management', 'mapping', 'md', 'mixins', 'na', 'numpy', 'os', 'passthrough_attr', 'sa', 'schematics', 'server', 'stats', 'stpm', 'sys', 'ta', 'time', 'toolbox', 'utils', 'warnings']


For QGIS it is called 'iface', so we do dir(iface).
>>> dir(iface)
['__class__', '__delattr__', '__dict__', '__dir__', '__doc__', '__eq__', '__format__', '__ge__', '__getattr__', '__getattribute__', '__gt__', '__hash__', '__init__', '__init_subclass__', '__le__', '__lt__', '__module__', '__ne__', '__new__', '__reduce__', '__reduce_ex__', '__repr__', '__setattr__', '__sizeof__', '__str__', '__subclasshook__', '__weakref__', 'actionAbout', 'actionAddAfsLayer', 'actionAddAllToOverview', 'actionAddAmsLayer', 'actionAddFeature', 'actionAddOgrLayer', 'actionAddPart', 'actionAddPgLayer', 'actionAddRasterLayer', 'actionAddRing', 'actionAddToOverview', 'actionAddWmsLayer', 'actionAllEdits', 'actionCancelAllEdits', 'actionCancelEdits', 'actionCheckQgisVersion', 'actionCopyFeatures', 'actionCopyLayerStyle', 'actionCreatePrintLayout', 'actionCustomProjection', 'actionCutFeatures', 'actionDeletePart', 'actionDeleteRing', 'actionDeleteSelected', 'actionDraw', 'actionDuplicateLayer', 'actionExit', 'actionFeatureAction', 'actionHelpContents', 'actionHideAllLayers', 'actionHideDeselectedLayers', 'actionHideSelectedLayers', 'actionIdentify', 'actionLayerProperties', 'actionLayerSaveAs', 'actionManagePlugins', 'actionMapTips', 'actionMeasure', 'actionMeasureArea', 'actionMoveFeature', 'actionNewBookmark', 'actionNewProject', 'actionNewVectorLayer', 'actionOpenFieldCalculator', 'actionOpenProject', 'actionOpenStatisticalSummary', 'actionOpenTable', 'actionOptions', 'actionPan', 'actionPanToSelected', 'actionPasteFeatures', 'actionPasteLayerStyle', 'actionPluginListSeparator', 'actionProjectProperties', 'actionQgisHomePage', 'actionRemoveAllFromOverview', 'actionRollbackAllEdits', 'actionRollbackEdits', 'actionSaveActiveLayerEdits', 'actionSaveAllEdits', 'actionSaveEdits', 'actionSaveMapAsImage', 'actionSaveProject', 'actionSaveProjectAs', 'actionSelect', 'actionSelectFreehand', 'actionSelectPolygon', 'actionSelectRadius', 'actionSelectRectangle', 'actionShowAllLayers', 'actionShowBookmarks', 'actionShowLayoutManager', 'actionShowPythonDialog', 'actionShowSelectedLayers', 'actionSimplifyFeature', 'actionSplitFeatures', 'actionSplitParts', 'actionToggleEditing', 'actionToggleFullScreen', 'actionVertexTool', 'actionVertexToolActiveLayer', 'actionZoomActualSize', 'actionZoomFullExtent', 'actionZoomIn', 'actionZoomLast', 'actionZoomNext', 'actionZoomOut', 'actionZoomToLayer', 'actionZoomToSelected', 'activeLayer', 'addCustomActionForLayer', 'addCustomActionForLayerType', 'addDatabaseToolBarIcon', 'addDatabaseToolBarWidget', 'addDockWidget', 'addLayerMenu', 'addMeshLayer', 'addPluginToDatabaseMenu', 'addPluginToMenu', 'addPluginToRasterMenu', 'addPluginToVectorMenu', 'addPluginToWebMenu', 'addProject', 'addRasterLayer', 'addRasterToolBarIcon', 'addRasterToolBarWidget', 'addToolBar', 'addToolBarIcon', 'addToolBarWidget', 'addUserInputWidget', 'addVectorLayer', 'addVectorToolBarIcon', 'addVectorToolBarWidget', 'addWebToolBarIcon', 'addWebToolBarWidget', 'addWindow', 'advancedDigitizeToolBar', 'askForDatumTransform', 'attributesToolBar', 'blockSignals', 'browserModel', 'buildStyleSheet', 'cadDockWidget', 'childEvent', 'children', 'closeMapCanvas', 'connectNotify', 'copySelectionToClipboard', 'createNewMapCanvas', 'currentLayerChanged', 'currentThemeChanged', 'customEvent', 'dataSourceManagerToolBar', 'databaseMenu', 'databaseToolBar', 'defaultStyleSheetFont', 'defaultStyleSheetOptions', 'deleteLater', 'deregisterLocatorFilter', 'destroyed', 'digitizeToolBar', 'disconnect', 'disconnectNotify', 'dumpObjectInfo', 'dumpObjectTree', 'dynamicPropertyNames', 'editMenu', 'editableLayers', 'event', 'eventFilter', 'fileToolBar', 'findChild', 'findChildren', 'firstRightStandardMenu', 'getFeatureForm', 'helpMenu', 'helpToolBar', 'iconSize', 'inherits', 'initializationCompleted', 'insertAddLayerAction', 'installEventFilter', 'invalidateLocatorResults', 'isSignalConnected', 'isWidgetType', 'isWindowType', 'killTimer', 'layerMenu', 'layerSavedAs', 'layerToolBar', 'layerTreeCanvasBridge', 'layerTreeView', 'layoutDesignerClosed', 'layoutDesignerOpened', 'layoutDesignerWillBeClosed', 'mainWindow', 'mapCanvas', 'mapCanvases', 'mapNavToolToolBar', 'messageBar', 'messageTimeout', 'metaObject', 'moveToThread', 'newLayerMenu', 'newProject', 'newProjectCreated', 'objectName', 'objectNameChanged', 'openFeatureForm', 'openLayoutDesigner', 'openLayoutDesigners', 'openMessageLog', 'openURL', 'parent', 'pasteFromClipboard', 'pluginManagerInterface', 'pluginMenu', 'pluginToolBar', 'preloadForm', 'projectMenu', 'projectRead', 'property', 'pyqtConfigure', 'rasterMenu', 'rasterToolBar', 'receivers', 'registerCustomDropHandler', 'registerCustomLayoutDropHandler', 'registerLocatorFilter', 'registerMainWindowAction', 'registerMapLayerConfigWidgetFactory', 'registerOptionsWidgetFactory', 'reloadConnections', 'removeAddLayerAction', 'removeCustomActionForLayerType', 'removeDatabaseToolBarIcon', 'removeDockWidget', 'removeEventFilter', 'removePluginDatabaseMenu', 'removePluginMenu', 'removePluginRasterMenu', 'removePluginVectorMenu', 'removePluginWebMenu', 'removeRasterToolBarIcon', 'removeToolBarIcon', 'removeVectorToolBarIcon', 'removeWebToolBarIcon', 'removeWindow', 'saveStyleSheetOptions', 'sender', 'senderSignalIndex', 'setActiveLayer', 'setObjectName', 'setParent', 'setProperty', 'settingsMenu', 'shapeDigitizeToolBar', 'showAttributeTable', 'showLayerProperties', 'showLayoutManager', 'showOptionsDialog', 'signalsBlocked', 'startTimer', 'staticMetaObject', 'statusBarIface', 'takeAppScreenShots', 'thread', 'timerEvent', 'tr', 'unregisterCustomDropHandler', 'unregisterCustomLayoutDropHandler', 'unregisterMainWindowAction', 'unregisterMapLayerConfigWidgetFactory', 'unregisterOptionsWidgetFactory', 'vectorLayerTools', 'vectorMenu', 'vectorToolBar', 'viewMenu', 'webMenu', 'webToolBar', 'windowMenu', 'zoomFull', 'zoomToActiveLayer', 'zoomToNext', 'zoomToPrevious']


4) Get details from document/project file

ArcGIS document/project file is save with the extension .mxd, while QGIS extension is .qgz or .qgs. And the can be manipulated via their respective APIs as seen below:-

ArcGIS: 
# Create project instance...
prjt = arcpy.mapping.MapDocument('current')

# project file path and name..
prjt.filePath

# project title
prjt.title

# project date saved
prjt.dateSaved

# project author
prjt.author

# project layer count
arcpy.mapping.ListLayers(prjt)

# project coordinate reference system
prjt.activeDataFrame.mapUnits

# set project working space/folder
arcpy.env.workspace

# list shp in working space/folder
arcpy.ListFeatureClasses('*.shp')

QGIS:
# Create project instance...
prjt = QgsProject.instance()

# project name..
prjt.fileName() 

# project folder
prjt.homePath()

# project layer count
prjt.count()

# project coordinate reference system
prjt.crs()
prjt.crs().geographicCrsAuthId()
prjt.crs().authid()

# project bounds
prjt.crs().bounds().asWktCoordinates()