Sunday, August 7, 2016

Analyzing Level Field Table/Book with Python

Hello,

Let me try something new on level survey tabular data apart from the traditional reduction of the reduced levels from the observed Back-sight, Intermediate-sight and Fore-sight.

I know this is never a common task among surveyors. Nevertheless, am going to talk about it today. Am going to perform some unusual statistical analysis on a levelling table using python packages called numpy, pandas and matplotlib.

Levelling is the measurement of geodetic height using an optical levelling instrument and a level staff or rod having a numbered scale.

To determine the difference in level between points on the surface of the ground a 'series' of levels will need to be carried out.


There are two method of Booking levels reading:
1) Rise & Fall method and
2) Height of collimation (height of instrument) methods

The objectives of levelling survey are:-
1) Find the elevation of a given point with respect to the given or assumed datum.
2) Establish a point at a given elevation with respect to the given or assumed datum.


Given this level table (saved in .csv file format), am going to import it into python programming environment (Jupyter NoteBook) to perform some analyses on it using a python packages called numpypandas and matplotlib.

The easiest way to install all of these packages on python is to install python distribution package called: Anaconda. This will all the required packages and their dependencies on the fly!

Now that our python environment is setup, lets start analyzing our level book/table.

Here are some important links:-
1) You can download the NoteBook from here.
2) You can view it online here.
3) You can view it on GitHub here.

Try to read the above jupyter notebook, it contains every detail you need to know about this post. The complete python script is down below;-

import numpy as np
import pandas as pd
import matplotlib.pyplot as plt

# Lets enable our plot to display inline within notebook
get_ipython().magic('matplotlib inline')


# ### Loading the Levelling table in .csv format

level_table = pd.read_csv("Level_Book.csv")
level_table



# lets replace the NaN with 0.000. But first let see the titles of the colums

level_table.columns



# Replace NaN with empty space for all columns

level_table_cleaned = level_table.fillna('')
# level_table_cleaned = level_table.replace(np.nan, '')
level_table_cleaned


# Replace white space with 0.000 for all columns

level_table_zeros = level_table.fillna(0.000)
level_table_zeros


# Lets see Descriptive statistical values for the table
level_table_zeros.describe()


# # Checking Level Error
# Sum each column and save it in redefine column name as variable
BS = level_table["Back Sight"].sum()
FS = level_table["Fore Sight"].sum()
Rise = level_table["Rise"].sum()
Fall = level_table["Fall"].sum()

# Use round() method to round Rise to 3 decimals
Rise = round(Rise, 3)

# Getting the values of the FRL and LRL
FRL = level_table["Reduce Level"][0]
LRL = level_table["Reduce Level"][18]

BS, FS, Rise, Fall, FRL, LRL

# Checking the arithmetic for error
# the result should be turple of thesame number.
BS - FS, Rise - Fall, LRL - FRL


# 4) Visualise the Profile (Distance Vs Reduce level)

x = level_table["Distance"]
y = level_table["Reduce Level"]

plt.figure(figsize=(20, 7), facecolor='orange')

plt.plot(x, y, linestyle="dashed", marker="o", color="blue")

plt.title("Profile Drawing", size=30, color="white")
plt.xlabel("Distance (m)", size=20, color="white")
plt.ylabel("Reduce Level (m)", size=20, color="white")

plt.xticks([5, 15, 25, 35, 45, 55, 65, 75, 85, 95, 105, 115, 125, 135, 145, 155, 165, 175], size=15)
plt.yticks([10, 10.5, 11], size=15)

plt.grid()

plt.show()



Thanks for reading.

No comments:

Post a Comment