Wiki » History » Version 110
« Previous -
Version 110/317
(diff) -
Next » -
Current version
Selvarani C, 27/02/2021 14:46
- Wiki
- Collaborations
- Editing shapefiles in Autocad
- Shapefiles
- Survey data
- Data analysis
- CSR server
- Database
- Ambient Weather weather station
- Notes on Console
- Plan
- GDAL (OGR) tools
- Links
- Data (measurements auxiliary tables)
- Access to data
- How to create a new projection in QGis
- Elimination of Duplicate points – General criteria
- Linework for the Survey Area
- 1. Creation of Initial Linework in QGIS using Survey points import - (Ram, System 4)
- Note: The line shapefiles / Geopackages shall be in CRS: TM AUSPOS
- 2. Creation of final working drawing Shapefiles / Geopackages - (Selvarani, System 1)
- The CRS for the export shall be EPSG:4326 - WGS 84
- The CRS for the export shall be EPSG:4326 - WGS 84
- Linework for the whole Survey Zone
- 1. Merging Shapefiles / Geopackages - (Selvarani, System 1)
- 2. Storing Shapefiles / Geopackages - (Selvarani, System 1)
- 3. Topology check of merged shapefiles
- 4. Archive and replace the Master Shapefiles / Geopackages (Ram, System 4)
- 5. Note about Shapefiles and Geopackages
Wiki¶
About the Auroville CSR Geomatics Studio¶
We are a team working on geographical and related information with an engineering and scientific approach. The concept of geomatics is explained here: https://en.wikipedia.org/wiki/Geomatics.
We develop, maintain and publish data on this web site: https://gis.auroville.org.in.
Team¶
Currently, the team consists of:
- Bala
- Giulio
- Philippe
- Raj
- Ram
- Selvarani
Collaborations¶
Quick report of the collaborations the Geomatics Team the areas of work.
Collaborations¶
Collaborations/coordination with other other groups, etc
CSR Geomatics Description of Activity / Research / Project | AA | AVF | LB | FAMC | WG | CSR | Talam | DST |
---|---|---|---|---|---|---|---|---|
Topographic Survey of Auroville City Area in coordination with L’avenir d’Auroville: Matrimandir entire compound and Residential Zone Sector 1 completed, Sector 2 half completed | yes | yes | ||||||
Topographic Survey of specific projects area: Vibrance, Wasteless, Cultural Zone along the Crown Road, Baraka, Gardens of Unexpected | ||||||||
Collective wastewater treatment systems health check-up survey: 68 plants evaluated | yes | yes | yes | yes | ||||
Manual weekly monitoring of water level in selected wells on Auroville land on approximately 50 wells (number fluctuates depending on local conditions) | yes | yes | yes | |||||
Collection of rainfall data through manual raingauges distributed to Aurovilians: data received regularly from at least 7 raingauges | yes | yes | ||||||
Collection of weather data through automatic weather station installed at CSR: data collected every minute, stored in the database, and published online in real time | yes | yes | yes | |||||
Collaboration with Land Board for survey of identified land boundary stones: collection of coordinates of Government boundary stones for georefering of cadastral maps | yes | yes | ||||||
Collaboration with AV Foundation for compilation of land ownership map: geographic land records as provided by AV Foundation, protected by login access | yes | |||||||
Collaboration with L’avenir d’Auroville for data sharing and coordinated system set-up: organisation of geographic data for unique online platform to enhance planning processes (initial stage) | yes | |||||||
Collaboration with L’avenir d’Auroville and FAMC for database links: structural work on procedures for linking databases already existing with various agencies | yes | yes | ||||||
Study of application of drone topographic survey in Auroville context: pilot test conducted on Residential Zone with outside drone survey agency | yes | yes | yes | |||||
Study of open source software for drone topographic survey: testing of softwares like OpenDroneMap, MeshLab, CloudeCompare to evaluate work flow, processes and limitations | yes | yes | ||||||
Surface water modelling (ongoing): research on evaluation of surface water collection potential through automatic processing of field data | yes | yes | ||||||
Collaboration with TALAM on research on radio-transmission for automatic wells level monitoring (ongoing): evaluation of application of radio-transmitted signals for water-related monitoring and metering (selected borewells, flow meters) | yes | yes | ||||||
Organisation of data collected in Auroville in the past (ongoing): standardisation of data in geospatial database | yes | |||||||
Provision for basic statistical analyses of geographic-related data (indicators/dashboard) | yes | yes | ||||||
Publication of collected data in standardised format on web portal gis.auroville.org.in | yes | yes |
- AA : L'avenir d'Auroville
- AVF : Auroville Foundation
- LB : Auroville Land Board
- FAMC : Auroville Funds and Assets Management Committee
- WG : Auroville Water Group
- CSR : Auroville Centre for Scientific Research
- TALAM : a project under CSR
- DST : Department of Science and Technology, Ministry of Science and Technology, New Delhi
Our workflow¶
Surveys¶
Most field surveys are executed by our team of surveyors, using advanced DGPS equipment.
Other surveyors might also be contracted.
CAD¶
The survey data are imported to a CAD software (Aurocad/Civil 3D).
Editing shapefiles in Autocad¶
FEATURES CREATION¶
1. Assign a CRS to the drawing (TM-AUSPOS) (MAPCSLIBRARY command)
2. Create features in CAD (Points, lines, polygons)
3. Export shapefile (a) from CAD (Output > DWG to SDF) (Convert to LL84 – 3D)
FEATURES IMPORT INTO DB¶
4. Create zip file of the shapefile
5. Upload into the GISAF Shapefiles Basket
6. Import the shapefile into DB
7. Delete the shapefile from Local Machine
FEATURES EDITING¶
8. Open the table in QGis
9. Save as a shapefile (b) in TM AUSPOS CRS
10. In CAD, open a new drawing and assign ASUPOS CRS
11. Import the shapefile (b) (MapImport) with all Object Data
12. Edit features
13. Export shapefile (a) from CAD (Output > DWG to SDF) with ONLY the id selected (Data Tab > Select Attributes > Object Data > Filename > id) (Convert to LL84 – 3D)
FEATURES IMPORT INTO DB¶
14. Create zip file of the shapefile
15. Upload into the GISAF Shapefiles Basket
16. Import the shapefile into DB
17. Delete the shapefile from Local Machine
QGis¶
Conventions¶
Shapefiles¶
We work with QGis , a widely used open source, free software for working on maps and geographical data.
Shapefile is a standard file format for mapping, that Gisaf can import. QGis can open and save shapefiles.
We have defined some simple norms about these shapefiles for integration with Gisaf.
Coordinate system¶
We use CRS SRID 32644.
Column (attributes)¶
- All objects in a shapefile (layer) have a unique identifier named "id": numerical value.
Field (attribute) names¶
- All fields are lower case (
UpperCase, use:lowercase
)
- They do not contain spaces, but underscores (
not this, use:but_that
)
- The field names cannot have more than 8 characters (
this_is_too_long, use:this_short
)
Attribute types¶
- The dates are written in ISO format:
YYYY-MM-DD
(eg.1968-02-25
)
Fields to remove¶
- Eventually, remove the fields containing the coordinates: northing, easting, elevation, latitude, longitude, area, length, etc (these might be present when the data is imported from speadsheet)
Foreign keys¶
We often deal with labels or categories of objects in a layer.
A common use case to explain: Alice creates a layer of stones, and wants to tag each stone with a rating: these are picked from a list of choices, like: Beautiful, Interesting, Pretty, etc.
For these kind of attribute: define a column like rating_id
(something ending with _id
) as a numerical value, and assign values 1, 2, 3, etc. The text is defined in another table (typically a CSV file), that looks like:
id | name |
---|---|
1 | Beautiful |
2 | Interesting |
3 | Pretty |
Code¶
We have defined a standard set of codes, that defines the type of data. They can be found here (TODO: add link).
Add a column code_name
, matching with one the code, eg. V25
for TDEF.
Surveyor¶
We keep a record of the people who realized the surveys (the surveyors).
The shapefiles must contain an attribute srvyr_id
, which refers to this table (TODO: add link).
Accuracy¶
We keep a record of the accuracy of the surveys.
The shapefiles must contain an attribute accur_id
, which refers to this table (TODO: add link).
Date of survey¶
As nothing is absolutely permanent, it's also important to keep track of the date of the surveys: the shapefiles must contain an attribute date
.
Working with Gisaf¶
Survey data¶
Raw survey data are contained in CSV files, typically downloaded from surveying instruments.
See more information of the process for the survey data (including a flow diagram): Survey data
Survey data¶
Workflow summary¶
Import basket¶
Gisaf provides an "import basket" dedicated for raw survey data, which is generated by land survey equipment (Leica's Total Station and RTK).
These are CSV files, like:
100081,370633.969,1327742.157,51.187,,, 100083,370628.876,1327702.913,51.565,T52,, 100082,370628.729,1327720.019,51.261,T52,, 100081,370633.969,1327742.154,51.179,,, 100083,370628.876,1327702.913,51.565,T52,, 20800,370633.969,1327742.154,51.180,,, 20801,370618.795,1327713.172,52.817,E30,, 20802,370623.674,1327711.436,51.283,B35,, 20803,370619.314,1327713.407,51.383,B35,,
Each category (5th column) must be defined in the Category table (see Categories).
Organization of the raw survey data basket¶
The basket should be organized in a directory structure:
- Project name (these can be themselves put in a hierarchy of (sub)directories)
- Surveyor's organization
- Equipment (eg. TS, RTK)
- Survey files (eg. Our_project-Some_comment-2018-02-23.txt
)
Format of the survey file names¶
Our_project-Some_comment-2018-02-23.txt
The date of the survey follows the ISO date standard: YYYY-MM-DD
.
Import to the database¶
When importing raw survey data files to the database, Gisaf does 2 steps as described below.
It's worth noting that, in this process, no reprojection is done.
Feed the raw_survey table¶
Each point of the imported raw survey data file is inserted to the raw_survey table:
- Creation of a Point geometry: the raw_survey table has a geometry column for a single point (
geom
) with x,y and z coordinates - Save the
id
of the original point to theorig_id
column - A unique
id
is computed from the following fields:id
,project
,equipment
,date
- The project is saved in the
project_id
column - The surveyor identification in
srvyr_id
- The date of survey is saved in the
date
column - The accuracy is tagged in the
accur_id
, according to a mapping defined in theaccuracyequimentsurveyormapping
table, which depends on the surveyor and equipment - The category of the point
Feed the RAW_V_*
tables¶
From the raw_survey
table, each point is then copied to its respective RAW_V_
table, with basically the same information.
These tables (which should be created manually or with the admin notebook called create_tables
, as of today's writing), only contain points.
The project is saved along: see below.
Import the points¶
For categories that define points (opposite to lines and polygons, which require line work carried over in CAD or in a GIS software, see Line work), the points can be imported automatically to their final destination: the V_*
tables.
Note: in this process, the geometries are reprojected.
Auto import of the points¶
The points found in the RAW_V_*
tables can be imported automatically, project per project, the project page of the admin interface.
Import of the line work (lines and polygons)¶
See Line work
The shapefiles generated manually (line work) should be put in the project's basket, and imported from it.
Categories¶
The categories define the types of the geographical features and they are mapped according to ISO standard layer naming conventions: see https://www.nationalcadstandard.org/ncs5/pdfs/ncs5_clg_lnf.pdf
Gisaf uses:
- a table
category
where the layers are defined - a table per category
Fields for the categories¶
TODO
Creation of the RAW_* tables¶
This step must be done manually (as of today's date of writing).
QGis: work on shapefiles¶
Go to shapefiles.
Data analysis¶
We use Jupyter , Pandas and GeoPandas , accessible at http://gis.auroville.org.in/notebooks .
For integration in the processes (execution of notebooks), there's papermill . Systemd timers are used to automatically schedule the notebooks on the server, ie. for the dashboards.
There's a dedicated virtual machine for Jupyter, accessible from our local network at jupyter.csr.av
.
Organization of notebooks¶
The setup is organized in 2 parts, that are run with 2 instances of Jupyter for security reasons.
Admin¶
The notebooks in the admin are mostly for maintenance: operations on the database, etc.
Users¶
The notebooks are organized in folders, all under Gisaf's source code git repository, except the "Sandbox" one.
This notebook server connects to the database with a specific user (jupyter
), which has been set on the database server with permissions to read all data (readonly
) plus has write access to some tables dedicated to store analysis results.
Integration with Gisaf¶
The notebook in Templates
demonstrates the usage of notebook in relation with Gisaf: mostly, how to use the gisad.ipynb_tools
module to access Gisaf models and the data from the database.
This module is part of gisaf: https://redmine.auroville.org.in/projects/gisaf/repository/revisions/master/entry/gisaf/ipynb_tools.py
References¶
Geopandas¶
Some nice examples of processing, using water shed and rain: https://geohackweek.github.io/vector/06-geopandas-advanced/
Integration¶
A good example of how a company has integrated the same tools: https://medium.com/netflix-techblog/scheduling-notebooks-348e6c14cfd6
Other docs¶
From Swathi¶
Hosting¶
The team is located in the CSR of Auroville.
We have setup a server for hosting the software and database. CSR_server for technical information about the setup.
CSR server¶
dream.csr.av (192.168.0.12)
- Debian 9
- Xen hypervisor
- libvirt for the orchestration of VMs
- management with ansible
Dom0¶
Installation¶
Found minor issues with the installation (eg. issues with HDDs, Dell EFI boot).
This document starts from a working Debian Xen server.
Installed on a mirror of 2*2TB drives with btrfs.
Update (see #7156): re-installed the OS on /dev/sdc2, ext4, without RAID/LVM.
Storage for domUs¶
A LVM dream.csr has been created for the domUs.
Networking¶
With systemd.networkd: bridge "br0" to the physical ethernet.
Create a VM¶
Adjust the parameters from:
export vm_name=infra.csr.av export vm_ip_addr=172.16.0.3 export vm_password=foobar
Create the domU:
root@dream:~# xen-create-image --broadcast=172.16.0.255 --dist=stretch --fs=ext4 --gateway=172.16.0.1 --hostname ${vm_name} --ip=${vm_ip_addr} --lvm=dream.csr --maxmem=512M --memory=256M --mirror=http://ftp.de.debian.org/debian/ --netmask=255.255.255.0 --password=${vm_password} --size=10G --swap=1G --vcpus=1 --bridge=br0
Note that IP address will be set in the VM, and the vm_ip_addr isn't actually used.
DomUs¶
Migrate XL to libvirt¶
After creation using xen-create-image, migrate the definition of the domU to libvirt:
virsh -c xen:/// domxml-from-native xen-xm /etc/xen/${vm_name}.cfg > /tmp/${vm_name}.xml virsh define /tmp/${vm_name}.xml
From this point onward, one can log out from the dom0's console and use virsh or "Virtual Machine Manager" from your computer to administer the VM, eg:
- Set "Auto start" on dom0 boot
- Set memory limits, etc
Start the domU¶
Use "Virtual Machine Manager" or the command:
virsh -c xen+ssh://root@dream.csr.av/ start ${vm_name}
To do in a libvirt shell¶
Start a libvirt shell, with "Virtual Machine Manager" or with the command:
virsh -c xen+ssh://root@dream.csr.av/ console ${vm_name}
Log in as root in the libvirt console.
Network config¶
Add /etc/systemd/network/50-wired.network
(adjust the IP):
[Match] Name=eth0 [Network] Address=192.168.0.14/24 Gateway=192.168.0.10 DNS=192.168.0.10
Then:
systemctl enable --now systemd-networkd.socket systemctl enable --now systemd-networkd.service systemctl disable --now networking.service
Post-install ssh config¶
Allow ssh root login with password in /etc/ssh/sshd_config
:
sed -i -e 's/#PermitRootLogin prohibit-password/PermitRootLogin yes/' /etc/ssh/sshd_config systemctl restart ssh.service
From this point onwards, one can close the console session opened via libvirt
.
DNS¶
Log in to the local DNS server with:
ssh root@infra.csr.av
Update the 2 files in /etc/bind/zones
(zones/db.csr.av
zones/db.192.168.0
) and reload the DNS with:
rndc reload
DNS¶
Log in to the local DNS server, update the 2 files in /etc/bind9/zones
, and run:
rndc reload
Allow yourself to log in easily with your ssh key¶
Copy your ssh key to the domU: run from your own computer:
ssh-copy-id root@${ip_addr}
Ansible¶
Using the Ansible project developed in Blue Light: https://redmine.bluelightav.org/projects/ansible
Prepare the host¶
Install Python
apt install -y python
Database¶
The Postgis database runs on its specific domU (gisdb.csr.av, 192.168.0.18).
Installation¶
After installing the Postgis package (eg. assign the "postgis" Ansible's role), follow up to Db-support
Jupyter¶
The Jupyter server runs on its specific domU (jupyter.csr.av, 192.168.0.19).
Installation¶
See #6990 .
Backup and restoration of the database¶
Database¶
Troubleshooting¶
Layers missing in the map's tree¶
Gisaf relies on counting features through Postgres statistics collector subsystem.
In case the server is restarted dirty (eg. without clean shutdown), then the count of the tables might be wrong or just 0, leaving the layers apparently empty and thus not even appearing.
The fix is as easy as:
sudo -u postgres psql avgis -c VACUUM
Installation¶
This documentation assumes that the Postgis package has been installed (see CSR_server).
Configure the server¶
Allow connections from other hosts in the local network¶
Set the server to listen to addresses, set listen_addresses to *
in /etc/postgresql/9.6/main/postgresql.conf
.
Allow the connections, add in /etc/postgresql/9.6/main/pg_hba.conf
:
host all all 192.168.0.0/24 md5
Creation of the database¶
As postgres
user:
createdb -E utf8 -T template0 avgis
Backups¶
Primary¶
The database is backed up every day at midnight. The dump file is located in /var/backups/postgres/
.
Secondary¶
There are other backups (daily, weekly, monthly) thanks to Debian package autopostgresqlbackup
), located (default) in /var/lib/autopostgresqlbackup
.
Tertiary (dom0)¶
The whole virtual machine is backed up by BackupNinja on the "dom0" controller, using:
- rdiff backups every day
- tar files on Saturdays.
See /etc/backups.d
on the dom0 (192.168.0.12).
Remote¶
TODO: remote backup.
Restoration¶
If the VM is not shutdown properly, there's a chance that the database is corrupt, and needs to be restored from one of the backups.
After the restoration, restart gisaf:
systemctl restart uwsgi.service
From primary backup¶
Note: the roles aren't restored with this method.
With user postgres
:
# Optionally, rename the corrupt database (selecting a name for a database like "avgis_c2")... psql -c "ALTER DATABASE avgis RENAME TO avgis_c2;" # ... or drop the existing database psql -c "drop database avgis;" # Create a new database: createdb -E utf8 -T template0 avgis # Restore the database pg_restore -d avgis /var/backups/postgres/avgis.pg_dump
From secondary backup¶
autopostgresqlbackup
backs up the roles in postgres_globals
.
zcat /var/lib/autopostgresqlbackup/daily/postgres_globals/postgres_globals_2018-10-24_06h25m.Wednesday.sql.gz | psql zcat /var/lib/autopostgresqlbackup/daily/avgis/avgis_2018-10-24_06h25m.Wednesday.sql.gz | psql
Gear¶
Survey equipment¶
See survey equipment
Weather station¶
See Ambient_Weather_weather_station
Ambient Weather weather station¶
We have purchased a WS2902A weather station (https://www.ambientweather.com/amws2902.html).
Firmware version: 4.0.2.
Manual¶
The operating manual of the weather station can be found at https://p10.secure.hostingprod.com/@site.ambientweatherstore.com/ssl/Manuals/WS-2902C.pdf
Connection¶
Wifi¶
Set up the wifi of the console using the "Ambient Tool" phone application. IP address given by DHCP on the router is: 192.168.1.101
Local communication¶
Fail so far: the only exposed port is TCP/45000. Telnet doesn't show any activity. Nothing found on Internet on this protocol.
One interesting project may come, hijacking the connection to cloud services: https://www.wxforum.net/index.php?topic=35033.0
Cloud connection¶
We'll create an account on AmbientWeather.net (and eventually on WUnderground.net and/or weathercloud.net), and:
- have the console upload data to there
- gisaf to retrieve our WS data from there
Notes on Console¶
The daily rainfall data displayed in the console resets at 00.30 every night
Plan for future¶
Beside living well, see plan.
Plan¶
Some interesting projects that might be integrated:
- https://github.com/Oslandia/albion : Build 3D geological model from wells information
Other¶
GDAL (OGR) tools¶
GDAL is a translator library for raster and vector geospatial data formats. It is used by many software (including QGIS and many other open source ones, including Gisaf). Some command line utilities are supplied, like:
ogr2ogr
can easily convert one data format to anotherogrinfo
displays information about files.
Using Windows¶
On a computer with Windows and GQIS installed:
1. Open a command line console (eg. <Windows Key> to display the Start menu, then just type cmd
and <Enter>)
2. In the console window, type (adjust with the QGIS version and location, this seems to be the standard one):
"c:\Program Files\QGis 3.10\OSGeo4W.bat"
3. GDAL utilities can be used: ogr2ogr
, etc.
Example: convert Geopackage to Shapefiles¶
Output the content of the geopackage 9wdoogfr_2019-11-13_12_26_07.gpkg
to the folder shapefiles
:
ogr2ogr -progress -f "ESRI Shapefile" shapefiles 9wdoogfr_2019-11-13_12_26_07.gpkg
To output the content of the geopackage 9wdoogfr_2019-11-13_12_26_07.gpkg
to the root
folder:
ogr2ogr -progress -f "ESRI Shapefile" c:\shapefiles 9wdoogfr_2019-11-13_12_26_07.gpkg
With reprojection¶
Same as above, reprojecting to UTM44N:
ogr2ogr -progress -f "ESRI Shapefile" -t_srs EPSG:32644 c:\shapefiles 9wdoogfr_2019-11-13_12_26_07.gpkg
Links and references¶
Links¶
Water management¶
Modflow¶
The reference software for underground water modelling and simulation.
In conjunction with flopy (https://water.usgs.gov/ogw/flopy/) and Jupyter (https://jupyter.org/), it provides a relatively easy to use interface.
Freewat¶
This project is partly based on modflow, and integrates with QGis.
QGIS¶
- Tools for Geology
Construction of geological cross sections in QGIS - http://www.geokincern.com/?p=1452
Autocad¶
- Overview of Converting Geospatial Data to Drawing Objects:
http://docs.autodesk.com/CIV3D/2013/ENU/index.html?url=filesMAPC3D/GUID-C38FD485-3CC2-4B52-8264-0D8C0F45422B.htm,topicNumber=MAPC3Dd30e41809
- CAD-DB connection:
https://knowledge.autodesk.com/support/autocad-civil-3d/learn-explore/caas/video/youtube/watch-v-AQoB--nyUJA.html
Orfeo¶
Remote sensing
Old docs¶
Shapefiles
Data (measurements, auxiliary tables)
Data (measurements auxiliary tables)¶
Besides the importation of shapefiles, Gisaf can import non-geophical information: auxiliary data (typically categories like the list of locations names, well types, etc), and temporal informations (well levels, etc).
Command line¶
The import_to_db.py
script imports files, fetched from a set of URLs (typically, in the Redmine Files section of this project), formats and pre-process, and imports to the database.
import_to_db.py
is a support tool, that is planned to be integrated with the web interface.
Import all with:
phil@phil-mbp:~/BlueLight/gisaf_src/gisaf$ python import_to_db.py
The script currently accepts an argument for filtering the URLs to import.
Pavneet's docs (imported from gisaf's wiki)¶
Rules of Map making - What all Maps should have!
Survey Database
Field to finish
Survey Data Post-Processing
Wells Documentation
Civil 3D useful commands
Online references for Civil 3D
connections in QGIS- using browser panel and Add postGIS
Reconcilation of Raw survey data using pgAdmin
importing RAW data to GISAF
Editing Z value of features in Shapefiles in QGIS
Miscellaneous- Civil 3D
Documentation- Rain Gauge
Documentation- Wells Monitoring (Piezometer by Bala)
Documentation- Flow meter, by Bala
Documentation- DST- Vegetation Indexing
Documentation- DST- Interpolation
Documentation- DST- Survey- Office workflow
From CAD to GIS, by Giulio
QGIS- Miscellaneous
Giulio's documentation¶
Documentation - Reconciliation of points using Gisaf
Documentation - Status and Status Changes
Documentation - Tags retained after re-import of same geometry
Access to data¶
Connection to server directly from CSR¶
To connect to the server directly without going through Aurinoco server, the correct url is
http://gis.csr.av
Connection to Gisaf via QGis through WFS / OGC API¶
This works only on QGis from version 3.14.15 onward
In the browser, click on WFS/OGC API, then right-click to create a new connection
Give a name (e.g. OGC API Qgis Gisaf)
Give the url https://gis.auroville.org.in/ogcapi
Under the WFS Options box, on Version dropdown, the default option "Maximum" works just fine
Click on OK
The list of layers will appear in the Browser under WFS/OGC API.
How to create a new projection in QGis¶
To create a new projection in QGis, go to menu "Settings", and click on "Custom Projections".
A pop-up window appears with a list of all projections defined in QGis projects used by the user so far.
Click on the green "+" sign on the right top part of the window to create a new projection.
In the "Name" box, type "TM CSRAUSPOS SF1" (which means TM = Transverse Mercator projection; CSRAUSPOS = theparameters for this projection are derived from the processing of DGPS raw data by AUSPOS - Online GPS Processing Service - https://www.ga.gov.au/scientific-topics/positioning-navigation/geodesy/auspos; SF1 = Scale Factor is 1).
In the "Format" dropdown list, select "Proj String (legacy - Not Recommended)"
In the "Parameters" box, paste the following "+proj=tmerc +lat_0=12.01605433+lon_0=79.80998934 +k=1 +x_0=370455.630 +y_0=1328608.994 +ellps=WGS84+towgs84=0,0,0,0,0,0,0 +units=m +no_defs".
Finally, click on OK.
In a more explicit way, the parameters mean the following:
Map Projection: TransverseMercator (TM)
False Easting: 370455.6300
False Northing: 1328608.9940
Latitude of Origin: 12°00'57.79560" (DMS) 12.01605433 (DD)
Central Meridian: 79°48'35.96164" (DMS) 79.80998934 (DD)
Scale Factor: 1.00000000
Zone Width: 6.0°
Elimination of Duplicate points – General criteria¶
It might happen that the same physical feature (e.g. a tree, or a pole) is surveyed more than once: this can happen because there are many physical features in an area, and the survey needs more than one station. So, for example a tree is surveyed from a station, and gets a serial number on that date. When the station is then changed, it might happen that the same tree is resurveyed: another serial number is given, and possibly a different date, if the survey from the second station happened on a different day.
It is clear that the same tree is then represented with two different points, which means that two different trees exist: but only one tree really exist in the physical reality.
It is clear that one of the two points is redundant and needs to be removed. If this is noted by the surveyor directly in the field, then the issue is solved by the surveyor himself during processing time.
If instead, due to various reasons, it was not noted by the surveyor in the field, it will need to be cleaned after the processing, possibly by post-processing staff.
How to identify duplicate points?
The following criteria can be used:
1. The distance between the two points is less than 30 cm (trees are surveyed if their trunk diameter is at least about 20 cm, so in 30 cm cannot exist two of them)
2. The orig_id (serial number) of the points are not in series
3. The survey date is not the same
4. In case of trees, the species of trees is the same
5. 5. In case of trees, the tree type is not TDEF (because TDEF are mapped irrespective of their diameter, so they can actually have a small trunk, and two of them might exist in 30 cm), not OT (many TDEF species are surveyed as OT if not otherwise indicated by a botanist)
6. The context needs to be evaluated: if one tree is deleted in an area where many trees exist in a limited space, then loosing one in the map is not a big error. If instead one tree is deleted where there are very few trees, then it might be a big loss.
Linework for the Survey Area¶
1. Creation of Initial Linework in QGIS using Survey points import - (Ram, System 4)¶
Initial Linework in QGIS is started by surveyor with the knowledge from the Field. For this step, points are simply imported into the QGIS from the field text file (.csv or .txt). CRS needs to be TM-AUSPOS. The box of “First record has field names” shall not be ticked. In Point Coordinates, select the correct field for x, for y and for z (usually “field_2” for x, “field_3” for y and “field_4” for z). Points can be styled using the “Categorized” style in “Symbology”, using “Field_5” as value, or using a Rule-based symbology using the category (field 5) as filter.
Linework is created by connecting points having same description and belonging to the same physical feature. All line and polygon features are created as lines.
The Initial Linework for the Survey Area is also stored temporarily in
D: > AVSM > Zone-Survey number (eg RZ-01) > Survey Area (eg J) > Temporary WD
Note: The line shapefiles / Geopackages shall be in CRS: TM AUSPOS¶
2. Creation of final working drawing Shapefiles / Geopackages - (Selvarani, System 1)¶
Final working drawing Shapefiles / Geopackages are created from the Initial Linework of Survey Area.
As the Surveyor draws all features as lines (both for lines and polygons features), the following actions shall be done:
1. If features are lines:
• Export the shapefile / geopackage into the final working drawing folder (Final WD), in separate folders according to its type (e.g. BLDG, FENC, ROAD, etc).
The CRS for the export shall be EPSG:4326 - WGS 84¶
2. If features are polygons:
• Lines shall be converted into polygons:
to do it, first click on the layer to be converted to make it active (e.g. WD-CZ-01-F-LL84_V-BLDG-MHOL------E), then go to “Vector” Menu, click on Geometry Tools, click on Line to Polygons:
The new window for “Lines to Polygons” conversion will appear:
• Always cross check the input layer, to make sure that the input layer is the active one
• Save the output in a temporary layer
• The temporary layer will be listed in the list of layers, it shall be exported to the saving location as D: > Survey > Zone-Survey Number > Final WD > Survey Area SHP (eg . D: > Survey > GB-01 > Final WD > A-Shp)
The CRS for the export shall be EPSG:4326 - WGS 84¶
Once all the shapefiles / geopackages are exported in Final WD, for each of the newly exported layers the Topology Checker Tool shall be used.
Linework for the whole Survey Zone¶
1. Merging Shapefiles / Geopackages - (Selvarani, System 1)¶
A copy of the Zone Master shapefiles / geopackages are taken from System 4 and stored in Temp Folder on Desktop in System 1.
Master shapefiles / geopackages are merged with the Survey Area shapefiles / geopackages:
• To do it, go to “Vector” Menu, click on Geoprocessing Tools, then click on Union:
The new window for “Union” will appear:
• To make sure that the right geometry is generated by this process (“line” type, not “Multiline”, and similarly “Polygon” type, not “Multipolygon), we need to always keep the Master shapefile (e.g. Final-CZ-01-2021-02-05-LL84_V-BLDG-MHOL------E) as Input layer, and the Survey Area shapefile as Overlay Layer (e.g. WD-CZ-01-F-LL84_V-BLDG-MHOL------E).
• (The output can be saved to a file, as the CRS should already be EPSG4326 – WGS84.)
2. Storing Shapefiles / Geopackages - (Selvarani, System 1)¶
Save the merged shape file in the correct location in Final folder as D: > Survey > Zone-Survey Number > Final (eg . D: > Survey > GB-01 > Final)
Date in the name of Final Shapefile / Geopackage needs to be updated.
Once the merging operation is completed, the copy of Master shapefile / geopackage is deleted from the Temp folder.
3. Topology check of merged shapefiles¶
The topology checker is applied again on the merged shapefiles / geopackages.
The “id_field” shall be removed from the attribute table.
4. Archive and replace the Master Shapefiles / Geopackages (Ram, System 4)¶
Archive the previous master shapefiles / geopackages on system 4, and copy the new merged shapefiles / geopackages in its place.
*Then delete the Merged Shapefile / Geopackage folder from System 1.
*
5. Note about Shapefiles and Geopackages¶
All the above works are usually done using shapefile format, in QGIS latest version (3.16.3).
The Geopackage export is done in QGis versions older than 3.12 (e.g. 3.4, 3.6, 3.8, 3.10) so that the lines are not saved as “Multilines” but as “Lines”and polygons are not saved as “Multipolygons” but as “Polygons”. This is very important to be notes, as Gisaf database does not accept the Multipolygon and Multiline geometry types.