Wiki » History » Version 106
Version 105 (Selvarani C, 27/02/2021 10:49) → Version 106/317 (Selvarani C, 27/02/2021 11:29)
{{toc}}
h1. %{color:BLUE} Wiki%
h2. About the Auroville CSR Geomatics Studio
We are a team working on geographical and related information with an engineering and scientific approach. The concept of *geomatics* is explained here: https://en.wikipedia.org/wiki/Geomatics.
We develop, maintain and publish data on this web site: https://gis.auroville.org.in.
h3. Team
Currently, the team consists of:
* Bala
* Giulio
* Philippe
* Raj
* Ram
* Selvarani
h3. Collaborations
Quick report of the collaborations the Geomatics Team the areas of work.
h1. Collaborations
Collaborations/coordination with other other groups, etc
|_.CSR Geomatics Description of Activity / Research / Project|_.AA|_.AVF|_.LB|_.FAMC|_.WG|_.CSR|_.Talam|_.DST|
|Topographic Survey of Auroville City Area in coordination with L’avenir d’Auroville: Matrimandir entire compound and Residential Zone Sector 1 completed, Sector 2 half completed|yes|||||||yes|
|Topographic Survey of specific projects area: Vibrance, Wasteless, Cultural Zone along the Crown Road, Baraka, Gardens of Unexpected | | | | | | | | |
| Collective wastewater treatment systems health check-up survey: 68 plants evaluated | yes | | | | yes | yes | | yes |
| Manual weekly monitoring of water level in selected wells on Auroville land on approximately 50 wells (number fluctuates depending on local conditions) | yes | | | | yes | | | yes |
| Collection of rainfall data through manual raingauges distributed to Aurovilians: data received regularly from at least 7 raingauges | | | | | yes | | | yes |
| Collection of weather data through automatic weather station installed at CSR: data collected every minute, stored in the database, and published online in real time | | | | | yes | yes | | yes |
| Collaboration with Land Board for survey of identified land boundary stones: collection of coordinates of Government boundary stones for georefering of cadastral maps | | yes | yes | | | | | |
| Collaboration with AV Foundation for compilation of land ownership map: geographic land records as provided by AV Foundation, protected by login access | | yes | | | | | | |
| Collaboration with L’avenir d’Auroville for data sharing and coordinated system set-up: organisation of geographic data for unique online platform to enhance planning processes (initial stage) | yes | | | | | | | |
| Collaboration with L’avenir d’Auroville and FAMC for database links: structural work on procedures for linking databases already existing with various agencies | yes | | | yes | | | | |
| Study of application of drone topographic survey in Auroville context: pilot test conducted on Residential Zone with outside drone survey agency | yes | | | | | yes | | yes |
| Study of open source software for drone topographic survey: testing of softwares like OpenDroneMap, MeshLab, CloudeCompare to evaluate work flow, processes and limitations | | | | | | yes | | yes |
| Surface water modelling (ongoing): research on evaluation of surface water collection potential through automatic processing of field data | | | | | | yes | | yes |
| Collaboration with TALAM on research on radio-transmission for automatic wells level monitoring (ongoing): evaluation of application of radio-transmitted signals for water-related monitoring and metering (selected borewells, flow meters) | | | | | | | yes | yes |
| Organisation of data collected in Auroville in the past (ongoing): standardisation of data in geospatial database | | | | | | yes | | |
| Provision for basic statistical analyses of geographic-related data (indicators/dashboard) | | | | | | yes | | yes |
| Publication of collected data in standardised format on web portal gis.auroville.org.in | | | | | | yes | | yes |
* AA : L'avenir d'Auroville
* AVF : Auroville Foundation
* LB : Auroville Land Board
* FAMC : Auroville Funds and Assets Management Committee
* WG : Auroville Water Group
* CSR : Auroville Centre for Scientific Research
* TALAM : a project under CSR
* DST : Department of Science and Technology, Ministry of Science and Technology, New Delhi
h2. Our workflow
h2. Surveys
Most field surveys are executed by our team of surveyors, using advanced DGPS equipment.
Other surveyors might also be contracted.
h2. CAD
The survey data are imported to a CAD software (Aurocad/Civil 3D).
h1. Editing shapefiles in Autocad
h2. FEATURES CREATION
1. Assign a CRS to the drawing (TM-AUSPOS) (MAPCSLIBRARY command)
2. Create features in CAD (Points, lines, polygons)
3. Export shapefile (a) from CAD (Output > DWG to SDF) (Convert to LL84 – 3D)
h2. FEATURES IMPORT INTO DB
4. Create zip file of the shapefile
5. Upload into the GISAF Shapefiles Basket
6. Import the shapefile into DB
7. Delete the shapefile from Local Machine
h2. FEATURES EDITING
8. Open the table in QGis
9. Save as a shapefile (b) in TM AUSPOS CRS
10. In CAD, open a new drawing and assign ASUPOS CRS
11. Import the shapefile (b) (MapImport) with all Object Data
12. Edit features
13. Export shapefile (a) from CAD (Output > DWG to SDF) with ONLY the id selected (Data Tab > Select Attributes > Object Data > Filename > id) (Convert to LL84 – 3D)
h2. FEATURES IMPORT INTO DB
14. Create zip file of the shapefile
15. Upload into the GISAF Shapefiles Basket
16. Import the shapefile into DB
17. Delete the shapefile from Local Machine
h2. QGis
h3. Conventions
h1. Shapefiles
We work with "QGis":https://en.wikipedia.org/wiki/QGIS , a widely used open source, free software for working on maps and geographical data.
"Shapefile":https://en.wikipedia.org/wiki/Shapefile is a standard file format for mapping, that Gisaf can import. QGis can open and save *shapefiles*.
We have defined some simple norms about these shapefiles for integration with Gisaf.
h2. Coordinate system
We use CRS SRID 32644.
h2. Column (attributes)
* All objects in a shapefile (layer) have a unique identifier named "id": numerical value.
h3. Field (attribute) names
* All fields are lower case (-UpperCase-, use: @lowercase@)
* They do not contain spaces, but underscores (-not this-, use: @but_that@)
* The field names cannot have more than 8 characters (-this_is_too_long-, use: @this_short@)
h3. Attribute types
* The dates are written in ISO format: @YYYY-MM-DD@ (eg. @1968-02-25@)
h3. Fields to remove
* Eventually, remove the fields containing the coordinates: northing, easting, elevation, latitude, longitude, area, length, etc (these might be present when the data is imported from speadsheet)
h2. Foreign keys
We often deal with labels or categories of objects in a layer.
A common use case to explain: Alice creates a layer of stones, and wants to *tag* each stone with a rating: these are picked from a list of choices, like: _Beautiful_, _Interesting_, _Pretty_, etc.
For these kind of attribute: define a column like @rating_id@ (_something_ ending with @_id@) as a numerical value, and assign values 1, 2, 3, etc. The text is defined in another table (typically a CSV file), that looks like:
|_. id |_. name |
| 1 | Beautiful |
| 2 | Interesting |
| 3 | Pretty |
h2. Code
We have defined a standard set of codes, that defines the type of data. They can be found here (TODO: add link).
Add a column @code_name@, matching with one the code, eg. @V25@ for TDEF.
h2. Surveyor
We keep a record of the people who realized the surveys (the _surveyors_).
The shapefiles must contain an attribute @srvyr_id@, which refers to this table (TODO: add link).
h2. Accuracy
We keep a record of the accuracy of the surveys.
The shapefiles must contain an attribute @accur_id@, which refers to this table (TODO: add link).
h2. Date of survey
As nothing is absolutely permanent, it's also important to keep track of the date of the surveys: the shapefiles must contain an attribute @date@.
h2. Working with Gisaf
h3. Survey data
Raw survey data are contained in CSV files, typically downloaded from surveying instruments.
See more information of the process for the survey data (including a flow diagram): [[Survey data]]
h1. Survey data
h2. Workflow summary
p=. !https://redmine.auroville.org.in/attachments/download/4792/Data_workflow.png!
h2. Import basket
Gisaf provides an "import basket" dedicated for raw survey data, which is generated by land survey equipment (Leica's Total Station and RTK).
These are CSV files, like:
<pre>
100081,370633.969,1327742.157,51.187,,,
100083,370628.876,1327702.913,51.565,T52,,
100082,370628.729,1327720.019,51.261,T52,,
100081,370633.969,1327742.154,51.179,,,
100083,370628.876,1327702.913,51.565,T52,,
20800,370633.969,1327742.154,51.180,,,
20801,370618.795,1327713.172,52.817,E30,,
20802,370623.674,1327711.436,51.283,B35,,
20803,370619.314,1327713.407,51.383,B35,,
</pre>
Each category (5th column) must be defined in the Category table (see [[Categories]]).
h2. Organization of the raw survey data basket
The basket should be organized in a directory structure:
- Project name (these can be themselves put in a hierarchy of (sub)directories)
- Surveyor's organization
- Equipment (eg. TS, RTK)
- Survey files (eg. @Our_project-Some_comment-2018-02-23.txt@)
h3. Format of the survey file names
<pre>
Our_project-Some_comment-2018-02-23.txt
</pre>
The date of the survey follows the ISO date standard: @YYYY-MM-DD@.
h2. Import to the database
When importing raw survey data files to the database, Gisaf does 2 steps as described below.
It's worth noting that, in this process, no reprojection is done.
h3. Feed the raw_survey table
Each point of the imported raw survey data file is inserted to the raw_survey table:
# Creation of a Point geometry: the raw_survey table has a geometry column for a single point (@geom@) with x,y and z coordinates
# Save the @id@ of the original point *to the @orig_id@ column*
# *A unique @id@ is computed* from the following fields: @id@, @project@, @equipment@, @date@
# The project is saved in the @project_id@ column
# The surveyor identification in @srvyr_id@
# The date of survey is saved in the @date@ column
# The accuracy is tagged in the @accur_id@, according to a mapping defined in the @accuracyequimentsurveyormapping@ table, which depends on the surveyor and equipment
# The category of the point
h3. Feed the @RAW_V_*@ tables
From the @raw_survey@ table, each point is then copied to its respective @RAW_V_@ table, with basically the same information.
These tables (which should be created manually or with the admin notebook called @create_tables@, as of today's writing), only contain points.
The project is saved along: see below.
h2. Import the points
For categories that define points (opposite to lines and polygons, which require _line work_ carried over in CAD or in a GIS software, see [[Line work]]), the points can be imported automatically to their final destination: the @V_*@ tables.
Note: in this process, the geometries are reprojected.
h3. Auto import of the points
The points found in the @RAW_V_*@ tables can be imported automatically, project per project, the project page of the admin interface.
h3. Import of the line work (lines and polygons)
See [[Line work]]
The shapefiles generated manually (line work) should be put in the project's basket, and imported from it.
h3. Categories
The categories define the types of the geographical features and they are mapped according to ISO standard layer naming conventions: see https://www.nationalcadstandard.org/ncs5/pdfs/ncs5_clg_lnf.pdf
Gisaf uses:
* a table @category@ where the layers are defined
* a table per category
h2. Fields for the categories
TODO
h2. Creation of the RAW_* tables
This step must be done manually (as of today's date of writing).
h3. QGis: work on shapefiles
Go to [[shapefiles]].
h1. Data analysis
We use "Jupyter":https://jupyter.org , "Pandas":https://pandas.pydata.org/ and "GeoPandas":http://geopandas.org/ , accessible at http://gis.auroville.org.in/notebooks .
For integration in the processes (execution of notebooks), there's "papermill":https://github.com/nteract/papermill . Systemd "timers":https://wiki.archlinux.org/index.php/Systemd/Timers are used to automatically schedule the notebooks on the server, ie. for the dashboards.
There's a dedicated virtual machine for Jupyter, accessible from our local network at @jupyter.csr.av@.
h2. Organization of notebooks
The setup is organized in 2 parts, that are run with 2 instances of Jupyter for security reasons.
h3. Admin
The notebooks in the admin are mostly for maintenance: operations on the database, etc.
h3. Users
The notebooks are organized in folders, all under Gisaf's source code git repository, except the "Sandbox" one.
This notebook server connects to the database with a specific user (@jupyter@), which has been set on the database server with permissions to read all data (@readonly@) plus has write access to some tables dedicated to store analysis results.
h2. Integration with Gisaf
The notebook in @Templates@ demonstrates the usage of notebook in relation with Gisaf: mostly, how to use the @gisad.ipynb_tools@ module to access Gisaf models and the data from the database.
This module is part of gisaf: https://redmine.auroville.org.in/projects/gisaf/repository/revisions/master/entry/gisaf/ipynb_tools.py
h2. References
h3. Geopandas
Some nice examples of processing, using water shed and rain: https://geohackweek.github.io/vector/06-geopandas-advanced/
h3. Integration
A good example of how a company has integrated the same tools: https://medium.com/netflix-techblog/scheduling-notebooks-348e6c14cfd6
h2. Other docs
h3. From Swathi
h2. Hosting
The team is located in the CSR of Auroville.
We have setup a server for hosting the software and database. CSR_server for technical information about the setup.
h1. CSR server
dream.csr.av (192.168.0.12)
- Debian 9
- Xen hypervisor
- libvirt for the orchestration of VMs
- management with ansible
h2. Dom0
h3. Installation
Found minor issues with the installation (eg. issues with HDDs, Dell EFI boot).
This document starts from a working Debian Xen server.
-Installed on a mirror of 2*2TB drives with btrfs.-
Update (see #7156): re-installed the OS on /dev/sdc2, ext4, without RAID/LVM.
h3. Storage for domUs
A LVM dream.csr has been created for the domUs.
h3. Networking
With systemd.networkd: bridge "br0" to the physical ethernet.
h3. Create a VM
Adjust the parameters from:
<pre>
export vm_name=infra.csr.av
export vm_ip_addr=172.16.0.3
export vm_password=foobar
</pre>
Create the domU:
<pre>
root@dream:~# xen-create-image --broadcast=172.16.0.255 --dist=stretch --fs=ext4 --gateway=172.16.0.1 --hostname ${vm_name} --ip=${vm_ip_addr} --lvm=dream.csr --maxmem=512M --memory=256M --mirror=http://ftp.de.debian.org/debian/ --netmask=255.255.255.0 --password=${vm_password} --size=10G --swap=1G --vcpus=1 --bridge=br0
</pre>
Note that IP address will be set in the VM, and the vm_ip_addr isn't actually used.
h2. DomUs
h3. Migrate XL to libvirt
After creation using xen-create-image, migrate the definition of the domU to libvirt:
<pre>
virsh -c xen:/// domxml-from-native xen-xm /etc/xen/${vm_name}.cfg > /tmp/${vm_name}.xml
virsh define /tmp/${vm_name}.xml
</pre>
From this point onward, one can log out from the dom0's console and use virsh or "Virtual Machine Manager" from your computer to administer the VM, eg:
* Set "Auto start" on dom0 boot
* Set memory limits, etc
h3. Start the domU
Use "Virtual Machine Manager" or the command:
<pre>
virsh -c xen+ssh://root@dream.csr.av/ start ${vm_name}
</pre>
h3. To do in a libvirt shell
Start a libvirt shell, with "Virtual Machine Manager" or with the command:
<pre>
virsh -c xen+ssh://root@dream.csr.av/ console ${vm_name}
</pre>
Log in as root in the libvirt console.
h4. Network config
Add @/etc/systemd/network/50-wired.network@ (adjust the IP):
<pre>
[Match]
Name=eth0
[Network]
Address=192.168.0.14/24
Gateway=192.168.0.10
DNS=192.168.0.10
</pre>
Then:
<pre>
systemctl enable --now systemd-networkd.socket
systemctl enable --now systemd-networkd.service
systemctl disable --now networking.service
</pre>
h4. Post-install ssh config
Allow ssh root login with password in @/etc/ssh/sshd_config@:
<pre>
sed -i -e 's/#PermitRootLogin prohibit-password/PermitRootLogin yes/' /etc/ssh/sshd_config
systemctl restart ssh.service
</pre>
From this point onwards, one can close the console session opened via @libvirt@.
h3. DNS
Log in to the local DNS server with:
<pre>
ssh root@infra.csr.av
</pre>
Update the 2 files in @/etc/bind/zones@ (@zones/db.csr.av@ @zones/db.192.168.0@) and reload the DNS with:
<pre>
rndc reload
</pre>
h3. DNS
Log in to the local DNS server, *update the 2 files in @/etc/bind9/zones@*, and run:
<pre>
rndc reload
</pre>
h3. Allow yourself to log in easily with your ssh key
Copy your ssh key to the domU: run from your own computer:
<pre>
ssh-copy-id root@${ip_addr}
</pre>
h2. Ansible
Using the Ansible project developed in Blue Light: https://redmine.bluelightav.org/projects/ansible
h3. Prepare the host
Install Python
<pre>
apt install -y python
</pre>
h2. Database
The Postgis database runs on its specific domU (gisdb.csr.av, 192.168.0.18).
h3. Installation
After installing the Postgis package (eg. assign the "postgis" Ansible's role), follow up to [[Db-support]]
h2. Jupyter
The Jupyter server runs on its specific domU (jupyter.csr.av, 192.168.0.19).
h3. Installation
See #6990 .
h3. Backup and restoration of the database
h1. Database
h2. Troubleshooting
h3. Layers missing in the map's tree
Gisaf relies on counting features through Postgres statistics collector subsystem.
In case the server is restarted *dirty* (eg. without clean shutdown), then the count of the tables might be wrong or just 0, leaving the layers apparently empty and thus not even appearing.
The fix is as easy as:
<pre>
sudo -u postgres psql avgis -c VACUUM
</pre>
h2. Installation
This documentation assumes that the Postgis package has been installed (see [[CSR_server#Database]]).
h3. Configure the server
h4. Allow connections from other hosts in the local network
Set the server to listen to addresses, set listen_addresses to @*@ in @/etc/postgresql/9.6/main/postgresql.conf@.
Allow the connections, add in @/etc/postgresql/9.6/main/pg_hba.conf@:
<pre>
host all all 192.168.0.0/24 md5
</pre>
h2. Creation of the database
As @postgres@ user:
<pre>
createdb -E utf8 -T template0 avgis
</pre>
h2. Backups
h3. Primary
The database is backed up every day at midnight. The dump file is located in @/var/backups/postgres/@.
h3. Secondary
There are other backups (daily, weekly, monthly) thanks to Debian package @autopostgresqlbackup@), located (default) in @/var/lib/autopostgresqlbackup@.
h3. Tertiary (dom0)
The whole virtual machine is backed up by BackupNinja on the "dom0" controller, using:
- rdiff backups every day
- tar files on Saturdays.
See @/etc/backups.d@ on the dom0 (192.168.0.12).
h3. Remote
TODO: remote backup.
h2. Restoration
If the VM is not shutdown properly, there's a chance that the database is corrupt, and needs to be restored from one of the backups.
After the restoration, restart gisaf:
<pre>
systemctl restart uwsgi.service
</pre>
h3. From primary backup
*Note*: the roles aren't restored with this method.
With user @postgres@:
<pre>
# Optionally, rename the corrupt database (selecting a name for a database like "avgis_c2")...
psql -c "ALTER DATABASE avgis RENAME TO avgis_c2;"
# ... or drop the existing database
psql -c "drop database avgis;"
# Create a new database:
createdb -E utf8 -T template0 avgis
# Restore the database
pg_restore -d avgis /var/backups/postgres/avgis.pg_dump
</pre>
h3. From secondary backup
@autopostgresqlbackup@ backs up the roles in @postgres_globals@.
<pre>
zcat /var/lib/autopostgresqlbackup/daily/postgres_globals/postgres_globals_2018-10-24_06h25m.Wednesday.sql.gz | psql
zcat /var/lib/autopostgresqlbackup/daily/avgis/avgis_2018-10-24_06h25m.Wednesday.sql.gz | psql
</pre>
h2. Gear
h3. Survey equipment
See [[survey equipment]]
h3. Weather station
See [[Ambient_Weather_weather_station]]
h1. Ambient Weather weather station
We have purchased a WS2902A weather station (https://www.ambientweather.com/amws2902.html).
Firmware version: 4.0.2.
h2. Manual
The operating manual of the weather station can be found at https://p10.secure.hostingprod.com/@site.ambientweatherstore.com/ssl/Manuals/WS-2902C.pdf
h2. Connection
h3. Wifi
Set up the wifi of the console using the "Ambient Tool" phone application. IP address given by DHCP on the router is: 192.168.1.101
h3. Local communication
Fail so far: the only exposed port is TCP/45000. Telnet doesn't show any activity. Nothing found on Internet on this protocol.
One interesting project may come, hijacking the connection to cloud services: https://www.wxforum.net/index.php?topic=35033.0
h3. Cloud connection
We'll create an account on AmbientWeather.net (and eventually on WUnderground.net and/or weathercloud.net), and:
* have the console upload data to there
* gisaf to retrieve our WS data from there
h1. Notes on Console
The daily rainfall data displayed in the console resets at 00.30 every night
h2. Plan for future
Beside living well, see [[plan]].
h2. Other
* [[GDAL (OGR) tools]]
h2. Links and references
[[links]]
h2. Old docs
[[Shapefiles]]
[[Data (measurements, auxiliary tables)]]
h2. Pavneet's docs (imported from gisaf's wiki)
[[Rules of Map making - What all Maps should have!]]
[[Survey Database]]
[[Field to finish]]
[[Survey Data Post-Processing]]
[[Wells Documentation]]
[[Civil 3D useful commands]]
[[Online references for Civil 3D]]
[[connections in QGIS- using browser panel and Add postGIS]]
[[Reconcilation of Raw survey data using pgAdmin]]
[[importing RAW data to GISAF]]
[[Editing Z value of features in Shapefiles in QGIS]]
[[Miscellaneous- Civil 3D]]
[[Documentation- Rain Gauge]]
[[Documentation- Wells Monitoring (Piezometer by Bala)]]
[[Documentation- Flow meter, by Bala]]
[[Documentation- DST- Vegetation Indexing]]
[[Documentation- DST- Interpolation]]
[[Documentation- DST- Survey- Office workflow]]
[[From CAD to GIS, by Giulio]]
[[QGIS- Miscellaneous]]
h2. Giulio's documentation
[[Documentation - Reconciliation of points using Gisaf]]
[[Documentation - Status and Status Changes]]
[[Documentation - Tags retained after re-import of same geometry]]
h1. Access to data
h2. Connection to server directly from CSR
To connect to the server directly without going through Aurinoco server, the correct url is
http://gis.csr.av
h2. Connection to Gisaf via QGis through WFS / OGC API
This works only on QGis from version 3.14.15 onward
In the browser, click on WFS/OGC API, then right-click to create a new connection
Give a name (e.g. OGC API Qgis Gisaf)
Give the url https://gis.auroville.org.in/ogcapi
Under the WFS Options box, on Version dropdown, the default option "Maximum" works just fine
Click on OK
The list of layers will appear in the Browser under WFS/OGC API.
h1. How to create a new projection in QGis
To create a new projection in QGis, go to menu "Settings", and click on "Custom Projections".
A pop-up window appears with a list of all projections defined in QGis projects used by the user so far.
Click on the green "+" sign on the right top part of the window to create a new projection.
In the "Name" box, type "TM CSRAUSPOS SF1" (which means TM = Transverse Mercator projection; CSRAUSPOS = theparameters for this projection are derived from the processing of DGPS raw data by AUSPOS - Online GPS Processing Service - https://www.ga.gov.au/scientific-topics/positioning-navigation/geodesy/auspos; SF1 = Scale Factor is 1).
In the "Format" dropdown list, select "Proj String (legacy - Not Recommended)"
In the "Parameters" box, paste the following "+proj=tmerc +lat_0=12.01605433+lon_0=79.80998934 +k=1 +x_0=370455.630 +y_0=1328608.994 +ellps=WGS84+towgs84=0,0,0,0,0,0,0 +units=m +no_defs".
Finally, click on OK.
In a more explicit way, the parameters mean the following:
Map Projection: TransverseMercator (TM)
False Easting: 370455.6300
False Northing: 1328608.9940
Latitude of Origin: 12°00'57.79560" (DMS) 12.01605433 (DD)
Central Meridian: 79°48'35.96164" (DMS) 79.80998934 (DD)
Scale Factor: 1.00000000
Zone Width: 6.0°
h1. Elimination of Duplicate points – General criteria
It might happen that the same physical feature (e.g. a tree, or a pole) is surveyed more than once: this can happen because there are many physical features in an area, and the survey needs more than one station. So, for example a tree is surveyed from a station, and gets a serial number on that date. When the station is then changed, it might happen that the same tree is resurveyed: another serial number is given, and possibly a different date, if the survey from the second station happened on a different day.
It is clear that the same tree is then represented with two different points, which means that two different trees exist: but only one tree really exist in the physical reality.
It is clear that one of the two points is redundant and needs to be removed. If this is noted by the surveyor directly in the field, then the issue is solved by the surveyor himself during processing time.
If instead, due to various reasons, it was not noted by the surveyor in the field, it will need to be cleaned after the processing, possibly by post-processing staff.
How to identify duplicate points?
The following criteria can be used:
1. The distance between the two points is less than 30 cm (trees are surveyed if their trunk diameter is at least about 20 cm, so in 30 cm cannot exist two of them)
2. The orig_id (serial number) of the points are not in series
3. The survey date is not the same
4. In case of trees, the species of trees is the same
5. 5. In case of trees, the tree type is not TDEF (because TDEF are mapped irrespective of their diameter, so they can actually have a small trunk, and two of them might exist in 30 cm), not OT (many TDEF species are surveyed as OT if not otherwise indicated by a botanist)
6. The context needs to be evaluated: if one tree is deleted in an area where many trees exist in a limited space, then loosing one in the map is not a big error. If instead one tree is deleted where there are very few trees, then it might be a big loss.
h1. Linework for the Survey Area
h2. 1. Creation of Initial Linework in QGIS using Survey points import - (Ram, System 4)
Initial Linework in QGIS is started by surveyor with the knowledge from the Field. For this step, points are simply imported into the QGIS from the field text file (.csv or .txt). CRS needs to be TM-AUSPOS. The box of “First record has field names” shall not be ticked. In Point Coordinates, select the correct field for x, for y and for z (usually “field_2” for x, “field_3” for y and “field_4” for z). Points can be styled using the “Categorized” style in “Symbology”, using “Field_5” as value, or using a Rule-based symbology using the category (field 5) as filter.
Linework is created by connecting points having same description and belonging to the same physical feature. *All line and polygon features are created as lines*.
The Initial Linework for the Survey Area is also stored temporarily in
+D: > AVSM > Zone-Survey number (eg RZ-01) > Survey Area (eg J) > Temporary WD+
h2. Note: The line shapefiles / Geopackages shall be in CRS: TM AUSPOS
h2. 2. Creation of final working drawing Shapefiles / Geopackages - (Selvarani, System 1)
Final working drawing Shapefiles / Geopackages are created from the Initial Linework of Survey Area.
As the Surveyor draws all features as lines (both for lines and polygons features), the following actions shall be done:
1. *If features are lines:*
• Export the shapefile / geopackage into the final working drawing folder (Final WD), in separate folders according to its type (e.g. BLDG, FENC, ROAD, etc).
h2. The CRS for the export shall be EPSG:4326 - WGS 84
2. *If features are polygons:*
• Lines shall be converted into polygons:
to do it, first click on the layer to be converted to make it active (e.g. WD-CZ-01-F-LL84_V-BLDG-MHOL------E), then go to “Vector” Menu, click on Geometry Tools, click on Line to Polygons:
!https://redmine.auroville.org.in/attachments/download/9760/Line%20to%20Polygon%20Menu.png!
The new window for “Lines to Polygons” conversion will appear:
!https://redmine.auroville.org.in/attachments/download/9762/Lines%20to%20Polygon%20Window.png!
• Always cross check the input layer, to make sure that the input layer is the active one
• Save the output in a temporary layer
• The temporary layer will be listed in the list of layers, it shall be exported to the saving location as +D: > Survey > Zone-Survey Number > Final WD > Survey Area SHP+ (eg . D: > Survey > GB-01 > Final WD > A-Shp)
h2. The CRS for the export shall be EPSG:4326 - WGS 84
Once all the shapefiles / geopackages are exported in Final WD, for each of the newly exported layers the Topology Checker Tool shall be used.
h2. Linework for the whole Survey Zone
h2. 1. Merging Shapefiles / Geopackages - (Selvarani, System 1)
A copy of the Zone Master shapefiles / geopackages are taken from System 4 and stored in Temp Folder on Desktop in System 1.
Master shapefiles / geopackages are merged with the Survey Area shapefiles / geopackages:
• To do it, go to “Vector” Menu, click on Geoprocessing Tools, then click on Union:
!https://redmine.auroville.org.in/attachments/download/9763/Union%20Menu.png!
The new window for “Union” will appear:
!https://redmine.auroville.org.in/attachments/download/9764/Union%20Window.png!
• To make sure that the right geometry is generated by this process (“line” type, not “Multiline”, and similarly “Polygon” type, not “Multipolygon), we need to always keep the *Master shapefile* (e.g. Final-CZ-01-2021-02-05-LL84_V-BLDG-MHOL------E) *as Input layer*, and the Survey Area shapefile as Overlay Layer (e.g. WD-CZ-01-F-LL84_V-BLDG-MHOL------E).
• (The output can be saved to a file, as the CRS should already be EPSG4326 – WGS84.)
h2. 2. Storing Shapefiles / Geopackages - (Selvarani, System 1)
Save the merged shape file in the correct location in Final folder as +D: > Survey > Zone-Survey Number > Final+ (eg . D: > Survey > GB-01 > Final)
Date in the name of Final Shapefile / Geopackage needs to be updated.
Once the merging operation is completed, the copy of Master shapefile / geopackage is deleted from the Temp folder.
h2. 3. Topology check of merged shapefiles
The topology checker is applied again on the merged shapefiles / geopackages.
The “id_field” shall be removed from the attribute table.
h2. 4. Archive and replace the Master Shapefiles / Geopackages (Ram, System 4)
Archive the previous master shapefiles / geopackages on system 4, and copy the new merged shapefiles / geopackages in its place.
*Then delete the Merged Shapefile / Geopackage folder from System 1.
*
h2. 5. Note about Shapefiles and Geopackages
All the above works are usually done using shapefile format, in QGIS latest version (3.16.3).
The Geopackage export is done in QGis versions older than 3.12 (e.g. 3.4, 3.6, 3.8, 3.10) so that the lines are not saved as “Multilines” but as “Lines”and polygons are not saved as “Multipolygons” but as “Polygons”. This is very important to be notes, as Gisaf database does not accept the Multipolygon and Multiline geometry types.
h1. %{color:BLUE} Wiki%
h2. About the Auroville CSR Geomatics Studio
We are a team working on geographical and related information with an engineering and scientific approach. The concept of *geomatics* is explained here: https://en.wikipedia.org/wiki/Geomatics.
We develop, maintain and publish data on this web site: https://gis.auroville.org.in.
h3. Team
Currently, the team consists of:
* Bala
* Giulio
* Philippe
* Raj
* Ram
* Selvarani
h3. Collaborations
Quick report of the collaborations the Geomatics Team the areas of work.
h1. Collaborations
Collaborations/coordination with other other groups, etc
|_.CSR Geomatics Description of Activity / Research / Project|_.AA|_.AVF|_.LB|_.FAMC|_.WG|_.CSR|_.Talam|_.DST|
|Topographic Survey of Auroville City Area in coordination with L’avenir d’Auroville: Matrimandir entire compound and Residential Zone Sector 1 completed, Sector 2 half completed|yes|||||||yes|
|Topographic Survey of specific projects area: Vibrance, Wasteless, Cultural Zone along the Crown Road, Baraka, Gardens of Unexpected | | | | | | | | |
| Collective wastewater treatment systems health check-up survey: 68 plants evaluated | yes | | | | yes | yes | | yes |
| Manual weekly monitoring of water level in selected wells on Auroville land on approximately 50 wells (number fluctuates depending on local conditions) | yes | | | | yes | | | yes |
| Collection of rainfall data through manual raingauges distributed to Aurovilians: data received regularly from at least 7 raingauges | | | | | yes | | | yes |
| Collection of weather data through automatic weather station installed at CSR: data collected every minute, stored in the database, and published online in real time | | | | | yes | yes | | yes |
| Collaboration with Land Board for survey of identified land boundary stones: collection of coordinates of Government boundary stones for georefering of cadastral maps | | yes | yes | | | | | |
| Collaboration with AV Foundation for compilation of land ownership map: geographic land records as provided by AV Foundation, protected by login access | | yes | | | | | | |
| Collaboration with L’avenir d’Auroville for data sharing and coordinated system set-up: organisation of geographic data for unique online platform to enhance planning processes (initial stage) | yes | | | | | | | |
| Collaboration with L’avenir d’Auroville and FAMC for database links: structural work on procedures for linking databases already existing with various agencies | yes | | | yes | | | | |
| Study of application of drone topographic survey in Auroville context: pilot test conducted on Residential Zone with outside drone survey agency | yes | | | | | yes | | yes |
| Study of open source software for drone topographic survey: testing of softwares like OpenDroneMap, MeshLab, CloudeCompare to evaluate work flow, processes and limitations | | | | | | yes | | yes |
| Surface water modelling (ongoing): research on evaluation of surface water collection potential through automatic processing of field data | | | | | | yes | | yes |
| Collaboration with TALAM on research on radio-transmission for automatic wells level monitoring (ongoing): evaluation of application of radio-transmitted signals for water-related monitoring and metering (selected borewells, flow meters) | | | | | | | yes | yes |
| Organisation of data collected in Auroville in the past (ongoing): standardisation of data in geospatial database | | | | | | yes | | |
| Provision for basic statistical analyses of geographic-related data (indicators/dashboard) | | | | | | yes | | yes |
| Publication of collected data in standardised format on web portal gis.auroville.org.in | | | | | | yes | | yes |
* AA : L'avenir d'Auroville
* AVF : Auroville Foundation
* LB : Auroville Land Board
* FAMC : Auroville Funds and Assets Management Committee
* WG : Auroville Water Group
* CSR : Auroville Centre for Scientific Research
* TALAM : a project under CSR
* DST : Department of Science and Technology, Ministry of Science and Technology, New Delhi
h2. Our workflow
h2. Surveys
Most field surveys are executed by our team of surveyors, using advanced DGPS equipment.
Other surveyors might also be contracted.
h2. CAD
The survey data are imported to a CAD software (Aurocad/Civil 3D).
h1. Editing shapefiles in Autocad
h2. FEATURES CREATION
1. Assign a CRS to the drawing (TM-AUSPOS) (MAPCSLIBRARY command)
2. Create features in CAD (Points, lines, polygons)
3. Export shapefile (a) from CAD (Output > DWG to SDF) (Convert to LL84 – 3D)
h2. FEATURES IMPORT INTO DB
4. Create zip file of the shapefile
5. Upload into the GISAF Shapefiles Basket
6. Import the shapefile into DB
7. Delete the shapefile from Local Machine
h2. FEATURES EDITING
8. Open the table in QGis
9. Save as a shapefile (b) in TM AUSPOS CRS
10. In CAD, open a new drawing and assign ASUPOS CRS
11. Import the shapefile (b) (MapImport) with all Object Data
12. Edit features
13. Export shapefile (a) from CAD (Output > DWG to SDF) with ONLY the id selected (Data Tab > Select Attributes > Object Data > Filename > id) (Convert to LL84 – 3D)
h2. FEATURES IMPORT INTO DB
14. Create zip file of the shapefile
15. Upload into the GISAF Shapefiles Basket
16. Import the shapefile into DB
17. Delete the shapefile from Local Machine
h2. QGis
h3. Conventions
h1. Shapefiles
We work with "QGis":https://en.wikipedia.org/wiki/QGIS , a widely used open source, free software for working on maps and geographical data.
"Shapefile":https://en.wikipedia.org/wiki/Shapefile is a standard file format for mapping, that Gisaf can import. QGis can open and save *shapefiles*.
We have defined some simple norms about these shapefiles for integration with Gisaf.
h2. Coordinate system
We use CRS SRID 32644.
h2. Column (attributes)
* All objects in a shapefile (layer) have a unique identifier named "id": numerical value.
h3. Field (attribute) names
* All fields are lower case (-UpperCase-, use: @lowercase@)
* They do not contain spaces, but underscores (-not this-, use: @but_that@)
* The field names cannot have more than 8 characters (-this_is_too_long-, use: @this_short@)
h3. Attribute types
* The dates are written in ISO format: @YYYY-MM-DD@ (eg. @1968-02-25@)
h3. Fields to remove
* Eventually, remove the fields containing the coordinates: northing, easting, elevation, latitude, longitude, area, length, etc (these might be present when the data is imported from speadsheet)
h2. Foreign keys
We often deal with labels or categories of objects in a layer.
A common use case to explain: Alice creates a layer of stones, and wants to *tag* each stone with a rating: these are picked from a list of choices, like: _Beautiful_, _Interesting_, _Pretty_, etc.
For these kind of attribute: define a column like @rating_id@ (_something_ ending with @_id@) as a numerical value, and assign values 1, 2, 3, etc. The text is defined in another table (typically a CSV file), that looks like:
|_. id |_. name |
| 1 | Beautiful |
| 2 | Interesting |
| 3 | Pretty |
h2. Code
We have defined a standard set of codes, that defines the type of data. They can be found here (TODO: add link).
Add a column @code_name@, matching with one the code, eg. @V25@ for TDEF.
h2. Surveyor
We keep a record of the people who realized the surveys (the _surveyors_).
The shapefiles must contain an attribute @srvyr_id@, which refers to this table (TODO: add link).
h2. Accuracy
We keep a record of the accuracy of the surveys.
The shapefiles must contain an attribute @accur_id@, which refers to this table (TODO: add link).
h2. Date of survey
As nothing is absolutely permanent, it's also important to keep track of the date of the surveys: the shapefiles must contain an attribute @date@.
h2. Working with Gisaf
h3. Survey data
Raw survey data are contained in CSV files, typically downloaded from surveying instruments.
See more information of the process for the survey data (including a flow diagram): [[Survey data]]
h1. Survey data
h2. Workflow summary
p=. !https://redmine.auroville.org.in/attachments/download/4792/Data_workflow.png!
h2. Import basket
Gisaf provides an "import basket" dedicated for raw survey data, which is generated by land survey equipment (Leica's Total Station and RTK).
These are CSV files, like:
<pre>
100081,370633.969,1327742.157,51.187,,,
100083,370628.876,1327702.913,51.565,T52,,
100082,370628.729,1327720.019,51.261,T52,,
100081,370633.969,1327742.154,51.179,,,
100083,370628.876,1327702.913,51.565,T52,,
20800,370633.969,1327742.154,51.180,,,
20801,370618.795,1327713.172,52.817,E30,,
20802,370623.674,1327711.436,51.283,B35,,
20803,370619.314,1327713.407,51.383,B35,,
</pre>
Each category (5th column) must be defined in the Category table (see [[Categories]]).
h2. Organization of the raw survey data basket
The basket should be organized in a directory structure:
- Project name (these can be themselves put in a hierarchy of (sub)directories)
- Surveyor's organization
- Equipment (eg. TS, RTK)
- Survey files (eg. @Our_project-Some_comment-2018-02-23.txt@)
h3. Format of the survey file names
<pre>
Our_project-Some_comment-2018-02-23.txt
</pre>
The date of the survey follows the ISO date standard: @YYYY-MM-DD@.
h2. Import to the database
When importing raw survey data files to the database, Gisaf does 2 steps as described below.
It's worth noting that, in this process, no reprojection is done.
h3. Feed the raw_survey table
Each point of the imported raw survey data file is inserted to the raw_survey table:
# Creation of a Point geometry: the raw_survey table has a geometry column for a single point (@geom@) with x,y and z coordinates
# Save the @id@ of the original point *to the @orig_id@ column*
# *A unique @id@ is computed* from the following fields: @id@, @project@, @equipment@, @date@
# The project is saved in the @project_id@ column
# The surveyor identification in @srvyr_id@
# The date of survey is saved in the @date@ column
# The accuracy is tagged in the @accur_id@, according to a mapping defined in the @accuracyequimentsurveyormapping@ table, which depends on the surveyor and equipment
# The category of the point
h3. Feed the @RAW_V_*@ tables
From the @raw_survey@ table, each point is then copied to its respective @RAW_V_@ table, with basically the same information.
These tables (which should be created manually or with the admin notebook called @create_tables@, as of today's writing), only contain points.
The project is saved along: see below.
h2. Import the points
For categories that define points (opposite to lines and polygons, which require _line work_ carried over in CAD or in a GIS software, see [[Line work]]), the points can be imported automatically to their final destination: the @V_*@ tables.
Note: in this process, the geometries are reprojected.
h3. Auto import of the points
The points found in the @RAW_V_*@ tables can be imported automatically, project per project, the project page of the admin interface.
h3. Import of the line work (lines and polygons)
See [[Line work]]
The shapefiles generated manually (line work) should be put in the project's basket, and imported from it.
h3. Categories
The categories define the types of the geographical features and they are mapped according to ISO standard layer naming conventions: see https://www.nationalcadstandard.org/ncs5/pdfs/ncs5_clg_lnf.pdf
Gisaf uses:
* a table @category@ where the layers are defined
* a table per category
h2. Fields for the categories
TODO
h2. Creation of the RAW_* tables
This step must be done manually (as of today's date of writing).
h3. QGis: work on shapefiles
Go to [[shapefiles]].
h1. Data analysis
We use "Jupyter":https://jupyter.org , "Pandas":https://pandas.pydata.org/ and "GeoPandas":http://geopandas.org/ , accessible at http://gis.auroville.org.in/notebooks .
For integration in the processes (execution of notebooks), there's "papermill":https://github.com/nteract/papermill . Systemd "timers":https://wiki.archlinux.org/index.php/Systemd/Timers are used to automatically schedule the notebooks on the server, ie. for the dashboards.
There's a dedicated virtual machine for Jupyter, accessible from our local network at @jupyter.csr.av@.
h2. Organization of notebooks
The setup is organized in 2 parts, that are run with 2 instances of Jupyter for security reasons.
h3. Admin
The notebooks in the admin are mostly for maintenance: operations on the database, etc.
h3. Users
The notebooks are organized in folders, all under Gisaf's source code git repository, except the "Sandbox" one.
This notebook server connects to the database with a specific user (@jupyter@), which has been set on the database server with permissions to read all data (@readonly@) plus has write access to some tables dedicated to store analysis results.
h2. Integration with Gisaf
The notebook in @Templates@ demonstrates the usage of notebook in relation with Gisaf: mostly, how to use the @gisad.ipynb_tools@ module to access Gisaf models and the data from the database.
This module is part of gisaf: https://redmine.auroville.org.in/projects/gisaf/repository/revisions/master/entry/gisaf/ipynb_tools.py
h2. References
h3. Geopandas
Some nice examples of processing, using water shed and rain: https://geohackweek.github.io/vector/06-geopandas-advanced/
h3. Integration
A good example of how a company has integrated the same tools: https://medium.com/netflix-techblog/scheduling-notebooks-348e6c14cfd6
h2. Other docs
h3. From Swathi
h2. Hosting
The team is located in the CSR of Auroville.
We have setup a server for hosting the software and database. CSR_server for technical information about the setup.
h1. CSR server
dream.csr.av (192.168.0.12)
- Debian 9
- Xen hypervisor
- libvirt for the orchestration of VMs
- management with ansible
h2. Dom0
h3. Installation
Found minor issues with the installation (eg. issues with HDDs, Dell EFI boot).
This document starts from a working Debian Xen server.
-Installed on a mirror of 2*2TB drives with btrfs.-
Update (see #7156): re-installed the OS on /dev/sdc2, ext4, without RAID/LVM.
h3. Storage for domUs
A LVM dream.csr has been created for the domUs.
h3. Networking
With systemd.networkd: bridge "br0" to the physical ethernet.
h3. Create a VM
Adjust the parameters from:
<pre>
export vm_name=infra.csr.av
export vm_ip_addr=172.16.0.3
export vm_password=foobar
</pre>
Create the domU:
<pre>
root@dream:~# xen-create-image --broadcast=172.16.0.255 --dist=stretch --fs=ext4 --gateway=172.16.0.1 --hostname ${vm_name} --ip=${vm_ip_addr} --lvm=dream.csr --maxmem=512M --memory=256M --mirror=http://ftp.de.debian.org/debian/ --netmask=255.255.255.0 --password=${vm_password} --size=10G --swap=1G --vcpus=1 --bridge=br0
</pre>
Note that IP address will be set in the VM, and the vm_ip_addr isn't actually used.
h2. DomUs
h3. Migrate XL to libvirt
After creation using xen-create-image, migrate the definition of the domU to libvirt:
<pre>
virsh -c xen:/// domxml-from-native xen-xm /etc/xen/${vm_name}.cfg > /tmp/${vm_name}.xml
virsh define /tmp/${vm_name}.xml
</pre>
From this point onward, one can log out from the dom0's console and use virsh or "Virtual Machine Manager" from your computer to administer the VM, eg:
* Set "Auto start" on dom0 boot
* Set memory limits, etc
h3. Start the domU
Use "Virtual Machine Manager" or the command:
<pre>
virsh -c xen+ssh://root@dream.csr.av/ start ${vm_name}
</pre>
h3. To do in a libvirt shell
Start a libvirt shell, with "Virtual Machine Manager" or with the command:
<pre>
virsh -c xen+ssh://root@dream.csr.av/ console ${vm_name}
</pre>
Log in as root in the libvirt console.
h4. Network config
Add @/etc/systemd/network/50-wired.network@ (adjust the IP):
<pre>
[Match]
Name=eth0
[Network]
Address=192.168.0.14/24
Gateway=192.168.0.10
DNS=192.168.0.10
</pre>
Then:
<pre>
systemctl enable --now systemd-networkd.socket
systemctl enable --now systemd-networkd.service
systemctl disable --now networking.service
</pre>
h4. Post-install ssh config
Allow ssh root login with password in @/etc/ssh/sshd_config@:
<pre>
sed -i -e 's/#PermitRootLogin prohibit-password/PermitRootLogin yes/' /etc/ssh/sshd_config
systemctl restart ssh.service
</pre>
From this point onwards, one can close the console session opened via @libvirt@.
h3. DNS
Log in to the local DNS server with:
<pre>
ssh root@infra.csr.av
</pre>
Update the 2 files in @/etc/bind/zones@ (@zones/db.csr.av@ @zones/db.192.168.0@) and reload the DNS with:
<pre>
rndc reload
</pre>
h3. DNS
Log in to the local DNS server, *update the 2 files in @/etc/bind9/zones@*, and run:
<pre>
rndc reload
</pre>
h3. Allow yourself to log in easily with your ssh key
Copy your ssh key to the domU: run from your own computer:
<pre>
ssh-copy-id root@${ip_addr}
</pre>
h2. Ansible
Using the Ansible project developed in Blue Light: https://redmine.bluelightav.org/projects/ansible
h3. Prepare the host
Install Python
<pre>
apt install -y python
</pre>
h2. Database
The Postgis database runs on its specific domU (gisdb.csr.av, 192.168.0.18).
h3. Installation
After installing the Postgis package (eg. assign the "postgis" Ansible's role), follow up to [[Db-support]]
h2. Jupyter
The Jupyter server runs on its specific domU (jupyter.csr.av, 192.168.0.19).
h3. Installation
See #6990 .
h3. Backup and restoration of the database
h1. Database
h2. Troubleshooting
h3. Layers missing in the map's tree
Gisaf relies on counting features through Postgres statistics collector subsystem.
In case the server is restarted *dirty* (eg. without clean shutdown), then the count of the tables might be wrong or just 0, leaving the layers apparently empty and thus not even appearing.
The fix is as easy as:
<pre>
sudo -u postgres psql avgis -c VACUUM
</pre>
h2. Installation
This documentation assumes that the Postgis package has been installed (see [[CSR_server#Database]]).
h3. Configure the server
h4. Allow connections from other hosts in the local network
Set the server to listen to addresses, set listen_addresses to @*@ in @/etc/postgresql/9.6/main/postgresql.conf@.
Allow the connections, add in @/etc/postgresql/9.6/main/pg_hba.conf@:
<pre>
host all all 192.168.0.0/24 md5
</pre>
h2. Creation of the database
As @postgres@ user:
<pre>
createdb -E utf8 -T template0 avgis
</pre>
h2. Backups
h3. Primary
The database is backed up every day at midnight. The dump file is located in @/var/backups/postgres/@.
h3. Secondary
There are other backups (daily, weekly, monthly) thanks to Debian package @autopostgresqlbackup@), located (default) in @/var/lib/autopostgresqlbackup@.
h3. Tertiary (dom0)
The whole virtual machine is backed up by BackupNinja on the "dom0" controller, using:
- rdiff backups every day
- tar files on Saturdays.
See @/etc/backups.d@ on the dom0 (192.168.0.12).
h3. Remote
TODO: remote backup.
h2. Restoration
If the VM is not shutdown properly, there's a chance that the database is corrupt, and needs to be restored from one of the backups.
After the restoration, restart gisaf:
<pre>
systemctl restart uwsgi.service
</pre>
h3. From primary backup
*Note*: the roles aren't restored with this method.
With user @postgres@:
<pre>
# Optionally, rename the corrupt database (selecting a name for a database like "avgis_c2")...
psql -c "ALTER DATABASE avgis RENAME TO avgis_c2;"
# ... or drop the existing database
psql -c "drop database avgis;"
# Create a new database:
createdb -E utf8 -T template0 avgis
# Restore the database
pg_restore -d avgis /var/backups/postgres/avgis.pg_dump
</pre>
h3. From secondary backup
@autopostgresqlbackup@ backs up the roles in @postgres_globals@.
<pre>
zcat /var/lib/autopostgresqlbackup/daily/postgres_globals/postgres_globals_2018-10-24_06h25m.Wednesday.sql.gz | psql
zcat /var/lib/autopostgresqlbackup/daily/avgis/avgis_2018-10-24_06h25m.Wednesday.sql.gz | psql
</pre>
h2. Gear
h3. Survey equipment
See [[survey equipment]]
h3. Weather station
See [[Ambient_Weather_weather_station]]
h1. Ambient Weather weather station
We have purchased a WS2902A weather station (https://www.ambientweather.com/amws2902.html).
Firmware version: 4.0.2.
h2. Manual
The operating manual of the weather station can be found at https://p10.secure.hostingprod.com/@site.ambientweatherstore.com/ssl/Manuals/WS-2902C.pdf
h2. Connection
h3. Wifi
Set up the wifi of the console using the "Ambient Tool" phone application. IP address given by DHCP on the router is: 192.168.1.101
h3. Local communication
Fail so far: the only exposed port is TCP/45000. Telnet doesn't show any activity. Nothing found on Internet on this protocol.
One interesting project may come, hijacking the connection to cloud services: https://www.wxforum.net/index.php?topic=35033.0
h3. Cloud connection
We'll create an account on AmbientWeather.net (and eventually on WUnderground.net and/or weathercloud.net), and:
* have the console upload data to there
* gisaf to retrieve our WS data from there
h1. Notes on Console
The daily rainfall data displayed in the console resets at 00.30 every night
h2. Plan for future
Beside living well, see [[plan]].
h2. Other
* [[GDAL (OGR) tools]]
h2. Links and references
[[links]]
h2. Old docs
[[Shapefiles]]
[[Data (measurements, auxiliary tables)]]
h2. Pavneet's docs (imported from gisaf's wiki)
[[Rules of Map making - What all Maps should have!]]
[[Survey Database]]
[[Field to finish]]
[[Survey Data Post-Processing]]
[[Wells Documentation]]
[[Civil 3D useful commands]]
[[Online references for Civil 3D]]
[[connections in QGIS- using browser panel and Add postGIS]]
[[Reconcilation of Raw survey data using pgAdmin]]
[[importing RAW data to GISAF]]
[[Editing Z value of features in Shapefiles in QGIS]]
[[Miscellaneous- Civil 3D]]
[[Documentation- Rain Gauge]]
[[Documentation- Wells Monitoring (Piezometer by Bala)]]
[[Documentation- Flow meter, by Bala]]
[[Documentation- DST- Vegetation Indexing]]
[[Documentation- DST- Interpolation]]
[[Documentation- DST- Survey- Office workflow]]
[[From CAD to GIS, by Giulio]]
[[QGIS- Miscellaneous]]
h2. Giulio's documentation
[[Documentation - Reconciliation of points using Gisaf]]
[[Documentation - Status and Status Changes]]
[[Documentation - Tags retained after re-import of same geometry]]
h1. Access to data
h2. Connection to server directly from CSR
To connect to the server directly without going through Aurinoco server, the correct url is
http://gis.csr.av
h2. Connection to Gisaf via QGis through WFS / OGC API
This works only on QGis from version 3.14.15 onward
In the browser, click on WFS/OGC API, then right-click to create a new connection
Give a name (e.g. OGC API Qgis Gisaf)
Give the url https://gis.auroville.org.in/ogcapi
Under the WFS Options box, on Version dropdown, the default option "Maximum" works just fine
Click on OK
The list of layers will appear in the Browser under WFS/OGC API.
h1. How to create a new projection in QGis
To create a new projection in QGis, go to menu "Settings", and click on "Custom Projections".
A pop-up window appears with a list of all projections defined in QGis projects used by the user so far.
Click on the green "+" sign on the right top part of the window to create a new projection.
In the "Name" box, type "TM CSRAUSPOS SF1" (which means TM = Transverse Mercator projection; CSRAUSPOS = theparameters for this projection are derived from the processing of DGPS raw data by AUSPOS - Online GPS Processing Service - https://www.ga.gov.au/scientific-topics/positioning-navigation/geodesy/auspos; SF1 = Scale Factor is 1).
In the "Format" dropdown list, select "Proj String (legacy - Not Recommended)"
In the "Parameters" box, paste the following "+proj=tmerc +lat_0=12.01605433+lon_0=79.80998934 +k=1 +x_0=370455.630 +y_0=1328608.994 +ellps=WGS84+towgs84=0,0,0,0,0,0,0 +units=m +no_defs".
Finally, click on OK.
In a more explicit way, the parameters mean the following:
Map Projection: TransverseMercator (TM)
False Easting: 370455.6300
False Northing: 1328608.9940
Latitude of Origin: 12°00'57.79560" (DMS) 12.01605433 (DD)
Central Meridian: 79°48'35.96164" (DMS) 79.80998934 (DD)
Scale Factor: 1.00000000
Zone Width: 6.0°
h1. Elimination of Duplicate points – General criteria
It might happen that the same physical feature (e.g. a tree, or a pole) is surveyed more than once: this can happen because there are many physical features in an area, and the survey needs more than one station. So, for example a tree is surveyed from a station, and gets a serial number on that date. When the station is then changed, it might happen that the same tree is resurveyed: another serial number is given, and possibly a different date, if the survey from the second station happened on a different day.
It is clear that the same tree is then represented with two different points, which means that two different trees exist: but only one tree really exist in the physical reality.
It is clear that one of the two points is redundant and needs to be removed. If this is noted by the surveyor directly in the field, then the issue is solved by the surveyor himself during processing time.
If instead, due to various reasons, it was not noted by the surveyor in the field, it will need to be cleaned after the processing, possibly by post-processing staff.
How to identify duplicate points?
The following criteria can be used:
1. The distance between the two points is less than 30 cm (trees are surveyed if their trunk diameter is at least about 20 cm, so in 30 cm cannot exist two of them)
2. The orig_id (serial number) of the points are not in series
3. The survey date is not the same
4. In case of trees, the species of trees is the same
5. 5. In case of trees, the tree type is not TDEF (because TDEF are mapped irrespective of their diameter, so they can actually have a small trunk, and two of them might exist in 30 cm), not OT (many TDEF species are surveyed as OT if not otherwise indicated by a botanist)
6. The context needs to be evaluated: if one tree is deleted in an area where many trees exist in a limited space, then loosing one in the map is not a big error. If instead one tree is deleted where there are very few trees, then it might be a big loss.
h1. Linework for the Survey Area
h2. 1. Creation of Initial Linework in QGIS using Survey points import - (Ram, System 4)
Initial Linework in QGIS is started by surveyor with the knowledge from the Field. For this step, points are simply imported into the QGIS from the field text file (.csv or .txt). CRS needs to be TM-AUSPOS. The box of “First record has field names” shall not be ticked. In Point Coordinates, select the correct field for x, for y and for z (usually “field_2” for x, “field_3” for y and “field_4” for z). Points can be styled using the “Categorized” style in “Symbology”, using “Field_5” as value, or using a Rule-based symbology using the category (field 5) as filter.
Linework is created by connecting points having same description and belonging to the same physical feature. *All line and polygon features are created as lines*.
The Initial Linework for the Survey Area is also stored temporarily in
+D: > AVSM > Zone-Survey number (eg RZ-01) > Survey Area (eg J) > Temporary WD+
h2. Note: The line shapefiles / Geopackages shall be in CRS: TM AUSPOS
h2. 2. Creation of final working drawing Shapefiles / Geopackages - (Selvarani, System 1)
Final working drawing Shapefiles / Geopackages are created from the Initial Linework of Survey Area.
As the Surveyor draws all features as lines (both for lines and polygons features), the following actions shall be done:
1. *If features are lines:*
• Export the shapefile / geopackage into the final working drawing folder (Final WD), in separate folders according to its type (e.g. BLDG, FENC, ROAD, etc).
h2. The CRS for the export shall be EPSG:4326 - WGS 84
2. *If features are polygons:*
• Lines shall be converted into polygons:
to do it, first click on the layer to be converted to make it active (e.g. WD-CZ-01-F-LL84_V-BLDG-MHOL------E), then go to “Vector” Menu, click on Geometry Tools, click on Line to Polygons:
!https://redmine.auroville.org.in/attachments/download/9760/Line%20to%20Polygon%20Menu.png!
The new window for “Lines to Polygons” conversion will appear:
!https://redmine.auroville.org.in/attachments/download/9762/Lines%20to%20Polygon%20Window.png!
• Always cross check the input layer, to make sure that the input layer is the active one
• Save the output in a temporary layer
• The temporary layer will be listed in the list of layers, it shall be exported to the saving location as +D: > Survey > Zone-Survey Number > Final WD > Survey Area SHP+ (eg . D: > Survey > GB-01 > Final WD > A-Shp)
h2. The CRS for the export shall be EPSG:4326 - WGS 84
Once all the shapefiles / geopackages are exported in Final WD, for each of the newly exported layers the Topology Checker Tool shall be used.
h2. Linework for the whole Survey Zone
h2. 1. Merging Shapefiles / Geopackages - (Selvarani, System 1)
A copy of the Zone Master shapefiles / geopackages are taken from System 4 and stored in Temp Folder on Desktop in System 1.
Master shapefiles / geopackages are merged with the Survey Area shapefiles / geopackages:
• To do it, go to “Vector” Menu, click on Geoprocessing Tools, then click on Union:
!https://redmine.auroville.org.in/attachments/download/9763/Union%20Menu.png!
The new window for “Union” will appear:
!https://redmine.auroville.org.in/attachments/download/9764/Union%20Window.png!
• To make sure that the right geometry is generated by this process (“line” type, not “Multiline”, and similarly “Polygon” type, not “Multipolygon), we need to always keep the *Master shapefile* (e.g. Final-CZ-01-2021-02-05-LL84_V-BLDG-MHOL------E) *as Input layer*, and the Survey Area shapefile as Overlay Layer (e.g. WD-CZ-01-F-LL84_V-BLDG-MHOL------E).
• (The output can be saved to a file, as the CRS should already be EPSG4326 – WGS84.)
h2. 2. Storing Shapefiles / Geopackages - (Selvarani, System 1)
Save the merged shape file in the correct location in Final folder as +D: > Survey > Zone-Survey Number > Final+ (eg . D: > Survey > GB-01 > Final)
Date in the name of Final Shapefile / Geopackage needs to be updated.
Once the merging operation is completed, the copy of Master shapefile / geopackage is deleted from the Temp folder.
h2. 3. Topology check of merged shapefiles
The topology checker is applied again on the merged shapefiles / geopackages.
The “id_field” shall be removed from the attribute table.
h2. 4. Archive and replace the Master Shapefiles / Geopackages (Ram, System 4)
Archive the previous master shapefiles / geopackages on system 4, and copy the new merged shapefiles / geopackages in its place.
*Then delete the Merged Shapefile / Geopackage folder from System 1.
*
h2. 5. Note about Shapefiles and Geopackages
All the above works are usually done using shapefile format, in QGIS latest version (3.16.3).
The Geopackage export is done in QGis versions older than 3.12 (e.g. 3.4, 3.6, 3.8, 3.10) so that the lines are not saved as “Multilines” but as “Lines”and polygons are not saved as “Multipolygons” but as “Polygons”. This is very important to be notes, as Gisaf database does not accept the Multipolygon and Multiline geometry types.