In my experience working with datasets from borehole dataloggers, there is a lot of scope for automating and storing the data in more appropriate ways.
Years of data at fixed intervals may be retrieved where they are subsequently processed and appended to a master data set. This has previously been carried out in excel.
With the limitations in the .xlsx file type and potential for duplication or slow performance on older hardware, the 'single source of truth' alternative provided by using a database offers a welcome change.
Producing an easy-to-use interface with the ability to perform these tasks was the intention of this project which I hope to continue to work on to add support for more logger types.
The datasets I have worked with use In-Situ dataloggers. For this we have 2 options.
Both of these methods result in different export CSV files. This then forces us to use slightly differing processing techniques.
Processing was first considered to use Python and interact with the ArcGIS API. For this, simple Python was used to perform the processing of the downloaded datasets.
This was completed by skipping a number of header rows in the data before then performing the required calculations using the latest dip reading and barologger compensation for atmospheric pressure.
The results are storred in both a processed CSV file for that time step and then pushed up onto a feature layer on ArcGIS online where the data can be used for visualisation on webmaps/apps.
This technique works well other than a seemingly long connection and authorization to push the data. The issue came with planning a more ambitious project using this processing logic.
I wanted the user to be able to collect data throughout the day using an ArcGIS collector app. This would just take the CSV files as attachments and perform processing on a webhook backend.
For this, I set up a small Flask endpoint that
checks on interval for recent changes to the feature layer. Once it detects a change, the interval is reduced for processing of boreholes associated with the site. This would continue until all boreholes are processed.
In theory this would work to allow the user to 'see the result graph' by the time they leave the site.
A working version was developed using this methodology, however, it felt clunky and required someone to ensure the Flask endpoint was active during a site visit.
It would also require the use of ArcGIS online layers. This is a hassle in both needing to be set up but also the cost for storing the data.
This led me to create a local database style collection app with support for PostgreSQL databases. The same processing logic was able to be kept and rewritten in Dart for use with Flutter to create visually appealing interfaces.
The design is simple and user-friendly. Once a site is added along with the associated boreholes, the processing tab just requires the CSV export from the logger, the barologger export and the most recent dip reading.
Having the data stored as a 'single source of truth' was one of the main goals of the project. Without using an ArcGIS feature layer the next option was to use PostgreSQL.
Adding support for this allows the user to either use the device local database connection, or to allow connection to a server running PostgreSQL. From this the application allows for the creation of required tables with a button.
Without these tables, the Postgres connection will not work. The application will work as it does with a local database such as adding sites/boreholes and information. However, the Postgres section has more development put in.
For example, using a Postgres connection the user can visualise the borehole locations on a map for any given site to gain a quick overview of positions.
On this map view, the user can export the positions as a .KML file. This was chosen as, during my experience, site workers use google earth with loaded layers for finding objects of interest.
The other option is to click a pin on the map which will show basic information about the borehole as well as the option to 'Navigate To' which will use the map service for the current device.
Using Postgres allows for fast graph visualisation of data which is streamed directly to the device from the database. A time range can be established to view within the limits.
Using this the data can be viewed directly after being pushed during processing.
A reason to shift away from Python for this project was to develop something to operate as simply for the user as possible. Flutter offers the opportunity to easily produce a multiplatform application.
As a result the app is able to be used on both laptops as well as mobile devices. Testing has been conducted on Android tablets and Windows devices.
With the possibility for use both on desktop and mobile, this application can be used as a direct tool on site to push data to the cloud where it can be viewed before even stepping foot off of site.
Other users can view and interrogate the data using desktop devices back in the office and even view the results as they come through live.
The code for this project as well as release version can be found on my GitHub in the Borehole Datalogging App repository.
I would like to continue working on this application to finalise design as well as adding support for more logging devices.