CrossRef Open Access 2019 16 sitasi

Comparing Human versus Machine-Driven Cadastral Boundary Feature Extraction

Emmanuel Nyandwi Mila Koeva Divyani Kohli Rohan Bennett

Abstrak

The objective to fast-track the mapping and registration of large numbers of unrecorded land rights globally, leads to the experimental application of Artificial Intelligence (AI) in the domain of land administration, and specifically the application of automated visual cognition techniques for cadastral mapping tasks. In this research, we applied and compared the ability of rule-based systems within Object Based Image Analysis (OBIA), as opposed to human analysis, to extract visible cadastral boundaries from Very high resolution (VHR) World View-2 image, in both rural and urban settings. From our experiments, machine-based techniques were able to automatically delineate a good proportion of rural parcels with explicit polygons where the correctness of the automatically extracted boundaries was 47.4% against 74.24% for humans and the completeness of 45% for machine, as against 70.4% for humans. On the contrary, in the urban area, automatic results were counterintuitive: even though urban plots and buildings are clearly marked with visible features such as fences, roads and tacitly perceptible to eyes, automation resulted in geometrically and topologically poorly structured data, that could neither be geometrically compared  with human digitised, nor actual cadastral data from the field. These results provide an updated snapshot with regards to the performance of contemporary machine-drive feature extraction techniques compared to conventional manual digitising.

Penulis (4)

E

Emmanuel Nyandwi

M

Mila Koeva

D

Divyani Kohli

R

Rohan Bennett

Format Sitasi

Nyandwi, E., Koeva, M., Kohli, D., Bennett, R. (2019). Comparing Human versus Machine-Driven Cadastral Boundary Feature Extraction. https://doi.org/10.20944/preprints201905.0342.v1

Akses Cepat

Informasi Jurnal
Tahun Terbit
2019
Bahasa
en
Total Sitasi
16×
Sumber Database
CrossRef
DOI
10.20944/preprints201905.0342.v1
Akses
Open Access ✓