Journal: ISPRS Journal of Photogrammetry and Remote Sensing

Loading...

Abbreviation

ISPRS j. photogramm. remote sens.

Publisher

Elsevier

Journal Volumes

ISSN

0924-2716

Description

Search Results

Publications 1 - 10 of 70
  • Liu, Yujia; Obukhov, Anton; Wegner, Jan Dirk; et al. (2024)
    ISPRS Journal of Photogrammetry and Remote Sensing
    We present a learning-based approach to reconstructing buildings as 3D polygonal meshes from airborne LiDAR point clouds. What makes 3D building reconstruction from airborne LiDAR difficult is the large diversity of building designs, especially roof shapes, the low and varying point density across the scene, and the often incomplete coverage of building facades due to occlusions by vegetation or the sensor's viewing angle. To cope with the diversity of shapes and inhomogeneous and incomplete object coverage, we introduce a generative model that directly predicts 3D polygonal meshes from input point clouds. Our autoregressive model, called Point2Building, iteratively builds up the mesh by generating sequences of vertices and faces. This approach enables our model to adapt flexibly to diverse geometries and building structures. Unlike many existing methods that rely heavily on pre-processing steps like exhaustive plane detection, our model learns directly from the point cloud data, thereby reducing error propagation and increasing the fidelity of the reconstruction. We experimentally validate our method on a collection of airborne LiDAR data from Zurich, Berlin, and Tallinn. Our method shows good generalization to diverse urban styles.
  • Aasen, Helge; Burkart, Andreas; Bolten, Andreas; et al. (2015)
    ISPRS Journal of Photogrammetry and Remote Sensing
  • Mason, David C.; Scott, Tania R.; Wang, Hai-Jing (2006)
    ISPRS Journal of Photogrammetry and Remote Sensing
  • Baltsavias, E.P. (2004)
    ISPRS Journal of Photogrammetry and Remote Sensing
  • Rottensteiner, Franz; Sohn, Gunho; Gerke, Markus; et al. (2014)
    ISPRS Journal of Photogrammetry and Remote Sensing
  • Chen, Hongruixuan; Lan, Cuiling; Song, Jian; et al. (2026)
    ISPRS Journal of Photogrammetry and Remote Sensing
    Change Detection (CD) between images with different modalities is a fundamental capability for remote sensing. In this work, we pinpoint the commonalities between Multimodal Change Detection (MCD) and Multimodal Image Matching (MIM). Accordingly, we present a new unsupervised CD framework designed from the perspective of Image Matching (IM), called IM4CD. It unifies the IM and CD tasks into a single, coherent framework. In this framework, we abandon the prevalent strategy in MCD to compare per-pixel image features, since it is in practice quite difficult to design features that are truly invariant across modalities. Instead, we propose to compute similarity by local template matching and utilize the spatial offset of response peaks to represent change intensity between images with different modalities, and then to integrate it tightly with the co-registration of the two images, which anyway includes such a matching step. In this way, the same off-the-shelf descriptors used for MIM also support MCD. In other words, we first extract modality-independent features, then detect salient points to obtain initial pairs of corresponding Control Points (CP). When matching those points to accurately register the images, CP pairs located in unchanged areas show low residuals, whereas those in changed areas show high residuals. The CPs can then be connected into a Conditional Random Field (CRF), leveraging modality-independent structural relationships to estimate dense change maps. Experimental results show the effectiveness of our method, including robustness to registration errors, its compatibility with different image descriptors, and promising potential for challenging real-world disaster response scenarios.
  • Cai, Zhipeng; Chin, Tat-Jun; Parra Bustos, Alvaro; et al. (2019)
    ISPRS Journal of Photogrammetry and Remote Sensing
  • Volpi, Michele; Tuia, Devis (2018)
    ISPRS Journal of Photogrammetry and Remote Sensing
  • Plank, Simon; Singer, John; Thuro, Kurosch (2013)
    ISPRS Journal of Photogrammetry and Remote Sensing
  • Qiao, Jing; Butt, Jemil Avers (2023)
    ISPRS Journal of Photogrammetry and Remote Sensing
    Target- or intensity-feature-based calibration has been extensively explored for in-situ calibration of terrestrial laser scanners (TLSs). Those calibration approaches main drawbacks include the necessity of mounting physical targets in the scan area and the need for the presence of rich geometrical features allowing a feature-based distinction of different points in the point cloud. We propose an autonomous TLS calibration algorithm using planar patches that are ubiquitous in urban environments and can be found without manual preparation beforehand. The scanner calibration parameters are estimated and updated by minimizing the M3C2 normal distances between corresponding planar patches. Unlike target- or keypoint-based approaches, only medium-resolution scan data is needed. A comprehensive set of calibration model parameters are estimated using scans acquired from multiple positions, thereby providing a further advantage over current keypoint-based approaches that estimate only two-face sensitive parameters. To increase speed and reduce memory consumption, we propose to execute the calibration using only parts of scan data from a single station. Studies with two high-precision scanners, Leica RTC360 and Z+F Imager 5016, show results consistent with the laborious target-based approach with improved precision. Similar two-face sensitive calibration performance to the target- and keypoint-based approaches has been achieved, but with only subsets of the point cloud using the proposed algorithm.
Publications 1 - 10 of 70