GitXplorerGitXplorer
k

pySLAM-D

public
5 stars
0 forks
1 issues

Commits

List of commits on branch main.
Unverified
e2a54d27835514650ec1eaaef0e173b8308f6631

improved orthorectification

kkylesayrs committed 8 months ago
Unverified
63acf13ce4fe8d4007fc0770311e4a47db96766d

update readme explaination

kkylesayrs committed 8 months ago
Unverified
d5b392b6939917edca18e48b44ba683e76805dc3

better image crop

kkylesayrs committed 8 months ago
Unverified
fe6a22475fd2c0b622a8479d87a774a1fbf6a95a

better crop example stitch image

kkylesayrs committed 8 months ago
Unverified
7d60e187a6a40c65f80daccd833e42b220b6cf5b

readme imagery

kkylesayrs committed 8 months ago
Verified
9bf4daa290c3b8bcfa487f6a442d490ebfd2438a

Update README.md

kkylesayrs committed 9 months ago

README

The README file for this repository.

PySLAM-D

PySLAM-D is a UAV-tailored SLAM stitching inspired by pySLAM-D which was developed in partnership with GreenSight Agronomics. This implementation significantly increases the robustness and usability of the initial design, includes functional documentation, and implements additional matching options and parameters.

Example Stitching

Technical explaination

The Pyslam-D live stitching algorithm works by generating a factor graph whose nodes are each camera’s pose in georeferenced 3D space and whose edges (factors) are the relative pose transformations from one node to another. During the optimization step, the nodes of the factor graph are optimized for spatial consistency, meaning that even noisy sensor measurements can still reinforce one another to produce a consistent estimation of camera poses.

The first factor type, odometry factors, are determined by performing visual odometry between neighboring images and using point cloud registration to estimate the pose transformation between the two images. These factors can be computationally expensive, but help neighboring images “mesh” with each other independently of instrument noise.

The second factor type, gps factors, help to determine the camera's absolute position. These factors are often noisy, but are valuable for compensating for bad odometry estimates. The error produced by GPS, unlike relative odometry error, is independent of flight duration, meaning that these factors prevent images from drifting away from their true, georeferenced positions over time.

The third factor type, attitude factors, incorporate attitude measurements from the drone when the picture was taken. These factors improve estimates of image rotation and can be used as a prior for the odometry area of overlap, making odometry matching faster and increasing robustness to low overlap and high attitude datasets.

One known bug is that this implementation projects images directly downwards from the camera position. Future work will project the image with respect to the orientation of the camera, therefore better supporting non-gimbaled payloads.

Example Stitching

Installation

  1. Install pyslamd
git clone https://github.com/kylesayrs/pySLAM-D
python3 -m pip install -e .
  1. Install GTSAM
  2. Install TEASER-plusplus

Use cmake .. -DENABLE_DIAGNOSTIC_PRINT=OFF to turn off debug messages for teaser

Usage

Edit src/pyslamd/Settings.py to adjust stitching settings. In the future these settings will be documented as script arguments

pyslamd.stitch path/to/image/directory

Images must be geotagged with gps coordinates in order to georeference frame poses within a real world coordinate frame.