Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Can someone help in connecting openvins with px4 for publishing the data? #21

Closed
ShreyasKuntnal opened this issue Jul 22, 2024 · 22 comments

Comments

@ShreyasKuntnal
Copy link

ShreyasKuntnal commented Jul 22, 2024

I am a bit new to this. Checked out px4 documentation but was failing to find the right file to do changes in. So can someone help please?

@ranjithlesoko
Copy link

@ShreyasKuntnal openvins or vins?

@ShreyasKuntnal
Copy link
Author

Openvins....But if i know how to connect vins to px4 then I guess it would be max same... I checked out VINS-Fusion-PX4 repo but when it comes to openvins things are a bit complex is what I feel. So thought asking it out?

@ranjithlesoko
Copy link

@ShreyasKuntnal what device you use like jetson or any other board and which camera model like realsense or any and you build vins without error ? whats your opencv version

@ShreyasKuntnal
Copy link
Author

ShreyasKuntnal commented Jul 25, 2024

Yes VINS built with no errors.
I use Intel Realsense D435i connected it to PC(i7 11800H 3060 6GB). I have a pixhawk 6c Connected and MAVros running.
Open CV version is 4.9.0 CUDA Enabled.
rosdistro: noetic
rosversion: 1.16.0
RealSense ROS v2.3.2
Built with LibRealSense v2.50.0
Running with LibRealSense v2.50.0

So, when I start the openvins subscriber launch, there is a lot of drift I tried reducing it to certain extent already but if you idea about it then it would be very helpful. I am trying to solve that too.
If connecting Openvins to SITL or simulator is the same as connecting to PX4 then I guide me with regard to it the same way. So, I can replicate on my own.

@engcang
Copy link
Owner

engcang commented Jul 25, 2024

@ShreyasKuntnal Hi! Sorry for late.
Can you more clarify your question again? What do you want to do and what problem did you suffer?
You just want to use OpenVINS as external pose estimation in MAVROS? or do you want to improve the pose estimation performance itself of OpenVINS?

@ShreyasKuntnal
Copy link
Author

ShreyasKuntnal commented Jul 25, 2024

Thank you so much for the reply.
So firstly I am a bit new to VINS and ros.
I am using OPEN Vins for autonomous indoor drone.
Here I will divide it into two parts:
Open VINS drifting issues and Implementation of open Vins with PX4

I have even calibrated my color mono camera with IMU. Even completed the task of alliance ros for imu noise.

Open vins drifting issue:
I have implemented Open VINS with Ros noetic on my D435i Realsense camera.
Initially there was a very huge drifting. The extent of drifting was to the level that as soon as I run openvins it goes straight line far away.

So, I tried researching it and Checking through multiple issues in GitHub open vins page but failed to find any useful information. But what I was able to figure out is that. When I set ZUPT true, It was atleast stable to some extent. But If you have any knowledge about this then please ask me the information which u need i will be able to provide and get help from you.

Connecting OPEN VINS with PX4:
As we know OPEN vins can be used as VIO for obstacle Aviodance and path planning for drones like VINS Fusion. Correct me if I am wrong.

So how do we connect Open VINS to px4?(Obvious answer would be to publish data through mavros rostopic and things goes on as mentioned in px4 documentation but how do we do that as follows in open vins are complex?)

Followup question: we need to publish Odometry or pose estimation to PX4 how to do that?

@engcang and @ranjithlesoko Can you please check the same and help me one or other way if possible!!

@engcang
Copy link
Owner

engcang commented Jul 25, 2024

For the latter - Connecting VIO with PX4: check this official document page - https://docs.px4.io/main/en/ros/external_position_estimation.html

You should set these parameters properly - EKF2_EV_CTRL, EKF2_HGT_REF

and publish the pose estimation result as /mavros/vision_pose/pose topic through MAVROS.

@engcang
Copy link
Owner

engcang commented Jul 25, 2024

@ShreyasKuntnal
For the former, according to your descriptions, I think the configuration (especially extrinsic between IMU and camera) should be wrong. If it is set properly, it does not drift at the beginning of the estimation.
And I recommend VINS-Fusion than OpenVINS.

@engcang engcang closed this as completed Jul 25, 2024
@ranjithlesoko
Copy link

Yes VINS built with no errors. I use Intel Realsense D435i connected it to PC(i7 11800H 3060 6GB). I have a pixhawk 6c Connected and MAVros running. Open CV version is 4.9.0 CUDA Enabled. rosdistro: noetic rosversion: 1.16.0 RealSense ROS v2.3.2 Built with LibRealSense v2.50.0 Running with LibRealSense v2.50.0

So, when I start the openvins subscriber launch, there is a lot of drift I tried reducing it to certain extent already but if you idea about it then it would be very helpful. I am trying to solve that too. If connecting Openvins to SITL or simulator is the same as connecting to PX4 then I guide me with regard to it the same way. So, I can replicate on my own.

initial you try to use camera imu first for vins fusion

@ShreyasKuntnal
Copy link
Author

@engcang AS you said to calibrate the imu and camera. I did the same but still facing the Drifiting issue on Openvins for d435i camera. How to fix the same?

@ranjithlesoko
Copy link

@ShreyasKuntnal which model pixhawk are you using ?

@ShreyasKuntnal
Copy link
Author

I am using pixhawk 6c.
Btw pixhawk is later part as initially the algorithm itself drifts so much so I cant connect it to pixhawk or firmware.

@ranjithlesoko
Copy link

@ShreyasKuntnal Proper synchronization between the camera and IMU is crucial. Ensure timestamps are accurately synchronized to minimize drift.

@ShreyasKuntnal
Copy link
Author

@ranjithlesoko Is it? OK so as far as I know time synchronisation offset can be seen in the calibration files itself as a parameter right. So, Can you guide me through on what should be done?
You can even check this post with all the required information and also calibrated reports:
rpng/open_vins#458 (comment)

@ranjithlesoko
Copy link

@ShreyasKuntnal I have knowledge in vins-fusion not openvins sorry

@ShreyasKuntnal
Copy link
Author

Oh, OK! Thank you @ranjithlesoko ...
Hoping @engcang Going to reply for the same.

@engcang
Copy link
Owner

engcang commented Aug 5, 2024

@ShreyasKuntnal

Are you using IR stereo images and IMU of D435i for OpenVINS?
Did you turn off the IR laser emitter? It distracts feature extraction and tracking.
What is your IMU-camera extrinsic calibration results?
How did you set your configuration file for OpenVINS?

@ShreyasKuntnal
Copy link
Author

ShreyasKuntnal commented Aug 5, 2024

  1. No, I am not using IR Stereo Camera. But I have even checked Stereo camera setup with IR Laser emitter off.
  2. I have set IR laser emitter to off in launch file itself so it never interrupted.
  3. You can find the outputs and calibration files attached with this post:
    Can someone help in connecting openvins with px4 for publishing the data? rpng/open_vins#458 (comment)
    For your kind reference, I am going to reattach the calibration files link here: Google Drive Link for Calibration Files
  4. Initially, I tried out the configuration suggestions provided in the issues section of Openvins for D435i camera. But nothing stopped my drifting issue.
    This is the estimator_config.yaml:
%YAML:1.0 # need to specify the file type at the top!

verbosity: "INFO" # ALL, DEBUG, INFO, WARNING, ERROR, SILENT

use_fej: true # if first-estimate Jacobians should be used (enable for good consistency)
integration: "rk4" # discrete, rk4, analytical (if rk4 or analytical used then analytical covariance propagation is used)
use_stereo: false # if we have more than 1 camera, if we should try to track stereo constraints between pairs
max_cameras: 1 # how many cameras we have 1 = mono, 2 = stereo, >2 = binocular (all mono tracking)

calib_cam_extrinsics: true # if the transform between camera and IMU should be optimized R_ItoC, p_CinI
calib_cam_intrinsics: true # if camera intrinsics should be optimized (focal, center, distortion)
calib_cam_timeoffset: true # if timeoffset between camera and IMU should be optimized
calib_imu_intrinsics: false # if imu intrinsics should be calibrated (rotation and skew-scale matrix)
calib_imu_g_sensitivity: false # if gyroscope gravity sensitivity (Tg) should be calibrated

max_clones: 11 # how many clones in the sliding window
max_slam: 50 # number of features in our state vector
max_slam_in_update: 25 # update can be split into sequential updates of batches, how many in a batch
max_msckf_in_update: 40 # how many MSCKF features to use in the update
dt_slam_delay: 1 # delay before initializing (helps with stability from bad initialization...)

gravity_mag: 9.81 # magnitude of gravity in this location

feat_rep_msckf: "GLOBAL_3D"
feat_rep_slam: "ANCHORED_MSCKF_INVERSE_DEPTH"
feat_rep_aruco: "ANCHORED_MSCKF_INVERSE_DEPTH"

# zero velocity update parameters we can use
# we support either IMU-based or disparity detection.
try_zupt: false
zupt_chi2_multipler: 0.6 # set to 0 for only disp-based
zupt_max_velocity: 0.5
zupt_noise_multiplier: 10
zupt_max_disparity: 0.5 # set to 0 for only imu-based
zupt_only_at_beginning: false

# ==================================================================
# ==================================================================

init_window_time: 1.0 # how many seconds to collect initialization information
init_imu_thresh: 2.5 # threshold for variance of the accelerometer to detect a "jerk" in motion
init_max_disparity: 10.0 # max disparity to consider the platform stationary (dependent on resolution)
init_max_features: 50 # how many features to track during initialization (saves on computation)

init_dyn_use: false # if dynamic initialization should be used
init_dyn_mle_opt_calib: false # if we should optimize calibration during intialization (not recommended)
init_dyn_mle_max_iter: 50 # how many iterations the MLE refinement should use (zero to skip the MLE)
init_dyn_mle_max_time: 0.05 # how many seconds the MLE should be completed in
init_dyn_mle_max_threads: 6 # how many threads the MLE should use
init_dyn_num_pose: 6 # number of poses to use within our window time (evenly spaced)
init_dyn_min_deg: 10.0 # orientation change needed to try to init

init_dyn_inflation_ori: 10 # what to inflate the recovered q_GtoI covariance by
init_dyn_inflation_vel: 100 # what to inflate the recovered v_IinG covariance by
init_dyn_inflation_bg: 10 # what to inflate the recovered bias_g covariance by
init_dyn_inflation_ba: 100 # what to inflate the recovered bias_a covariance by
init_dyn_min_rec_cond: 1e-12 # reciprocal condition number thresh for info inversion

init_dyn_bias_g: [ 0.0, 0.0, 0.0 ] # initial gyroscope bias guess
init_dyn_bias_a: [ 0.0, 0.0, 0.0 ] # initial accelerometer bias guess

# ==================================================================
# ==================================================================

record_timing_information: false # if we want to record timing information of the method
record_timing_filepath: "/tmp/traj_timing.txt" # https://docs.openvins.com/eval-timing.html#eval-ov-timing-flame

# if we want to save the simulation state and its diagional covariance
# use this with rosrun ov_eval error_simulation
save_total_state: false
filepath_est: "/tmp/ov_estimate.txt"
filepath_std: "/tmp/ov_estimate_std.txt"
filepath_gt: "/tmp/ov_groundtruth.txt"

# ==================================================================
# ==================================================================

# our front-end feature tracking parameters
# we have a KLT and descriptor based (KLT is better implemented...)
use_klt: true # if true we will use KLT, otherwise use a ORB descriptor + robust matching
num_pts: 200 # number of points (per camera) we will extract and try to track
fast_threshold: 50 # threshold for fast extraction (warning: lower threshs can be expensive)
grid_x: 5 # extraction sub-grid count for horizontal direction (uniform tracking)
grid_y: 5 # extraction sub-grid count for vertical direction (uniform tracking)
min_px_dist: 15 # distance between features (features near each other provide less information)
knn_ratio: 0.70 # descriptor knn threshold for the top two descriptor matches
track_frequency: 31.0 # frequency we will perform feature tracking at (in frames per second / hertz)
downsample_cameras: false # will downsample image in half if true
num_opencv_threads: 4 # -1: auto, 0-1: serial, >1: number of threads
histogram_method: "HISTOGRAM" # NONE, HISTOGRAM, CLAHE

# aruco tag tracker for the system
# DICT_6X6_1000 from https://chev.me/arucogen/
use_aruco: false
num_aruco: 1024
downsize_aruco: true

# ==================================================================
# ==================================================================

# camera noises and chi-squared threshold multipliers
up_msckf_sigma_px: 1
up_msckf_chi2_multipler: 1
up_slam_sigma_px: 1
up_slam_chi2_multipler: 1
up_aruco_sigma_px: 1
up_aruco_chi2_multipler: 1

# masks for our images
use_mask: false

# imu and camera spacial-temporal
# imu config should also have the correct noise values
relative_config_imu: "kalibr_imu_chain.yaml"
relative_config_imucam: "kalibr_imucam_chain.yaml"

If I set this try_zupt: false true the drifting decreases to some extent, but still It's not so acceptable for drones.

@engcang
Copy link
Owner

engcang commented Aug 5, 2024

@ShreyasKuntnal

  1. No, I am not using IR Stereo Camera. But I have even checked Stereo camera setup with IR Laser emitter off.

D435i has two global shutter IR stereo camers, but rolling shutter for RGB camera.
So if you use RGB camera, Visual odometry algorithms will not work properly.

@ShreyasKuntnal
Copy link
Author

@engcang Ok so I even tried with stereo camera still the drift happens.

@ShreyasKuntnal
Copy link
Author

@ranjithlesoko how can I connect you in regards to vins fusion. I need some help from u!!

@ranjithlesoko
Copy link

@ShreyasKuntnal my mail id [email protected]

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants