Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Question on tf frames and VI Odom (D435i) #3173

Open
JTShuai opened this issue Jul 30, 2024 · 1 comment
Open

Question on tf frames and VI Odom (D435i) #3173

JTShuai opened this issue Jul 30, 2024 · 1 comment

Comments

@JTShuai
Copy link

JTShuai commented Jul 30, 2024


Required Info
Camera Model D435i
Firmware Version v2.55.1.0
Operating System & Version Linux (Ubuntu 22)
Kernel Version (Linux Only) 6.5.0-1025-oem
Platform PC
Librealsense SDK Version 2.55.1
Language python
Segment Robot
ROS Distro Humble
RealSense ROS Wrapper Version 4.55.1

Issue Description

I'm trying to use the D435i to build a VIO with pkg robot_localization, rtabmap_odom, and imu_filter_madgwick. My current pipeline is:

  1. an IMU filter node to get processed IMU data with orientation
  2. a visual odometry node that uses stereo images to get odometry
  3. an UKF node that fuses IMU and visual odom to get final VIO odom data

If I only launch the realsense ros2 wrapper, the tf tree is:
pure_camera_frame

This is my current launch file:

#!/usr/bin/env python3

import os
from ament_index_python.packages import get_package_share_directory
from launch import LaunchDescription
from launch_ros.actions import Node

from launch_ros.substitutions import FindPackageShare
from launch.substitutions import PathJoinSubstitution
from launch.actions import IncludeLaunchDescription
from launch.launch_description_sources import PythonLaunchDescriptionSource


def generate_launch_description():
    
    ld = LaunchDescription()


    # realsense launch
    rs_camera_node = Node(
        package="realsense2_camera",
        executable="realsense2_camera_node",
        name="camera",
        namespace="camera",
        parameters=[{
            "align_depth.enable": True,
            "linear_accel_cov": 1.0,
            "unite_imu_method": 2,  # linear_interpolation
            "enable_rgbd": True,
            "enable_sync": True,
            "enable_color": True,
            "enable_depth": True,
            "enable_accel": True,
            "enable_gyro": True,
            "initial_reset": False,
            "publish_tf": True,

            "rgb_camera.color_profile":"640x480x30",
            "depth_module.infra_profile":"640x480x30", 
            }]
        )
    

    # IMU filter
    imu_node = Node(
        package='imu_filter_madgwick',
        executable='imu_filter_madgwick_node',
        name='imu_filter',
        parameters=[{
            "use_mag": False,
            # "publish_tf": True,
            "publish_tf": True,
            "world_frame": "enu",
            "fixed_frame": "camera_link", # The parent frame to be used in publish_tf
            # "fixed_frame": "camera_imu_frame", # The parent frame to be used in publish_tf
            }],
        remappings=[
            ('/imu/data_raw', '/camera/camera/imu'),
            ('/imu/data', '/rtabmap/imu'),
            ]
        )
    


    # camera base to camera_link(fr1_optical_frame)
    # TODO: transformation from base_camera to camera_link
    base_camera_tf_camera_link = LaunchDescription([
        Node(
            package="tf2_ros",
            executable = "static_transform_publisher",
            arguments = ["0" , "0", "1", "0", "0", "0" ,"base_camera", "camera_link"]
        )
    ])
    
    # TODO: measure the robot base_link to base_camera
    robot_base_tf_base_camera = LaunchDescription([
        Node(
            package="tf2_ros",
            executable = "static_transform_publisher",
            arguments = ["0.0" , "0.0", "0.0", "0.0", "0.0", "0.0" ,"base_link","base_camera"]
        )
    ])


    # rtabmap_odom node
    # http://wiki.ros.org/rtabmap_odom#stereo_odometry
    rtabmap_odom_node = Node(
        package='rtabmap_odom', executable='stereo_odometry', name="rtab_stereo_odometry", output='screen',
        parameters=[{
            # 'frame_id':'base_link',  
            'frame_id':'camera_base',  
            'visual_odometry':True,
            'icp_odometry':False,
            'stereo':True,
            'publish_tf': False,  # let the ekf publish the odom frame, hence set to false for this node
            'subscribe_odom_info':True,
        }],
        remappings=[
            ('odom', '/stereo_vo'), # resulting odom publish to /stereo_vo

            ('left/image_rect', '/camera/camera/infra1/image_rect_raw'),
            ('left/camera_info', '/camera/camera/infra1/camera_info'),

            ('right/image_rect', '/camera/camera/infra2/image_rect_raw'),
            ('right/camera_info', '/camera/camera/infra2/camera_info')
            ]
        )


    # ukf node to get VIO
    # https://docs.ros.org/en/melodic/api/robot_localization/html/state_estimation_nodes.html
    filter_odom_node = Node(
        package="robot_localization",
        executable="ukf_node",
        name="ukf_filter_node",
        parameters=[
            {
                "frequency": 30.0,
                "odom0": "/stereo_vo",
                "odom0_config": [True, True, True,
                                 True, True, True,
                                 True, True, True,
                                 True, True, True,
                                 True, True, True],
                "odom0_relative": True,

                "imu0": "/rtabmap/imu",
                "imu0_config": [False, False, False,
                                True, True, True,
                                True, True, True,
                                True, True, True,
                                True, True, True, ],
                "imu0_differential": True,
                "imu0_relative": False,
                "use_control": False
            }
        ],
        remappings=[
            ('/odometry/filtered', '/odom')
            ]
    )

    
    ld.add_action(rs_camera_node)
    ld.add_action(imu_node)

    ld.add_action(base_camera_tf_camera_link)
    ld.add_action(robot_base_tf_base_camera)
   
    ld.add_action(rtabmap_odom_node)
    ld.add_action(filter_odom_node)
    
    return ld

Questions

  1. In the IMU filter node, should the parameter fixed_frame be camera_link or camera_imu_frame. The document said it should be the parent frame to be used. So, I guess this should be the camera_imu_frame since we filter the raw IMU data from the camera_imu_optical_frame, which has the parent frame of camera_imu_frame. But then I realized that the transformation between camera_imu_optical_frame and camera_link should always be the static tf. So I set the fixed_frame to camera_link. The tf tree is shown bellow. The new question comes: Now, the tf between camera_link and camera_imu_optical_frame is published by the filter node with imu_filter_madgwick pkg in avg 200hz. Does this TF take care of the transformations along the original chain which was camera_link --> camera_gyro_frame --> camera_imu_frame --> camera_imu_optical_frame?
imu_frame_camera_parent
  1. It seems like the VIO results drift a lot. Currently, I check the results with RVIZ2 where I select the Fixed Frame as odom and use rviz_imu_plugin to visualize the filtered IMU data. Is this a proper way to check the odometry performance? Is there any better way to do it?

Thanks!

@RealSenseCustomerSupport
Copy link
Collaborator


Hi,
Sorry for the slow response.

  1. based on your script setup, it should be camera_imu_frame. The coordinates described in "ROS2(Robot) vs Optical(Camera) Coordination Systems" section at https://github.com/IntelRealSense/realsense-ros/tree/ros2-development and transformation can be checked with TF in rviz2 as well as the /tf_static topic.
  2. what kind of drift are you seeing? to isolate the issue, in addition, check the raw accel and gyro data comes out of the realsense sdk. Tools like realsense viewer and motion example https://github.com/IntelRealSense/librealsense/tree/development/examples/motion will be helpful, and also the raw data come out of accel and gyro ros topics.
  3. The IMU should also be calibrated so the data out of realsense sdk is compensated. https://www.intelrealsense.com/wp-content/uploads/2019/07/Intel_RealSense_Depth_D435i_IMU_Calibration.pdf and https://github.com/IntelRealSense/librealsense/blob/development/tools/rs-imu-calibration/rs-imu-calibration.py

Best Regards!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants