@小鱼 谢谢大神回复,我按照您说的改了之后还是没法读取yaml文件,这几天我又改了几个版本的launch.py文件,发现都无法读取配置文件。我思考了一下,感觉是yaml文件直接用了ros1中格式的问题,不知道这个方向是否正确?
以下是我改写的readParameter函数,其中ros1和ros2不一样的地方是由getParam()函数变成了get_parameter()函数,不知是否是这个原因。
试图读取的yaml文件如下:
%YAML:1.0 #common parameters imu_topic: "/imu0" image_topic: "/cam0/image_raw" output_path: "/home/ylab/output/" #camera calibration model_type: PINHOLE camera_name: camera image_width: 752 image_height: 480 distortion_parameters: k1: -2.917e-01 k2: 8.228e-02 p1: 5.333e-05 p2: -1.578e-04 projection_parameters: fx: 4.616e+02 fy: 4.603e+02 cx: 3.630e+02 cy: 2.481e+02 # Extrinsic parameter between IMU and Camera. estimate_extrinsic: 0 # 0 Have an accurate extrinsic parameters. We will trust the following imu^R_cam, imu^T_cam, don't change it. # 1 Have an initial guess about extrinsic parameters. We will optimize around your initial guess. # 2 Don't know anything about extrinsic parameters. You don't need to give R,T. We will try to calibrate it. Do some rotation movement at beginning. #If you choose 0 or 1, you should write down the following matrix. #Rotation from camera frame to imu frame, imu^R_cam extrinsicRotation: !!opencv-matrix rows: 3 cols: 3 dt: d data: [0.0148655429818, -0.999880929698, 0.00414029679422, 0.999557249008, 0.0149672133247, 0.025715529948, -0.0257744366974, 0.00375618835797, 0.999660727178] #Translation from camera frame to imu frame, imu^T_cam extrinsicTranslation: !!opencv-matrix rows: 3 cols: 1 dt: d data: [-0.0216401454975,-0.064676986768, 0.00981073058949] #feature traker paprameters max_cnt: 150 # max feature number in feature tracking min_dist: 30 # min distance between two features freq: 10 # frequence (Hz) of publish tracking result. At least 10Hz for good estimation. If set 0, the frequence will be same as raw image F_threshold: 1.0 # ransac threshold (pixel) show_track: 1 # publish tracking image as topic equalize: 1 # if image is too dark or light, trun on equalize to find enough features fisheye: 0 # if using fisheye, trun on it. A circle mask will be loaded to remove edge noisy points #optimization parameters max_solver_time: 0.04 # max solver itration time (ms), to guarantee real time max_num_iterations: 8 # max solver itrations, to guarantee real time keyframe_parallax: 10.0 # keyframe selection threshold (pixel) #imu parameters The more accurate parameters you provide, the better performance acc_n: 0.08 # accelerometer measurement noise standard deviation. #0.2 0.04 gyr_n: 0.004 # gyroscope measurement noise standard deviation. #0.05 0.004 acc_w: 0.00004 # accelerometer bias random work noise standard deviation. #0.02 gyr_w: 2.0e-6 # gyroscope bias random work noise standard deviation. #4.0e-5 g_norm: 9.81007 # gravity magnitude #loop closure parameters loop_closure: 1 # start loop closure load_previous_pose_graph: 0 # load and reuse previous pose graph; load from 'pose_graph_save_path' fast_relocalization: 0 # useful in real-time and large project pose_graph_save_path: "/home/ylab/output/pose_graph/" # save and load path #unsynchronization parameters estimate_td: 0 # online estimate time offset between camera and imu td: 0.0 # initial value of time offset. unit: s. readed image clock + td = real image clock (IMU clock) #rolling shutter parameters rolling_shutter: 0 # 0: global shutter camera, 1: rolling shutter camera rolling_shutter_tr: 0 # unit: s. rolling shutter read out time per frame (from data sheet). #visualization parameters save_image: 1 # save image in pose graph for visualization prupose; you can close this function by setting 0 visualize_imu_forward: 0 # output imu forward propogation to achieve low latency and high frequence results visualize_camera_size: 0.4 # size of camera marker in RVIZ