[1] SMITH R C, CHEESEMAN P. On the representation and
estimation of spatial uncertainty[J]. International Journal of
Robotics Research, 1986, 5(4): 56-68.
[2] ZUO X X, GENEVA P, LEE W, et al. LIC-fusion: LiDAR
inertial-camera odometry[C]//IEEE/RSJ International Confer
ence on Intelligent Robots and Systems. Piscataway, USA:
IEEE, 2019: 5848-5854.
[3] ZUO X X, YANG Y L, GENEVA P, et al. LIC-fusion 2.0:
LiDAR-inertial-camera odometry with sliding-window plane
feature tracking[C]//IEEE/RSJ International Conference on In
telligent Robots and Systems. Piscataway, USA: IEEE, 2020:
5112-5119.
[4] SHAN T X, ENGLOT B, RATTI C, et al. LVI-SAM: Tightly
coupled Lidar-visual-inertial odometry via smoothing and map
ping[C]//IEEE International Conference on Robotics and Au
tomation. Piscataway, USA: IEEE, 2021: 5692-5698.
[5] QINT, LI P L, SHEN S J. VINS-Mono: A robust and versatile
monocular visual-inertial state estimator[J]. IEEE Transactions
on Robotics, 2018, 34(4): 1004-1020.
[6] LV J J, LANG X L, XU J H, et al. Continuous-time fixed-lag
smoothing for LiDAR-inertial-camera SLAM[J]. IEEE/ASME
Transactions on Mechatronics, 2023, 28(4): 2259-2270.
[7] FURGALE P, BARFOOT T D, SIBLEY G. Continuous-time
batch estimation using temporal basis functions[C]//IEEE Inter
national Conference on Robotics and Automation. Piscataway,
USA: IEEE, 2012: 2088-2095.
[8] FURGALE P, TONG C H, BARFOOT T D, et al. Continuous
time batch trajectory estimation using temporal basis functions
[J]. International Journal of Robotics Research, 2015, 34(14):
1688-1710.
[9] REHDERJ, SIEGWART R, FURGALE P. A general approach
to spatiotemporal calibration in multisensor systems[J]. IEEE
Transactions on Robotics, 2016, 32(2): 383-398.
[10] CIOFFI G, CIESLEWSKI T, SCARAMUZZA D. Continuous
time vs. discrete-time vision-based SLAM: A comparative s
tudy[J]. IEEE Robotics and Automation Letters, 2022, 7(2):
2399-2406.
[11] FURGALE P, REHDER J, SIEGWART R. Unified temporal
and spatial calibration for multi-sensor systems[C]//IEEE/RSJ
International Conference on Intelligent Robots and Systems.
Piscataway, USA: IEEE, 2013: 1280-1286.
[12] SIBLEY D, MEI C, REID I, et al. Adaptive relative bundle
adjustment[C]//Robotics: Science and Systems V. Cambridge,
USA: MIT Press, 2009. DOI: https://doi.org/10.15607/RSS.2009
.
V.023.
[13] SOMMER C, USENKO V, SCHUBERT D, et al. Efficient
derivative computation for cumulative B-splines on Lie groups
[C]//IEEE/CVF Conference on Computer Vision and Pattern
Recognition. Piscataway, USA: IEEE, 2020: 11148-11156.
[14] HUG D, CHLI M. HyperSLAM: A generic and modular ap
proach to sensor fusion and simultaneous localization and map
ping in continuous-time[C]//International Conference on 3D Vi
sion. Piscataway, USA: IEEE, 2020: 978-986.
[15] SOMMER H, FORBES J R, SIEGWART R, et al. Continuous
time estimation of attitude using B-splines on Lie groups[J].
Journal of Guidance, Control, and Dynamics, 2016, 39(2): 242
261.
[16] OVR´ EN H, FORSS´ EN P E. Trajectory representation and land
mark projection for continuous-time structure from motion[J].
International Journal of Robotics Research, 2019, 38(6): 686
701.
[17] HAARBACH A, BIRDAL T, ILIC S. Survey of higher order
rigid body motion interpolation methods for keyframe anima
tion and continuous-time trajectory estimation[C]//International
Conference on 3D Vision. Piscataway, USA: IEEE, 2018: 381
389.
[18] OTH L, FURGALE P, KNEIP L, et al. Rolling shutter camera
calibration[C]//IEEE Conference on Computer Vision and Pat
tern Recognition. Piscataway, USA: IEEE, 2013: 1360-1367.
[19] OVR´ EN H, FORSS´ EN P E. Spline error weighting for robust
visual-inertial fusion[C]//IEEE/CVF Conference on Computer
Vision and Pattern Recognition. Piscataway, USA: IEEE, 2018:
321-329.
[20] REHDERJ,NIKOLICJ,SCHNEIDERT,etal.Extendingkali
br: Calibrating the extrinsics of multiple IMUs and of individual
axes[C]//IEEE International Conference on Robotics and Au
tomation. Piscataway, USA: IEEE, 2016: 4304-4311.
[21] LV J J, XU J H, HU K W, et al. Targetless calibration of
LiDAR-IMUsystembasedoncontinuous-time batch estimation
[C]//IEEE/RSJ International Conference on Intelligent Robots
and Systems. Piscataway, USA: IEEE, 2020: 9968-9975.
[22] LV J J, ZUO X X, HU K W, et al. Observability-aware intrin
sic and extrinsic calibration of LiDAR-IMU systems[J]. IEEE
Transactions on Robotics, 2022, 38(6): 3734-3753.
[23] ZHI X Y, HOU J W, LU Y R, et al. Multical: Spatiotempo
ral calibration for multiple IMUs, cameras and LiDARs[C]//
IEEE/RSJ International Conference on Intelligent Robots and
Systems. Piscataway, USA: IEEE, 2022: 2446-2453.
[24]HUANGK,WANGYF,KNEIPL.Dynamic event camera calibration[C]//IEEE/RSJ International Conference on Intelligent
Robots and Systems. Piscataway, USA: IEEE, 2021: 7021
7028.
[25] REHDER J, BEARDSLEY P, SIEGWART R, et al. Spatio
temporal laser to visual/inertial calibration with applications
to hand-held, large scale scanning[C]//IEEE/RSJ Internation
al Conference on Intelligent Robots and Systems. Piscataway,
USA: IEEE, 2014: 459-465.
[26] CHEN S L, LI X X, LI S Y, et al. Targetless spatiotempo
ral calibration for multiple heterogeneous cameras and IMUs
based on continuous-time trajectory estimation[J]. IEEE Trans
actions on Instrumentation and Measurement, 2023, 72. DOI:
10.1109/TIM.2023.3328688.
[27] MAYEJ,FURGALEP,SIEGWARTR.Self-supervisedcalibra
tion for robotic systems[C]//IEEE Intelligent Vehicles Sympo
sium. Piscataway, USA: IEEE, 2013: 473-480.
[28] ALISMAIL H, BAKER L D, BROWNING B. Continuous tra
jectory estimation for 3D SLAM from actuated lidar[C]//IEEE
International Conference on Robotics and Automation. Piscat
away, USA: IEEE, 2014: 6096-6101.
[29] QUENZEL J, BEHNKE S. Real-time multi-adaptive
resolution-surfel 6D LiDAR odometry using continuous-time
trajectory optimization[C]//IEEE/RSJ International Conference
on Intelligent Robots and Systems. Piscataway, USA: IEEE,
2021: 5499-5506.
[30] CONGYZ,CHENC,YANGBS,etal.3D-CSTM:A3Dcon
tinuous spatio-temporal mapping method[J]. ISPRS Journal of
Photogrammetry and Remote Sensing, 2022, 186: 232-245.
[31] ZHENG X, ZHU J K. Traj-LO: In defense of LiDAR-only
odometry using an effective continuous-time trajectory[J]. IEEE
Robotics and Automation Letters, 2024, 9(2): 1961-1968.
[32] LOVEGROVE S, PATRON-PEREZ A, SIBLEY G. Spline fu
sion: A continuous-time representation for visual-inertial fusion
with application to rolling shutter cameras[C]//Proceedings of
the British Machine Vision Conference. 2013. DOI: 10.5244/
C.27.93.
[33] KERL C, STUCKLER J, CREMERS D. Dense continuous
time tracking and mapping with rolling shutter RGB-D cameras
[C]//IEEE International Conference on Computer Vision. Pis
cataway, USA: IEEE, 2015: 2264-2272.
[34] MUEGGLER E, GALLEGO G, SCARAMUZZA D.
Continuous-time trajectory estimation for event-based vi
sion sensors[C]//Robotics: Science and Systems XI. 2015.
DOI: 10.15607/RSS.2015.XI.036.
[35] YANG A J, CUI C, Bˆ ARSAN I A, et al. Asynchronous multi
view SLAM[C]//IEEE International Conference on Robotics
and Automation. Piscataway, USA: IEEE, 2021: 5669-5676.
[36] LV J J, HU K W, XU J H, et al. CLINS: Continuous-time tra
jectory estimation for LiDAR-inertial system[C]//IEEE/RSJ In
ternational Conference on Intelligent Robots and Systems. Pis
cataway, USA: IEEE, 2021: 6657-6663.
[37] RAMEZANIM,KHOSOUSSIK,CATTG,etal.Wildcat: On
line continuous-time 3D lidar-inertial SLAM[DB/OL].(2022
05-25) [2019-08-13]. https://arxiv.org/abs/2205.12595.
[38] HE B, DAI W C, WAN Z Y, et al. Continuous-time LiDAR
inertial-vehicle odometry method with lateral acceleration con
straint[C]//IEEE International Conference on Robotics and Au
tomation. Piscataway, USA: IEEE, 2023: 3997-4003.
[39] NGUYENTM,XUX,JINT,etal. Eigen is all you need: Effi
cient lidar-inertial continuous-time odometry with internal asso
ciation[DB/OL]. (2024-02-04) [2024-06-05]. https://arxiv.org/
abs/2402.02337.
[40] MUEGGLERE,GALLEGOG,REBECQH,etal.Continuous
time visual-inertial odometry for event cameras[J]. IEEE Trans
actions on Robotics, 2018, 34(6): 1425-1440.
[41] HUG D, B¨ ANNINGER P, ALZUGARAY I, et al. Continuous
time stereo-inertial odometry[J]. IEEE Robotics and Automa
tion Letters, 2022, 7(3): 6455-6462.
[42] LANG X, L¨ U J J, HUANG J, et al. Ctrl-VIO: Continuous-time
visual-inertial odometry for rolling shutter cameras[J]. IEEE
Robotics and Automation Letters, 2022, 7(4): 11537-11544.
[43] LOWE T, KIM S, COX M. Complementary perception for
handheld SLAM[J]. IEEE Robotics and Automation Letters,
2018, 3(2): 1104-1111.
[44] LANG X, CHEN C, TANG K, et al. Coco-LIC: Continuous
time tightly-coupled LiDAR-inertial-camera odometry using
non-uniform B-spline[J]. IEEE Robotics and Automation Let
ters, 2023, 8(11): 7074-7081.
[45] DING W C, GAO W L, WANG K X, et al. An efficient B
spline-based kinodynamic replanning framework for quadrotors
[J]. IEEE Transactions on Robotics, 2019, 35(6): 1287-1306.
[46] NGUYENNT,SCHILLINGL, ANGERNMS,etal. B-spline
path planner for safe navigation of mobile robots[C]//IEEE/RSJ
International Conference on Intelligent Robots and Systems.
Piscataway, USA: IEEE, 2021: 339-345.
[47] ZHENGX,LIMY,MOURIKISAI.Decoupledrepresentation
of the error and trajectory estimates for efficient pose estimation
[C]//Robotics: Science and Systems XI. 2015. DOI: 10.15607/
RSS.2015.XI.009.
[48] ANDERSON S, MACTAVISH K, BARFOOT T D. Relative
continuous-time SLAM[J]. International Journal of Robotics
Research, 2015, 34(12): 1453-1479.
[49] DUB´ E R, SOMMER H, GAWEL A, et al. Non-uniform sam
pling strategies for continuous correction based trajectory esti
mation[C]//IEEE International Conference on Robotics and Au
tomation. Piscataway, USA: IEEE, 2016: 4792-4798.