mars.hku.hk Open in urlscan Pro
147.8.204.239  Public Scan

Submitted URL: http://mars.hku.hk/
Effective URL: https://mars.hku.hk/
Submission: On April 19 via api from US — Scanned from DE

Form analysis 0 forms found in the DOM

Text Content

Skip to content
 * Home
 * News Center
 * Research
 * People
 * Publications

 * 
 * 
 * 
 * 
 * 

 1. 1
 2. 2
 3. 3
 4. 4
 5. 5

 * <
 * >





ABOUT US

Welcome to the Mechatronics and Robotic Systems (MaRS) Laboratory. We are part
of the Department of Mechanical Engineering at the University of Hong Kong
(HKU). Our lab focuses on general mechatronic systems and robotics, with
emphasis on their practical use in real human life and industry. Our current
focuses are on aerial robot design, planning and control, and lidar-based
simultanous localization and mapping (SLAM).

We are hiring new MPhil and Ph.D. students on UAV design, planning, control, and
LiDAR SLAM. Potential students can contact Dr. Zhang at fuzhang@hku.hk for the
positions.


PROJECT HIGHLIGHTS

FAST-LIO2: FAST DIRECT LIDAR-INERTIAL ODOMETRY



FAST-LIO2 is computationally-efficient (e.g., up to 100 Hz odometry and mapping
in large outdoor environments), robust (e.g., reliable pose estimation in
cluttered indoor environments with rotation up to 1000 deg/s), versatile (i.e.,
applicable to both multi-line spinning and solid-state LiDARs, UAV and handheld
platforms, and Intel and ARM-based processors), while still achieving higher or
comparable accuracy with existing methods.

Authors: Wei Xu, Yixi Cai, Dongjiao He, Jiarong Lin, Fu Zhang
Videos: video 1, video 2
Code: https://github.com/hku-mars/FAST_LIO

FAST AND ACCURATE EXTRINSIC CALIBRATION FOR MULTIPLE LIDARS AND CAMERAS



We propose a fast, accurate, and targetless extrinsic calibration method for
multiple LiDARs and cameras based on adaptive voxelization. On the theory level,
we incorporate the LiDAR extrinsic calibration with the bundle adjustment
method. We derive the second-order derivatives of the cost function w.r.t. the
extrinsic parameter to accelerate the optimization. On the implementation level,
we apply the adaptive voxelization to dynamically segment the LiDAR point cloud
into voxels with non-identical sizes, and reduce the computation time in the
process of feature correspondence matching.

Authors: Xiyuan Liu, Chongjian Yuan, Fu Zhang
Videos: video
Code: https://github.com/hku-mars/mlcc

IKD-TREE



ikd-Tree is an incremental k-d tree designed for robotic applications. The
ikd-Tree incrementally updates a k-d tree with new coming points only, leading
to much lower computation time than existing static k-d trees. Besides
point-wise operations, the ikd-Tree supports several features such as box-wise
operations and down-sampling that are practically useful in robotic
applications.

Authors: Yixi Cai, Wei Xu, Fu Zhang
Videos: video
Code: https://github.com/hku-mars/ikd-Tree

R2LIVE: A ROBUST, REAL-TIME, LIDAR-INERTIAL-VISUAL TIGHTLY-COUPLED STATE
ESTIMATOR AND MAPPING



R2LIVE is a robust, real-time tightly-coupled multi-sensor fusion framework,
which fuses the measurement from the LiDAR, inertial sensor, visual camera to
achieve robust, accurate state estimation. Taking advantage of measurement from
all individual sensors, our algorithm is robust enough to various visual
failure, LiDAR-degenerated scenarios, and is able to run in real time on an
on-board computation platform, as shown by extensive experiments conducted in
indoor, outdoor, and mixed environment of different scale.

Authors: Jiarong Lin, Fu Zhang
Videos: video
Code: https://github.com/hku-mars/r2live

R3LIVE: A ROBUST, REAL-TIME, RGB-COLORED, LIDAR-INERTIAL-VISUAL TIGHTLY-COUPLED
STATE ESTIMATION AND MAPPING PACKAGE



R3LIVE is a novel LiDAR-Inertial-Visual sensor fusion framework, which takes
advantage of measurement of LiDAR, inertial, and visual sensors to achieve
robust and accurate state estimation. R3LIVE is contained of two subsystems, the
LiDAR-inertial odometry (LIO) and visual-inertial odometry (VIO). The LIO
subsystem (FAST-LIO) takes advantage of the measurement from LiDAR and inertial
sensors and builds the geometry structure of (i.e. the position of 3D points)
global maps. The VIO subsystem utilizes the data of visual-inertial sensors and
renders the map's texture (i.e. the color of 3D points).

Authors: Jiarong Lin, Fu Zhang
Videos: video 1, video 2
Code: https://github.com/hku-mars/r3live

AVOIDING DYNAMIC SMALL OBSTACLES WITH ONBOARD SENSING AND COMPUTATING ON AERIAL
ROBOTS



This repository is used for UAV dynamic small obstacles avoidance. It is a
complete system for lidar-based UAV, including FAST-LIO slam, time-accumulated
KD-Tree mapping and kinodynamic A* search modules. It is able to avoid dynamic
small obstacles (down to 20mm diameter bars) by running at 50Hz.

Authors: Fanze Kong, Wei Xu, Fu Zhang
Videos: video
Code: https://github.com/hku-mars/dyn_small_obs_avoidance

PIXEL-LEVEL EXTRINSIC SELF CALIBRATION OF HIGH RESOLUTION LIDAR AND CAMERA IN
TARGETLESS ENVIRONMENTS



livox_camera_calib is a robust, high accuracy extrinsic calibration tool between
high resolution LiDAR (e.g. Livox) and camera in targetless environment. Our
algorithm can run in both indoor and outdoor scenes, and only requires edge
information in the scene. If the scene is suitable, we can achieve pixel-level
accuracy similar to or even beyond the target based method.

Authors: Chongjian Yuan, Xiyuan Liu, Xiaoping Hong, Fu Zhang
Videos: video
Code: https://github.com/hku-mars/livox_camera_calib




MECHATRONICS AND ROBOTIC SYSTEMS LAB

SITE MAP

 * News Center
 * Research
 * People
 * Publications

CONTACT

 * LG-02, Haking Wong Building,
 * University of Hong Kong,
 * Pokfulam, Hong Kong

© 2022 Mechatronics and Robotic Systems Lab of Mechanical Engineering Dept. at
University of Hong Kong