3D Vision Research
Research Paper | Problem Being Solved | Thought Process | Actions Taken | Analysis and Takeaways | Summary and Resources | |
---|---|---|---|---|---|---|
1 | Hand-object pose estimation (HOPE) is extremely difficult given the different orientations and the dexterity of the human hand. ArtiBoost attempts to solve this issue. | It is an online data enhancement method that creates a CVV-space to create synthetic hand-object poses by exploration and synthesis. This is then fed into the model along with real data. | Complex statistics is involved in the creation of the CVV-space. However, the general idea is to train the model and feed the losses back to the exploration step. | The model is better performing than a dataset of only real-world hand-object poses. These synthetic hand-object poses tend to train the model better when they are more diverse rather than when in better quality. | ||
2 | Building a local and global 3D map for navigation in autonomous land vehicles (ALVs). | Employ a binocular stereo vision system to use parallax from two cameras to calculate depth. | Used a matching algorithm to generate a disparity map which was changed into a new coordinate system for 3D map building. | The real-time global 3D map generated could be useful for mapping and navigation. However, it requires two cameras as well as GPS/INS built into the UAV. | ||
3 | Providing ground operators a system to control the drone with a more immersive stereo vision experience. | The ground operators can control the UAV using a controller and VR headset. The drone utilizes a stereo vision based camera with low latency to stream real time video onto the VR headset. | A stereo vision camera was planted on the UAV and an Oculus rift with a controller was utilized by the controller. Specific hardware was used for processing the video feed and sharing it to the ground operator. | The research is very in-depth and can be really useful to implement on the WARG UAV. Depth estimation can be performed for data gathering or live video could be streamed into an FPV based control system. | ||
4 | Presenting a stereo vision mapping algorithm to find safe regions for navigation by detecting objects, inclines, and drop-off points. | A localized map must be generated with annotations to describe the surroundings of the robot. Essentially, the safe and unsafe areas must be analyzed so that the robot can navigate through 3D space. | Stereo vision is employed to calculate depth. The depth is then used to generate a 3D grid which is then segmented into levels and inclines. A 2D local safety map is generated to navigate the robot’s surroundings. | The research provides a really good insight into the steps that we must be taking while constructing our own 3D vision mapping model. Using only a camera for mapping, the idea is easily transferable onto the WARG UAV. | https://web.eecs.umich.edu/~kuipers/papers/Murarka-iros-09.pdf | |
5 | Creating a semantic 3D map of an urban environment with a 3D LiDAR and a camera for robot navigation. | A 2D map with pixels would be segmented with respect to each voxel which differs based on the LiDAR data. | The 2D map created was segmented and certain error correction methods were performed to finally generate a labelled and fairly accurate semantic 3D map of the environment. | This research is really useful for the WARG UAV as it does a good job of labelling different parts of a 2D image from the camera. Combined with the LiDAR, it could generate useful information for mapping. | ||
6 | Map Construction Based on LiDAR Vision Inertial Multi-Sensor Fusion | Creating a high precision global 3D map using a fusion of SLAM with visual images as well as LiDAR and odometer data. | Data from the live camera feed will be infused with the data from the LiDAR point clouds and IMU to create a 3D global map. | After gathering all necessary data values, outlier points were removed to collect candidate points which were then optimized using a factor graph optimization process. | This research is highly mathematical with a lot of resources provided for easy incorporation onto the WARG UAV. It is highly precise and has great performance. |
Â