nbv_sim/README.md

91 lines
2.4 KiB
Markdown
Raw Normal View History

2024-10-14 12:59:25 -05:00
# Roadmap
* GS-Net filter aligned with VGN
2024-10-14 12:59:25 -05:00
2024-10-05 21:40:55 -05:00
# Updated installation steps fo my PC environment
2024-10-05 21:36:04 -05:00
```sh
2024-10-12 21:23:56 -05:00
# Install Active Grasp
2024-10-05 21:36:04 -05:00
sudo apt install liborocos-kdl-dev
mkdir -p ws/src && cd ws/src
git clone https://github.com/0nhc/active_grasp.git
conda create -n active_grasp python=3.8
cd active_grasp && conda activate active_grasp
pip install -r requirements.txt
conda install libffi==3.3
conda install conda-forge::python-orocos-kdl
cd ..
2024-10-05 21:40:55 -05:00
git clone https://github.com/0nhc/vgn.git -b devel
2024-10-05 21:36:04 -05:00
cd vgn
pip install -r requirements.txt
cd ..
2024-10-05 21:40:55 -05:00
git clone https://github.com/0nhc/robot_helpers.git
2024-10-05 21:36:04 -05:00
cd ..
rosdep install --from-paths src --ignore-src -r -y
catkin build
2024-10-12 21:23:56 -05:00
# Install Active Perception
cd <path-to-your-ws>/src/active_grasp/src/active_grasp/active_perception/modules/module_lib/pointnet2_utils/pointnet2
pip install -e .
2024-10-05 21:36:04 -05:00
```
# Updated Features
* Added our baseline: src/active_grasp/active_perception_policy.py
* Added RGB and Segmentation image publishers. The segmentation ID 1 corresponds to the grasping target object.
2022-07-21 12:48:13 +02:00
# Closed-Loop Next-Best-View Planning for Target-Driven Grasping
2021-12-22 12:30:05 +01:00
2022-07-22 08:38:04 +02:00
This repository contains the implementation of our IROS 2022 submission, _"Closed-Loop Next-Best-View Planning for Target-Driven Grasping"_. [[Paper](http://arxiv.org/abs/2207.10543)][[Video](https://youtu.be/67W_VbSsAMQ)]
2022-03-01 14:45:06 +01:00
2021-12-22 12:30:05 +01:00
## Setup
2022-07-21 12:48:13 +02:00
The experiments were conducted with a Franka Emika Panda arm and a Realsense D435 attached to the wrist of the robot. The code was developed and tested on Ubuntu 20.04 with ROS Noetic. It depends on the following external packages:
2021-12-22 12:30:05 +01:00
- [MoveIt](https://github.com/ros-planning/panda_moveit_config)
- [robot_helpers](https://github.com/mbreyer/robot_helpers)
2021-12-22 12:30:05 +01:00
- [TRAC-IK](http://wiki.ros.org/trac_ik)
- [VGN](https://github.com/ethz-asl/vgn/tree/devel)
- franka_ros and realsense2_camera (only required for hardware experiments)
2021-12-22 12:30:05 +01:00
Additional Python dependencies can be installed with
```
pip install -r requirements.txt
```
2022-07-21 12:48:13 +02:00
Run `catkin build active_grasp` to build the package.
2022-12-15 13:07:18 +01:00
Finally, download the [assets folder](https://drive.google.com/file/d/1xJF9Cd82ybCH3nCdXtQRktTr4swDcNFD/view) and extract it inside the repository.
2021-12-22 12:30:05 +01:00
## Experiments
Start a roscore.
```
roscore
```
2022-01-16 14:26:48 +01:00
To run simulation experiments.
2021-12-22 12:30:05 +01:00
```
2022-01-16 14:26:48 +01:00
roslaunch active_grasp env.launch sim:=true
2021-12-22 12:30:05 +01:00
python3 scripts/run.py nbv
```
2022-07-21 12:48:13 +02:00
To run real-world experiments.
2021-12-22 12:30:05 +01:00
```
2022-01-16 14:26:48 +01:00
roslaunch active_grasp hw.launch
roslaunch active_grasp env.launch sim:=false
2021-12-22 12:30:05 +01:00
python3 scripts/run.py nbv --wait-for-input
```