Skip to content

Paskul/apple-harvest-pascal

 
 

Repository files navigation

Pascal Apple Harvesting Control

Continuing on the ROS 2 package for the OSU apple-harvest repo, allowing for drivers, control, vision, and general use of the groups' robotic arms with software integration. This fork provides (mostly) MoveIt changes to allow for trajectory caching/execution, as well as a start for hybrid planning/control, though it has not been fully developed yet. Some of thes

Most changes are localized in /ur_moveit_config, and include:

  1. Planning group edits for support of RRTstar default in external caching.
  2. Starter Hybrid planner code (global with RRTconnect and local with forward collision checking). To test this (with hardware launch), the MoveIt hybrid planning binary download is needed. Starter changes include an outline for two planning groups, with the intention of one in normal OMPL planning and another to be conducted through hybrid-control with a global planner starting with cache (in CSV), then relying on OMPL. However, only 'move_group' is currently used.
  3. Slight edits to URDF naming (ur with prefix to ur5e with prefix; hardcoded, but it fixed a few errors in dev).

Notes

  • Some of these changes are flat-out not that good if you aren't working with the UR5e arm. For example, I hard-coded config/launch prefixes where I was getting use/build errors (ex., from 'ur' to 'ur5e', change #4) for my MoveIt nodes to work properly with the URDF/SRDF files.
  • Another planning group is made, called ompl, that was to be used with hybrid-planning. ompl is a bit misleading; this wasn't intended to use purely OMPL.

What my vision of the hybrid-planning cycle looks like:

  1. Global planner - at a first call, pulls from the precomputed trajectory cache CSV to move the arm to its desired voxel as a first step.
  2. Local planner - always running. Does a 'forward check' to look for obstacles in front of the robot. If there is an obstacle, it triggers a global replan. This is the default, but I don't see anything wrong with this.
  3. Global planner (replan) - Cache becomes useless, as we are away from the starting (cached) joint-state. Instead, it makes sense here to trigger planning from OMPL with its current joint-state with the same final desired pose.

As of recent testing, the local planner works as expected, while the global planner only calls on OMPL, never looking for a cache in CSV. A custom planner plugin must be created to implement this idea with MoveIt.

Original README

This repository contains code to detect and localize apples from RGB-D data and operate a UR5e manipulator. This can be done with both real hardware or simulated data.

The vision code has only been setup for running a Realsense d435i (real hardware) or a Microsoft Azure Kinect (simulated data).

There are currently two main control schemes: apple harvesting (real or simulated) and tree templating (tested only in simulation). The apple harvesting pipeline includes nodes for additional UR controllers and data recording, while the tree templating method runs a minimal set of nodes for control in simulation.

Apple Harvesting:

  1. In the first terminal:

    Simulated

    ros2 launch harvest_control arm_control.launch.py ur_type:=ur5e robot_ip:=yyy.yyy.yyy.yyy use_fake_hardware:=true launch_rviz:=true

    Real Hardware

    ros2 launch harvest_control arm_control.launch.py ur_type:=ur5e robot_ip:=169.254.177.230 launch_rviz:=true headless_mode:=true
  2. Getting apple locations:

    A. If using predicting apple locations via YOLO, run the vision node in another script: (need to ensure palm camera connects to proper idx):

    ros2 launch harvest launch_vision.launch.py palm_camera_device_num:=<camera port idx>

    B. If getting manual apple locations, run position recording in another script (free-drive the robot to probe apple locations):

    ros2 run harvest_control get_manual_apple_locations.py --ros-args \
      -p output_directory:=/absolute/path/to/data
  3. In a third terminal run the control script. You can now tweak these parameters at launch with --ros-args -p <name>:=<value>:

    ros2 run harvest start_harvest.py --ros-args \
      -p pick_pattern:=force-heuristic \
      -p event_sensitivity:=0.43 \
      -p recording_startup_delay:=0.5 \
      -p base_data_dir:=/absolute/path/to/data \
      -p enable_recording:=true \
      -p enable_visual_servo:=true \
      -p enable_apple_prediction:=true \
      -p enable_pressure_servo:=true \
      -p enable_picking:=true

    Defaults (if you don’t pass -p flags):

    pick_pattern:            force-heuristic        # pick controller selection (force-heuristic, pull-twist, linear-pull)
    event_sensitivity:       0.43                   # event detection sensitivity (0.0–1.0)
    recording_startup_delay: 0.5                    # secs to wait after record start before action
    base_data_dir:           <workspace_root>/data  # where to store batch_<N> directories
    enable_recording:        true                   # enable/disable any rosbag recording stages
    enable_visual_servo:     true                   # enable/disable the visual-servo stage
    enable_apple_prediction: true                   # enable/disable live apple-prediction
    enable_pressure_servo:   true                   # enable/disable pressure servoing stage
    enable_picking:          true                   # enable/disable picking phase 
    • base_data_dir: by default this creates (if needed) a data/ folder next to your src/ and writes batch_1/, batch_2/, … under it.
    • All stages (approach, visual‑servo, pressure‑servo, pick, release) will automatically record topics (if enable_recording) under <base_data_dir>/<stage>_<batch>/<prefix>*.bag.
    • Use pick_pattern to switch between your three pick‑controller strategies without code changes.

With that, you can flexibly tune your harvest run from the command line, point recordings wherever you like, and turn subsystems on or off without editing any Python.

Tree Templating:

Currently only tested in simulation

  1. In the first terminal:

    ros2 launch harvest_control arm_control_templating.launch.py ur_type:=ur5e robot_ip:=yyy.yyy.yyy.yyy use_fake_hardware:=true launch_rviz:=true
  2. In a second terminal run the vision:

    ros2 launch harvest launch_vision_templating.launch.py
  3. In a third terminal run the control script to execute actions from the running nodes:

    ros2 run harvest orchard_templating.py

Details

  • The default kinematics plug-in is KDLKinematicsPlugin
  • Each of the three launch files have specific parameters that can be updated for the desired usecase. Each parameter having documentation within the associcated launch file.
  • If running tree templating in simulation with presaved vision data:
    • Saved vision data files are stored in harvest_vision/data/, under the directory prosser_a/
    • Where the letter after prosser_ is considered the vision_experiment parameter when launching the vision package
    • If adding a new data directory, the setup.py file will need to be updated
    • Within this directory are three files needed for experiments: color_raw.png, depth_to_color.png, and pointcloud.ply

About

ROS2 packages needed to run apple harvesting trials.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

 
 
 

Contributors

Languages

  • Python 58.6%
  • C++ 37.9%
  • CMake 2.5%
  • Other 1.0%