Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Calibrating my Griper and kinect2 mounted on UR #32

Closed
chandanlal92 opened this issue Mar 21, 2019 · 29 comments
Closed

Calibrating my Griper and kinect2 mounted on UR #32

chandanlal92 opened this issue Mar 21, 2019 · 29 comments
Labels

Comments

@chandanlal92
Copy link

No description provided.

@chandanlal92
Copy link
Author

Is it possible to define my Tool with your Calibration and find out the transformation between Sensor and Tool Center Point?

@marcoesposito1988
Copy link
Collaborator

Hi @chandanlal92,

I am not sure if I understand what you mean. This library is for hand-eye calibration, that is, to find the transformation between the robot arm (flange or base) and a tracking system (camera, etc).

Finding the transformation between the flange (robot arm) and a notable point of the gripper is a different problem. For that you will need something else, e.g. a pivot calibration.

The two problems are distinct and independent. Once you have performed both calibrations, you can find the position of the gripper w.r.t. the tracking system/sensor.

@chandanlal92
Copy link
Author

Is it possible to check the aruco marker detection before running the algorithm with the Robot? I am trying to run the algorithm with my Kinect v2 sensor, I am not able to view the marker detection and more over the GUI closes if I open the image viewer. I just want to make sure if the Kinect is detecting the marker before running the algorithm.

@chandanlal92
Copy link
Author

I am unable to connect to the robot UR10. Does this work with UR 10? I am able to detect the marker but unable to connect to my robot even though I could receive the ping?

@chandanlal92
Copy link
Author

This is my launch file

<arg name="robot_ip" doc="The IP address of the UR10 robot" />

<arg name="marker_size" doc="Size of the ArUco marker used, in meters" />
<arg name="marker_ip" doc="The ID of the ArUco marker used" />

<!-- start the Kinect -->
<include file="$(find kinect2_bridge)/launch/kinect2_bridge.launch" >
    <arg name="depth_device" value="0" />
</include>

<!-- start ArUco -->
<node name="aruco_tracker" pkg="aruco_ros" type="single">
    <remap from="/camera_info" to="/kinect2/qhd/camera_info" />
    <remap from="/image" to="/kinect2/qhd/image_color_rect" />
    <param name="image_is_rectified" value="True"/>
    <param name="marker_size"        value="0.08"/>
    <param name="marker_id"          value="26"/>
    <param name="reference_frame"    value="kinect2_link"/>
    <param name="camera_frame"       value="kinect2_rgb_optical_frame"/>
    <param name="marker_frame"       value="camera_marker" />
</node>
<node pkg="aruco_ros" type="marker_publisher" name="aruco_marker_publisher">
    <remap from="/camera_info" to="/kinect2/qhd/camera_info" />
    <remap from="/image" to="/kinect2/qhd/image_color_rect" />
    <param name="image_is_rectified" value="True"/>
    <param name="marker_size"        value="0.08"/>
    <param name="reference_frame"    value="kinect2_link"/>   <!-- frame in which the marker pose will be refered -->
    <param name="camera_frame"       value="kinect2_rgb_optical_frame"/>
</node>

<!-- start the robot -->
<include file="$(find ur_bringup)/launch/ur10_bringup.launch">
      <arg name="limited" value="true" />
    <arg name="robot_ip" value="10.4.1.55" />
</include>
<include file="$(find ur10_moveit_config)/launch/ur10_moveit_planning_execution.launch">
    <arg name="limited" value="true" />
    
</include>

<!-- start easy_handeye -->
<include file="$(find easy_handeye)/launch/calibrate.launch" >
    <arg name="namespace_prefix" value="$(arg namespace_prefix)" />
    <arg name="eye_on_hand" value="True" />

    <arg name="tracking_base_frame" value="camera_link" />
    <arg name="tracking_marker_frame" value="camera_marker" />
    <arg name="robot_base_frame" value="base_link" />
    <arg name="robot_effector_frame" value="wrist_3_link" />

    <arg name="freehand_robot_movement" value="false" />
    <arg name="robot_velocity_scaling" value="0.5" />
    <arg name="robot_acceleration_scaling" value="0.2" />
</include>

@chandanlal92
Copy link
Author

Now I am able to connect to robot after install ur_modern_driver

@chandanlal92
Copy link
Author

I am not able to get the measurement the gui crashes

@chandanlal92
Copy link
Author

this is the error crashing gui

ERROR] [1554206653.429518]: Error processing request: canTransform: source_frame camera_marker does not exist.
['Traceback (most recent call last):\n', ' File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/impl/tcpros_service.py", line 625, in _handle_request\n response = convert_return_to_response(self.handler(request), self.response_class)\n', ' File "/home/chandanlal/catkin_ws/src/easy_handeye/easy_handeye/src/easy_handeye/handeye_server.py", line 38, in take_sample\n self.calibrator.take_sample()\n', ' File "/home/chandanlal/catkin_ws/src/easy_handeye/easy_handeye/src/easy_handeye/handeye_calibrator.py", line 140, in take_sample\n transforms = self._get_transforms()\n', ' File "/home/chandanlal/catkin_ws/src/easy_handeye/easy_handeye/src/easy_handeye/handeye_calibrator.py", line 123, in _get_transforms\n time = self._wait_for_transforms()\n', ' File "/home/chandanlal/catkin_ws/src/easy_handeye/easy_handeye/src/easy_handeye/handeye_calibrator.py", line 111, in _wait_for_transforms\n self.listener.waitForTransform(self.tracking_base_frame, self.tracking_marker_frame, now, rospy.Duration(10))\n', ' File "/opt/ros/kinetic/lib/python2.7/dist-packages/tf/listener.py", line 76, in waitForTransform\n raise tf2_ros.TransformException(error_msg or "no such transformation: \"%s\" -> \"%s\"" % (source_frame, target_frame))\n', 'TransformException: canTransform: source_frame camera_marker does not exist.\n']
Traceback (most recent call last):
File "/home/chandanlal/catkin_ws/src/easy_handeye/rqt_easy_handeye/src/rqt_easy_handeye/rqt_easy_handeye.py", line 104, in handle_take_sample
sample_list = self.client.take_sample()
File "/home/chandanlal/catkin_ws/src/easy_handeye/easy_handeye/src/easy_handeye/handeye_client.py", line 34, in take_sample
return self.take_sample_proxy().samples
File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/impl/tcpros_service.py", line 435, in call
return self.call(*args, *kwds)
File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/impl/tcpros_service.py", line 515, in call
responses = transport.receive_once()
File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/impl/tcpros_base.py", line 727, in receive_once
p.read_messages(b, msg_queue, sock)
File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/impl/tcpros_service.py", line 353, in read_messages
self._read_ok_byte(b, sock)
File "/opt/ros/kinetic/lib/python2.7/dist-packages/rospy/impl/tcpros_service.py", line 336, in _read_ok_byte
raise ServiceException("service [%s] responded with an error: %s"%(self.resolved_name, str))
rospy.service.ServiceException: service [/ur10_kinect_handeyecalibration_eye_on_hand/take_sample] responded with an error: error processing request: canTransform: source_frame camera_marker does not exist.
[ur10_kinect_handeyecalibration_eye_on_hand/namespace_chandanlal_GL63_8SE_23844_7451937924869058901_rqt-16] process has died [pid 24083, exit code 1, cmd /opt/ros/kinetic/lib/rqt_gui/rqt_gui --clear-config --perspective-file /home/chandanlal/catkin_ws/src/easy_handeye/easy_handeye/launch/rqt_easy_handeye.perspective __name:=namespace_chandanlal_GL63_8SE_23844_7451937924869058901_rqt __log:=/home/chandanlal/.ros/log/416ffd4a-553f-11e9-b2be-00d86105d02e/ur10_kinect_handeyecalibration_eye_on_hand-namespace_chandanlal_GL63_8SE_23844_7451937924869058901_rqt-16.log].
log file: /home/chandanlal/.ros/log/416ffd4a-553f-11e9-b2be-00d86105d02e/ur10_kinect_handeyecalibration_eye_on_hand-namespace_chandanlal_GL63_8SE_23844_7451937924869058901_rqt-16
.log

@chandanlal92
Copy link
Author

Thank You for the code!!! everything is working after launching rqt_image_view while calibrating.
But I have only one doubt the camera marker in rviz is shown inverted above the robot but its present below.Need to check if the calibration is correct.

Here is the picture
Screenshot from 2019-04-02 15-45-39

@chandanlal92
Copy link
Author

camera_marker is shown above the robot . This seems weird.

@marcoesposito1988
Copy link
Collaborator

In order to find the origin of the problem, I suggest you to look at each component individually.

First just start aruco_ros with the kinect, and check that the marker tracking works. In particular, that the size of the marker is the same as the one you pass to aruco as an argument.

Then the robot driver, that the position you get in tf is the same as the one on the robot panel.

Finally, you can check that you configured easy_handeye correctly. In particular, the eye_on_hand parameter and the reference frames. In the case of the kinect, you should provide the optical camera center as tracking base frame and the root link of the kinect's tf tree as reference frame.

@chandanlal92
Copy link
Author

The issue my aruco_ros is not able to get Kinect2_optical frame. The marker is detected properly and marker Id and size is fine.

@chandanlal92
Copy link
Author

chandanlal92 commented Apr 3, 2019

Marker is tracked properly but the Kinect2_optical frame is not available

@chandanlal92
Copy link
Author

This Worked for me..Just need to verify...

<arg name="robot_ip" doc="The IP address of the UR10 robot" />

<arg name="marker_size" doc="Size of the ArUco marker used, in meters" />
<arg name="marker_ip" doc="The ID of the ArUco marker used" />

<!-- start the Kinect -->
<include file="$(find kinect2_bridge)/launch/kinect2_bridge.launch" >
    <arg name="depth_device" value="0" />
</include>

<!-- start ArUco -->
<node name="aruco_tracker" pkg="aruco_ros" type="single">
    <remap from="/camera_info" to="/kinect2/qhd/camera_info" />
    <remap from="/image" to="/kinect2/qhd/image_color_rect" />
    <param name="image_is_rectified" value="true"/>
    <param name="marker_size"        value="0.143"/>
    <param name="marker_id"          value="26"/>
    <param name="reference_frame"    value="kinect2_rgb_optical_frame"/>
    <param name="camera_frame"       value="kinect2_rgb_optical_frame"/>
    <param name="marker_frame"       value="camera_marker" />
</node>
<node pkg="aruco_ros" type="marker_publisher" name="aruco_marker_publisher">
    <remap from="/camera_info" to="/kinect2/qhd/camera_info" />
    <remap from="/image" to="/kinect2/qhd/image_color_rect" />
    <param name="image_is_rectified" value="true"/>
    <param name="marker_size"        value="0.143"/>
    <param name="reference_frame"    value="kinect2_rgb_optical_frame"/>   <!-- frame in which the marker pose will be refered -->
    <param name="camera_frame"       value="kinect2_rgb_optical_frame"/>
</node>

<!-- start the robot -->
<include file="$(find ur_modern_driver)/launch/ur10_bringup.launch">
      <arg name="limited" value="true" />
    <arg name="robot_ip" value="10.4.1.55" />
</include>
<include file="$(find ur10_moveit_config)/launch/ur10_moveit_planning_execution.launch">
    <arg name="limited" value="true" />

</include>

<!-- start easy_handeye -->
<include file="$(find easy_handeye)/launch/calibrate.launch" >
    <arg name="namespace_prefix" value="$(arg namespace_prefix)" />
    <arg name="eye_on_hand" value="true" />

    <arg name="tracking_base_frame" value="kinect2_rgb_optical_frame" />
    <arg name="tracking_marker_frame" value="camera_marker" />
    <arg name="robot_base_frame" value="base_link" />
    <arg name="robot_effector_frame" value="tool0" />

    <arg name="freehand_robot_movement" value="true" />
    <arg name="robot_velocity_scaling" value="0.25" />
    <arg name="robot_acceleration_scaling" value="0.2" />
</include>

Screenshot from 2019-04-03 16-45-13

@chandanlal92
Copy link
Author

Can you tell me in which conventions the pose obtained from this calibration result because I have to convert this the obtained pose into Homogenous Transformation Matrix that´s why I have this doubt.

@marcoesposito1988
Copy link
Collaborator

In the GUI, after pressing "Compute", you can see the result of the calibration as a translation vector and a rotation quaternion, in format XYZW. The direction is from the robot to the tracking system.

You can also use some of the source code of this package to automatically extract the values from the yaml file in a script.

@chandanlal92
Copy link
Author

Just for the clarification, The direction is from robot to the tracking system means am pose of the flange with respect to tracking system or vice-versa?

@chandanlal92
Copy link
Author

I am getting Wrong Hand->Pose, The sign of X-axis and Y-axis is inverted.

@chandanlal92
Copy link
Author

The Hand-World Pose of the robot which I am receiving has change wrong signs of X and Y
I am getting
X=-0.467
Y=-0.641
Where the actual poses seemed in robot panel is
X=0.467
Y=-0.641

This affecting my calibration results. The direction changes. Please help me out with this.

@marcoesposito1988
Copy link
Collaborator

Do you mean the pose between the robot base and the robot hand? So you have the problem also when starting the robot alone, without easy_handeye or the tracking system?

In this case, this is a problem of the robot driver: what it is publishing on tf does not agree with the state according to the panel. This is not related to easy_handeye.

@chandanlal92
Copy link
Author

Yes, I meant robot base and the hand.I checked with the rviz, the relative position of tool_0 controller exactly matches the robot panel poses. The hand->world translation which I get while computing seems to have different signs. I am not understanding this.

@chandanlal92
Copy link
Author

@chandanlal92
Copy link
Author

CaIibration Results are wrong for Camera in Hand....I am getting Huge error in X=3cm,Y=2cm.
Is This Wrong Calibration due to wrong definition of frames in LaunchFile

<arg name="tracking_base_frame" value="kinect2_rgb_optical_frame" />
<arg name="tracking_marker_frame" value="camera_marker" />
<arg name="robot_base_frame" value="base_link" />
<arg name="robot_effector_frame" value="tool0" />

Is this Correct Way the Correct Way of Defining. Could You Please Explain me If the parameters are Correctly set in the launch files. Do you have any Documentation.

@chandanlal92
Copy link
Author

No description provided.

@chandanlal92
Copy link
Author

No description provided.

@chandanlal92
Copy link
Author

IS this Correct Description. I am getting the Marker frame but not sure if it is correct. I Could see 2mm error in all direction checking it aruco_ros/result and aruco_tracker/results

@chandanlal92
Copy link
Author

I could see a consistent change in Calibration results for translation in X

@marcoesposito1988
Copy link
Collaborator

marcoesposito1988 commented May 17, 2019

I am afraid that I ran out of advice. I can only stress the importance of the fact that accurate tracking is a prerequisite for a successful calibration.

If, without calibrating, the relative displacement in Cartesian space reported by the tracking system and the robot disagree, that is a red flag. A misconfiguration of the tracking system or faulty camera calibration are the most probable causes. However, I am only maintaining this project, on a best effort basis.

@chandanlal92
Copy link
Author

Sorry for the Inconvenience, Everything is Working Fine It is a very good calibration Tool . Thank You very much for the code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants