-
Notifications
You must be signed in to change notification settings - Fork 224
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Calibrating my Griper and kinect2 mounted on UR #32
Comments
Is it possible to define my Tool with your Calibration and find out the transformation between Sensor and Tool Center Point? |
Hi @chandanlal92, I am not sure if I understand what you mean. This library is for hand-eye calibration, that is, to find the transformation between the robot arm (flange or base) and a tracking system (camera, etc). Finding the transformation between the flange (robot arm) and a notable point of the gripper is a different problem. For that you will need something else, e.g. a pivot calibration. The two problems are distinct and independent. Once you have performed both calibrations, you can find the position of the gripper w.r.t. the tracking system/sensor. |
Is it possible to check the aruco marker detection before running the algorithm with the Robot? I am trying to run the algorithm with my Kinect v2 sensor, I am not able to view the marker detection and more over the GUI closes if I open the image viewer. I just want to make sure if the Kinect is detecting the marker before running the algorithm. |
I am unable to connect to the robot UR10. Does this work with UR 10? I am able to detect the marker but unable to connect to my robot even though I could receive the ping? |
This is my launch file
|
Now I am able to connect to robot after install ur_modern_driver |
I am not able to get the measurement the gui crashes |
this is the error crashing gui ERROR] [1554206653.429518]: Error processing request: canTransform: source_frame camera_marker does not exist. |
camera_marker is shown above the robot . This seems weird. |
In order to find the origin of the problem, I suggest you to look at each component individually. First just start aruco_ros with the kinect, and check that the marker tracking works. In particular, that the size of the marker is the same as the one you pass to aruco as an argument. Then the robot driver, that the position you get in tf is the same as the one on the robot panel. Finally, you can check that you configured easy_handeye correctly. In particular, the eye_on_hand parameter and the reference frames. In the case of the kinect, you should provide the optical camera center as tracking base frame and the root link of the kinect's tf tree as reference frame. |
The issue my aruco_ros is not able to get Kinect2_optical frame. The marker is detected properly and marker Id and size is fine. |
Marker is tracked properly but the Kinect2_optical frame is not available |
This Worked for me..Just need to verify...
|
Can you tell me in which conventions the pose obtained from this calibration result because I have to convert this the obtained pose into Homogenous Transformation Matrix that´s why I have this doubt. |
In the GUI, after pressing "Compute", you can see the result of the calibration as a translation vector and a rotation quaternion, in format XYZW. The direction is from the robot to the tracking system. You can also use some of the source code of this package to automatically extract the values from the yaml file in a script. |
Just for the clarification, The direction is from robot to the tracking system means am pose of the flange with respect to tracking system or vice-versa? |
I am getting Wrong Hand->Pose, The sign of X-axis and Y-axis is inverted. |
The Hand-World Pose of the robot which I am receiving has change wrong signs of X and Y This affecting my calibration results. The direction changes. Please help me out with this. |
Do you mean the pose between the robot base and the robot hand? So you have the problem also when starting the robot alone, without easy_handeye or the tracking system? In this case, this is a problem of the robot driver: what it is publishing on tf does not agree with the state according to the panel. This is not related to easy_handeye. |
Yes, I meant robot base and the hand.I checked with the rviz, the relative position of tool_0 controller exactly matches the robot panel poses. The hand->world translation which I get while computing seems to have different signs. I am not understanding this. |
CaIibration Results are wrong for Camera in Hand....I am getting Huge error in X=3cm,Y=2cm.
Is this Correct Way the Correct Way of Defining. Could You Please Explain me If the parameters are Correctly set in the launch files. Do you have any Documentation. |
No description provided. |
No description provided. |
IS this Correct Description. I am getting the Marker frame but not sure if it is correct. I Could see 2mm error in all direction checking it aruco_ros/result and aruco_tracker/results |
I could see a consistent change in Calibration results for translation in X |
I am afraid that I ran out of advice. I can only stress the importance of the fact that accurate tracking is a prerequisite for a successful calibration. If, without calibrating, the relative displacement in Cartesian space reported by the tracking system and the robot disagree, that is a red flag. A misconfiguration of the tracking system or faulty camera calibration are the most probable causes. However, I am only maintaining this project, on a best effort basis. |
Sorry for the Inconvenience, Everything is Working Fine It is a very good calibration Tool . Thank You very much for the code. |
No description provided.
The text was updated successfully, but these errors were encountered: