Search the Community

Showing results for tags 'sawyer'.



More search options

  • Search By Tags

    Type tags separated by commas.
  • Search By Author

Content Type


Forums

  • Intera User Forum
    • Using Intera
  • SDK User Forum
    • Robot

Blogs

  • Test Blog

Calendars

There are no results to display.


Find results in...

Find results that contain...


Date Created

  • Start

    End


Last Updated

  • Start

    End


Filter by number of...

Joined

  • Start

    End


Group


Company Name

Found 30 results

  1. Atmaraaj

    Change ROS Master on Sawyer

    I would like to have my ROS Master running on a different system. Is it possible to set another system as the ROS Master instead of Sawyer?
  2. Hi everyone! I'm trying to do motion planning pipeline with moveIt in my Sawyer but there are only tutorials in c++ by MoveIt!, when i run this tutorial i have a error with systaxis (but i'm sure that it's okay). do anyone some code or tutorial about it in python? or do anyone know if i need to do anything to copile code in c++? can anyone help me? thanks a lot!
  3. Arunava Nag

    C++ MoveIt Demo for Sawyer

    Do we have demos in C++ for Sawyer? Especially by using the MoveIt.
  4. A new issue has risen. The issue is running the example programs, as none of the example programs work even after using the "enable" command. $ rosrun intera_interface enable_robot.py -e For example the " rosrun intera_examples joint_torque_springs.py " example does not move the robot to a "neutral pose" or even the " rosrun intera_examples camera_display.py " the terminal returns with splay.py Initializing node... [WARN] [WallTime: 1484625635.265185] Timed out waiting for command acknowlegment... [WARN] [WallTime: 1484625640.309822] Timed out waiting for node interface valid... [ERROR] [WallTime: 1484625640.310784] Failed to stop right_hand_camera from streaming on this robot. Unable to start streaming from head_camera [INFO] [WallTime: 1484625640.319965] Camera_display node running. Ctrl-c to quit I am able to get the joint states but none of the example code work. Do you know why this happening?
  5. jorgegalds_DON

    Sawyer not connecting to ROS

    I followed the steps on networking from the intera website and tried to run the examples. But when I tried to, the message of "unable to communicate with master" always appeared. The intera.sh file has been correctly updated and don't know what other approach to take. Any extra packages that need to be downloaded / steps to take other than the ones described in the website?
  6. Hi, everyone, I have a question and hope to get your help. We have already one Sawyer robotic arm in our lab at present. But required by the tasks, we need to use two robotic arms simultaneously. We want to buy another Sawyer robotic arm to make them work together. My question is whether it is feasible that we use one laptop running the ros platform to control two robotic arms simultaneously. In the tutorial documentation, I only find examples for one robotic arm. But how should two robotic arms be connected to one laptop via the network cable in hardware? And is there an example to setup an environment for the controlling of two Sawyer robotic arms synchronously? Thank you.
  7. nselby

    Individual Joint Impedance Control

    I want to be able to move the end effector freely along a circle about the base of my Sawyer with my hand. Is there a script similar to set_interaction_options.py or constrained_zeroG.py that will allow me to constrain individual joints?
  8. Asif Rana

    DH parameters for Sawyer

    Hi all, I need the DH parameters for Sawyer robot for my research project. Does anyone out here happen to have it calculated already? -Asif
  9. What exactly is the effort array in the message from the topic /robot/joint_states? What is its relation to torque of the joint?
  10. Nawid

    Sawyer SDK license

    Changing to the SDK mode directs me to rethinkrobotics.com/sdkforum to get a license code, but I cannot find the any online forms to get. Where do I get the license code from? I have attached an image of the screen I get.
  11. jorgegalds_DON

    Change Sawyer 'eyes'

    Wanted to know if there is a way to change the 'eyes' image for a pre selected image when booting in the SDK mode.
  12. Hello, I am having a Sawyer. My system is running ROS Kinetic in Ubuntu 16.04. I tried using the sh script, it ssh s myself inside how ever I cannot communicate with the ROS master. I tried using a virtual machine running ROS Indigo and Ubuntu 14.04. I still couldn't communicate to the ROS master. In all the cases i could ping the system from my workstation. Also in my SDK mode, it throws camera calibration error. I have attached a screenshot of that. Kindly help.
  13. Ulysse

    Execute Trajectory using MoveIt!

    Hi all, I am facing an issue trying to use the MoveIt! package to control the Sawyer. I followed all tutorials on how to set up ROS Kinetic, the Sawyer SDK packages and the MoveIt! package. I am know trying to work my way through the MoveIt tutorial. I am able to successfully load the robot inside Rviz, move the end effector in the interface and plan a trajectory. However when I try to execute it, Rviz shows a "failed" message and in the terminal output I can read : Failed to validate trajectory: couldn't receive full current joint state within 1s I also tried to use MoveIt! through the Python interface but when I execute the following : import sys import copy import rospy import moveit_commander import moveit_msgs.msg import geometry_msgs.msg from std_msgs.msg import String moveit_commander.roscpp_initialize(sys.argv) rospy.init_node('move_group_python_interface_tutorial', anonymous=True) robot = moveit_commander.RobotCommander() scene = moveit_commander.PlanningSceneInterface() group = moveit_commander.MoveGroupCommander("right_arm") display_trajectory_publisher = rospy.Publisher('/move_group/display_planned_path', moveit_msgs.msg.DisplayTrajectory, queue_size=20) print "============ Reference frame: %s" % group.get_planning_frame() print "============ End effector: %s" % group.get_end_effector_link() print "============ Robot Groups:" print robot.get_group_names() print "============ Printing robot state" print robot.get_current_state() print "============" ... I get something like : ============ Reference frame: /base ============ End effector: right_hand ============ Robot Groups: ['right_arm'] ============ Printing robot state [ WARN] [1526377733.002742750]: Joint values for monitored state are requested but the full state is not known joint_state: header: seq: 0 stamp: secs: 0 nsecs: 0 frame_id: "/base" name: [right_j0, head_pan, right_j1, right_j2, right_j3, right_j4, right_j5, right_j6] position: [0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0, 0.0] velocity: [] effort: [] multi_dof_joint_state: header: seq: 0 stamp: secs: 0 nsecs: 0 frame_id: "/base" joint_names: [] transforms: [] twist: [] wrench: [] attached_collision_objects: [] is_diff: False ============ This is basically the same problem as in this ROS Q&A question : MoveIt: troubles with joint values Unfortunately this question did not get any useful answer. The weird thing is that the joint state is actually published, when I echo the /robot/joint_states topic I can see the values being updated : header: seq: 600867 stamp: secs: 1526364559 nsecs: 890697828 frame_id: '' name: [head_pan, right_j0, right_j1, right_j2, right_j3, right_j4, right_j5, right_j6, torso_t0] position: [-4.0742021484375, -0.2243544921875, -1.1203125, 0.4582724609375, 2.05134375, -0.29490625, 0.69954296875, -4.114162109375, 0.0] velocity: [-0.001, -0.001, -0.001, -0.001, -0.001, -0.001, -0.001, -0.001, 0.0] effort: [-0.012, 0.052, -20.068, -1.06, -5.996, 0.572, 0.104, -0.284, 0.0] And when I query informations about the /robot/joint_states topic, I can see that both the /realtime_loop and /move_group nodes are publishing/subscribing to it : Type: sensor_msgs/JointState Publishers: * /realtime_loop (http://192.168.212.60:42358/) Subscribers: * /robot_state_publisher (http://192.168.212.60:38760/) * /robot/environment_server (http://192.168.212.60:59368/) * /calibrate_arm (http://192.168.212.60:57953/) * /PositionKinematicsNode_right (http://192.168.212.60:56399/) * /JointMotionPathNode_right (http://192.168.212.60:58612/) * /motion_controller (http://192.168.212.60:37187/) * /sdk_position_w_id_joint_trajectory_action_server_right (http://192.168.212.160:37229/) * /move_group (http://192.168.212.160:43623/) Any idea of what is going on or how I can further investigate the error ? Thanks for your help.
  14. lr101095

    sawyer inverse kinematics

    trying to write a program that subscribes to the endpoint states of sawyer through rostopics. i've opted to subscribe to each individual endpoint_state value. my program is based around ik_service_client, hence why i decided to subscribe to the individual values. I have the following: sub_p1 = rospy.Subscriber("/robot/limb/right/endpoint_state/pose/position/x", String) sub_p2 = rospy.Subscriber("/robot/limb/right/endpoint_state/pose/position/y", String) sub_p3 = rospy.Subscriber("/robot/limb/right/endpoint_state/pose/position/z", String) sub_q1 = rospy.Subscriber("/robot/limb/right/endpoint_state/pose/orientation/x", String) sub_q2 = rospy.Subscriber("/robot/limb/right/endpoint_state/pose/orientation/y", String) sub_q3 = rospy.Subscriber("/robot/limb/right/endpoint_state/pose/orientation/z", String) sub_q0 = rospy.Subscriber("/robot/limb/right/endpoint_state/pose/orientation/w", String) poses = {'right': PoseStamped( header = hdr, pose = Pose( position = Point( x = sub_p1, y = sub_p2, z = sub_p3, ), orientation = Quaternion( x = sub_q1, y = sub_q2, z = sub_q3, w = sub_q0, ), ), ), } However, as a test, i print "poses" and i get the following output: pose: position: x: <rospy.topics.Subscriber object at 0x7fe9f9c31c50> y: <rospy.topics.Subscriber object at 0x7fe9f9c31d50> z: <rospy.topics.Subscriber object at 0x7fe9f9c31d90> orientation: x: <rospy.topics.Subscriber object at 0x7fe9f9bd7050> y: <rospy.topics.Subscriber object at 0x7fe9f9c31e10> z: <rospy.topics.Subscriber object at 0x7fe9f9bd7710> w: <rospy.topics.Subscriber object at 0x7fe9f9bd79d0>} How am i able to print/access the actual data so i have an output of actual numbers that i can then pass on to perform inverse kinematics? I'm not sure if i'm using the correct data type (i.e. String or otherwise) to pass on the values. any corrections, suggestions or solutions would be much appreciated. - Luke
  15. lr101095

    vibrating model

    While running the sawyer simulation in intera 5.2, i can see that the model is vibrating. Calling rostopic echo robot/limb/right/gravity_compensation_torques, you can see that there are changes being constantly made in actual_position, actual_velocity, actual_effort and gravity_only. I compared the same topic to the real robot, and i saw no changes in the output. Does anyone know why this happens in the simulation or how to resolve it?
  16. lr101095

    joint angles in simulation

    I was comparing the joint angles and joint limits in the simulation and the real robot. I dont think the correct joint angles have been integrated in the simulation. I understand that there may be some variance in the real robot, but i just wanted to make sure i wasn't observing something incorrect. I am trying to run a program, that was written for the real robot, in simulation. It works as expected with the real robot. But there are several discrepancies that i am coming across when running the simulation. One concern is that the joint limits do no match those of the real robot. Because of this, the robot in simulation isn't in the correct pose. For example, in simulation, joint 6 (wrist) has a range of -3.14 to 3.14. The real sawyer has a range of -4.714 to 4.709 in the same joint. I'm not sure if that has anything to do the simulation failing to run the program as expected, but I feel it may because the lack in "flexibility" in the robot in simulation isn't allowing the required pose to be achieved. In the program i'm trying to run, a joint angle of 3.373 is required in the joint 6. Consequently, due to the range provided in the simulation, the required pose in not achieved. it is worth noting that i've only observed this difference in range in joint 6, and i haven't, as of yet, checked to see if there are discrepancies in the other joints. If anyone has any advice regarding how to resolve this, or if there is an upcoming update that resolves this issue, please let me know.
  17. lr101095

    SolvePositionIK and Transformations

    Can someone please explain how SolvePositionIK works in terms of path planning? I want to know how to solve for a trajectory by inputting the point position (and maybe orientation) in order to achieve a desired pose related to the reference point. I'm using a kinect in simulation to scan the environment. So I want to be able to take a point from the camera frame and calculate the position in the global frame so that the transformation required for IK to solve for the pose cam be derived. Can anyone also suggest a way of achieving that? Thanks in advance - Luke
  18. lr101095

    Sawyer Path Planning

    Just wanted to ask if anyone would be able to suggest a method of path planning for sawyer (both in application and in simulation). There is no collision avoidance as of yet and is not yet necessary for what i'm trying to achieve. i just wanted to be able to dictate the path of the robot between two points (initial end-effector location and desired location, including end-effector orientation). I know MoveIt does have a path planner but i'm not sure how to use it. To my understanding, MoveIt is able to plan a trajectory between two poses and execute the calculated trajectory, of course considering joint limits etc. I want to be able to determine a path just by inputting the xyz and normal of a point. For example, I want to be able to move the end-effector to a surface (such as a a table surface) and have the end-effector's z-axis parallel to the normal of surface. Assuming that the xyz (global frame) of the table point is known, i want to be able to control sawyer to move from an arbitrary start point and move to the desired location with the desired orientation. Because of what i need help with, can someone also provide an explanation for how inverse kinematics would work with sawyer? i know there is the IK Service example that involves ik_service_client and SolvePositionIK. But i dont really understand what is going on in that example nor what it does. Any help would be much appreciated. Cheers, Luke
  19. lr101095

    vibrating model

    I'm currently running intera 5.2 on my PC and trying to run a sawyer simulation with a force/torque sensor on the end effector. I'm using rqt_plot to get the readings from the f/t sensor. but i'm getting unnecessary noise from the signal even when the sensor is not in contact with anything. (see first screenshot) When i put the f/t sensor to a surface in a static manner (end-effector put towards the surface and kept in contact), the signal is violently oscillating between a max force and 0. (see second screenshot) Zooming into the model of both sawyer and the f/t sensor, i can see that they are both vibrating a little. i can tell because the shadow is oscillating a little bit. I'm posting this question on this forum because i wanted to ask if there have been any changes in the simulation since 5.2.2 was released? i also wanted to ask if anyone was able to resolve this "error"?
  20. lr101095

    Backwards Compatibility

    I recently updated to Intera 5.2 on my personal PC. However, i work in a lab and Sawyer is being shared among other lab techs. As it stands, the robot is only on Intera 5.1.2. Is there anyway i can still run the robot while having 5.2 on my pc but working with 5.1.2 on the actual robot? i dont want to update sawyer just yet because it might affect the performance for other users. i also dont want to downgrade my laptop back to 5.1 because i'm making use of the sawyer simulation. please advise. any help is much appreciated.
  21. I am doing a research which requires accurate coordinate and joint values from Sawyer, and as such I have the following questions to ask about Sawyer's joint accuracy. What are the accuracies of each limb length, joint angle values that are published via ROS and of the kinematic solvers? Is it possible to directly get the poses (coordinates) of any joint of choice? For now, we just calculate these values from the limb lengths and joint values published, thus the first question. Where is the virtual gripper/end-effector point relative to Sawyer's arm? As I understand, the end-effector point is needed to calculate the Inverse Kinematics, but I am unsure where exactly this point is. If it is possible to obtain this end-effector position, how can I tune/calibrate this position when a different gripper is attached? I know that the Intera UI allows some configuration, but is this possible with Intera SDK too? I would gladly explain more about my aims if more explanation is required. Thank you.
  22. Danfei

    Change URDF at run time

    Hi guys, we are trying to update robot URDF with external obstacles in real time for collision checking. Is there a way to overwrite existing collision avoidance state? Thanks!
  23. StuweXY

    Programming Sawyer

    Hello everyone, my goal is to be able to program Sawyer with Python. It would be ideal if I could run a simulation of my program first to see if everything is correct. Therefor I familiarized myself with some of the provided example programs and did the MoveIt-Tutorial and also looked a bit further into RVIZ. My problem now is, I checked out the RVIZ tutorial here but the commands have not much to do with the ones used in the examples. So my questions are: Is it better to orient myself on the example programs than on the general ros stuff? And how can I test my program, like sending it into RVIZ? I already wrote a small one for initialisation, copied it to the examples and tried running it, but I got told it wasn't an executable?! Best regards
  24. Murilo

    Modified URDF

    Dear forum, We removed the Sawyer from the wheel base and mounted on a large workbench. We have an updated URDF model without the wheel base and I was wondering the most appropriate way to have the Sawyer use this updated URDF model. I can overwrite the /robot_description parameter with the new URDF model, but the Sawyer does not take the updated parameter into account (as expected). I could start a robot_state_publisher from the workstation to publish the missing TF, thus killing Sawyer's respective node, but that seems more of a workaround than a solution. Any suggestions would be greatly appreciated! Thanks a lot!
  25. StuweXY

    Unable to ping Sawyer

    Hi, I've never worked with Ubuntu and now I am kinda lost what to do. Before describing the problem I'll tell you what I already did: 1) installing Ubuntu 14.04.05 as VM 2) Setting up ROS, the tutorials work 3) I used http://sdk.rethinkrobotics.com/intera/Workstation_Setup and http://sdk.rethinkrobotics.com/intera/Networking to connect directly to Sawyer. In the first tutorial it fails at the commando "rostopic list" (ERROR: Unable to communicate with master!) and in the second tutorial I can't ping the robot. env | grep ROS returns i.a. ROS_MASTER_URI=http://<serial number>.local:11311 and ROS_HOSTNAME my computer. I can also ping my computer, but when I try to ping Sawyer with ping <Serial number>.local it returns "unknown host". A colleague has Sawyer connected via a router and it works fine. So where is the problem with direct connection?