senaa

Members
  • Content count

    16
  • Joined

  • Last visited

  • Days Won

    1

Everything posted by senaa

  1. senaa

    Gazebo Support?

    Is there a timeline on support for Sawyer simulation through Gazebo? I noticed there was a Rethink internship advertised earlier in the year with this listed as one of the tasks, hoping there has been some progress..
  2. senaa

    Gazebo Support?

    Note, it's the newer examples which won't work, the pre 5.3.0 release examples should still work, so long as you use the 5.2.0 packages. git checkout release-5.2.0 in the intera_common, intera_sdk, sawyer_robot, and sawyer_simulator packages. catkin_make the workspace. The simulator should work now.
  3. senaa

    Zero-G Mode Button Unresponsive

    As title suggests, it seems our zero-g button is no longer responsive.. has anyone else experienced this? Is there a way to trigger zero-g mode through the sdk? Aran
  4. senaa

    sawyer inverse kinematics

    Ah ok I see, and yes this gives you position and orientation data as the ee_pose variable type is geometry_msgs/Pose Eg how to access the x position of the pose and w component of the orientation: ee_pos.position.x ee_pos.orientation.w the other elements are retrieved in a similar way.
  5. senaa

    sawyer inverse kinematics

    ROS subscribers run using callbacks http://wiki.ros.org/ROS/Tutorials/WritingPublisherSubscriber(python) Example for getting and using the endpoint state import rospy from intera_core_msgs.msg import EndpointState from geometry_msgs.msg import Pose global ee_pose def ee_cb(msg) global ee_pose ee_pose = msg.pose if __name__=='__main__': global ee_pose ee_pose = Pose() rospy.init_node('ee_node') rospy.Subscriber("/robot/limb/right/endpoint_state", EndPointState, callback=ee_cb) r = rospy.Rate(100) while not rospy.is_shutdown(): print ee_pos # do whatever you need with the ee_pose print ee_pos.position.x # for example r.sleep() The question though is why do you then need to perform IK on the endpoint state, when the joint positions are published on the /robot/joint_states topic?
  6. senaa

    Sawyer Vector Images

    There's 3D files in the downloads section that might do the trick - e.g. you could use them in Solidworks to export a dxf, then import that to inkscape?
  7. senaa

    Gazebo Support?

    Thanks Luke, that fixed it! Not seeing a /motion/motion_command topic, but the pick and place example does work.. Still not able to run go_to_joint_angles.py, same error messages as before.
  8. senaa

    Gazebo Support?

    Thanks Mike! I'm having some trouble with the simulator, maybe I'm missing something? It seems the motion controller isn't starting with the sim? Is this something that needs to be explicitly launched? I can't see a /motion/motion_command topic, and when I try to move the joint position example it times out trying to connect with the motion controller server. rosrun intera_examples go_to_joint_angles.py -q -0.2 0.1 0.1 0.2 -0.3 0.2 0.4 -s 0.9 -a 0.1 [ERROR] [1521112525.570134, 1038.170000]: Timed out waiting for Motion Controller Server to connect. Check your ROS networking and/or reboot the robot. [ERROR] [1521112526.200106, 1038.179000]: Trajectory FAILED to send I've tried restarting the system, and checked that the robot is enabled. I can run other examples like the joint_torque_springs example, so it seems specific to the motion controller server. When I try to run the pick and place sim example, the robot is loading without a gripper, causing various errors. Is there someway to tell the sim that a gripper should be connected when loading?
  9. senaa

    Robot Storage

    Thanks Don, have also just found that there's a ROS param for the shipping pose. For others reference you can get it using rosparam get /named_poses/right/poses/shipping Or directly from .../src/sawyer_robot/sawyer_description/params/named_poses.yaml
  10. senaa

    Robot Storage

    Is there a way to move Sawyer into it's storage configuration, like the way it was configured when shipped? Or is it just a case of manually moving it into a suitable configuration?
  11. senaa

    Accuracy of limb data and data published via ROS

    Have you looked into whether the robot's tf data can give you what you need?
  12. senaa

    Access to Sawyer's Cameras Simultaneously

    I've since answered my own question, thought it might be useful to share. If I've missed an easier way to do this please let me know (or if there's some important reason as to why you shouldn't have both streaming at once). Commands to activate/deactivate the cameras are sent to the cameras via a service as a JSON. The snippet below shows how to deactivate/activate the camera streams, to get both streaming simultaneously just set the status to true and call this for each camera. I'm wondering if there's other commands for controlling focus, exposure, the flash etc... #!/usr/bin/env python # Aran Sena 2018 # # Code example only, provided without guarantees # # Example for how to get both cameras streaming together # #### import rospy from intera_core_msgs.srv._IOComponentCommandSrv import IOComponentCommandSrv from intera_core_msgs.msg._IOComponentCommand import IOComponentCommand def camera_command_client(camera, status, timeout=0.0): rospy.wait_for_service('/io/internal_camera/' + camera + '/command') try: head_cam_control = rospy.ServiceProxy('/io/internal_camera/' + camera + '/command', IOComponentCommandSrv) cmd = IOComponentCommand() cmd.time = rospy.Time.now() cmd.op = 'set' if status: cmd.args = '{"signals": {"camera_streaming": {"data": [true], "format": {"type": "bool"}}}}' else: cmd.args = '{"signals": {"camera_streaming": {"data": [false], "format": {"type": "bool"}}}}' resp = head_cam_control(cmd, timeout) print resp except rospy.ServiceException, e: print "Service call failed: %s"%e if __name__ == '__main__': rospy.init_node('camera_command_client') camera_command_client(camera='head_camera', status=True) camera_command_client(camera='right_hand_camera', status=True)
  13. I'm trying to access both of Sawyer's cameras at the same time, from separate nodes, but experiencing an issue where it appears only one of the cameras is producing data. If I run the nodes individually, the cameras work fine, but if both are accessed simultaneously one of them becomes unresponsive. Also, I've noticed in RViz, if I set up one camera view, video streams through okay, but if I then change the topic to the other camera nothing comes through. If you try to have two camera views, again one of the cameras becomes unresponsive. Overall, running both cameras together, or attempting to switch between the cameras individually results in one of them becoming unresponsive (sometimes the right hand camera, sometimes the head camera). Is there a topic or service I should be calling to 'activate' the camera I would like? Is there a way to stream from both cameras simultaneously?
  14. /robot/joint_states contains contains effort messages (in Nm I think) for each joint, is this what you want? Did you find how to focus the camera?
  15. senaa

    Change URDF at run time

    If you're using a depth sensor, you can use built-in functionality in moveit with octomaps. You modify the planning scene rather than the urdf itself http://docs.ros.org/kinetic/api/moveit_tutorials/html/doc/pr2_tutorials/planning/src/doc/perception_configuration.html
  16. senaa

    Camera Configuration Failed

    Hi, At start up of our Sawyer, we are getting a message "Camera Configuration Failed. Please contact Rethink Support". The robot boots to the Sawyer SDK page successfully, and we can still interact through the SDK. I've tried restarting the robot a few times but this doesn't seem to be going away - are there any suggestions on how to resolve this error message? Thanks