Skip to content
Snippets Groups Projects
Commit 8191f12a authored by armar4-demo's avatar armar4-demo
Browse files

Merge branch 'master' of https://gitlab.com/ArmarX/RobotAPI

parents 96b4ddc4 092c22d6
No related branches found
No related tags found
No related merge requests found
etc/doxygen/images/Tutorial-RobotAPI-Rect-1.png

153 KiB

etc/doxygen/images/Tutorial-RobotAPI-Rect-2.png

151 KiB

etc/doxygen/images/Tutorial-RobotAPI-Rect-3.png

156 KiB

......@@ -4,5 +4,6 @@
Tutorials related to RobotAPI
\li \subpage RobotAPI-Tutorial-MoveRobotWithGui
\li \subpage RobotAPI-Tutorial-MoveRobotArmAlongRectangle
*/
/**
\page RobotAPI-Tutorial-MoveRobotArmAlongRectangle Moving the robot's hand along a predefined trajectory
Prerequisites: ArmarXCore tutorials (statechart and scenario handling)
In this tutorial you will be given a short description of the task that you need to accomplish and some
implementation hints. The expected results will be presented in form of screenshots and a video.
Since this tutorial is meant to be solved by the reader, the full implementation will not be shown.
\section RobotAPI-Tutorial-MoveArmRect-sec-task Description of the task
The main goal is to make Armar3 use his right hand to trace a rectangular trajectory using velocity control.
Since you will be using velocity control, the TCP will not precisely follow the given trajectory.
Therefore you need to employ a control policy which peridoically checks and corrects the position of the TCP.
You are expected to follow these constraints:
\li Use the TCPControlUnit in velocity mode to control the robot's arm.
\li Use the kinematic chain "HipYawRightArm"
\li The rectangular trajectory should be configurable using the StatechartEditor
\li Show the result in the Armar3Simulation scenario (package ArmarXSimulation)
\li Visualize atleast the target trajectory using the DebugDrawer
\section RobotAPI-Tutorial-MoveArmRect-sec-hints Implementation hints
\subsection RobotAPI-Tutorial-MoveArmRect-sec-hints-proxies Declaring the proxies
Your statechart group will need access to the following proxies:
- [RobotAPIInterfaces] TCPControlUnit: Control the TCP's velocity
- [RobotAPIInterfaces] Robot State Component: Query the current position of the TCPControlUnit
- [RobotAPIInterfaces] Debug Drawer Topic: Visualize the target trajectory and other information
\subsection RobotAPI-Tutorial-MoveArmRect-sec-hints-start-pos Determining the start position
The first action the robot should do is move its arm into a start configuration.
To find a suitable pose you can use the the GUI plugin RobotIK. To use this plugin
make sure to start the scenario Armar3Simulation. Then add two widgets to the ArmarXGui:
\li ```RobotControl -> KinematicUnitGUI```
\li ```RobotControl -> RobotIK```
In the RobotIK GUI select the kinematic chain "HipYawRightArm" and move the TCP to a suitable position.
Then open the KinematicUnitGUI and read off the values of the joints belonging to the kinematic chain.
The resulting joint value map can represented as a "Map(float)" in your statechart. An example follows:
\code{.js}
{
"Hip Yaw": -0.0230325,
"Shoulder 1 R": -0.223568,
"Shoulder 2 R": -0.0231917,
"Upperarm R": 0.431719,
"Elbow R": -0.107462,
"Underarm R": -0.264145,
"Wrist 1 R": 0.353429,
"Wrist 2 R": 0.0210833
}
\endcode
You can use the existing statechart JointPositionControl which is part of the MotionControlGroup to implement
the initial movement. Make sure to wait a little bit after reaching the starting pose.
Otherwise the robot may not have fully stopped moving.
\subsection RobotAPI-Tutorial-MoveArmRect-sec-hints-robot-state Querying the current robot state
We can use the Robot State Component proxy to access the current state of the robot.
You can create a local VirtualRobot model which is periodically synchronized with the real robot.
\code{.cpp}
// Create a local clone of the robot model
auto robotPrx = getRobotStateComponent()->getSynchronizedRobot();
VirtualRobot::RobotPtr robot = RemoteRobot::createLocalClone(robotPrx);
//...
while (!isRunningTaskStopped())
{
// During each control iteration synchronize the local and the remote robot model
RemoteRobot::synchronizeLocalClone(robot, robotPrx);
// And query the current position of the TCP
VirtualRobot::RobotNodePtr tcp = robot->getRobotNodeSet(kinematicChainName)->getTCP();
Eigen::Vector3f currentPosition = tcp->getPositionInRootFrame();
// ...
}
\endcode
\subsection RobotAPI-Tutorial-MoveArmRect-sec-hints-tcp-velocity-control Using velocity control for the TCP
In this tutorial you are required to use the TCPControlUnit to control the arm.
Before you can use the unit you have to request it once. Do this during the onEnter() method of you statechart:
\code{.cpp}
getTcpControlUnit()->request();
\endcode
Make sure to set the velocity to zero and release the unit before exiting your statechart:
\code{.cpp}
getTcpControlUnit()->setTCPVelocity(kinematicChainName, tcpName,
new FramedDirection(Eigen::Vector3f::Zero(), tcpName, getRobot()->getName()),
new FramedDirection(Eigen::Vector3f::Zero(), tcpName, getRobot()->getName()));
// Wait a little so that the command to set the velocity to zero reaches the control unit
TimeUtil::MSSleep(100);
getTcpControlUnit()->release();
\endcode
To set a velocity relative to the current robot position you can use this code fragment:
\code{.cpp}
Eigen::Vector3f velocity = ... // To be calculated
FramedDirectionPtr velocityPtr = new FramedDirection(velocity, rootName, robot->getName());
velocityPtr->changeFrame(robot, tcpName);
getTcpControlUnit()->setTCPVelocity(kinematicChainName, tcpName, velocityPtr,
new FramedDirection(Eigen::Vector3f::Zero(), tcpName, robot->getName()));
\endcode
\subsection RobotAPI-Tutorial-MoveArmRect-sec-hints-debug-drawer Using the debug drawer
You can use the debug drawer to visualize the trajectory and other information.
This example code draws a yellow sphere at the closest point on the line.
Keep in mind that the debug drawer expects coordinates in the global frame whereas most of your
calculations will take place in the robot frame.
\code{.cpp}
getDebugDrawerTopic()->setSphereVisu(DebugDrawerLayer, "ClosestPoint",
new Vector3(globalClosestPointOnLine),
{ 1.0f, 1.0f, 0.0f, 1.0f }, 7.0f);
\endcode
Feel free to explore other visualization possibilities by examining the methods of the debug drawer.
\section RobotAPI-Tutorial-MoveArmRect-sec-result Expected result
The robot should first move to its right hand to the start pose and the follow a rectangular trajectory as shown
in the follwing video. You will see that the target trajectory is highlighted in pink. The yellow sphere shows
the closest point to the TCP position on the current line trajectory.
\htmlonly
<iframe width="800" height="451" src="https://www.youtube.com/embed/vDkjXCMqrkI?rel=0" frameborder="0" allowfullscreen></iframe>
\endhtmlonly
Here are some screenshots of the simulator if you cannot watch the video:
\image html Tutorial-RobotAPI-Rect-1.png
\image html Tutorial-RobotAPI-Rect-2.png
\image html Tutorial-RobotAPI-Rect-3.png
*/
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment