-
Notifications
You must be signed in to change notification settings - Fork 26
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Discussion on the Robot abstraction #33
Comments
@diegoferigo joint torques may be added? |
I thought they were already in the list, of course joint torques. Joint forces can be already applied but not yet read. |
3 tasks
Excluding sensors, all the other abstractions have been implemented in the new ScenarI/O bindings #158. Closing. |
Sign up for free
to join this conversation on GitHub.
Already have an account?
Sign in to comment
Labels
Currently the Robot interface has only the support of getting and setting joint values. It is enough for the simple environments we have developed, but definitely it has to be extended. I open this issue as placeholder for the discussion about how to structure the interface.
An initial list of feature we need to have is the following:
The existing interfaces are the following:
gympp::Robot
andgym_ignition.base.Robot
.Currently the robot interface is unique. Likely this is not scalable, there would be too many methods. We should think how to split it in many interfaces while maintaining consistency between python and C++.
Reminder: the python interface will switch to numpy arrays soon
The text was updated successfully, but these errors were encountered: