C++ API and example

1. Introduction

This page exist in order to extract the examples from the Doxygen documentation, Please have look at the end of this page there are all the examples.

2. C++ API and example

template<typename Action, typename Observation>
class BasePyBulletFingerDriver : public robot_interfaces::RobotDriver<Action, Observation>, public robot_interfaces::RobotDriver<Action, Observation>

Base driver for pyBullet of both single Finger and TriFinger.

Implements all methods of RobotDriver except initialize which needs to be implemented by the child class as there are differences between single Finger and TriFinger.

All other methods are generic and only need to be templated with the proper types for actions/observations.

Template Parameters:
  • Action – Action type used for the specific robot.

  • Observation – Observation type used for the specific robot.

Public Types

typedef Observation::JointVector JointVector
typedef Observation::JointVector JointVector

Public Functions

inline BasePyBulletFingerDriver(bool real_time_mode, bool visualize)
inline Observation get_latest_observation() override
inline Action apply_action(const Action &desired_action) override
inline std::optional<std::string> get_error() override
inline void shutdown() override
inline BasePyBulletFingerDriver(bool real_time_mode, bool visualize)
inline Observation get_latest_observation() override
inline Action apply_action(const Action &desired_action) override
inline std::optional<std::string> get_error() override
inline void shutdown() override

Protected Attributes

bool real_time_mode_

If true, step simulation at 1 kHz, otherwise as fast as possible.

bool visualize_

If true, pyBullet GUI for visualization is started.

py::object sim_finger_

Instance of the Python class SimFinger that implements the pyBullet simulation of the finger robots.

This needs to be initialized by the child class!

struct CalibrationParameters

Parameters related to calibration.

Public Members

Vector endstop_search_torques_Nm = Vector::Zero()

Torque that is used to find the end stop.

uint32_t move_steps = 0

Number of time steps for reaching the initial position.

struct Config

Configuration of the robot that can be changed by the user.

Public Types

typedef std::array<std::string, N_MOTOR_BOARDS> CanPortArray
typedef std::array<std::string, N_MOTOR_BOARDS> CanPortArray

Public Functions

bool is_within_hard_position_limits(const Vector &position) const

Check if the given position is within the hard limits.

Parameters:

position – Joint positions.

Returns:

True if hard_position_limits_lower <= position <= hard_position_limits_upper.

void print() const

Print the given configuration in a human-readable way.

bool is_within_hard_position_limits(const Vector &position) const

Check if the given position is within the hard limits.

Parameters:

position – Joint positions.

Returns:

True if hard_position_limits_lower <= position <= hard_position_limits_upper.

void print() const

Print the given configuration in a human-readable way.

Public Members

CanPortArray can_ports

List of CAN port names used by the robot.

For each motor control board used by the robot, this specifies the CAN port through which it is connected.

Example: {"can0", "can1"}

double max_current_A = 0.0

Maximum current that can be sent to the motor [A].

bool has_endstop = false

Whether the joints have physical end stops or not.

This is for example relevant for homing, where (in case this value is set to true) all joints move until they hit the end stop to determine their absolute position.

Note that not having end stops does not mean that the joint can rotate freely in general.

HomingMethod homing_method = HomingMethod::NONE

Which method to use for homing.

struct robot_fingers::NJointBlmcRobotDriver<Observation, N_JOINTS, N_MOTOR_BOARDS>::Config::CalibrationParameters calibration
double move_to_position_tolerance_rad = 0.0

Tolerance for reaching the target with NJointBlmcRobotDriver::move_to_position.

Vector safety_kd = Vector::Constant(0.1)

D-gain to dampen velocity. Set to zero to disable damping.

struct robot_fingers::NJointBlmcRobotDriver<Observation, N_JOINTS, N_MOTOR_BOARDS>::Config::PositionControlGains position_control_gains
Vector hard_position_limits_lower = Vector::Zero()

Hard lower limits for joint position.

Exceeding this limit results in an error and robot shutdown.

Vector hard_position_limits_upper = Vector::Zero()

Hard upper limits for joint position.

Exceeding this limit results in an error and robot shutdown.

Vector soft_position_limits_lower = Vector::Constant(-std::numeric_limits<double>::infinity())

Soft lower limits for joint position.

Exceeding this limit results in the action being adjusted to move the joint back inside the limits.

Vector soft_position_limits_upper = Vector::Constant(std::numeric_limits<double>::infinity())

Soft upper limits for joint position.

Exceeding this limit results in the action being adjusted to move the joint back inside the limits.

Vector home_offset_rad = Vector::Zero()

Offset between home position and zero.

Vector initial_position_rad = Vector::Zero()

Initial position to which the robot moves after initialization.

std::vector<TrajectoryStep> shutdown_trajectory

Trajectory which is executed in the shutdown method.

Use this to move the robot to a “rest position” during shutdown of the robot driver. It can consist of arbitrarily many steps. Leave it empty to not move during shutdown.

std::vector<std::string> run_duration_logfiles

List of file to which run duration logs are written.

You can specify multiple files here if you want to log the runtime of different independent components separately. For example on a robot with multiple manipulators, you can have a separate log for each manipulator, so if one of them is replaced, only the log file of this manipulator needs to be changed.

Public Static Functions

static Config load_config(const std::string &config_file_name)

Load driver configuration from file.

Load the configuration from the specified YAML file. The file is expected to have the same structure/key naming as the Config struct. If a value can not be read from the file, the application exists with an error message.

Parameters:

config_file_name – Path/name of the configuration YAML file.

Returns:

Configuration

static inline HomingMethod parse_homing_method_name(const std::string method_name)

Parse a homing method name.

Parameters:

method_name – Homing method name.

Throws:

std::invalid_argument – if the given string does not represent a valid homing method.

Returns:

The corresponding homing method.

static inline std::string get_homing_method_name(HomingMethod method)

Get the name of the specified homing method.

static Config load_config(const std::string &config_file_name)

Load driver configuration from file.

Load the configuration from the specified YAML file. The file is expected to have the same structure/key naming as the Config struct. If a value can not be read from the file, the application exists with an error message.

Parameters:

config_file_name – Path/name of the configuration YAML file.

Returns:

Configuration

static inline HomingMethod parse_homing_method_name(const std::string method_name)

Parse a homing method name.

Parameters:

method_name – Homing method name.

Throws:

std::invalid_argument – if the given string does not represent a valid homing method.

Returns:

The corresponding homing method.

static inline std::string get_homing_method_name(HomingMethod method)

Get the name of the specified homing method.

Private Static Functions

template<typename T>
static void set_config_value(const YAML::Node &user_config, const std::string &name, T *var)

Set value from user configuration to var if specified.

Checks if a field name exists in user_config. If yes, its value is written to var, otherwise var is unchanged.

Parameters:
  • user_config[in] YAML node containing the user configuration.

  • name[in] Name of the configuration entry.

  • var[out] Variable to which configuration is written. Value is unchanged if the specified field name does not exist in user_config, i.e. it can be initialized with a default value.

template<typename T>
static void set_config_value(const YAML::Node &user_config, const std::string &name, T *var)

Set value from user configuration to var if specified.

Checks if a field name exists in user_config. If yes, its value is written to var, otherwise var is unchanged.

Parameters:
  • user_config[in] YAML node containing the user configuration.

  • name[in] Name of the configuration entry.

  • var[out] Variable to which configuration is written. Value is unchanged if the specified field name does not exist in user_config, i.e. it can be initialized with a default value.

class FakeFingerDriver : public robot_interfaces::RobotDriver<robot_interfaces::MonoFingerTypes::Action, robot_interfaces::MonoFingerTypes::Observation>, public robot_interfaces::RobotDriver<robot_interfaces::MonoFingerTypes::Action, robot_interfaces::MonoFingerTypes::Observation>

Public Types

typedef robot_interfaces::MonoFingerTypes::Action Action
typedef robot_interfaces::MonoFingerTypes::Observation Observation
typedef robot_interfaces::MonoFingerTypes::Action::Vector Vector
typedef robot_interfaces::MonoFingerTypes::Action Action
typedef robot_interfaces::MonoFingerTypes::Observation Observation
typedef robot_interfaces::MonoFingerTypes::Action::Vector Vector

Public Functions

inline FakeFingerDriver()
inline Observation get_latest_observation() override
inline Action apply_action(const Action &desired_action) override
inline std::optional<std::string> get_error() override
inline void shutdown() override
inline void initialize() override
inline FakeFingerDriver()
inline Observation get_latest_observation() override
inline Action apply_action(const Action &desired_action) override
inline std::optional<std::string> get_error() override
inline void shutdown() override
inline void initialize() override

Public Members

int data_generating_index_ = 0
struct MotorParameters

Parameters related to the motor.

Public Members

double torque_constant_NmpA

Torque constant K_t of the motor [Nm/A].

double gear_ratio

Gear ratio between motor and joint.

For a n:1 ratio (i.e. one joint revolution requires n motor revolutions) set this value to n.

template<size_t N_FINGERS>
class NFingerDriver : public robot_fingers::NJointBlmcRobotDriver<N_FINGERS>, public robot_fingers::NJointBlmcRobotDriver<N_FINGERS>

Driver for the Finger robots.

This is a generic driver for the CAN-based BLMC Finger Robots. It works for both a single finger and a set of multiple fingers.

Template Parameters:

N_FINGERS – Number of fingers on the robot.

Public Types

typedef robot_interfaces::NFingerObservation<N_FINGERS> Observation
typedef robot_interfaces::NFingerObservation<N_FINGERS> Observation

Public Functions

inline virtual Observation get_latest_observation() override
inline virtual Observation get_latest_observation() override
template<typename Observation, size_t N_JOINTS, size_t N_MOTOR_BOARDS>
class NJointBlmcRobotDriver : public robot_interfaces::RobotDriver<robot_interfaces::NJointAction<N_JOINTS>, Observation>, public robot_interfaces::RobotDriver<robot_interfaces::NJointAction<N_JOINTS>, Observation>

Base class for n-joint BLMC robots.

This is a generic base class to easily implement drivers for simple BLMC robots that consist of N_JOINTS joints.

Template Parameters:
  • N_JOINTS – Number of joints.

  • N_MOTOR_BOARDS – Number of motor control boards that are used.

Subclassed by robot_fingers::NFingerDriver< 1 >, robot_fingers::NFingerDriver< 3 >, robot_fingers::NFingerDriver< 1 >, robot_fingers::NFingerDriver< 3 >, robot_fingers::SimpleNJointBlmcRobotDriver< 8, 4 >, robot_fingers::SimpleNJointBlmcRobotDriver< 2, 1 >, robot_fingers::SimpleNJointBlmcRobotDriver< 1, 1 >, robot_fingers::SimpleNJointBlmcRobotDriver< 8, 4 >, robot_fingers::SimpleNJointBlmcRobotDriver< 2, 1 >, robot_fingers::SimpleNJointBlmcRobotDriver< 1, 1 >, robot_fingers::NFingerDriver< N_FINGERS >, robot_fingers::NFingerDriver< N_FINGERS >

Public Types

typedef robot_interfaces::NJointAction<N_JOINTS> Action
typedef robot_interfaces::RobotInterfaceTypes<Action, Observation> Types
typedef Action::Vector Vector
typedef std::array<std::shared_ptr<blmc_drivers::MotorInterface>, N_JOINTS> Motors
typedef std::array<std::shared_ptr<blmc_drivers::CanBusMotorBoard>, N_MOTOR_BOARDS> MotorBoards
typedef robot_interfaces::NJointAction<N_JOINTS> Action
typedef robot_interfaces::RobotInterfaceTypes<Action, Observation> Types
typedef Action::Vector Vector
typedef std::array<std::shared_ptr<blmc_drivers::MotorInterface>, N_JOINTS> Motors
typedef std::array<std::shared_ptr<blmc_drivers::CanBusMotorBoard>, N_MOTOR_BOARDS> MotorBoards

Public Functions

inline NJointBlmcRobotDriver(const MotorBoards &motor_boards, const Motors &motors, const MotorParameters &motor_parameters, const Config &config)
inline Vector get_max_torques() const
void pause_motors()
Vector get_measured_index_angles() const
void initialize() override

Find home position of all joints and move to start position.

Homes all joints using home_on_index_after_negative_end_stop. When finished, move the joint to the starting position (defined in config_.initial_position_rad).

Action get_idle_action() override

Provide position command to initial position as idle action.

This will hold the joints in place after the initialisation.

virtual Observation get_latest_observation() override = 0
Action apply_action(const Action &desired_action) override
std::optional<std::string> get_error() override
void shutdown() override
bool is_within_hard_position_limits(const Observation &observation) const

Check if the joint position is within the hard limits.

See also

Config::is_within_hard_position_limits

inline NJointBlmcRobotDriver(const MotorBoards &motor_boards, const Motors &motors, const MotorParameters &motor_parameters, const Config &config)
inline Vector get_max_torques() const
void pause_motors()
Vector get_measured_index_angles() const
void initialize() override

Find home position of all joints and move to start position.

Homes all joints using home_on_index_after_negative_end_stop. When finished, move the joint to the starting position (defined in config_.initial_position_rad).

Action get_idle_action() override

Provide position command to initial position as idle action.

This will hold the joints in place after the initialisation.

virtual Observation get_latest_observation() override = 0
Action apply_action(const Action &desired_action) override
std::optional<std::string> get_error() override
void shutdown() override
bool is_within_hard_position_limits(const Observation &observation) const

Check if the joint position is within the hard limits.

See also

Config::is_within_hard_position_limits

Public Members

const bool has_endstop_

True if the joints have mechanical end stops, false if not.

If set to true, it is assumed that all joints of the robot have end stops that mechanically prevent them from moving out of the valid range.

If present, the end stops are used for a fully automated homing procedure in which the joints first move until they hit the end stop before starting the encoder index search. This way it is ensured that the correct index is used for homing without any need for manual set up.

Public Static Functions

static MotorBoards create_motor_boards(const std::array<std::string, N_MOTOR_BOARDS> &can_ports)
static Action process_desired_action(const Action &desired_action, const Observation &latest_observation, const double max_torque_Nm, const Vector &safety_kd, const Vector &default_position_control_kp, const Vector &default_position_control_kd, bool is_position_limit_enabled, const Vector &lower_position_limits = Vector::Constant(-std::numeric_limits<double>::infinity()), const Vector &upper_position_limits = Vector::Constant(std::numeric_limits<double>::infinity()))

Process the desired action provided by the user.

Takes the desired action from the user and does the following processing:

static MotorBoards create_motor_boards(const std::array<std::string, N_MOTOR_BOARDS> &can_ports)
static Action process_desired_action(const Action &desired_action, const Observation &latest_observation, const double max_torque_Nm, const Vector &safety_kd, const Vector &default_position_control_kp, const Vector &default_position_control_kd, bool is_position_limit_enabled, const Vector &lower_position_limits = Vector::Constant(-std::numeric_limits<double>::infinity()), const Vector &upper_position_limits = Vector::Constant(std::numeric_limits<double>::infinity()))

Process the desired action provided by the user.

Takes the desired action from the user and does the following processing:

Public Static Attributes

static constexpr size_t num_joints = N_JOINTS
static constexpr size_t num_motor_boards = N_MOTOR_BOARDS

Protected Functions

Action apply_action_uninitialized(const Action &desired_action)
void _initialize()

Actual initialization that is called in a real-time thread in initialize().

void move_until_blocking(Vector torques_Nm)

Move with constant torque until all joints are blocking.

Applies a constant torque until all joints are reporting a velocity close to zero. This can be used to move against the end-stops.

Parameters:

torques_Nm – Torques that are applied to the joints.

bool homing()

Homing of all joints, based on the robot configuration.

Procedure for finding an absolute zero position when using relative encoders.

The method for finding the home position is depending on the “homing_method” setting in the configuration. See Config::HomingMethod for the different options.

By default, the zero position after homing is the same as the home position. The optional configuration parameter Config::home_offset_rad provides a means to move the zero position relative to the home position. The zero position is computed as

zero position = home position + home offset

Returns:

True if the homing was successful, false if not. In case of a failure, an error message with more information is printed to stdout.

bool move_to_position(const Vector &goal_pos, const double tolerance, const uint32_t time_steps)

Move to given goal position with a minimum jerk trajectory.

Use a series of position actions to move to the given goal position on a minimum jerk trajectory.

Parameters:
  • goal_pos – Angular goal position for each joint.

  • tolerance – Allowed position error for reaching the goal. This is checked per joint, that is the maximal possible error is +/-tolerance on each joint.

  • time_steps – Number of control loop cycles for reaching the goal. The lower the number of steps, the faster the robot will move.

Returns:

True if goal position is reached, false if timeout is exceeded.

inline void enable_position_limits()

Enable the position limits.

inline void disable_position_limits()

Disable the position limits.

Action apply_action_uninitialized(const Action &desired_action)
void _initialize()

Actual initialization that is called in a real-time thread in initialize().

void move_until_blocking(Vector torques_Nm)

Move with constant torque until all joints are blocking.

Applies a constant torque until all joints are reporting a velocity close to zero. This can be used to move against the end-stops.

Parameters:

torques_Nm – Torques that are applied to the joints.

bool homing()

Homing of all joints, based on the robot configuration.

Procedure for finding an absolute zero position when using relative encoders.

The method for finding the home position is depending on the “homing_method” setting in the configuration. See Config::HomingMethod for the different options.

By default, the zero position after homing is the same as the home position. The optional configuration parameter Config::home_offset_rad provides a means to move the zero position relative to the home position. The zero position is computed as

zero position = home position + home offset

Returns:

True if the homing was successful, false if not. In case of a failure, an error message with more information is printed to stdout.

bool move_to_position(const Vector &goal_pos, const double tolerance, const uint32_t time_steps)

Move to given goal position with a minimum jerk trajectory.

Use a series of position actions to move to the given goal position on a minimum jerk trajectory.

Parameters:
  • goal_pos – Angular goal position for each joint.

  • tolerance – Allowed position error for reaching the goal. This is checked per joint, that is the maximal possible error is +/-tolerance on each joint.

  • time_steps – Number of control loop cycles for reaching the goal. The lower the number of steps, the faster the robot will move.

Returns:

True if goal position is reached, false if timeout is exceeded.

inline void enable_position_limits()

Enable the position limits.

inline void disable_position_limits()

Disable the position limits.

Protected Attributes

blmc_drivers::BlmcJointModules<N_JOINTS> joint_modules_
MotorBoards motor_boards_
MotorParameters motor_parameters_

Fixed motor parameters (assuming all joints use same setup).

double max_torque_Nm_

Maximum torque allowed on each joint.

Config config_

User-defined configuration of the driver.

Contains all configuration values that can be modified by the user (via the configuration file).

bool is_initialized_ = true
uint32_t action_counter_ = 0

Counter for the number of actions sent to the robot.

bool is_position_limit_enabled_ = false

Whether or not position limits (both soft and hard) are checked.

class OneJointDriver : public robot_fingers::SimpleNJointBlmcRobotDriver<1, 1>, public robot_fingers::SimpleNJointBlmcRobotDriver<1, 1>

Driver for a single joint.

Driver for a single BLMC joint. Mostly intended for testing purposes.

Public Functions

inline OneJointDriver(const Config &config)
inline OneJointDriver(const Config &config)

Private Functions

inline OneJointDriver(const MotorBoards &motor_boards, const Config &config)
inline OneJointDriver(const MotorBoards &motor_boards, const Config &config)

Private Static Functions

static inline Motors create_motors(const MotorBoards &motor_boards)
static inline Motors create_motors(const MotorBoards &motor_boards)
struct PositionControlGains

Default control gains for the position PD controller.

Public Members

Vector kp = Vector::Zero()
Vector kd = Vector::Zero()
class PyBulletSingleFingerDriver : public trifinger_simulation::BasePyBulletFingerDriver<robot_interfaces::MonoFingerTypes::Action, robot_interfaces::MonoFingerTypes::Observation>, public trifinger_simulation::BasePyBulletFingerDriver<robot_interfaces::MonoFingerTypes::Action, robot_interfaces::MonoFingerTypes::Observation>

pyBullet driver for the single Finger.

Public Functions

inline void initialize() override
inline void initialize() override
inline BasePyBulletFingerDriver(bool real_time_mode, bool visualize)
inline BasePyBulletFingerDriver(bool real_time_mode, bool visualize)
inline BasePyBulletFingerDriver(bool real_time_mode, bool visualize)
inline BasePyBulletFingerDriver(bool real_time_mode, bool visualize)
class PyBulletTriFingerDriver : public trifinger_simulation::BasePyBulletFingerDriver<robot_interfaces::TriFingerTypes::Action, robot_interfaces::TriFingerTypes::Observation>, public trifinger_simulation::BasePyBulletFingerDriver<robot_interfaces::TriFingerTypes::Action, robot_interfaces::TriFingerTypes::Observation>

pyBullet driver for the TriFinger

Public Functions

inline void initialize() override
inline void initialize() override
inline BasePyBulletFingerDriver(bool real_time_mode, bool visualize)
inline BasePyBulletFingerDriver(bool real_time_mode, bool visualize)
inline BasePyBulletFingerDriver(bool real_time_mode, bool visualize)
inline BasePyBulletFingerDriver(bool real_time_mode, bool visualize)
class RealFingerDriver : public robot_fingers::NFingerDriver<1>, public robot_fingers::NFingerDriver<1>

Public Functions

inline RealFingerDriver(const Config &config)
inline RealFingerDriver(const Config &config)

Private Functions

inline RealFingerDriver(const MotorBoards &motor_boards, const Config &config)
inline RealFingerDriver(const MotorBoards &motor_boards, const Config &config)

Private Static Functions

static inline Motors create_motors(const MotorBoards &motor_boards)
static inline Motors create_motors(const MotorBoards &motor_boards)
template<size_t N_JOINTS, size_t N_MOTOR_BOARDS = (N_JOINTS + 1) / 2>
class SimpleNJointBlmcRobotDriver : public robot_fingers::NJointBlmcRobotDriver<robot_interfaces::NJointObservation<N_JOINTS>, N_JOINTS, (N_JOINTS + 1) / 2>, public robot_fingers::NJointBlmcRobotDriver<robot_interfaces::NJointObservation<N_JOINTS>, N_JOINTS, (N_JOINTS + 1) / 2>

Simple n-joint robot driver that uses NJointObservation.

Template Parameters:
  • N_JOINTS – Number of joints

  • N_MOTOR_BOARDS – Number of motor boards.

Public Types

typedef robot_interfaces::NJointObservation<N_JOINTS> Observation
typedef robot_interfaces::NJointObservation<N_JOINTS> Observation

Public Functions

virtual Observation get_latest_observation() override
virtual Observation get_latest_observation() override
class SoloEightDriver : public robot_fingers::SimpleNJointBlmcRobotDriver<8, 4>, public robot_fingers::SimpleNJointBlmcRobotDriver<8, 4>

Driver for Solo 8.

Driver for 4 times double BLMC joint.

Public Functions

inline SoloEightDriver(const Config &config)
inline SoloEightDriver(const Config &config)

Private Functions

inline SoloEightDriver(const MotorBoards &motor_boards, const Config &config)
inline SoloEightDriver(const MotorBoards &motor_boards, const Config &config)

Private Static Functions

static inline Motors create_motors(const MotorBoards &motor_boards)
static inline Motors create_motors(const MotorBoards &motor_boards)
template<typename CameraObservation_t>
class T_TriFingerPlatformFrontend

Combined frontend for the TriFinger Platform.

This class combines the frontends for robot and cameras in one class using unified time indices.

Internally the different frontends all have their own time indices which are unrelated to each other. In this combined class, the time index used is the one that belongs to the robot frontend. When accessing observations of the other frontends, it also takes this index t and internally matches it to the time index t_o that was active in the other frontend at the time of t.

Todo:

Methods to get timestamp from camera or object tracker?

This class combines the frontends for robot and cameras in one class using unified time indices.

Internally the different frontends all have their own time indices which are unrelated to each other. In this combined class, the time index used is the one that belongs to the robot frontend. When accessing observations of the other frontends, it also takes this index t and internally matches it to the time index t_o that was active in the other frontend at the time of t.

Todo:

Methods to get timestamp from camera or object tracker?

Public Types

typedef robot_interfaces::TriFingerTypes::Action Action
typedef robot_interfaces::TriFingerTypes::Observation RobotObservation
typedef robot_interfaces::Status RobotStatus
typedef CameraObservation_t CameraObservation
typedef robot_interfaces::TriFingerTypes::Action Action
typedef robot_interfaces::TriFingerTypes::Observation RobotObservation
typedef robot_interfaces::Status RobotStatus
typedef CameraObservation_t CameraObservation

Public Functions

inline T_TriFingerPlatformFrontend(robot_interfaces::TriFingerTypes::BaseDataPtr robot_data, std::shared_ptr<robot_interfaces::SensorData<CameraObservation>> camera_data)

Initialize with data instances for all internal frontends.

Parameters:
  • robot_data – RobotData instance used by the robot frontend.

  • object_tracker_data – ObjectTrackerData instance used by the object tracker frontend.

  • camera_data – SensorData instance, used by the camera frontend.

inline T_TriFingerPlatformFrontend()

Initialize with default data instances.

Creates for each internal frontend a corresponding mutli-process data instance with the default shared memory ID for the corresponding data type.

inline time_series::Index append_desired_action(const Action &desired_action)

Append a desired robot action to the action queue.

See also

robot_interfaces::TriFingerTypes::Frontend::append_desired_action

Returns:

The index of the time step at which this action is going to be executed.

inline RobotObservation get_robot_observation(const time_series::Index &t) const

Get robot observation of the time step t.

See also

robot_interfaces::TriFingerTypes::Frontend::get_observation

inline Action get_desired_action(const time_series::Index &t) const

Get desired action of time step t.

See also

robot_interfaces::TriFingerTypes::Frontend::get_desired_action

inline Action get_applied_action(const time_series::Index &t) const

Get actually applied action of time step t.

See also

robot_interfaces::TriFingerTypes::Frontend::get_applied_action

inline RobotStatus get_robot_status(const time_series::Index &t) const

Get robot status of time step t.

See also

robot_interfaces::TriFingerTypes::Frontend::get_status

inline time_series::Timestamp get_timestamp_ms(const time_series::Index &t) const

Get timestamp (in milliseconds) of time step t.

See also

robot_interfaces::TriFingerTypes::Frontend::get_timestamp_ms

inline time_series::Index get_current_timeindex() const

Get the current time index.

See also

robot_interfaces::TriFingerTypes::Frontend::get_current_timeindex

inline void wait_until_timeindex(const time_series::Index &t) const

Wait until time step t.

See also

robot_interfaces::TriFingerTypes::Frontend::wait_until_timeindex

inline CameraObservation get_camera_observation(const time_series::Index t) const

Get camera images of time step t.

Parameters:

t – Time index of the robot time series. This is internally mapped to the corresponding time index of the camera time series.

Returns:

Camera images of time step t.

inline T_TriFingerPlatformFrontend(robot_interfaces::TriFingerTypes::BaseDataPtr robot_data, std::shared_ptr<robot_interfaces::SensorData<CameraObservation>> camera_data)

Initialize with data instances for all internal frontends.

Parameters:
  • robot_data – RobotData instance used by the robot frontend.

  • object_tracker_data – ObjectTrackerData instance used by the object tracker frontend.

  • camera_data – SensorData instance, used by the camera frontend.

inline T_TriFingerPlatformFrontend()

Initialize with default data instances.

Creates for each internal frontend a corresponding mutli-process data instance with the default shared memory ID for the corresponding data type.

inline time_series::Index append_desired_action(const Action &desired_action)

Append a desired robot action to the action queue.

See also

robot_interfaces::TriFingerTypes::Frontend::append_desired_action

Returns:

The index of the time step at which this action is going to be executed.

inline RobotObservation get_robot_observation(const time_series::Index &t) const

Get robot observation of the time step t.

See also

robot_interfaces::TriFingerTypes::Frontend::get_observation

inline Action get_desired_action(const time_series::Index &t) const

Get desired action of time step t.

See also

robot_interfaces::TriFingerTypes::Frontend::get_desired_action

inline Action get_applied_action(const time_series::Index &t) const

Get actually applied action of time step t.

See also

robot_interfaces::TriFingerTypes::Frontend::get_applied_action

inline RobotStatus get_robot_status(const time_series::Index &t) const

Get robot status of time step t.

See also

robot_interfaces::TriFingerTypes::Frontend::get_status

inline time_series::Timestamp get_timestamp_ms(const time_series::Index &t) const

Get timestamp (in milliseconds) of time step t.

See also

robot_interfaces::TriFingerTypes::Frontend::get_timestamp_ms

inline time_series::Index get_current_timeindex() const

Get the current time index.

See also

robot_interfaces::TriFingerTypes::Frontend::get_current_timeindex

inline void wait_until_timeindex(const time_series::Index &t) const

Wait until time step t.

See also

robot_interfaces::TriFingerTypes::Frontend::wait_until_timeindex

inline CameraObservation get_camera_observation(const time_series::Index t) const

Get camera images of time step t.

Parameters:

t – Time index of the robot time series. This is internally mapped to the corresponding time index of the camera time series.

Returns:

Camera images of time step t.

Private Functions

template<typename FrontendType>
inline time_series::Index find_matching_timeindex(const FrontendType &other_frontend, const time_series::Index t_robot) const

Find time index of frontend that matches with the given robot time index.

The given time index t_robot refers to the robot data time series. To provide the correct observation from the other frontend for this time step, find the highest time index t_other of the other frontend where

 timestamp(t_other) <= timestamp(t_robot)
Note that this is not always the one that is closest w.r.t. to the timestamp, i.e.
 t_other != argmin(|timestamp(t_other) - timestamp(t_robot)|)
The latter would not be deterministic: the outcome could change when called twice with the same t_robot if a new “other” observation arrived in between the calls.

Todo:

The implementation below is very naive. It simply does a linear search starting from the latest time index. So worst case performance is O(n) where n is the number of “other” observations over the period that is covered by the buffer of the robot data.

Options to speed this up:

  • binary search (?)

  • estimate time step size based on last observations

  • store matched indices of last call

Note, however, that t_robot is very likely the latest time index in most cases. In this case the match for t_other will also be the latest index of the corresponding time series. In this case, the complexity is O(1). So even when implementing a more complex search algorithm, the first candidate for t_other that is checked should always be the latest one.

Template Parameters:

FrontendType – Type of the frontend. This is templated so that the same implementation can be used for both camera and object tracker frontend.

Parameters:
  • other_frontend – The frontend for which a matching time index needs to be found.

  • t_robot – Time index of the robot frontend.

Returns:

Time index for other_frontend which is/was active at the time of t_robot.

template<typename FrontendType>
inline time_series::Index find_matching_timeindex(const FrontendType &other_frontend, const time_series::Index t_robot) const

Find time index of frontend that matches with the given robot time index.

The given time index t_robot refers to the robot data time series. To provide the correct observation from the other frontend for this time step, find the highest time index t_other of the other frontend where

 timestamp(t_other) <= timestamp(t_robot)
Note that this is not always the one that is closest w.r.t. to the timestamp, i.e.
 t_other != argmin(|timestamp(t_other) - timestamp(t_robot)|)
The latter would not be deterministic: the outcome could change when called twice with the same t_robot if a new “other” observation arrived in between the calls.

Todo:

The implementation below is very naive. It simply does a linear search starting from the latest time index. So worst case performance is O(n) where n is the number of “other” observations over the period that is covered by the buffer of the robot data.

Options to speed this up:

  • binary search (?)

  • estimate time step size based on last observations

  • store matched indices of last call

Note, however, that t_robot is very likely the latest time index in most cases. In this case the match for t_other will also be the latest index of the corresponding time series. In this case, the complexity is O(1). So even when implementing a more complex search algorithm, the first candidate for t_other that is checked should always be the latest one.

Template Parameters:

FrontendType – Type of the frontend. This is templated so that the same implementation can be used for both camera and object tracker frontend.

Parameters:
  • other_frontend – The frontend for which a matching time index needs to be found.

  • t_robot – Time index of the robot frontend.

Returns:

Time index for other_frontend which is/was active at the time of t_robot.

Private Members

robot_interfaces::TriFingerTypes::Frontend robot_frontend_
robot_interfaces::SensorFrontend<CameraObservation> camera_frontend_
template<typename CameraObservation_t>
class T_TriFingerPlatformLog

Load robot and camera log and match observations like during runtime.

The robot and camera observations are provided asynchronously. To access both through a common time index, the TriFingerPlatformFrontend class maps “robot time indices” to the corresponding camera observations based on the time stamps. This mapping is not explicitly saved in the log files. Therefore, the TriFingerPlatformLog class provides an interface to load robot and camera logs together and performs the mapping from robot to camera time index in the same way as it is happening in TriFingerPlatformFrontend.

Public Types

typedef robot_interfaces::TriFingerTypes::Action Action
typedef robot_interfaces::TriFingerTypes::Observation RobotObservation
typedef robot_interfaces::Status RobotStatus
typedef CameraObservation_t CameraObservation
typedef robot_interfaces::TriFingerTypes::Action Action
typedef robot_interfaces::TriFingerTypes::Observation RobotObservation
typedef robot_interfaces::Status RobotStatus
typedef CameraObservation_t CameraObservation

Public Functions

inline T_TriFingerPlatformLog(const std::string &robot_log_file, const std::string &camera_log_file)
inline const robot_interfaces::TriFingerTypes::BinaryLogReader &get_robot_log() const

Access the robot log.

inline const robot_interfaces::SensorLogReader<CameraObservation> &get_camera_log() const

Access the camera log.

inline const std::vector<int> &get_map_robot_to_camera_index() const

Access the index mapping from robot to camera log.

Note that the robot observation index does not necessarily match with the time index!

inline RobotObservation get_robot_observation(const time_series::Index &t) const

Get robot observation of the time step t.

inline Action get_desired_action(const time_series::Index &t) const

Get desired action of time step t.

inline Action get_applied_action(const time_series::Index &t) const

Get actually applied action of time step t.

inline RobotStatus get_robot_status(const time_series::Index &t) const

Get robot status of time step t.

inline time_series::Timestamp get_timestamp_ms(const time_series::Index &t) const

Get timestamp (in milliseconds) of time step t.

inline CameraObservation get_camera_observation(const time_series::Index t) const

Get camera images of time step t.

Parameters:

t – Time index of the robot time series. This is internally mapped to the corresponding time index of the camera time series.

Returns:

Camera images of time step t.

inline time_series::Index get_first_timeindex() const

Get the time index of the first time step in the log.

inline time_series::Index get_last_timeindex() const

Get the time index of the last time step in the log.

inline T_TriFingerPlatformLog(const std::string &robot_log_file, const std::string &camera_log_file)
inline const robot_interfaces::TriFingerTypes::BinaryLogReader &get_robot_log() const

Access the robot log.

inline const robot_interfaces::SensorLogReader<CameraObservation> &get_camera_log() const

Access the camera log.

inline const std::vector<int> &get_map_robot_to_camera_index() const

Access the index mapping from robot to camera log.

Note that the robot observation index does not necessarily match with the time index!

inline RobotObservation get_robot_observation(const time_series::Index &t) const

Get robot observation of the time step t.

inline Action get_desired_action(const time_series::Index &t) const

Get desired action of time step t.

inline Action get_applied_action(const time_series::Index &t) const

Get actually applied action of time step t.

inline RobotStatus get_robot_status(const time_series::Index &t) const

Get robot status of time step t.

inline time_series::Timestamp get_timestamp_ms(const time_series::Index &t) const

Get timestamp (in milliseconds) of time step t.

inline CameraObservation get_camera_observation(const time_series::Index t) const

Get camera images of time step t.

Parameters:

t – Time index of the robot time series. This is internally mapped to the corresponding time index of the camera time series.

Returns:

Camera images of time step t.

inline time_series::Index get_first_timeindex() const

Get the time index of the first time step in the log.

inline time_series::Index get_last_timeindex() const

Get the time index of the last time step in the log.

Private Members

robot_interfaces::TriFingerTypes::BinaryLogReader robot_log_
robot_interfaces::SensorLogReader<CameraObservation> camera_log_
time_series::Index robot_log_start_index_
std::vector<int> map_robot_to_camera_index_
struct TrajectoryStep

A sub-goal of a trajectory.

Public Members

Vector target_position_rad = Vector::Zero()

Target position to which the joints should move.

uint32_t move_steps = 0

Number of time steps for reaching the target position.

class TriFingerDriver : public robot_fingers::NFingerDriver<3>, public robot_fingers::NFingerDriver<3>

Public Functions

inline TriFingerDriver(const Config &config)
inline TriFingerDriver(const Config &config)

Private Functions

inline TriFingerDriver(const MotorBoards &motor_boards, const Config &config)
inline TriFingerDriver(const MotorBoards &motor_boards, const Config &config)

Private Static Functions

static inline Motors create_motors(const MotorBoards &motor_boards)
static inline Motors create_motors(const MotorBoards &motor_boards)
class TwoJointDriver : public robot_fingers::SimpleNJointBlmcRobotDriver<2, 1>, public robot_fingers::SimpleNJointBlmcRobotDriver<2, 1>

Driver for two joints.

Driver for a double BLMC joint. Mostly intended for testing purposes.

Public Functions

inline TwoJointDriver(const Config &config)
inline TwoJointDriver(const Config &config)

Private Functions

inline TwoJointDriver(const MotorBoards &motor_boards, const Config &config)
inline TwoJointDriver(const MotorBoards &motor_boards, const Config &config)

Private Static Functions

static inline Motors create_motors(const MotorBoards &motor_boards)
static inline Motors create_motors(const MotorBoards &motor_boards)
namespace literals
namespace robot_fingers

Typedefs

template<size_t N_FINGERS>
using _NFingerDriverBase = NJointBlmcRobotDriver<robot_interfaces::NFingerObservation<N_FINGERS>, N_FINGERS * robot_interfaces::JOINTS_PER_FINGER, N_FINGERS * robot_interfaces::BOARDS_PER_FINGER>
typedef T_TriFingerPlatformFrontend<trifinger_object_tracking::TriCameraObjectObservation> TriFingerPlatformWithObjectFrontend
typedef T_TriFingerPlatformFrontend<trifinger_cameras::TriCameraObservation> TriFingerPlatformFrontend
typedef T_TriFingerPlatformLog<trifinger_cameras::TriCameraObservation> TriFingerPlatformLog
typedef T_TriFingerPlatformLog<trifinger_object_tracking::TriCameraObjectObservation> TriFingerPlatformWithObjectLog

Functions

template<typename Vector>
Vector clamp(const Vector &vector, const double lower_limit, const double upper_limit)

Clamp an arbitrary Eigen vector.

Parameters:
  • vector – The vector that is to be clamped.

  • lower_limit – Lower limit.

  • upper_limit – Upper limit.

Returns:

Copy of vector where values below or above the limits are set to the corresponding limit values.

robot_interfaces::MonoFingerTypes::BackendPtr create_fake_finger_backend(robot_interfaces::MonoFingerTypes::BaseDataPtr robot_data)
template<typename Driver>
Driver::Types::BackendPtr create_backend(typename Driver::Types::BaseDataPtr robot_data, const typename Driver::Config &config, const double first_action_timeout = std::numeric_limits<double>::infinity(), const uint32_t max_number_of_actions = 0)

Create backend using the specified driver.

Template Parameters:

Driver – Type of the driver. Expected to inherit from NJointBlmcRobotDriver.

Parameters:
  • robot_data – Instance of RobotData used for communication.

  • config – Driver configuration.

  • first_action_timeout – Duration for which the backend waits for the first action to arrive. If exceeded, the backend shuts down.

  • max_number_of_actions – Number of actions after which the backend automatically shuts down.

Returns:

A RobotBackend instances with a driver of the specified type.

template<typename Driver>
Driver::Types::BackendPtr create_backend(typename Driver::Types::BaseDataPtr robot_data, const std::string &config_file_path, const double first_action_timeout = std::numeric_limits<double>::infinity(), const uint32_t max_number_of_actions = 0)

Create backend using the specified driver.

Overloaded version that takes a path to a configuration YAML file instead of a Config instance.

template<typename Driver>
void bind_driver_config(pybind11::module &m, const std::string &name)
template<typename Driver>
void bind_create_backend(pybind11::module &m, const std::string &name)
namespace robot_interfaces
namespace trifinger_simulation

Functions

template<typename Types, typename Driver>
Types::BackendPtr create_finger_backend(typename Types::BaseDataPtr robot_data, const bool real_time_mode, const bool visualize, const double first_action_timeout = std::numeric_limits<double>::infinity(), const uint32_t max_number_of_actions = 0)

Create a Finger/TriFinger-backend using pyBullet.

Template Parameters:
  • Types – The struct providing the types for action, observation, etc.

  • Driver – pyBullet-Driver class for either single Finger or TriFinger.

Parameters:
  • robot_data – RobotData instance for the backend.

  • real_time_mode – If true, step the simulation in real time, otherwise as fast as possible.

  • visualize – If true, pyBullet’s GUI is started for visualization.

  • first_action_timeout – See RobotBackend

  • max_number_of_actions – See RobotBackend

Returns:

Backend using a driver of the specified type.

file clamp.hpp

Copyright

Copyright (c) 2020, New York University & Max Planck Gesellschaft.

file clamp.hpp

Copyright

Copyright (c) 2020, New York University & Max Planck Gesellschaft.

file fake_finger_driver.hpp
#include <chrono>
#include <thread>
#include <Eigen/Eigen>
#include <robot_fingers/n_joint_blmc_robot_driver.hpp>
#include <robot_interfaces/finger_types.hpp>
file fake_finger_driver.hpp
#include <chrono>
#include <thread>
#include <Eigen/Eigen>
#include <robot_fingers/n_joint_blmc_robot_driver.hpp>
#include <robot_interfaces/finger_types.hpp>
file n_finger_driver.hpp
#include <robot_fingers/n_joint_blmc_robot_driver.hpp>
#include <robot_interfaces/finger_types.hpp>

Driver for the Finger Robots.

License:

BSD 3-clause

Copyright

2020, Max Planck Gesellschaft. All rights reserved.

file n_finger_driver.hpp
#include <robot_fingers/n_joint_blmc_robot_driver.hpp>
#include <robot_interfaces/finger_types.hpp>

Driver for the Finger Robots.

License:

BSD 3-clause

Copyright

2020, Max Planck Gesellschaft. All rights reserved.

file n_joint_blmc_robot_driver.hpp
#include <array>
#include <cmath>
#include <iterator>
#include <string>
#include <fmt/format.h>
#include <yaml-cpp/yaml.h>
#include <Eigen/Eigen>
#include <robot_interfaces/monitored_robot_driver.hpp>
#include <robot_interfaces/n_joint_robot_types.hpp>
#include <robot_interfaces/robot_driver.hpp>
#include <yaml_utils/yaml_eigen.hpp>
#include <blmc_drivers/blmc_joint_module.hpp>
#include <robot_fingers/clamp.hpp>

Base driver for a generic n-joint BLMC robot.

Copyright

Copyright (c) 2019, New York University and Max Planck Gesellschaft.

Enums

enum class HomingMethod

Different homing methods that can be selected.

Values:

file n_joint_blmc_robot_driver.hpp
#include <array>
#include <cmath>
#include <iterator>
#include <string>
#include <fmt/format.h>
#include <yaml-cpp/yaml.h>
#include <Eigen/Eigen>
#include <robot_interfaces/monitored_robot_driver.hpp>
#include <robot_interfaces/n_joint_robot_types.hpp>
#include <robot_interfaces/robot_driver.hpp>
#include <yaml_utils/yaml_eigen.hpp>
#include <blmc_drivers/blmc_joint_module.hpp>
#include <robot_fingers/clamp.hpp>

Base driver for a generic n-joint BLMC robot.

Copyright

Copyright (c) 2019, New York University and Max Planck Gesellschaft.

Enums

enum class HomingMethod

Different homing methods that can be selected.

Values:

file n_joint_blmc_robot_driver.hxx

Base driver for a generic n-joint BLMC robot.

Copyright

Copyright (c) 2019, New York University and Max Planck Gesellschaft.

Defines

TPL_NJBRD
NJBRD
file n_joint_blmc_robot_driver.hxx

Base driver for a generic n-joint BLMC robot.

Copyright

Copyright (c) 2019, New York University and Max Planck Gesellschaft.

Defines

TPL_NJBRD
NJBRD
file one_joint_driver.hpp
#include <robot_fingers/n_joint_blmc_robot_driver.hpp>

Driver for a “one-joint” robot.

Copyright

Copyright (c) 2019, New York University and Max Planck Gesellschaft.

file one_joint_driver.hpp
#include <robot_fingers/n_joint_blmc_robot_driver.hpp>

Driver for a “one-joint” robot.

Copyright

Copyright (c) 2019, New York University and Max Planck Gesellschaft.

file pybullet_driver.hpp
#include <chrono>
#include <thread>
#include <pybind11/eigen.h>
#include <pybind11/embed.h>
#include <pybind11/pybind11.h>
#include <pybind11/stl_bind.h>
#include <robot_interfaces/finger_types.hpp>

C++ wrappers to use pyBullet simulation of fingers as robot_interfaces::RobotDriver.

file pybullet_driver.hpp
#include <chrono>
#include <thread>
#include <pybind11/eigen.h>
#include <pybind11/embed.h>
#include <pybind11/pybind11.h>
#include <pybind11/stl_bind.h>
#include <robot_interfaces/finger_types.hpp>

C++ wrappers to use pyBullet simulation of fingers as robot_interfaces::RobotDriver.

file real_finger_driver.hpp
#include “n_finger_driver.hpp

The hardware wrapper of the real Finger robot.

Author

Manuel Wuthrich

Date

2018

Copyright

Copyright (c) 2019, New York University and Max Planck Gesellschaft.

file real_finger_driver.hpp
#include “n_finger_driver.hpp

The hardware wrapper of the real Finger robot.

Author

Manuel Wuthrich

Date

2018

Copyright

Copyright (c) 2019, New York University and Max Planck Gesellschaft.

file solo_eight_driver.hpp
#include <robot_fingers/n_joint_blmc_robot_driver.hpp>

Driver for a “Solo 8” robot.

Copyright

Copyright (c) 2019, New York University and Max Planck Gesellschaft.

file solo_eight_driver.hpp
#include <robot_fingers/n_joint_blmc_robot_driver.hpp>

Driver for a “Solo 8” robot.

Copyright

Copyright (c) 2019, New York University and Max Planck Gesellschaft.

file trifinger_driver.hpp
#include “n_finger_driver.hpp

The hardware wrapper of the real TriFinger robot.

Copyright

Copyright (c) 2019, New York University and Max Planck Gesellschaft.

file trifinger_driver.hpp
#include “n_finger_driver.hpp

The hardware wrapper of the real TriFinger robot.

Copyright

Copyright (c) 2019, New York University and Max Planck Gesellschaft.

file trifinger_platform_frontend.hpp
#include <robot_interfaces/finger_types.hpp>
#include <robot_interfaces/sensors/sensor_frontend.hpp>
#include <trifinger_cameras/tricamera_observation.hpp>
#include <trifinger_object_tracking/tricamera_object_observation.hpp>

Combined frontend for the TriFinger Platform.

License:

BSD 3-clause

Copyright

2020, Max Planck Gesellschaft. All rights reserved.

file trifinger_platform_frontend.hpp
#include <robot_interfaces/finger_types.hpp>
#include <robot_interfaces/sensors/sensor_frontend.hpp>
#include <trifinger_cameras/tricamera_observation.hpp>
#include <trifinger_object_tracking/tricamera_object_observation.hpp>

Combined frontend for the TriFinger Platform.

License:

BSD 3-clause

Copyright

2020, Max Planck Gesellschaft. All rights reserved.

file trifinger_platform_log.hpp
#include <robot_interfaces/finger_types.hpp>
#include <robot_interfaces/sensors/sensor_log_reader.hpp>
#include <trifinger_cameras/tricamera_observation.hpp>
#include <trifinger_object_tracking/tricamera_object_observation.hpp>

Access combined log of a TriFinger platform.

License:

BSD 3-clause

Copyright

2020, Max Planck Gesellschaft. All rights reserved.

file trifinger_platform_log.hpp
#include <robot_interfaces/finger_types.hpp>
#include <robot_interfaces/sensors/sensor_log_reader.hpp>
#include <trifinger_cameras/tricamera_observation.hpp>
#include <trifinger_object_tracking/tricamera_object_observation.hpp>

Access combined log of a TriFinger platform.

License:

BSD 3-clause

Copyright

2020, Max Planck Gesellschaft. All rights reserved.

file two_joint_driver.hpp
#include <robot_fingers/n_joint_blmc_robot_driver.hpp>

Driver for a “two-joint” robot.

Copyright

Copyright (c) 2019, New York University and Max Planck Gesellschaft.

file two_joint_driver.hpp
#include <robot_fingers/n_joint_blmc_robot_driver.hpp>

Driver for a “two-joint” robot.

Copyright

Copyright (c) 2019, New York University and Max Planck Gesellschaft.

file generic_driver_bindings.hpp
#include <pybind11/eigen.h>
#include <pybind11/pybind11.h>
#include <pybind11/stl.h>
#include <robot_interfaces/n_joint_robot_types.hpp>

Helper functions for Python-binding templated types.

License:

BSD 3-clause

Copyright

2019, Max Planck Gesellschaft. All rights reserved.

file py_one_joint.cpp
#include <pybind11/pybind11.h>
#include <robot_fingers/one_joint_driver.hpp>

Functions

PYBIND11_MODULE(py_one_joint, m)
file py_real_finger.cpp
#include <pybind11/pybind11.h>
#include <robot_fingers/fake_finger_driver.hpp>
#include <robot_fingers/real_finger_driver.hpp>

Functions

PYBIND11_MODULE(py_real_finger, m)
file py_solo_eight.cpp
#include <pybind11/eigen.h>
#include <pybind11/embed.h>
#include <pybind11/pybind11.h>
#include <pybind11/stl_bind.h>
#include <robot_fingers/solo_eight_driver.hpp>

Functions

PYBIND11_MODULE(py_solo_eight, m)
file py_trifinger.cpp
#include <pybind11/eigen.h>
#include <pybind11/embed.h>
#include <pybind11/pybind11.h>
#include <pybind11/stl_bind.h>
#include <robot_fingers/trifinger_driver.hpp>
#include <robot_fingers/trifinger_platform_frontend.hpp>
#include <robot_fingers/trifinger_platform_log.hpp>

Python bindings for TriFinger-related classes/functions.

License:

BSD 3-clause

Copyright

2020, Max Planck Gesellschaft. All rights reserved.

Functions

template<typename T>
void pybind_trifinger_platform_frontend(pybind11::module &m, const std::string &name)
template<typename T>
void pybind_trifinger_platform_log(pybind11::module &m, const std::string &name)
PYBIND11_MODULE(py_trifinger, m)
file py_two_joint.cpp
#include <pybind11/pybind11.h>
#include <robot_fingers/two_joint_driver.hpp>

Functions

PYBIND11_MODULE(py_two_joint, m)
file pybullet_drivers.cpp
#include <pybind11/embed.h>
#include <pybind11/pybind11.h>
#include <pybind11/stl_bind.h>
#include <robot_fingers/pybullet_driver.hpp>
#include <robot_interfaces/finger_types.hpp>

Functions

PYBIND11_MODULE(pybullet_drivers, m)
page license

File generic_driver_bindings.hpp

BSD 3-clause

File n_finger_driver.hpp

BSD 3-clause

File n_finger_driver.hpp

BSD 3-clause

File py_trifinger.cpp

BSD 3-clause

File trifinger_platform_frontend.hpp

BSD 3-clause

File trifinger_platform_frontend.hpp

BSD 3-clause

File trifinger_platform_log.hpp

BSD 3-clause

File trifinger_platform_log.hpp

BSD 3-clause

page todo

Class robot_fingers::T_TriFingerPlatformFrontend< CameraObservation_t >

Methods to get timestamp from camera or object tracker?

Methods to get timestamp from camera or object tracker?

Member robot_fingers::T_TriFingerPlatformFrontend< CameraObservation_t >::find_matching_timeindex  (const FrontendType &other_frontend, const time_series::Index t_robot) const

The implementation below is very naive. It simply does a linear search starting from the latest time index. So worst case performance is O(n) where n is the number of “other” observations over the period that is covered by the buffer of the robot data.

The implementation below is very naive. It simply does a linear search starting from the latest time index. So worst case performance is O(n) where n is the number of “other” observations over the period that is covered by the buffer of the robot data.

dir include
dir install/robot_fingers/include
dir install
dir include/robot_fingers
dir install/robot_fingers
dir install/robot_fingers/include/robot_fingers
dir srcpy