Skip to content

Commit

Permalink
Migrate repo from hbuurmei github
Browse files Browse the repository at this point in the history
  • Loading branch information
hbuurmei committed Sep 23, 2024
1 parent ada8fa1 commit 14d5eb1
Show file tree
Hide file tree
Showing 267 changed files with 63,634 additions and 2 deletions.
31 changes: 31 additions & 0 deletions .github/workflows/ci.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,31 @@
name: ci
on:
push:
branches:
- master
- main
permissions:
contents: write
jobs:
deploy:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- name: Configure Git Credentials
run: |
git config user.name github-actions[bot]
git config user.email 41898282+github-actions[bot]@users.noreply.github.com
- uses: actions/setup-python@v5
with:
python-version: 3.x
- run: echo "cache_id=$(date --utc '+%V')" >> $GITHUB_ENV
- uses: actions/cache@v4
with:
key: mkdocs-material-${{ env.cache_id }}
path: .cache
restore-keys: |
mkdocs-material-
- run: pip install mkdocs-material
- run: pip install mkdocs-git-committers-plugin-2
- run: pip install mkdocs-git-revision-date-localized-plugin
- run: mkdocs gh-deploy --force
2 changes: 2 additions & 0 deletions .gitignore
Original file line number Diff line number Diff line change
@@ -0,0 +1,2 @@
.DS_Store
.cache
13 changes: 13 additions & 0 deletions .vscode/settings.json
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
{
"yaml.schemas": {
"https://squidfunk.github.io/mkdocs-material/schema.json": "mkdocs.yml"
},
"yaml.customTags": [
"!ENV scalar",
"!ENV sequence",
"!relative scalar",
"tag:yaml.org,2002:python/name:material.extensions.emoji.to_svg",
"tag:yaml.org,2002:python/name:material.extensions.emoji.twemoji",
"tag:yaml.org,2002:python/name:pymdownx.superfences.fence_code_format"
]
}
5 changes: 3 additions & 2 deletions README.md
Original file line number Diff line number Diff line change
@@ -1,2 +1,3 @@
# trunk-stack
ASL Trunk software stack.
# Trunk Stack
Welcome to the ASL Trunk robot software stack.
This repository contains all code to run experiments on the Trunk platform, including [documentation](https://stanfordasl.github.io/asl_trunk/) and configuration files.
13 changes: 13 additions & 0 deletions docs/3dprinting.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,13 @@
# 3D Printing

Many of the components for the trunk robot are custom and 3d printed. All of the 3d printed components in the assembly can eaily be printed on any commercial or hobbyist FDM printer. We utilized a Bambu X1C with AMS to print all 3d printed components with PLA, but other materials could alternately be used.

## Printer Settings
On the Bambu X1C, we used default print settings (infill of 15% and 2 wall loops) on the Bambu Cool Plate, with PLA as the primary material, PLA as the support material, and support PLA as the support/raft interface material on all prints except those listed below:
- Pulleys: Printed with PLA-CF, PLA-CF support material, infill of 100% and 5 wall loops
- Trunk Disks: Infill of 30% and 5 wall loops

We found that an offset of 7 thousandths of an inch (0.18 mm) for a friction fit between manufacturer parts and 3d printed parts. For example, if a manufacturer part is friction fit into a custom 3d printed part, and the outer diameter of a manufacturer part is 1.000", then our 3d printed part would have an internal diameter of 1.007". This tolerance may change on other 3d printers, but we found it to be consistent across prints on the Bambu X1C.

## Assets
All 3d printed assets are available in the full CAD assembly. All assets are editable and fully configurable in Fusion360. Please email [email protected] if you need access to STL files or other filetypes.
Binary file added docs/assets/asl-white.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/assets/circuitdiagram_081524.jpg
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Binary file added docs/assets/favicon-robot.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
11 changes: 11 additions & 0 deletions docs/collecting_data.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,11 @@
# Collecting Data
To collect data using the Trunk robot, after setting up the robot using the [motion capture](./mocap.md) and [motor control](./motor_control.md) instructions, the following steps can be followed.

## Usage
Essentially, all you need to run is contained in:
```bash
cd asl_trunk_ws
./scripts/run_data_collection.sh
```
Currently, this script will only collect steady-state data according to the control inputs as specified in [control_inputs.csv](https://github.com/hbuurmei/asl_trunk/blob/main/asl_trunk/asl_trunk_ws/data/trajectories/steady_state/control_inputs.csv) in the `asl_trunk_ws`.
The data will be saved in the `data/steady_state/` directory.
50 changes: 50 additions & 0 deletions docs/contributing.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,50 @@
# Contributing to the ASL Trunk robot project

Contributions are welcome! Here are some guidelines to follow when contributing to the project.

## Getting started
Start by cloning this repository using the following command:
```bash
gh repo clone hbuurmei/asl_trunk
```
where the GitHub CLI is required to use the `gh` command (I highly recommend it).

## Project layout
The project is organized as follows:

asl_trunk/
README.md # The project README file.
asl_trunk/ # The main package.
asl_trunk_ws/ # The main ROS2 workspace, incl. data collection etc.
mocap_ws/ # The ROS2 workspace for interfacing with the motion capture system.
motor_control_ws/ # The ROS2 workspace for controlling the motors.
docs/
mkdocs.yml # The website configuration file.
docs/
index.md # The documentation homepage.
contributing.md # This file.
... # Other markdown pages, images and other files.

## Code contributions
All the ROS2 packages are located in the `asl_trunk/` directory, and each workspace is their own repository.
These are added via git subtrees to have everything in one place.
Therefore, just contribute to the respective workspace repository, which will most likely be the [asl_trunk_ws](https://github.com/hbuurmei/asl_trunk_ws) repository.
Afterwards, the main repository can be updated with the new changes using the following command:
```bash
git subtree pull --prefix=asl_trunk/asl_trunk_ws https://github.com/hbuurmei/asl_trunk_ws.git main
```

## Contributing to the documentation
After cloning this repository, one can make updates to the documentation by editing the files in the `docs/` directory.
The documentation is built using [MkDocs](https://www.mkdocs.org/), a static site generator that's geared towards project documentation.
Specifically, we use the [Material for MkDocs](https://squidfunk.github.io/mkdocs-material/) theme.
This should be installed using the following command:
```bash
pip install mkdocs-material
```
**Note:** The documentation is built automatically using GitHub Actions, so there is no need to build it locally. Always push to the `main` branch.
In case you want to preview the updates locally, simply use:
```bash
mkdocs serve
```
in the main directory, and open the browser as instructed.
8 changes: 8 additions & 0 deletions docs/electrical_design.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,8 @@
# Electrical Design


## System description
The trunk is actuated by 6 CIM 12V motors, each with a Talon SRX controller and an encoder. The CIM motors are powered by a 12V, 100A power supply. A 20A circuit breaker is in series with the positive terminal of each motor controller to protect from current spikes. Low level motor commands are handled with a Raspberry Pi 4, which has its own 5V power supply. CAN is the protocol used to communicate commands from the Raspberry Pi to the motor controllers, via a CANable 1.0 device. The gripper servo has its own 6V power supply. The grounds of all power supplies are connected to a common ground, which is connected to the frame.

## Circuit diagram
![Circuit Diagram](assets/circuitdiagram_081524.jpg)
19 changes: 19 additions & 0 deletions docs/index.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Welcome to the ASL Trunk robot documentation
This documentation provides a detailed guide to the setup, configuration, and use of the Trunk Robot. It is intended for internal use by the team involved in the development, deployment, and maintenance of the robot.

## Contents
- Design: Documentation for full-system software, electrical, and mechanical design. Inlcudes BOM, CAD assets, circuit diagram, and design considerations.
- Robot Setup: Instructions for setting up trunk hardware and software
- Usage: Instructions for using a set up trunk robot
- Motion Capture: Instructions for setting up and using the motion capture system with the Trunk Robot.
- Motor Control: Instructions for enabling motor control of the Trunk Robot.
- Collecting Data: Instructions for collecting data using the Trunk Robot, including data collection scripts and procedures.
- Video Streaming: Instructions for using the video streaming capabilities of the Trunk Robot.
- Telemetry Viewer: Description on how to view and visualize telemetry data.
- Teleoperation: Instructions for teleoperating the robot with an Apple Vision Pro to collect data.
- Visuomotor Rollout: Instructions for rolling out a visuomotor policy on the Trunk Robot hardware.
- Contributing: Guidelines for contributing to the development of the Trunk Robot project.

This documentation is intended as a practical resource to support your work with the Trunk Robot, ensuring that all ASL members have access to the necessary information to effectively manage the system.

<!-- WIP! Some details on the design/features of the robot. BOM? CAD? Pictures? May need to be re-taken. -->
19 changes: 19 additions & 0 deletions docs/javascripts/mathjax.js
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
window.MathJax = {
tex: {
inlineMath: [["\\(", "\\)"]],
displayMath: [["\\[", "\\]"]],
processEscapes: true,
processEnvironments: true
},
options: {
ignoreHtmlClass: ".*|",
processHtmlClass: "arithmatex"
}
};

document$.subscribe(() => {
MathJax.startup.output.clearCache()
MathJax.typesetClear()
MathJax.texReset()
MathJax.typesetPromise()
})
26 changes: 26 additions & 0 deletions docs/mechanical_design.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,26 @@
# Mechanical Design
The ASL Trunk robot is a low cost, highly customizable, open-source desktop soft robot hardware platform. The trunk is powered by 6 motors, which control 12 tendons that terminate at 3 disks along the length of the robot. Custom pretensioning mechanisms keep the antagonistc tendons in tension, and the actuation unit routes the tendons into the trunk.

[//]: # (TODO: self-link on this page and add citation, add BOM)

## Full BOM
The full working bill of materials is available [here](https://docs.google.com/spreadsheets/d/1P72TMokWnYh4jPumQLBwXQ3-0UVnGI-0Cx1MCqVu7Cc/edit?usp=sharing) .

## Full CAD
<iframe src="https://stanford2289.autodesk360.com/shares/public/SH30dd5QT870c25f12fcd9883a938c317f7f?mode=embed" width="1024" height="768" allowfullscreen="true" webkitallowfullscreen="true" mozallowfullscreen="true" frameborder="0"></iframe>

## Trunk
The flexible body of the trunk is a standard vacuum hose, which was cut to length for our application. 3 custom 3d printed disks, which each have 12 radially symmetric channels, divide the trunk into 3 equal-length segments. At each disk, 4 tendons terminate. Each disk also has a unique arrangment of motion capture markers, so OptiTrack Motive can easily distinguish them from each other for pose estimation. A custom parallel jaw gripper, adapted from [this design](https://www.youtube.com/watch?v=Qfd0ikdnAsg), is mounted on the end effector, driven by a small servo housed within the trunk body. The jaws of the gripper are easily swappable for different applications, including carrying large amounts of weight (up to 600g).
<iframe src="https://stanford2289.autodesk360.com/shares/public/SH30dd5QT870c25f12fcc3451d73cc86f268?mode=embed" width="1024" height="768" allowfullscreen="true" webkitallowfullscreen="true" mozallowfullscreen="true" frameborder="0"></iframe>

## Actuation Unit
The actuation unit routes 12 tendons from their respective pretensioning mechanisms to the trunk. The main structure is a custom 3d printed mount, which connects to the frame. 6 entry holes with 6 corresponding shafts hold 12 small pulleys which route the tendons with minimal friction and no overlap. The bottom of the actuation unit has a snap-fit attachment for the top of the trunk.
<iframe src="https://stanford2289.autodesk360.com/shares/public/SH30dd5QT870c25f12fc22629b72247d4398?mode=embed" width="1024" height="768" allowfullscreen="true" webkitallowfullscreen="true" mozallowfullscreen="true" frameborder="0"></iframe>

## Pretensioning Mechanism (PTM)
The pretensioning mechanism is heavily inspired by [Yeshmukhametov et al., 2019](https://doi.org/10.3390/robotics8030051). A pretensioning mechanism is necessary to drive two antagonistic cables with the same motor, such that when one is pulled by the motor, the other does not go slack. Our design consists of a "sled" that passively tensions a tendon using two compression springs in series on the lower linear rail.
<iframe src="https://stanford2289.autodesk360.com/shares/public/SH30dd5QT870c25f12fc3f4bc1ea84dd9bed?mode=embed" width="800" height="600" allowfullscreen="true" webkitallowfullscreen="true" mozallowfullscreen="true" frameborder="0"></iframe>

## Motor Assemblies
Each motor assembly is centered around a CIM 12V motor. We use the CIM12V mount along with a custom 3d printed mount to securely attach the motor and Talon SRX controller to the frame. A custom 3d printed pulley is connected to the motor shaft using a shaft key, and the tendons are secured to the top of the pulley. The Talon encoder is mounted to the frame using a custom 3d printed mount.
<iframe src="https://stanford2289.autodesk360.com/shares/public/SH30dd5QT870c25f12fca5cfc22cb92d7282?mode=embed" width="1024" height="768" allowfullscreen="true" webkitallowfullscreen="true" mozallowfullscreen="true" frameborder="0"></iframe>
28 changes: 28 additions & 0 deletions docs/mocap.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,28 @@
# Motion Capture System


## Usage
First, make sure the robot is turned on.
The motion capture cameras should show numbers 1-4.
The Windows laptop has to be connected to the OptiHub via USB, and be running the [Motive 2](https://docs.optitrack.com/v/v2.3) software.
Then, on the remote computer run the following command:
```bash
cd mocap_ws
source install/setup.bash
ros2 launch mocap4r2_optitrack_driver optitrack2.launch.py
```
and in a new terminal run:
```bash
cd mocap_ws
source install/setup.bash
ros2 lifecycle set /mocap4r2_optitrack_driver_node activate
ros2 run converter converter_node
```
You can choose whether to use *markers* or *rigid bodies* by changing the `type` parameter, i.e.
```bash
ros2 run converter converter_node --ros-args -p type:='markers' # or 'rigid_bodies' (default)
```

## Troubleshooting
In the Motive 2 software, make sure that the rigid bodies are correctly set up.
There should be a rigid body for each segment of the robot, and the markers should be correctly assigned to the rigid bodies (5/6 markers per rigid body).
47 changes: 47 additions & 0 deletions docs/motor_control.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,47 @@
# Motor Control
The motor controllers are connected to the Raspberry Pi 4 (4GB RAM), which has Ubuntu 20.04 installed. The motor controllers are controlled using the [ROS2 Foxy](https://docs.ros.org/en/foxy/index.html) framework (note not Humble distro!), which is already installed on the Pi.

## Usage
In the first terminal, run:
```bash
cd Phoenix-Linux-SocketCAN-Example
sudo ./canableStart.sh # start the CANable interface
cd ../motor_control_ws
source install/setup.bash
ros2 launch ros_phoenix trunk.launch.py
```
In the second terminal, run:
```bash
cd motor_control_ws
source install/setup.bash
ros2 run converter converter_node # optionally add --ros-args -p debug:=true
```

## Motor control modes
The motor control modes are as follows:

| Mode | Value |
|---------------------------|-------|
| `PERCENT_OUTPUT` | 0 |
| `POSITION` | 1 |
| `VELOCITY` | 2 |
| `CURRENT` | 3 |
| `FOLLOWER` | 5 |
| `MOTION_PROFILE` | 6 |
| `MOTION_MAGIC` | 7 |
| `MOTION_PROFILE_ARC` | 10 |

To simply run a command once to test the motor control, the following command can be used:

```bash
ros2 topic pub --once /all_motors_control interfaces/msg/AllMotorsControl "{motors_control: [{mode: 0, value: 0.25},{mode: 0, value: 0},{mode: 0, value: 0},{mode: 0, value: 0},{mode: 0, value: 0},{mode: 0, value: 0}]}"
```

## Motor control limits
The motor control limits are empirically established as follows:

$$
\operatorname{norm}\left(0.75\left(\vec{u}_3+\vec{u}_4\right)+1.0\left(\vec{u}_2+\vec{u}_5\right)+1.25\left(\vec{u}_1+\vec{u}_6\right)\right) \leq 0.6
$$

Going beyond this limit can result in the robot going outside of the workspace, which can be dangerous.
10 changes: 10 additions & 0 deletions docs/optitrack.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
# OptiTrack System

## System Overview
What's needed to get the OptiTrack system up and running?

## Camera Setup

## Motive 2.0

### Calibration
16 changes: 16 additions & 0 deletions docs/software_design.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,16 @@
# Software Design

## ROS Graph

## Teleoperation

### Overview
The trunk robot is teleoperated by a user wearing an Apple Vision Pro. We designed an augmented reality app written in Swift which initializes a virtual 3d, 3-link spherical pendulum overlayed on the real-world view of the user. Once the virtual trunk is initialized, the user can calibrate the position and orientation of the virtual trunk to the hardware system. After calibration, the user can look at one of the disks on the trunk, which then lights up to denote its selection. The user can pinch their thumb and forefinger to select the disk, then the position of the virtual disk will mirror the position of their hand. The virtual disk positions can optionally be streamed over WiFi to a ROS listener, which publishes the 3d positions of the 3 disks on the trunk to the desired positions topic. A controller node subscribes to this topic and calculates the motor outputs necessary to attain that pose. The updated motor outputs are published to the motors, which causes the hardware trunk to mirror the virtual trunk. Streaming of desired trunk positions is done at 10Hz, and all of the other ROS functions run at 100Hz.

### Swift App Design
The Apple Vision Pro teleoperation app was written in Swift 5 using XCode Beta 16 for VisionOS 2.0 Beta. Beta versions of XCode and VisionOS were used since some functionality necessary for our app was only available in beta versions.

Our 3D assets were programmatically generated with standard hierarchical RealityKit Entities. The entities are placed into a MeshResource.Skeleton, upon which a custom IKComponent is added. A corresponding IKSolver smoothly solves the inverse kinematics of the 3 spherical pendulum joints when the position of the end effector is commanded with a gesture. The disk selection gestures are created with DragGestures. The streaming functionality for our app was heavily inspired by [VisionProTeleop](https://github.com/Improbable-AI/VisionProTeleop), using GRPC to stream disk positions over WiFi.

Source code for the app can be found in this [GitHub repository](https://github.com/StanfordASL/trunk-teleop).

3 changes: 3 additions & 0 deletions docs/stylesheets/extra.css
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
:root {
--md-primary-fg-color: #3066ac;
}
19 changes: 19 additions & 0 deletions docs/telemetry_viewer.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,19 @@
# Telemetry Viewer

For visualizing telemetry data, such as the motor output currents, motor control temperatures, and webcam stream, we use the free [foxglove](https://docs.foxglove.dev/docs/connecting-to-data/frameworks/ros2/) tool.

## Usage
You can run the Foxglove server/ROS2 node by running the following command in any ROS2 workspace:
```bash
ros2 launch foxglove_bridge foxglove_bridge.launch.xml
```
Note that Foxglove is installed for a particular ROS2 distribution, but you can install it for any distribution, see below.
Then, just open the Foxglove web interface via their website and connect.
You will be able to visualize almost any data type.
Finally, particular settings, such as topics to listen to, are stored and can be found [in the repo](https://github.com/hbuurmei/asl_trunk/tree/main/asl_trunk/asl_trunk_ws/foxglove).

## Installing Foxglove
The only thing to do to run Foxglove is to install the WebSocket server. This can be done by running the following command:
```bash
sudo apt install ros-$ROS_DISTRO-foxglove-bridge
```
Loading

0 comments on commit 14d5eb1

Please sign in to comment.