Skip to content

Commit

Permalink
update
Browse files Browse the repository at this point in the history
  • Loading branch information
jamjamjon committed Jul 24, 2024
1 parent 0901ab3 commit 4d8dc29
Show file tree
Hide file tree
Showing 10 changed files with 220 additions and 304 deletions.
47 changes: 0 additions & 47 deletions CHANGELOG.md

This file was deleted.

6 changes: 3 additions & 3 deletions Cargo.toml
Original file line number Diff line number Diff line change
@@ -1,13 +1,13 @@
[package]
name = "usls"
version = "0.0.6"
version = "0.0.7"
edition = "2021"
description = "A Rust library integrated with ONNXRuntime, providing a collection of ML models."
repository = "https://github.com/jamjamjon/usls"
authors = ["Jamjamjon <[email protected]>"]
license = "MIT"
readme = "README.md"
exclude = ["assets/*", "examples/*"]
exclude = ["assets/*", "examples/*", "scripts/*", "runs/*"]

[dependencies]
clap = { version = "4.2.4", features = ["derive"] }
Expand Down Expand Up @@ -44,4 +44,4 @@ ab_glyph = "0.2.23"
geo = "0.28.0"
prost = "0.12.4"
human_bytes = "0.4.3"
fast_image_resize = { version = "4.0.0", git = "https://github.com/jamjamjon/fast_image_resize", branch = "dev" , features = ["image"]}
fast_image_resize = { version = "4.2.0", features = ["image"]}
91 changes: 1 addition & 90 deletions README.md
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@

[![Static Badge](https://img.shields.io/crates/v/usls.svg?style=for-the-badge&logo=rust)](https://crates.io/crates/usls) ![Static Badge](https://img.shields.io/crates/d/usls?style=for-the-badge) [![Static Badge](https://img.shields.io/badge/Documents-usls-blue?style=for-the-badge&logo=docs.rs)](https://docs.rs/usls) [![Static Badge](https://img.shields.io/badge/GitHub-black?style=for-the-badge&logo=github)](https://github.com/jamjamjon/usls)

A Rust library integrated with **ONNXRuntime**, providing a collection of **Computer Vison** and **Vision-Language** models including [YOLOv5](https://github.com/ultralytics/yolov5), [YOLOv8](https://github.com/ultralytics/ultralytics), [YOLOv9](https://github.com/WongKinYiu/yolov9), [YOLOv10](https://github.com/THU-MIG/yolov10), [RTDETR](https://arxiv.org/abs/2304.08069), [CLIP](https://github.com/openai/CLIP), [DINOv2](https://github.com/facebookresearch/dinov2), [FastSAM](https://github.com/CASIA-IVA-Lab/FastSAM), [YOLO-World](https://github.com/AILab-CVC/YOLO-World), [BLIP](https://arxiv.org/abs/2201.12086), [PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR), [Depth-Anything](https://github.com/LiheYoung/Depth-Anything), [MODNet](https://github.com/ZHKKKe/MODNet) and others.
A Rust library integrated with **ONNXRuntime**, providing a collection of **Computer Vison** and **Vision-Language** models including [YOLOv5](https://github.com/ultralytics/yolov5), [YOLOv6](https://github.com/meituan/YOLOv6), [YOLOv7](https://github.com/WongKinYiu/yolov7), [YOLOv8](https://github.com/ultralytics/ultralytics), [YOLOv9](https://github.com/WongKinYiu/yolov9), [YOLOv10](https://github.com/THU-MIG/yolov10), [RTDETR](https://arxiv.org/abs/2304.08069), [CLIP](https://github.com/openai/CLIP), [DINOv2](https://github.com/facebookresearch/dinov2), [FastSAM](https://github.com/CASIA-IVA-Lab/FastSAM), [YOLO-World](https://github.com/AILab-CVC/YOLO-World), [BLIP](https://arxiv.org/abs/2201.12086), [PaddleOCR](https://github.com/PaddlePaddle/PaddleOCR), [Depth-Anything](https://github.com/LiheYoung/Depth-Anything), [MODNet](https://github.com/ZHKKKe/MODNet) and others.

| Monocular Depth Estimation |
| :--------------------------------------------------------------: |
Expand Down Expand Up @@ -75,92 +75,3 @@ Or you can use specific commit
```Shell
usls = { git = "https://github.com/jamjamjon/usls", rev = "???sha???"}
```
### 2. Build model
```Rust
let options = Options::default()
.with_yolo_version(YOLOVersion::V5) // YOLOVersion: V5, V6, V7, V8, V9, V10, RTDETR
.with_yolo_task(YOLOTask::Classify) // YOLOTask: Classify, Detect, Pose, Segment, Obb
.with_model("xxxx.onnx")?;
let mut model = YOLO::new(options)?;
```
- If you want to run your model with TensorRT or CoreML
```Rust
let options = Options::default()
.with_trt(0) // using cuda by default
// .with_coreml(0)
```
- If your model has dynamic shapes
```Rust
let options = Options::default()
.with_i00((1, 2, 4).into()) // dynamic batch
.with_i02((416, 640, 800).into()) // dynamic height
.with_i03((416, 640, 800).into()) // dynamic width
```
- If you want to set a confidence for each category
```Rust
let options = Options::default()
.with_confs(&[0.4, 0.15]) // class_0: 0.4, others: 0.15
```
- Go check [Options](src/core/options.rs) for more model options.
#### 3. Load images
- Build `DataLoader` to load images
```Rust
let dl = DataLoader::default()
.with_batch(model.batch.opt as usize)
.load("./assets/")?;

for (xs, _paths) in dl {
let _y = model.run(&xs)?;
}
```
- Or simply read one image
```Rust
let x = vec![DataLoader::try_read("./assets/bus.jpg")?];
let y = model.run(&x)?;
```
#### 4. Annotate and save
```Rust
let annotator = Annotator::default().with_saveout("YOLO");
annotator.annotate(&x, &y);
```
#### 5. Get results
The inference outputs of provided models will be saved to `Vec<Y>`.
- You can get detection bboxes with `y.bboxes()`:
```Rust
let ys = model.run(&xs)?;
for y in ys {
// bboxes
if let Some(bboxes) = y.bboxes() {
for bbox in bboxes {
println!(
"Bbox: {}, {}, {}, {}, {}, {}",
bbox.xmin(),
bbox.ymin(),
bbox.xmax(),
bbox.ymax(),
bbox.confidence(),
bbox.id(),
)
}
}
}
```
- Other: [Docs](https://docs.rs/usls/latest/usls/struct.Y.html)
5 changes: 5 additions & 0 deletions build.rs
Original file line number Diff line number Diff line change
@@ -0,0 +1,5 @@
fn main() {
// Need this for CoreML. See: https://ort.pyke.io/perf/execution-providers#coreml
#[cfg(target_os = "macos")]
println!("cargo:rustc-link-arg=-fapple-link-rtlib");
}
30 changes: 15 additions & 15 deletions examples/yolo/README.md
Original file line number Diff line number Diff line change
Expand Up @@ -25,29 +25,29 @@
```Shell

# Classify
cargo run -r --example yolo -- --task classify --version v5 # YOLOv5
cargo run -r --example yolo -- --task classify --version v8 # YOLOv8
cargo run -r --example yolo -- --task classify --ver v5 # YOLOv5
cargo run -r --example yolo -- --task classify --ver v8 # YOLOv8

# Detect
cargo run -r --example yolo -- --task detect --version v5 # YOLOv5
cargo run -r --example yolo -- --task detect --version v6 # YOLOv6
cargo run -r --example yolo -- --task detect --version v7 # YOLOv7
cargo run -r --example yolo -- --task detect --version v8 # YOLOv8
cargo run -r --example yolo -- --task detect --version v9 # YOLOv9
cargo run -r --example yolo -- --task detect --version v10 # YOLOv10
cargo run -r --example yolo -- --task detect --version rtdetr # YOLOv8-RTDETR
cargo run -r --example yolo -- --task detect --version v8 --model yolov8s-world-v2-shoes.onnx # YOLOv8-world
cargo run -r --example yolo -- --task detect --ver v5 # YOLOv5
cargo run -r --example yolo -- --task detect --ver v6 # YOLOv6
cargo run -r --example yolo -- --task detect --ver v7 # YOLOv7
cargo run -r --example yolo -- --task detect --ver v8 # YOLOv8
cargo run -r --example yolo -- --task detect --ver v9 # YOLOv9
cargo run -r --example yolo -- --task detect --ver v10 # YOLOv10
cargo run -r --example yolo -- --task detect --ver rtdetr # YOLOv8-RTDETR
cargo run -r --example yolo -- --task detect --ver v8 --model yolov8s-world-v2-shoes.onnx # YOLOv8-world

# Pose
cargo run -r --example yolo -- --task pose --version v8 # YOLOv8-Pose
cargo run -r --example yolo -- --task pose --ver v8 # YOLOv8-Pose

# Segment
cargo run -r --example yolo -- --task segment --version v5 # YOLOv5-Segment
cargo run -r --example yolo -- --task segment --version v8 # YOLOv8-Segment
cargo run -r --example yolo -- --task segment --version v8 --model FastSAM-s-dyn-f16.onnx # FastSAM
cargo run -r --example yolo -- --task segment --ver v5 # YOLOv5-Segment
cargo run -r --example yolo -- --task segment --ver v8 # YOLOv8-Segment
cargo run -r --example yolo -- --task segment --ver v8 --model FastSAM-s-dyn-f16.onnx # FastSAM

# Obb
cargo run -r --example yolo -- --task obb --version v8 # YOLOv8-Obb
cargo run -r --example yolo -- --task obb --ver v8 # YOLOv8-Obb
```

<details close>
Expand Down
18 changes: 9 additions & 9 deletions examples/yolo/main.rs
Original file line number Diff line number Diff line change
Expand Up @@ -16,7 +16,7 @@ pub struct Args {
pub task: YOLOTask,

#[arg(long, value_enum, default_value_t = YOLOVersion::V8)]
pub version: YOLOVersion,
pub ver: YOLOVersion,

#[arg(long, default_value_t = 224)]
pub width_min: isize,
Expand Down Expand Up @@ -69,7 +69,7 @@ fn main() -> Result<()> {

// version & task
let options =
match args.version {
match args.ver {
YOLOVersion::V5 => {
match args.task {
YOLOTask::Classify => options
Expand All @@ -79,20 +79,20 @@ fn main() -> Result<()> {
}
YOLOTask::Segment => options
.with_model(&args.model.unwrap_or("yolov5n-seg-dyn.onnx".to_string()))?,
t => anyhow::bail!("Task: {t:?} is unsupported for {:?}", args.version),
t => anyhow::bail!("Task: {t:?} is unsupported for {:?}", args.ver),
}
}
YOLOVersion::V6 => match args.task {
YOLOTask::Detect => options
.with_model(&args.model.unwrap_or("yolov6n-dyn.onnx".to_string()))?
.with_nc(args.nc),
t => anyhow::bail!("Task: {t:?} is unsupported for {:?}", args.version),
t => anyhow::bail!("Task: {t:?} is unsupported for {:?}", args.ver),
},
YOLOVersion::V7 => match args.task {
YOLOTask::Detect => options
.with_model(&args.model.unwrap_or("yolov7-tiny-dyn.onnx".to_string()))?
.with_nc(args.nc),
t => anyhow::bail!("Task: {t:?} is unsupported for {:?}", args.version),
t => anyhow::bail!("Task: {t:?} is unsupported for {:?}", args.ver),
},
YOLOVersion::V8 => {
match args.task {
Expand All @@ -112,22 +112,22 @@ fn main() -> Result<()> {
YOLOVersion::V9 => match args.task {
YOLOTask::Detect => options
.with_model(&args.model.unwrap_or("yolov9-c-dyn-f16.onnx".to_string()))?,
t => anyhow::bail!("Task: {t:?} is unsupported for {:?}", args.version),
t => anyhow::bail!("Task: {t:?} is unsupported for {:?}", args.ver),
},
YOLOVersion::V10 => match args.task {
YOLOTask::Detect => {
options.with_model(&args.model.unwrap_or("yolov10n.onnx".to_string()))?
}
t => anyhow::bail!("Task: {t:?} is unsupported for {:?}", args.version),
t => anyhow::bail!("Task: {t:?} is unsupported for {:?}", args.ver),
},
YOLOVersion::RTDETR => match args.task {
YOLOTask::Detect => {
options.with_model(&args.model.unwrap_or("rtdetr-l-f16.onnx".to_string()))?
}
t => anyhow::bail!("Task: {t:?} is unsupported for {:?}", args.version),
t => anyhow::bail!("Task: {t:?} is unsupported for {:?}", args.ver),
},
}
.with_yolo_version(args.version)
.with_yolo_version(args.ver)
.with_yolo_task(args.task);

// device
Expand Down
4 changes: 1 addition & 3 deletions src/core/engine.rs
Original file line number Diff line number Diff line change
Expand Up @@ -41,7 +41,7 @@ impl OrtEngine {
let model_proto = Self::load_onnx(&config.onnx_path)?;
let graph = match &model_proto.graph {
Some(graph) => graph,
None => anyhow::bail!("No graph found in this proto"),
None => anyhow::bail!("No graph found in this proto. Failed to parse ONNX model."),
};

// model params & mems
Expand Down Expand Up @@ -499,14 +499,12 @@ impl OrtEngine {
let tensor_type = match Self::ort_dtype_from_onnx_dtype_id(tensor_type) {
Some(dtype) => dtype,
None => continue,
// None => anyhow::bail!("DType not supported"),
};
dtypes.push(tensor_type);

let shapes = match &tensor.shape {
Some(shapes) => shapes,
None => continue,
// None => anyhow::bail!("DType has no shapes"),
};
let mut shape_: Vec<isize> = Vec::new();
for shape in shapes.dim.iter() {
Expand Down
Loading

0 comments on commit 4d8dc29

Please sign in to comment.